The rise of artificial-intelligence-created videos, while technically impressive, has raised serious concerns around the use of the technology for nefarious purposes, such as fake porn. As AI technology continues to improve, identifying what is real and what isn’t has become increasingly difficult, however, Intel Corp. now has a solution.
Launched today, Intel’s new FakeCatcher does as the name suggests – it identifies fake videos, so-called deepfakes, with a 96% accuracy rate. The deepfake detection platform is said by Intel to be the world’s first real-time deepfake detector that returns results in milliseconds.
The technology – designed by Iike Demior, a researcher at Intel Labs in collaboration with Umur Ciftci from the State University of New York at Binghamton, uses Intel hardware and software, runs on a server and interfaces through a web-based platform. The software uses specialist tools, including OpenVino, to run AI models for face and landmark detection.
While other solutions look at raw data to find inauthenticity, FakeCatcher looks for authentic clues in real videos by assessing “what makes us human,” including subtle “blood flow” in the pixels of a video. The technology is so advanced that it can detect whether veins are changing color, using blood flow signals to detect instantly whether a video is real or fake.
“Deception due to deepfakes can cause harm and result in negative consequences, like diminished trust in media,” Intel explains. “FakeCatcher helps restore trust by enabling users to distinguish between real and fake content.”
FakeCatcher has several potential use cases. News organizations could use the service to avoid amplifying manipulated videos; social media platforms could use the technology to detect harmful deepfake videos; and nonprofits could employ the platform to democratize the detection of deepfakes for everyone.
The launch of FakeCapture is said to be part of Intel’s commitment to advancing AI technology responsibly through a multidisciplinary approach with academia and industry partners.