In what could prove to be a major step towards exposing imposters who attend virtual conferences without anyone’s knowledge or on somebody’s behalf, the researchers at the Indian Institute of Technology (IIT) Ropar in Punjab and Monash University, Australia, have developed a unique detector called FakeBuster. Not only does it expose imposters but also detects faces manipulated on social media to defame or crack a joke at someone. The FakeBuster could be seen as the way forward, especially in current times when most of the work and official meetings happen online.
The latest technology helps the organiser of an online conference or a seminar detect if a participant’s video is manipulated or spoofed during the interaction. FakeBuster will detect if an individual is attending a meeting on behalf of a colleague by morphing his image with his own.
A paper titled “FakeBuster: A DeepFakes Detection Tool for Video Conferencing Scenarios” was presented at the 26th International Conference on Intelligent User Interfaces in the US in April. The software, researchers say, has been tested with Zoom and Skype. More importantly, the tool works in both online and offline modes, say its creators.
In a statement, Dr Abhinav Dhall, one of the key members of a four-man team that developed FakeBuster, said that sophisticated AI techniques have spurred a dramatic increase in the manipulation of media contents and they keep evolving and becoming more realistic. “The tool has achieved over 90 per cent accuracy,” he said.
The other three members on the team include associate professor Ramanathan Subramanian and two students, Vineet Mehta and Parul Gupta.
Subramanian says the device can be attached to laptops and desktops, adding they were aiming to “make the network smaller and lighter to enable it to run on mobile phones/devices as well.” Not just that, the professor added that the team was working on using the device to detect fake audios as well.
Dr Dhall adds that the usage of the manipulated media content in spreading fake news, pornography, and other such online content has been widely observed with major repercussions. He says these manipulations have also made their way to video-calling platforms through spoofing tools based on the transfer of facial expressions. These fake facial expressions are often convincing to the human eye and can have serious implications, researchers say. And now the fear is that these real-time mimicked visuals, known as deepfakes, can even be used during online examinations and job interviews.
Dr Dhall’s team claims that FakeBuster uses the DeepFake detection technology, and is one of the first tools to detect imposters during live video-conferencing. It’s expected to hit the market soon.