New Delhi, March 24 (Ajit Weekly News) Misinformation and disinformation spread through artificial intelligence (AI)-generated deepfakes and fake content are the biggest threat to the upcoming election in India, exposure management company Tenable said on Sunday.
As per the company, these threats will be shared across social media and messaging platforms like WhatsApp, X (formerly Twitter), Instagram, and others.
“The biggest threats to the 2024 Lok Sabha elections are misinformation and disinformation as part of influence operations conducted by malicious actors against the electorate,” Satnam Narang, Senior Staff Research Engineer, Tenable, told Ajit Weekly News.
According to a recent report by Tidal Cyber, this year, 10 countries will be facing the highest levels of election cyber interference threats, including India.
Recently, deepfake videos of former US President Bill Clinton and current President Joe Biden were fabricated and circulated to confuse citizens during the coming presidential elections.
According to experts, the proliferation of deepfake content surged in late 2017, with over 7,900 videos online. By early 2019, this number nearly doubled to 14,678, and the trend continues to escalate.
“With the increase in generative AI tools and their use growing around the world, we may see deepfakes, be it in images or video content, impersonating notable candidates seeking to retain their seats or those hoping to unseat those currently in parliament,” Narang said.
Recently, the Indian government has issued directives to social media platforms such as X and Meta (formerly Facebook), urging them to regulate the proliferation of AI-generated deepfake content.
In addition, ahead of the Lok Sabha elections, the Ministry of Electronics & IT (MeitY) has issued an advisory to such platforms to remove AI-generated deepfakes from their platforms.
According to Tenable, the easiest way to identify a deepfake image is to look for text that is nonsensical or that looks almost alien-like in language.
–Ajit Weekly News
shs/vd
News Credits – I A N S