Could deepfakes influence the result of the world's biggest election?
Video and words by ITV News Digital Video Producer George Hancorn
Dead politicians brought back to life and millions spent on false ads. How is the World's biggest democratic election already feeling the impact of misinformation?
India's election sees one in 10 of the world's population going to the polls - that's roughly 970 million voters. The election is staggered across six weeks and is already more than halfway through. The votes will then be counted on Tuesday June 4.
So, how have deepfakes been involved and could the same be replicated in the UK when Prime Minister Rishi Sunak calls an election?
Even before the election campaign in India got underway, the public already had a taste of how artificial intelligence (AI) and tech might play into things.
Deepfakes are generally videos of a person in which their face or body has been digitally altered, to make them say something they didn't.
And unusually, according to experts, often deepfakes are made not by foreign states, but by teams commissioned by India's own political parties.
"They create them by themselves or by their in-house team, explains deepfake creator and expert Divyendra Singh Jadoun.
Jadoun, also known as 'The Indian Deepfaker', has created AI generated video for Netflix and other major brands.
"There are so many tools available online, even if you don't have any knowledge of coding, you can do voice swapping in less than three minutes, and you can clone anybody's voice in just five to 10 seconds."
"The main thing about the deepfakes is they want to target people who are confused," explains Jadoun.
"They don't want to target the people who are already affiliated to a particular political party."
Politician Muthuvel Karunanidhi - who died in 2018 and served for India's Dravida Munnetra Kazhagam party in the state of Tamil Nadu - was resurrected as a deepfake at a youth conference to praise his former party.
It's thought the influence of the once 'popular' politician might have some sway later in 2024.
Another deceased leader, Jayaram Jayalalithaa, was brought back through the release of a AI-generated voice clip.
Deepfaked interview clips of actors Aamir Khan and Ranveer Singh also raised concerns.
They racked up millions of views before being taken down by platforms, including Whatsapp and Instagram owner Meta.
These clips were put out in support of India's opposition party, Congress.
So far, parties haven't commented on the release of any deepfake videos.
In India, most polls predict a win for Modi and his Bharatiya Janata Party (BJP), which is up against a broad opposition alliance led by the Indian National Congress and powerful regional parties.
AI's also been used by Modi to translate his voice from Hindi - using an app - into other languages.
But there are fears this could be hijacked to change what he's actually saying.
"To make deepfakes, it used to take 10-12 days," explains Jadoun.
"But now you can get a deepfake video in just three minutes and it costs less than five dollars."
"Organisations used to reach out to us via fake WhatsApp numbers or fake Instagram accounts. Then they'd ask to move to closed platforms like Telegram to carry on the conversation."
Any work deemed 'unethical' by Jadoun, would be declined.
Why is all this happening? Well it's much cheaper to push out campaigns on socials than it is to hold rallies or press conferences - and social platforms can reach a lot more people.
Plus, India doesn't currently have any specific laws around deepfakes - although the government has said it's working on the legislation, and social media companies do have a duty to remove any content that violates rules.
Jadoun offers a warning for those watching India's election from around the world: "People of the UK - it's already happening - each bit of content that's out there- you can't rely on what is real or not.
"Anything that hits your emotions, think before sharing."
Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know...