The AI ​​election year was not what everyone expected


Many AI-generated content has been used to express support or fandom for certain candidates. For example, an AI-generated video of Donald Trump and Elon Musk dancing to the BeeGees song “Stayin' Alive” is shared millions of times on social networksincluding Sen. Mike Lee, a Utah Republican.

“It's all about social signals. That's all the reason why people share this content. That's not AI. You’re seeing the effects of a polarized electorate,” said Bruce Schneier, a technologist and public interest lecturer at the Harvard Kennedy School. “It's not like we've had perfect elections throughout our history and now suddenly there's AI and it's all misinformation.”

But don't misunderstand—that used to Misleading deepfakes have spread during this election. For example, in the days before the election in Bangladesh, Deepfakes circulating online are encouraging supporters of one of the country's political parties boycotted the vote. Sam Gregory, program director at the nonprofit Witness, which helps people use technology to support human rights and runs a rapid response detection program for civil society organizations and journalists, said that his team has noticed an increase in deepfake cases this year.

“In many election contexts, there have been examples of truly deceptive or misleading use of synthetic media in audio, video and image formats that confused or confused journalists,” he said. cannot be fully verified or challenged. ” This, he said, reveals that existing tools and systems for AI-generated vehicle detection are still lagging behind the pace of technological development. In places outside the United States and Western Europe, these detection tools even less reliable.

“Fortunately, AI in deceptive ways has not been used on a large scale in most elections or in pivotal ways, but there is clearly a gap in the detection tools,” Gregory said. and their accessibility to those who need it most”. “This is not the time to be complacent.”

The very existence of synthetic media, he says, means politicians can allege that the media is actually fake – a phenomenon known as “talker's rent.” lie”. In AugustDonald Trump alleged that images showing large crowds of people protesting in support of Vice President Kamala Harris were created by AI. (It's not.) Gregory says that in an analysis of all reports submitted to Witness's deepfake response task force, about a third of the cases were politicians using AI to deny evidence about a real event — many cases involve leaked conversations.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *