Deepfake Robo-Calls Mimic President Biden to Meddle in New Hampshire Primary
Summary:
Over the weekend of January 20-21, New Hampshire residents received AI-generated robo-calls mimicking President Joe Biden's voice, asking them to abstain from the primary election. The state's attorney general's office deemed it as misinformation and investigations to find its source are ongoing. A similar incident with a deepfake audio of Manhattan democratic leader Keith Wright happened, indicating audio fakes are becoming a preferred method by malicious actors for meddling in politics. Experts advise caution when interacting with content from unknown or dubious sources.
Over the weekend of January 20-21, residents in New Hampshire found themselves on the receiving end of an unexpected political plea. Automated phone calls mimicking the voice of U.S. President Joe Biden urged locals not to participate in the primary election on January 23. These robo-calls originated from an artificial intelligence (AI) tool capable of replicating voices, known as a deepfake, with the evident intent of disrupting the 2024 presidential elections.
The dubious calls suggested that residents abstain from voting, as captured in an audio recording by NBC: “Voting this Tuesday only bolsters the Republicans' ambition to re-elect Donald Trump. Your valuable vote will count in November, not this upcoming Tuesday."
The state's attorney general's office was quick to discredit these calls as a spread of misinformation, urging New Hampshire voters to completely ignore the message in the calls. On the other hand, a representative for ex-president Donald Trump disclaimed any involvement of him or his campaign team. Currently, the source behind these misrepresented calls is under investigation.
In a related development, a similar political controversy involving fake audio emerged. A deepfake audio that impersonated Manhattan democratic leader Keith Wright was discovered on January 21, where the said voice was heard criticizing fellow democratic assembly member, Inez Dickens. As per Politico's report, despite some dismissing the audio as fake, it managed to deceive at least one political insider for a moment.
Former City Council Speaker and Manhattan democrat, Melissa Mark-Viverito, confessed to Politico that she initially fell for the deception: "I was taken aback. I initially believed it to be authentic."
Experts speculate that these malicious parties prefer audio manipulations rather than video as audience scrutiny is often higher and more attentive towards visual trickery. AI advisor Henry Ajder shared with the Financial Times, "everyone's familiar with Photoshop or, at the very least, aware of its existence."
At the time of this news article's publication, a foolproof way to identify or discourage deepfakes is yet to be found. To navigate this, experts advise the public to exercise discernment when interacting with content from unknown or questionable sources, especially when extraordinary allegations are involved.
Published At
1/23/2024 7:25:04 PM
Disclaimer: Algoine does not endorse any content or product on this page. Readers should conduct their own research before taking any actions related to the asset, company, or any information in this article and assume full responsibility for their decisions. This article should not be considered as investment advice. Our news is prepared with AI support.
Do you suspect this content may be misleading, incomplete, or inappropriate in any way, requiring modification or removal?
We appreciate your report.