Berliner Boersenzeitung - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 3.850348
AFN 71.723895
ALL 98.294224
AMD 409.351011
ANG 1.894111
AOA 956.029757
ARS 1055.883554
AUD 1.619223
AWG 1.889522
AZN 1.777975
BAM 1.955088
BBD 2.121928
BDT 125.584292
BGN 1.955092
BHD 0.395156
BIF 3104.769976
BMD 1.048278
BND 1.415085
BOB 7.262391
BRL 6.093018
BSD 1.050918
BTN 88.586758
BWP 14.337782
BYN 3.439348
BYR 20546.257908
BZD 2.118429
CAD 1.474351
CDF 3009.607446
CHF 0.928639
CLF 0.037143
CLP 1024.902007
CNY 7.602168
CNH 7.614606
COP 4616.408703
CRC 537.009671
CUC 1.048278
CUP 27.779379
CVE 110.224882
CZK 25.280074
DJF 187.141887
DKK 7.458608
DOP 63.357545
DZD 140.37812
EGP 52.030461
ERN 15.724177
ETB 131.513834
FJD 2.3852
FKP 0.827424
GBP 0.833727
GEL 2.861712
GGP 0.827424
GHS 16.499995
GIP 0.827424
GMD 74.427574
GNF 9055.704052
GTQ 8.111048
GYD 219.869975
HKD 8.157336
HNL 26.580326
HRK 7.477642
HTG 137.931114
HUF 410.474365
IDR 16692.262133
ILS 3.819759
IMP 0.827424
INR 88.418148
IQD 1376.698932
IRR 44119.425234
ISK 145.102983
JEP 0.827424
JMD 165.9496
JOD 0.74354
JPY 159.908067
KES 136.097678
KGS 90.989726
KHR 4218.50486
KMF 491.589818
KPW 943.450221
KRW 1463.585741
KWD 0.322524
KYD 0.875781
KZT 524.74901
LAK 22997.629698
LBP 94111.946668
LKR 306.038613
LRD 188.641341
LSL 19.016379
LTL 3.095294
LVL 0.634093
LYD 5.142128
MAD 10.535166
MDL 19.20601
MGA 4907.213952
MKD 61.519065
MMK 3404.767562
MNT 3562.050167
MOP 8.423934
MRU 41.801185
MUR 49.090767
MVR 16.195629
MWK 1822.336736
MXN 21.683886
MYR 4.673751
MZN 66.979036
NAD 19.016379
NGN 1768.980499
NIO 38.675923
NOK 11.709171
NPR 141.738412
NZD 1.788552
OMR 0.403573
PAB 1.050923
PEN 3.965757
PGK 4.236458
PHP 61.717921
PKR 292.004421
PLN 4.305106
PYG 8201.015128
QAR 3.831805
RON 4.976495
RSD 116.991014
RUB 110.593948
RWF 1447.97299
SAR 3.938531
SBD 8.795675
SCR 14.277076
SDG 630.536598
SEK 11.519428
SGD 1.411874
SHP 0.827424
SLE 23.796749
SLL 21981.88023
SOS 600.58141
SRD 37.114327
STD 21697.247753
SVC 9.195653
SYP 2633.830942
SZL 19.022077
THB 36.377322
TJS 11.229313
TMT 3.679457
TND 3.320991
TOP 2.455176
TRY 36.326625
TTD 7.145399
TWD 34.079245
TZS 2772.696661
UAH 43.665607
UGX 3893.582877
USD 1.048278
UYU 44.7837
UZS 13467.098465
VES 48.949441
VND 26641.99718
VUV 124.453726
WST 2.926365
XAF 655.718342
XAG 0.034415
XAU 0.000397
XCD 2.833025
XDR 0.803907
XOF 655.718342
XPF 119.331742
YER 261.991017
ZAR 19.065568
ZMK 9435.766903
ZMW 28.979453
ZWL 337.545238
  • RBGPF

    60.1000

    60.1

    +100%

  • SCS

    -0.1800

    13.54

    -1.33%

  • CMSC

    -0.1600

    24.57

    -0.65%

  • RYCEF

    -0.0200

    6.78

    -0.29%

  • CMSD

    -0.1500

    24.43

    -0.61%

  • GSK

    -0.1300

    34.02

    -0.38%

  • RIO

    -0.9500

    62.03

    -1.53%

  • NGG

    -0.4300

    62.83

    -0.68%

  • RELX

    0.2400

    46.81

    +0.51%

  • AZN

    -0.0400

    66.36

    -0.06%

  • BTI

    0.3800

    37.71

    +1.01%

  • BCE

    -0.3900

    26.63

    -1.46%

  • BCC

    -4.0900

    148.41

    -2.76%

  • VOD

    -0.0500

    8.86

    -0.56%

  • JRI

    -0.1300

    13.24

    -0.98%

  • BP

    -0.3600

    28.96

    -1.24%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

(U.Gruber--BBZ)