Berliner Boersenzeitung - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.100926
AFN 76.44436
ALL 98.841689
AMD 432.640834
ANG 2.012366
AOA 1041.754303
ARS 1078.820915
AUD 1.628227
AWG 2.009714
AZN 1.901179
BAM 1.952081
BBD 2.254505
BDT 133.435781
BGN 1.950673
BHD 0.420633
BIF 3237.731546
BMD 1.116508
BND 1.435066
BOB 7.715335
BRL 6.126393
BSD 1.116573
BTN 93.389313
BWP 14.615482
BYN 3.654138
BYR 21883.553943
BZD 2.250732
CAD 1.502702
CDF 3199.354734
CHF 0.948474
CLF 0.037008
CLP 1021.158785
CNY 7.848826
CNH 7.848375
COP 4656.005228
CRC 578.303409
CUC 1.116508
CUP 29.587458
CVE 110.056309
CZK 25.15369
DJF 198.84295
DKK 7.457197
DOP 66.952444
DZD 147.584889
EGP 54.20523
ERN 16.747618
ETB 133.161201
FJD 2.445823
FKP 0.850287
GBP 0.835243
GEL 3.042471
GGP 0.850287
GHS 17.583115
GIP 0.850287
GMD 76.479109
GNF 9645.623374
GTQ 8.631555
GYD 233.567105
HKD 8.691461
HNL 27.738154
HRK 7.591149
HTG 147.389197
HUF 395.354877
IDR 16921.793038
ILS 4.178949
IMP 0.850287
INR 93.313705
IQD 1462.713271
IRR 46996.604911
ISK 150.695537
JEP 0.850287
JMD 174.866848
JOD 0.791159
JPY 161.268955
KES 144.040758
KGS 94.009898
KHR 4536.397986
KMF 493.440337
KPW 1004.856436
KRW 1487.947427
KWD 0.340871
KYD 0.930548
KZT 534.107215
LAK 24656.246305
LBP 99991.745751
LKR 335.929994
LRD 216.619242
LSL 19.222677
LTL 3.296758
LVL 0.675364
LYD 5.303895
MAD 10.782953
MDL 19.434409
MGA 5037.447885
MKD 61.473107
MMK 3626.373958
MNT 3793.893638
MOP 8.955718
MRU 44.147287
MUR 50.957846
MVR 17.149303
MWK 1935.839108
MXN 21.859493
MYR 4.612314
MZN 71.288861
NAD 19.222677
NGN 1838.095984
NIO 41.091713
NOK 11.743335
NPR 149.421326
NZD 1.773427
OMR 0.429822
PAB 1.116583
PEN 4.204889
PGK 4.436587
PHP 62.491196
PKR 310.218778
PLN 4.267151
PYG 8702.420343
QAR 4.070445
RON 4.974821
RSD 117.087985
RUB 101.882267
RWF 1514.128877
SAR 4.189134
SBD 9.27788
SCR 14.094815
SDG 671.583251
SEK 11.327081
SGD 1.436449
SHP 0.850287
SLE 25.509193
SLL 23412.605708
SOS 638.184146
SRD 34.00269
STD 23109.458362
SVC 9.770473
SYP 2805.259408
SZL 19.209403
THB 36.457886
TJS 11.886208
TMT 3.907777
TND 3.373373
TOP 2.614971
TRY 38.094575
TTD 7.597593
TWD 35.627208
TZS 3036.901156
UAH 46.043591
UGX 4123.144677
USD 1.116508
UYU 47.030819
UZS 14241.99403
VEF 4044608.356992
VES 41.047495
VND 27460.510674
VUV 132.554056
WST 3.123389
XAF 654.715515
XAG 0.035038
XAU 0.00042
XCD 3.017418
XDR 0.826026
XOF 654.709662
XPF 119.331742
YER 279.459485
ZAR 19.2141
ZMK 10049.905461
ZMW 29.617706
ZWL 359.515074
  • RYCEF

    0.0000

    7.07

    0%

  • RBGPF

    -0.6200

    59.48

    -1.04%

  • CMSC

    0.0200

    25.12

    +0.08%

  • RELX

    -0.0450

    48.485

    -0.09%

  • NGG

    0.1150

    70.225

    +0.16%

  • VOD

    0.0150

    10.105

    +0.15%

  • GSK

    -0.1600

    40.82

    -0.39%

  • RIO

    0.4400

    67.86

    +0.65%

  • BP

    -0.8500

    31.98

    -2.66%

  • BTI

    -0.0550

    38.045

    -0.14%

  • SCS

    -0.1200

    13

    -0.92%

  • CMSD

    -0.1100

    25.01

    -0.44%

  • AZN

    0.5750

    77.445

    +0.74%

  • BCC

    -2.6550

    139.125

    -1.91%

  • JRI

    -0.0200

    13.4

    -0.15%

  • BCE

    -0.2450

    34.885

    -0.7%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

(U.Gruber--BBZ)