Berliner Boersenzeitung - Will AI really destroy humanity?

EUR -
AED 4.054806
AFN 75.880678
ALL 98.969574
AMD 428.313528
ANG 1.994563
AOA 1041.587354
ARS 1071.126764
AUD 1.612291
AWG 1.989888
AZN 1.805027
BAM 1.955844
BBD 2.234601
BDT 132.255974
BGN 1.956547
BHD 0.416166
BIF 3226.572683
BMD 1.10396
BND 1.426132
BOB 7.647345
BRL 6.003284
BSD 1.10675
BTN 92.956936
BWP 14.590461
BYN 3.621864
BYR 21637.613417
BZD 2.230801
CAD 1.493549
CDF 3167.814056
CHF 0.939232
CLF 0.036475
CLP 1006.447193
CNY 7.779049
CNH 7.776542
COP 4617.985564
CRC 571.805111
CUC 1.10396
CUP 29.254937
CVE 110.269981
CZK 25.349087
DJF 197.078902
DKK 7.458906
DOP 66.990599
DZD 146.801251
EGP 53.376537
ERN 16.559398
ETB 132.365683
FJD 2.42904
FKP 0.840731
GBP 0.841604
GEL 3.019324
GGP 0.840731
GHS 17.48579
GIP 0.840731
GMD 77.277283
GNF 9555.085412
GTQ 8.55527
GYD 231.440455
HKD 8.574348
HNL 27.543361
HRK 7.505835
HTG 145.925931
HUF 400.178274
IDR 17058.443082
ILS 4.190659
IMP 0.840731
INR 92.68902
IQD 1446.187427
IRR 46476.710215
ISK 149.498151
JEP 0.840731
JMD 174.641561
JOD 0.782268
JPY 162.090567
KES 142.410745
KGS 93.176454
KHR 4511.183348
KMF 491.64887
KPW 993.563256
KRW 1470.540848
KWD 0.337679
KYD 0.922242
KZT 534.774157
LAK 24391.993346
LBP 98859.606322
LKR 326.484748
LRD 213.891763
LSL 19.246369
LTL 3.259707
LVL 0.667774
LYD 5.238313
MAD 10.795188
MDL 19.361875
MGA 5023.017393
MKD 61.616388
MMK 3585.618589
MNT 3751.255583
MOP 8.8504
MRU 43.882273
MUR 51.190522
MVR 16.945568
MWK 1915.921953
MXN 21.458606
MYR 4.658917
MZN 70.515394
NAD 19.186651
NGN 1844.782994
NIO 40.587037
NOK 11.719472
NPR 148.727329
NZD 1.774958
OMR 0.425033
PAB 1.106725
PEN 4.109496
PGK 4.331663
PHP 62.252255
PKR 306.486844
PLN 4.298869
PYG 8623.389537
QAR 4.019242
RON 4.976318
RSD 117.017506
RUB 104.870962
RWF 1499.0134
SAR 4.143454
SBD 9.154526
SCR 14.777341
SDG 664.031333
SEK 11.376842
SGD 1.43108
SHP 0.840731
SLE 25.222505
SLL 23149.48078
SOS 632.505654
SRD 34.187458
STD 22849.740386
SVC 9.683437
SYP 2773.732217
SZL 19.241869
THB 36.546038
TJS 11.764372
TMT 3.874899
TND 3.361572
TOP 2.585587
TRY 37.737714
TTD 7.506339
TWD 35.343247
TZS 3008.290876
UAH 45.675043
UGX 4065.183633
USD 1.10396
UYU 46.101456
UZS 14116.871627
VEF 3999152.618533
VES 40.705601
VND 27334.046337
VUV 131.064333
WST 3.088287
XAF 655.962864
XAG 0.035069
XAU 0.000418
XCD 2.983507
XDR 0.816726
XOF 649.680797
XPF 119.331742
YER 276.348759
ZAR 19.163363
ZMK 9936.951615
ZMW 28.968309
ZWL 355.474627
  • BCC

    -1.8600

    139.53

    -1.33%

  • SCS

    -0.3300

    12.87

    -2.56%

  • JRI

    -0.1500

    13.38

    -1.12%

  • NGG

    -1.2700

    68.78

    -1.85%

  • RIO

    -0.3400

    70.82

    -0.48%

  • GSK

    -0.8500

    39.45

    -2.15%

  • BCE

    -0.3900

    34.44

    -1.13%

  • CMSC

    0.0100

    24.78

    +0.04%

  • AZN

    0.9100

    79.58

    +1.14%

  • CMSD

    -0.0100

    24.93

    -0.04%

  • RBGPF

    59.9900

    59.99

    +100%

  • RYCEF

    0.0100

    6.91

    +0.14%

  • BTI

    -0.4800

    35.97

    -1.33%

  • BP

    0.2800

    32.37

    +0.86%

  • VOD

    -0.2100

    9.74

    -2.16%

  • RELX

    -0.0500

    47.29

    -0.11%

Will AI really destroy humanity?
Will AI really destroy humanity? / Photo: JEWEL SAMAD - AFP/File

Will AI really destroy humanity?

The warnings are coming from all angles: artificial intelligence poses an existential risk to humanity and must be shackled before it is too late.

Text size:

But what are these disaster scenarios and how are machines supposed to wipe out humanity?

- Paperclips of doom -

Most disaster scenarios start in the same place: machines will outstrip human capacities, escape human control and refuse to be switched off.

"Once we have machines that have a self-preservation goal, we are in trouble," AI academic Yoshua Bengio told an event this month.

But because these machines do not yet exist, imagining how they could doom humanity is often left to philosophy and science fiction.

Philosopher Nick Bostrom has written about an "intelligence explosion" he says will happen when superintelligent machines begin designing machines of their own.

He illustrated the idea with the story of a superintelligent AI at a paperclip factory.

The AI is given the ultimate goal of maximising paperclip output and so "proceeds by converting first the Earth and then increasingly large chunks of the observable universe into paperclips".

Bostrom's ideas have been dismissed by many as science fiction, not least because he has separately argued that humanity is a computer simulation and supported theories close to eugenics.

He also recently apologised after a racist message he sent in the 1990s was unearthed.

Yet his thoughts on AI have been hugely influential, inspiring both Elon Musk and Professor Stephen Hawking.

- The Terminator -

If superintelligent machines are to destroy humanity, they surely need a physical form.

Arnold Schwarzenegger's red-eyed cyborg, sent from the future to end human resistance by an AI in the movie "The Terminator", has proved a seductive image, particularly for the media.

But experts have rubbished the idea.

"This science fiction concept is unlikely to become a reality in the coming decades if ever at all," the Stop Killer Robots campaign group wrote in a 2021 report.

However, the group has warned that giving machines the power to make decisions on life and death is an existential risk.

Robot expert Kerstin Dautenhahn, from Waterloo University in Canada, played down those fears.

She told AFP that AI was unlikely to give machines higher reasoning capabilities or imbue them with a desire to kill all humans.

"Robots are not evil," she said, although she conceded programmers could make them do evil things.

- Deadlier chemicals -

A less overtly sci-fi scenario sees "bad actors" using AI to create toxins or new viruses and unleashing them on the world.

Large language models like GPT-3, which was used to create ChatGPT, it turns out are extremely good at inventing horrific new chemical agents.

A group of scientists who were using AI to help discover new drugs ran an experiment where they tweaked their AI to search for harmful molecules instead.

They managed to generate 40,000 potentially poisonous agents in less than six hours, as reported in the Nature Machine Intelligence journal.

AI expert Joanna Bryson from the Hertie School in Berlin said she could imagine someone working out a way of spreading a poison like anthrax more quickly.

"But it's not an existential threat," she told AFP. "It's just a horrible, awful weapon."

- Species overtaken -

The rules of Hollywood dictate that epochal disasters must be sudden, immense and dramatic -- but what if humanity's end was slow, quiet and not definitive?

"At the bleakest end our species might come to an end with no successor," philosopher Huw Price says in a promotional video for Cambridge University's Centre for the Study of Existential Risk.

But he said there were "less bleak possibilities" where humans augmented by advanced technology could survive.

"The purely biological species eventually comes to an end, in that there are no humans around who don't have access to this enabling technology," he said.

The imagined apocalypse is often framed in evolutionary terms.

Stephen Hawking argued in 2014 that ultimately our species will no longer be able to compete with AI machines, telling the BBC it could "spell the end of the human race".

Geoffrey Hinton, who spent his career building machines that resemble the human brain, latterly for Google, talks in similar terms of "superintelligences" simply overtaking humans.

He told US broadcaster PBS recently that it was possible "humanity is just a passing phase in the evolution of intelligence".

(Y.Berger--BBZ)