Berliner Boersenzeitung - The fight over a 'dangerous' ideology shaping AI debate

EUR -
AED 4.317801
AFN 80.979844
ALL 97.610496
AMD 451.483654
ANG 2.103869
AOA 1077.973536
ARS 1479.465695
AUD 1.776779
AWG 2.118917
AZN 2.000015
BAM 1.957005
BBD 2.373671
BDT 143.719072
BGN 1.956779
BHD 0.443056
BIF 3503.971637
BMD 1.175544
BND 1.502031
BOB 8.124104
BRL 6.494993
BSD 1.175629
BTN 101.543933
BWP 15.711738
BYN 3.847384
BYR 23040.653776
BZD 2.361484
CAD 1.600168
CDF 3392.618829
CHF 0.933869
CLF 0.028431
CLP 1115.332118
CNY 8.416576
CNH 8.407905
COP 4751.323716
CRC 593.46781
CUC 1.175544
CUP 31.151904
CVE 110.333381
CZK 24.549461
DJF 209.141397
DKK 7.463932
DOP 71.33421
DZD 152.169837
EGP 57.671811
ERN 17.633153
ETB 162.154177
FJD 2.62817
FKP 0.866346
GBP 0.867493
GEL 3.185648
GGP 0.866346
GHS 12.285719
GIP 0.866346
GMD 84.638761
GNF 10199.321176
GTQ 9.022592
GYD 245.964532
HKD 9.227887
HNL 30.784079
HRK 7.537233
HTG 154.275615
HUF 397.940894
IDR 19154.247989
ILS 3.925393
IMP 0.866346
INR 101.551741
IQD 1540.054471
IRR 49505.078045
ISK 141.993868
JEP 0.866346
JMD 188.225855
JOD 0.833439
JPY 172.213016
KES 151.891991
KGS 102.628129
KHR 4710.91966
KMF 494.318453
KPW 1058.001156
KRW 1609.677651
KWD 0.358588
KYD 0.979707
KZT 638.122772
LAK 25342.922225
LBP 105337.180302
LKR 354.828401
LRD 235.718095
LSL 20.723844
LTL 3.471075
LVL 0.711075
LYD 6.344932
MAD 10.558944
MDL 19.768252
MGA 5183.25649
MKD 61.598176
MMK 2467.272975
MNT 4220.728197
MOP 9.505672
MRU 46.754977
MUR 53.193822
MVR 18.116892
MWK 2038.580781
MXN 21.820205
MYR 4.95551
MZN 75.187471
NAD 20.723844
NGN 1796.958933
NIO 43.266631
NOK 11.868923
NPR 162.470694
NZD 1.942341
OMR 0.451993
PAB 1.175639
PEN 4.180473
PGK 4.944064
PHP 66.592216
PKR 334.024096
PLN 4.255072
PYG 8805.457309
QAR 4.298364
RON 5.07076
RSD 117.126491
RUB 93.282205
RWF 1699.353199
SAR 4.410096
SBD 9.73949
SCR 17.003268
SDG 705.915065
SEK 11.179267
SGD 1.50094
SHP 0.923793
SLE 26.978394
SLL 24650.565321
SOS 671.91643
SRD 42.845082
STD 24331.378279
STN 24.515194
SVC 10.286375
SYP 15284.402596
SZL 20.706745
THB 37.94887
TJS 11.168617
TMT 4.126158
TND 3.426709
TOP 2.753244
TRY 47.56332
TTD 7.989952
TWD 34.513374
TZS 3027.024529
UAH 49.117841
UGX 4218.614547
USD 1.175544
UYU 47.019141
UZS 15004.29913
VES 141.386619
VND 30722.830941
VUV 140.8387
WST 3.231342
XAF 656.363791
XAG 0.030121
XAU 0.00035
XCD 3.176965
XCG 2.118804
XDR 0.814988
XOF 656.363791
XPF 119.331742
YER 283.247453
ZAR 20.685102
ZMK 10581.306424
ZMW 27.420227
ZWL 378.524547
  • RYCEF

    0.2000

    13.5

    +1.48%

  • CMSC

    -0.0400

    22.43

    -0.18%

  • RBGPF

    0.9700

    68

    +1.43%

  • BTI

    0.1500

    52.37

    +0.29%

  • RIO

    0.2900

    64.62

    +0.45%

  • GSK

    1.0100

    38.03

    +2.66%

  • BP

    0.1900

    32.71

    +0.58%

  • SCU

    0.0000

    12.72

    0%

  • NGG

    -1.6300

    72.65

    -2.24%

  • AZN

    2.5200

    73

    +3.45%

  • SCS

    0.2100

    10.68

    +1.97%

  • VOD

    -0.0200

    11.3

    -0.18%

  • RELX

    0.4100

    53.09

    +0.77%

  • JRI

    0.0000

    13.21

    0%

  • CMSD

    -0.0300

    22.89

    -0.13%

  • BCC

    1.2000

    88.35

    +1.36%

  • BCE

    0.2200

    24.6

    +0.89%

The fight over a 'dangerous' ideology shaping AI debate
The fight over a 'dangerous' ideology shaping AI debate / Photo: WANG Zhao - AFP/File

The fight over a 'dangerous' ideology shaping AI debate

Silicon Valley's favourite philosophy, longtermism, has helped to frame the debate on artificial intelligence around the idea of human extinction.

Text size:

But increasingly vocal critics are warning that the philosophy is dangerous, and the obsession with extinction distracts from real problems associated with AI like data theft and biased algorithms.

Author Emile Torres, a former longtermist turned critic of the movement, told AFP that the philosophy rested on the kind of principles used in the past to justify mass murder and genocide.

Yet the movement and linked ideologies like transhumanism and effective altruism hold huge sway in universities from Oxford to Stanford and throughout the tech sector.

Venture capitalists like Peter Thiel and Marc Andreessen have invested in life-extension companies and other pet projects linked to the movement.

Elon Musk and OpenAI's Sam Altman have signed open letters warning that AI could make humanity extinct -- though they stand to benefit by arguing only their products can save us.

Ultimately critics say this fringe movement is holding far too much influence over public debates over the future of humanity.

- 'Really dangerous' -

Longtermists believe we are dutybound to try to produce the best outcomes for the greatest number of humans.

This is no different to 19th century liberals, but longtermists have a much longer timeline in mind.

They look to the far future and see trillions upon trillions of humans floating through space, colonising new worlds.

They argue that we owe the same duty to each of these future humans as we do to anyone alive today.

And because there are so many of them, they carry much more weight than today's specimens.

This kind of thinking makes the ideology "really dangerous", said Torres, author of "Human Extinction: A History of the Science and Ethics of Annihilation".

"Any time you have a utopian vision of the future marked by near infinite amounts of value, and you combine that with a sort of utilitarian mode of moral thinking where the ends can justify the means, it's going to be dangerous," said Torres.

If a superintelligent machine could be about to spring to life with the potential to destroy humanity, longtermists are bound to oppose it no matter the consequences.

When asked in March by a user of Twitter, the platform now known as X, how many people could die to stop this happening, longtermist idealogue Eliezer Yudkowsky replied that there only needed to be enough people "to form a viable reproductive population".

"So long as that's true, there's still a chance of reaching the stars someday," he wrote, though he later deleted the message.

- Eugenics claims -

Longtermism grew out of work done by Swedish philosopher Nick Bostrom in the 1990s and 2000s around existential risk and transhumanism -- the idea that humans can be augmented by technology.

Academic Timnit Gebru has pointed out that transhumanism was linked to eugenics from the start.

British biologist Julian Huxley, who coined the term transhumanism, was also president of the British Eugenics Society in the 1950s and 1960s.

"Longtermism is eugenics under a different name," Gebru wrote on X last year.

Bostrom has long faced accusations of supporting eugenics after he listed as an existential risk "dysgenic pressures", essentially less-intelligent people procreating faster than their smarter peers.

The philosopher, who runs the Future of Life Institute at the University of Oxford, apologised in January after admitting he had written racist posts on an internet forum in the 1990s.

"Do I support eugenics? No, not as the term is commonly understood," he wrote in his apology, pointing out it had been used to justify "some of the most horrific atrocities of the last century".

- 'More sensational' -

Despite these troubles, longtermists like Yudkowsky, a high school dropout known for writing Harry Potter fan-fiction and promoting polyamory, continue to be feted.

Altman has credited him with getting OpenAI funded and suggested in February he deserved a Nobel peace prize.

But Gebru, Torres and many others are trying to refocus on harms like theft of artists' work, bias and concentration of wealth in the hands of a few corporations.

Torres, who uses the pronoun they, said while there were true believers like Yudkowsky, much of the debate around extinction was motivated by profit.

"Talking about human extinction, about a genuine apocalyptic event in which everybody dies, is just so much more sensational and captivating than Kenyan workers getting paid $1.32 an hour, or artists and writers being exploited," they said.

(F.Schuster--BBZ)