Berliner Boersenzeitung - The fight over a 'dangerous' ideology shaping AI debate

EUR -
AED 4.063574
AFN 75.748008
ALL 98.606117
AMD 428.277601
ANG 1.992415
AOA 1052.751747
ARS 1073.458085
AUD 1.609561
AWG 1.991426
AZN 1.879686
BAM 1.95012
BBD 2.232099
BDT 132.106423
BGN 1.952698
BHD 0.416992
BIF 3216.988407
BMD 1.106348
BND 1.423351
BOB 7.638201
BRL 6.048392
BSD 1.10548
BTN 92.656131
BWP 14.430925
BYN 3.617698
BYR 21684.414327
BZD 2.228149
CAD 1.494764
CDF 3169.686356
CHF 0.93581
CLF 0.036204
CLP 999.098085
CNY 7.777291
CNH 7.781795
COP 4654.238695
CRC 573.025835
CUC 1.106348
CUP 29.318213
CVE 109.945768
CZK 25.292199
DJF 196.842446
DKK 7.457117
DOP 66.796657
DZD 146.699878
EGP 53.357935
ERN 16.595215
ETB 130.95262
FJD 2.425221
FKP 0.842549
GBP 0.834081
GEL 3.025915
GGP 0.842549
GHS 17.487801
GIP 0.842549
GMD 77.44396
GNF 9545.284836
GTQ 8.545189
GYD 231.160012
HKD 8.601747
HNL 27.489432
HRK 7.52207
HTG 145.867785
HUF 397.955513
IDR 16874.733785
ILS 4.156432
IMP 0.842549
INR 92.8775
IQD 1448.208152
IRR 46577.236534
ISK 149.787908
JEP 0.842549
JMD 174.013172
JOD 0.784073
JPY 158.913589
KES 142.609861
KGS 93.200168
KHR 4497.575216
KMF 489.336009
KPW 995.712276
KRW 1466.718403
KWD 0.337956
KYD 0.92115
KZT 531.910765
LAK 24095.07875
LBP 98988.629459
LKR 326.229828
LRD 213.891556
LSL 19.130055
LTL 3.266757
LVL 0.669218
LYD 5.233756
MAD 10.791666
MDL 19.294408
MGA 5058.899722
MKD 61.431077
MMK 3593.374075
MNT 3759.369332
MOP 8.850981
MRU 43.735013
MUR 50.913682
MVR 16.982143
MWK 1916.579215
MXN 21.831127
MYR 4.607384
MZN 70.66792
NAD 19.130227
NGN 1845.753373
NIO 40.681878
NOK 11.733331
NPR 148.248873
NZD 1.761854
OMR 0.425939
PAB 1.1054
PEN 4.097802
PGK 4.335412
PHP 62.273546
PKR 306.908902
PLN 4.287413
PYG 8615.063644
QAR 4.029464
RON 4.972593
RSD 116.956429
RUB 104.552261
RWF 1508.896105
SAR 4.150881
SBD 9.166724
SCR 14.657308
SDG 665.461066
SEK 11.370928
SGD 1.426038
SHP 0.842549
SLE 25.27706
SLL 23199.551772
SOS 631.717246
SRD 33.960472
STD 22899.163057
SVC 9.672914
SYP 2779.73164
SZL 19.134615
THB 36.055982
TJS 11.77256
TMT 3.88328
TND 3.363504
TOP 2.59118
TRY 37.85644
TTD 7.49909
TWD 35.310748
TZS 3012.585011
UAH 45.673385
UGX 4055.188923
USD 1.106348
UYU 45.956563
UZS 14084.106164
VEF 4007802.555037
VES 40.796347
VND 27227.216153
VUV 131.347818
WST 3.094966
XAF 654.004865
XAG 0.035334
XAU 0.000417
XCD 2.98996
XDR 0.815809
XOF 654.052019
XPF 119.331742
YER 276.917913
ZAR 19.256689
ZMK 9958.45904
ZMW 28.98947
ZWL 356.243498
  • RBGPF

    3.0600

    63.86

    +4.79%

  • CMSC

    0.0110

    24.731

    +0.04%

  • RYCEF

    -0.0100

    7.04

    -0.14%

  • GSK

    -0.5800

    40.3

    -1.44%

  • RIO

    -0.7550

    70.415

    -1.07%

  • SCS

    -0.3000

    13.19

    -2.27%

  • CMSD

    0.0500

    24.83

    +0.2%

  • RELX

    -0.3500

    47.11

    -0.74%

  • BCC

    -1.1500

    139.83

    -0.82%

  • BCE

    -0.1100

    34.69

    -0.32%

  • BP

    0.7400

    32.13

    +2.3%

  • AZN

    0.2720

    78.182

    +0.35%

  • VOD

    -0.1100

    9.91

    -1.11%

  • BTI

    -0.1810

    36.399

    -0.5%

  • JRI

    -0.1540

    13.516

    -1.14%

  • NGG

    0.3200

    69.99

    +0.46%

The fight over a 'dangerous' ideology shaping AI debate
The fight over a 'dangerous' ideology shaping AI debate / Photo: WANG Zhao - AFP/File

The fight over a 'dangerous' ideology shaping AI debate

Silicon Valley's favourite philosophy, longtermism, has helped to frame the debate on artificial intelligence around the idea of human extinction.

Text size:

But increasingly vocal critics are warning that the philosophy is dangerous, and the obsession with extinction distracts from real problems associated with AI like data theft and biased algorithms.

Author Emile Torres, a former longtermist turned critic of the movement, told AFP that the philosophy rested on the kind of principles used in the past to justify mass murder and genocide.

Yet the movement and linked ideologies like transhumanism and effective altruism hold huge sway in universities from Oxford to Stanford and throughout the tech sector.

Venture capitalists like Peter Thiel and Marc Andreessen have invested in life-extension companies and other pet projects linked to the movement.

Elon Musk and OpenAI's Sam Altman have signed open letters warning that AI could make humanity extinct -- though they stand to benefit by arguing only their products can save us.

Ultimately critics say this fringe movement is holding far too much influence over public debates over the future of humanity.

- 'Really dangerous' -

Longtermists believe we are dutybound to try to produce the best outcomes for the greatest number of humans.

This is no different to 19th century liberals, but longtermists have a much longer timeline in mind.

They look to the far future and see trillions upon trillions of humans floating through space, colonising new worlds.

They argue that we owe the same duty to each of these future humans as we do to anyone alive today.

And because there are so many of them, they carry much more weight than today's specimens.

This kind of thinking makes the ideology "really dangerous", said Torres, author of "Human Extinction: A History of the Science and Ethics of Annihilation".

"Any time you have a utopian vision of the future marked by near infinite amounts of value, and you combine that with a sort of utilitarian mode of moral thinking where the ends can justify the means, it's going to be dangerous," said Torres.

If a superintelligent machine could be about to spring to life with the potential to destroy humanity, longtermists are bound to oppose it no matter the consequences.

When asked in March by a user of Twitter, the platform now known as X, how many people could die to stop this happening, longtermist idealogue Eliezer Yudkowsky replied that there only needed to be enough people "to form a viable reproductive population".

"So long as that's true, there's still a chance of reaching the stars someday," he wrote, though he later deleted the message.

- Eugenics claims -

Longtermism grew out of work done by Swedish philosopher Nick Bostrom in the 1990s and 2000s around existential risk and transhumanism -- the idea that humans can be augmented by technology.

Academic Timnit Gebru has pointed out that transhumanism was linked to eugenics from the start.

British biologist Julian Huxley, who coined the term transhumanism, was also president of the British Eugenics Society in the 1950s and 1960s.

"Longtermism is eugenics under a different name," Gebru wrote on X last year.

Bostrom has long faced accusations of supporting eugenics after he listed as an existential risk "dysgenic pressures", essentially less-intelligent people procreating faster than their smarter peers.

The philosopher, who runs the Future of Life Institute at the University of Oxford, apologised in January after admitting he had written racist posts on an internet forum in the 1990s.

"Do I support eugenics? No, not as the term is commonly understood," he wrote in his apology, pointing out it had been used to justify "some of the most horrific atrocities of the last century".

- 'More sensational' -

Despite these troubles, longtermists like Yudkowsky, a high school dropout known for writing Harry Potter fan-fiction and promoting polyamory, continue to be feted.

Altman has credited him with getting OpenAI funded and suggested in February he deserved a Nobel peace prize.

But Gebru, Torres and many others are trying to refocus on harms like theft of artists' work, bias and concentration of wealth in the hands of a few corporations.

Torres, who uses the pronoun they, said while there were true believers like Yudkowsky, much of the debate around extinction was motivated by profit.

"Talking about human extinction, about a genuine apocalyptic event in which everybody dies, is just so much more sensational and captivating than Kenyan workers getting paid $1.32 an hour, or artists and writers being exploited," they said.

(F.Schuster--BBZ)