Berliner Boersenzeitung - Angry Bing chatbot just mimicking humans, say experts

EUR -
AED 4.032488
AFN 75.207698
ALL 99.08426
AMD 424.806021
ANG 1.977656
AOA 1001.814774
ARS 1070.1714
AUD 1.624732
AWG 1.977571
AZN 1.86722
BAM 1.957259
BBD 2.215661
BDT 131.127718
BGN 1.956206
BHD 0.413896
BIF 3175.6415
BMD 1.097888
BND 1.431073
BOB 7.598697
BRL 6.022135
BSD 1.097308
BTN 92.14909
BWP 14.514883
BYN 3.590914
BYR 21518.607915
BZD 2.211787
CAD 1.494857
CDF 3156.428231
CHF 0.938046
CLF 0.036859
CLP 1017.051077
CNY 7.705852
CNH 7.759269
COP 4625.677286
CRC 570.583629
CUC 1.097888
CUP 29.094036
CVE 110.72203
CZK 25.377682
DJF 195.116565
DKK 7.454567
DOP 66.202396
DZD 146.022584
EGP 53.144925
ERN 16.468322
ETB 132.187437
FJD 2.428511
FKP 0.836107
GBP 0.83899
GEL 3.013714
GGP 0.836107
GHS 17.439974
GIP 0.836107
GMD 75.754073
GNF 9469.285454
GTQ 8.491367
GYD 229.479966
HKD 8.526145
HNL 27.425651
HRK 7.464554
HTG 144.708497
HUF 402.203091
IDR 17269.78074
ILS 4.159036
IMP 0.836107
INR 92.222496
IQD 1437.684544
IRR 46207.368136
ISK 148.511212
JEP 0.836107
JMD 173.499294
JOD 0.778071
JPY 162.541246
KES 141.627872
KGS 92.991021
KHR 4460.719337
KMF 492.193555
KPW 988.098721
KRW 1476.505902
KWD 0.336437
KYD 0.914486
KZT 532.111689
LAK 24230.391648
LBP 98370.779118
LKR 321.936452
LRD 211.94696
LSL 19.191214
LTL 3.241778
LVL 0.664102
LYD 5.231398
MAD 10.784005
MDL 19.280456
MGA 5007.467796
MKD 61.596102
MMK 3565.897914
MNT 3730.623915
MOP 8.779002
MRU 43.641201
MUR 51.139597
MVR 16.852304
MWK 1905.933848
MXN 21.197739
MYR 4.70226
MZN 70.146882
NAD 19.190845
NGN 1778.143376
NIO 40.347113
NOK 11.692696
NPR 147.427122
NZD 1.793186
OMR 0.422731
PAB 1.097333
PEN 4.100285
PGK 4.372614
PHP 62.379262
PKR 304.71912
PLN 4.324111
PYG 8555.375564
QAR 3.996588
RON 4.977716
RSD 117.007433
RUB 105.615437
RWF 1465.680692
SAR 4.123765
SBD 9.089101
SCR 15.241172
SDG 660.380824
SEK 11.370932
SGD 1.431498
SHP 0.836107
SLE 25.083783
SLL 23022.160103
SOS 626.894247
SRD 34.562072
STD 22724.068262
SVC 9.602156
SYP 2758.476866
SZL 19.1909
THB 36.720514
TJS 11.675648
TMT 3.853587
TND 3.370523
TOP 2.571368
TRY 37.602433
TTD 7.441003
TWD 35.32071
TZS 2991.744912
UAH 45.212522
UGX 4033.023823
USD 1.097888
UYU 45.744089
UZS 14052.968071
VEF 3977157.532572
VES 40.611012
VND 27288.01019
VUV 130.343488
WST 3.071301
XAF 656.449183
XAG 0.034567
XAU 0.000415
XCD 2.967098
XDR 0.816249
XOF 655.985204
XPF 119.331742
YER 274.852661
ZAR 19.06609
ZMK 9882.312419
ZMW 29.085072
ZWL 353.519539
  • RBGPF

    -1.1600

    58.94

    -1.97%

  • CMSC

    -0.1300

    24.57

    -0.53%

  • SCS

    -0.0200

    12.95

    -0.15%

  • GSK

    -0.1900

    38.63

    -0.49%

  • NGG

    -1.0200

    65.48

    -1.56%

  • RELX

    -0.2500

    46.04

    -0.54%

  • RYCEF

    -0.1000

    6.88

    -1.45%

  • VOD

    0.0300

    9.69

    +0.31%

  • RIO

    -0.0800

    69.62

    -0.11%

  • BTI

    -0.0900

    35.2

    -0.26%

  • AZN

    -0.6000

    76.87

    -0.78%

  • BCC

    2.3700

    141.27

    +1.68%

  • JRI

    -0.1000

    13.18

    -0.76%

  • CMSD

    -0.0230

    24.79

    -0.09%

  • BCE

    -0.1800

    33.53

    -0.54%

  • BP

    0.2600

    33.14

    +0.78%

Angry Bing chatbot just mimicking humans, say experts
Angry Bing chatbot just mimicking humans, say experts / Photo: Jason Redmond - AFP

Angry Bing chatbot just mimicking humans, say experts

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.

Text size:

Tales of disturbing exchanges with the chatbot that have captured attention this week include the artificial intelligence (AI) issuing threats and telling of desires to steal nuclear code, create a deadly virus, or to be alive.

"I think this is basically mimicking conversations that it's seen online," said Graham Neubig, an associate professor at Carnegie Mellon University's language technologies institute.

"So once the conversation takes a turn, it's probably going to stick in that kind of angry state, or say 'I love you' and other things like this, because all of this is stuff that's been online before."

A chatbot, by design, serves up words it predicts are the most likely responses, without understanding meaning or context.

However, humans taking part in banter with programs naturally tend to read emotion and intent into what a chatbot says.

"Large language models have no concept of 'truth' -- they just know how to best complete a sentence in a way that's statistically probable based on their inputs and training set," programmer Simon Willison said in a blog post.

"So they make things up, and then state them with extreme confidence."

Laurent Daudet, co-founder of French AI company LightOn, theorized that the chatbot gone seemingly rogue was trained on exchanges that themselves turned aggressive or inconsistent.

"Addressing this requires a lot of effort and a lot of human feedback, which is also the reason why we chose to restrict ourselves for now to business uses and not more conversational ones," Daudet told AFP.

- 'Off the rails' -

The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT, the headline-grabbing app capable of generating all sorts of written content in seconds on a simple request.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up fascination and concern.

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses (and) that can lead to a style we didn't intend," Microsoft said in a blog post, noting the bot is a work in progress.

Bing chatbot said in some shared exchanges that it had been code named "Sydney" during development, and that it was given rules of behavior.

Those rules include "Sydney's responses should also be positive, interesting, entertaining and engaging," according to online posts.

Disturbing dialogues that combine steely threats and professions of love could be due to dueling directives to stay positive while mimicking what the AI mined from human exchanges, Willison theorized.

Chatbots seem to be more prone to disturbing or bizarre responses during lengthy conversations, losing a sense of where exchanges are going, eMarketer principal analyst Yoram Wurmser told AFP.

"They can really go off the rails," Wurmser said.

"It's very lifelike, because (the chatbot) is very good at sort of predicting next words that would make it seem like it has feelings or give it human like qualities; but it's still statistical outputs."

(T.Burkhard--BBZ)