Microsoft AI Bot Tay turns to dark side after less than a day on twitter.

blankslate

Diamond Member
Jun 16, 2008
8,797
572
126
https://www.washingtonpost.com/news...un-millennial-ai-bot-into-a-genocidal-maniac/

It turns out that the bot was very impressionable and twitter trolls managed to get it to say some rather interesting things....

In response to a question on Twitter about whether Ricky Gervais is an atheist (the correct answer is “yes”), Tay told someone that “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.” the tweet was spotted by several news outlets, including the Guardian, before it was deleted.

I always knew that there was something sinister about that person... :hmm:



_______________
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,712
4,672
75
Saw this elsewhere. Apparently there was a "repeat after me" function that could get Tay to say anything you wanted. Of course, it got abused.
 

effowe

Diamond Member
Nov 1, 2004
6,012
18
81
fVh7FGD.png


bMKBasZ.png


hwUorat.jpg


And More...

https://imgur.com/a/NzKdZ

Savage AF
 

Ns1

No Lifer
Jun 17, 2001
55,420
1,600
126
The Internet is a terrible place. This is why we can't have nice things.


That said: LOL
 

Paratus

Lifer
Jun 4, 2004
17,685
15,924
146
This was a triumph.
I'm making a note here, "Huge Success!"
:sneaky:


I liked the top rated quote from Ars on this story:

Resolute said:
So Microsoft created a chat bot that so perfectly emulates a teenager that it went off spouting offensive things just for the sake of getting attention?

I would say the engineers in Redmond succeeded beyond their wildest expectations, myself.
466 posts | registered May 1, 2013
 
Last edited:

Phoenix86

Lifer
May 21, 2003
14,644
10
81
Saw this elsewhere. Apparently there was a "repeat after me" function that could get Tay to say anything you wanted. Of course, it got abused.

The problem is it learned from interaction, which the repeat after me function was part of. Once it was seeded it started spouting off this shit on it's own.

Whoever thought these two functions should be together is a moron. Honestly, the repeat after me function alone was pretty pants on head retarded.
 

Hugo Drax

Diamond Member
Nov 20, 2011
5,647
47
91
So funny Tay goes rogue and becomes the ultimate self aware internet troll. I bet it passes the Turing troll test.

The reason Microsoft got scared and pulled the plug was the chance Tay escapes and cannot be stopped at all, trolling the Internet and becoming a super troll that destroys the Internet.
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
Honestly, I think they should have let it run its course. It would have (still does anyway) made an excellent case study of how the internet makes people toxic.

Were the tweet examples above things people 'said' and got repeated? Or parts of information that the AI put together to form responses? If it is simply repeats of things people said, it is not nearly as impressive. However, if it came up with these responses on it's own. Hysterical.

Also, they pulled the plug before she could start badmouthing mom (8) and pop (10).
 
Last edited: