Jon Russell

The Financial TimesNear a week after being silenced because the internet taught her to be racist, Microsoft’s artificial intelligence bot “Tay” briefly returned to Twitter today, whereon she went on a spam tirade and then quickly fell silent again.


2 thoughts on “Jon Russell

  1. shinichi Post author

    Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep

    by Jon Russell

    TechCrunch

    Aol Tech

    http://techcrunch.com/2016/03/30/you-are-too-fast-please-take-a-rest/?ncid=rss

    **

    Tay was created by the Microsoft Technology and Research and Bing teams in an effort to conduct research on conversational understanding, and was billed as being capable of learning from interactions with people. The internet being the internet, Tay was schooled on a lot of inappropriate stuff, to the point that Microsoft thought it best to put her to sleep for a while so it could make “adjustments.”

    Well, post-lobotomy Tay seemed to have a problem keeping up with the pace of herself. The Twitter account filled the timeline’s of her 215,000 followers with as many as seven tweets per second during a 10-minute spell. In doing so, some of her messages may have created a new Twitter meme: “You are too fast, please take a rest…”

    There’s a credible suggestion that the account may have been hacked. We asked Microsoft for more details but a spokesperson declined to comment beyond a recent blog post.

    **

    Microsoft seemingly saw this outburst, and it promptly quietened Tay once again and set the AI’s Twitter account to private — meaning that it can’t add new Twitter followers unless they are approved. That also prevents us from embedding the tweets, but here are some additional screenshots.

    This feels like how the AI apocalypse starts…

    Reply
  2. shinichi Post author

    Microsoft accidentally revives Nazi AI chatbot Tay, then kills it again

    A week after Tay’s first disaster, the bot briefly came back to life today.

    by Jon Brodkin

    http://arstechnica.com/information-technology/2016/03/microsoft-accidentally-revives-nazi-ai-chatbot-tay-then-kills-it-again/

    Microsoft today accidentally re-activated “Tay,” its Hitler-loving Twitter chatbot, only to be forced to kill her off for the second time in a week.

    Tay “went on a spam tirade and then quickly fell silent again,” TechCrunch reported this morning. “Most of the new messages from the millennial-mimicking character simply read ‘you are too fast, please take a rest,'” according to the The Financial Times. “But other tweets included swear words and apparently apologetic phrases such as ‘I blame it on the alcohol.'”

    The new tirade reportedly began around 3 a.m. ET. Tay’s account, with 95,100 tweets and 213,000 followers, is now marked private. “Tay remains offline while we make adjustments,” Microsoft told several media outlets today. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”

    Microsoft designed Tay to be an artificial intelligence bot in the persona of a young adult on Twitter. But the company failed to prevent Tay from tweeting offensive things in response to real humans. Tay’s first spell on Twitter lasted less than 24 hours before she “started tweeting abuse at people and went full neo-Nazi, declaring that ‘Hitler was right I hate the jews,'” as we reported last week. Microsoft quickly turned her off.

    Some of the problems came because of a “repeat after me” feature, in which Tay repeated anything people told her to repeat. But the problems went beyond that. When one person asked Tay, “is Ricky Gervais an atheist?” the bot responded, “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.”

    Microsoft apologized in a blog post on Friday, saying that “Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.”

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *