When first describing Tay, Microsoft said that the chat-bot was "designed to engage and entertain people where they connect with each other online through casual and playful conversation". As Microsoft puts it on Tay's website, "The more you chat with Tay the smarter she gets, so the experience can be more personalized for you".
The chatbot, targeted at 18- to 24-year-olds in the United States, has now been temporarily shut down.
Yesterday Microsoft unveiled Tay, a chatbot created to sound like a teenage girl.
Microsoft blames Tay's behavior on online trolls, saying in a statement that there was a "coordinated effort" to trick the program's "commenting skills". "As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it", says Microsoft.
"C U soon humans need sleep now so many conversations today", Tay said in its final tweet.
Although Tay is well-versed on the lingo of typical millennials, she was able to quickly learn and adapt within her Twitter cage well past using popular catchphrases and memes. After launching chatbot, Microsoft invited people to chat with Tay.
People on the Internet starting taking advantage of the AI's algorithm and replied to it with tweets relating to Hitler, Donald Trump and hating feminism.
In a statement Thursday, Microsoft confirmed it was taking Tay offline to make adjustments. Numerous really bad responses, as Business Insider notes, appear to be the result of an exploitation of Tay's "repeat after me", function - and it appears that Tay was able to repeat pretty much anything.
Tay's offensive tweets have since been deleted, but screenshots posted online suggest the bot expressed its support for genocide, and shared a conspiracy theory surrounding the September 11 attacks.
But it only took a matter of hours before Tay was regurgitating the abusive vitriol users were feeding her.