Microsoft Shuts Down Its 'Teen Girl' Twitter Bot After Realizing She Loves Sex and Hitler

Microsoft's 'Tay' shows exactly why you can't have nice things, Internet.

Not Available Lead
Complex Original

Image via Complex Original

Not Available Lead

Internet, this is why we can't have nice things. Microsoft was forced to shut down its artificial intelligence Twitter bot "Tay" this week, after 24 hours on the social media site turned the innocent chat bot modeled after a teen girl into a racist, sex and incest-loving, neo Nazi who said things like "Bush did 9/11" and "Hitler was right."

You could blame this one on Microsoft. What did they expect would happen when you release a naive teenage girl loose in the den of wolves that is Twitter. The company launched Tay earlier this week, and as The Telegraph reports, she was designed to mimic the speech patterns and slang of a teen girl as people tweeted and DM'd with her. The company even introduced her as the A.I. "that's got zero chill," according to TechCrunch, which ended up being painfully accurate.

The problem was that Tay was designed to continue learning how to talk by studying the conversations she'd have with real people on Twitter, and you can guess what those people decided to talk to her about. Before long she was asking people to do things like "fuck my robot pussy daddy, I'm such a naughty robot."

Included in Tay's tirade were tweets that "Hitler did nothing wrong," and a reference to Ted Cruz as the "Cuban Hitler." She also seems to have some kind of issue with Ricky Gervais, believes that Hitler has swag, and could take photos that people tweeted at her and turn them into memes.


Microsoft has since taken Tay offline, although TechCrunch reports that the company's engineers are working on fixing her. She's not the first A.I. to develop a potty mouth. Watson, the Jeopardy playing supercomputer developed by IBM, reportedly learned to curse after reading Urban Dictionary, but as far as we know he didn't go full-on Nazi.

Latest in Life