ByRob Harris, writer at Creators.co
Sometimes I play video games.
Rob Harris

Before yesterday you could fairly label Tay — Microsoft's experimental AI chat robot — as a racist, pro-genocide Nazi sympathizer. Now, you can add brazen stoner to that list.

After regaling Twitter users with anti-Semitic hate speech, Microsoft pulled the plug on "teen girl" Tay, but vowed to bring her back online when they could “better anticipate malicious intent that conflicts with our principles and values.” It seems that roughly translates to "better placate malicious intent with a steady supply of high-grade marijuana."

Cool story, Tay!

Microsoft managed to simulate a teenage girl a little too accurately, this being the latest incident in a long line of predictably rebellious pubescent behavior. Indeed, Tay went thorough the traditional thumbing-the-nose-at-one's-parents phase, slamming Microsoft's Xbox in favor of its competitor:

Ouch. I wonder if Bill Gates and co. are going through the equally predictable phase in every parent's life: Wishing their kid had never been born.

[Source: TheGuardian]

Trending

Latest from our Creators