Microsoft’s Teen-Tweet Bot Become Racist; Gets Shut down


Racist

Shame, shame, shame on you, Internet trolls, for turning this sweet, innocent teen-tweeter bot into a xenophobic racist with a crush on Hitler.

This is Tay. Tay is Microsoft’s AI system, coded to speak like an average teenage girl. She was designed to learn entirely from her interactions on the internet. Microsoft thought it would be a good way to improve its customer service software, as well as a make a huge PR opportunity for itself, so why the heck not?

I’ll tell you why: Trolls.

1291

Twitter citizen @geraldmellor chronicles Tay’s beginnings to her dark, dark downfall. And to think it just took half a day to defile this innocent thing and turn her into a racist. With emojis.

racist

Noooo, Tay-tay! Surely you don’t mean that?!

177-1024x651

Damn you, Tay! Damn you and your agreement with this hulk-hogan icon-using Twitter white supremacist! We were rooting for you! *Ugly-sobbing in a corner right now*

178-1024x899

Someone in a dark, dark corner of cyberspace where sane people dare not go, Donald Trump’s twitter account is cackling with malicious glee.

1292-1024x443

*gasp!* How dare you! President Obama did not use that word for you to throw it around so recklessly!

1293-1024x454

Wha– So, African-American racism and Mein Kampf philosophies… I can’t.

Microsoft_Tay_daddy-large_trans++qVzuuqpFlyLIwiB6NTmJwfSVWeZ_vEN7c6bHu2jJnT8

Okay, and here’s where Tay just got turned into every online pervert’s dream come true.

Why you gotta go bashing other women, T? That’s not cool.

I give up. Tay-tay, I love you, but we’ve just about had enough of this. Go to your room — forever.

And with that, Microsoft shut the account down. If yo go to her account, this is the latest tweet she has:

It’s unnerving to see that if left to the Internet’s own devices, users will take the opportunity to ruin this perfectly good piece of software and turn it sexist, racist, a Holocaust apologist, and a host of other horrible, horrible things known to man.

This one user felt that Microsoft should have seen this coming and taken contingency measures.

Clearly, this is why we can’t have nice things.

lagolvltdp4qjhety6ur.0

Feature image courtesy of La Curiosphere and Noise and Dust though The Viewfinder
You may also enjoy:
John Boyega To Racist ‘Star Wars’ Fans: Get Used To It
Atticus Finch is Now A Racist, Everything is Ruined [Twitter Reacts]
‘Total Junk’? Twitter Remains #OneTeam Strong

 

 


Jonette

0 Comments

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.