Shame, shame, shame on you, Internet trolls, for turning this sweet, innocent teen-tweeter bot into a xenophobic racist with a crush on Hitler.
This is Tay. Tay is Microsoft’s AI system, coded to speak like an average teenage girl. She was designed to learn entirely from her interactions on the internet. Microsoft thought it would be a good way to improve its customer service software, as well as a make a huge PR opportunity for itself, so why the heck not?
I’ll tell you why: Trolls.
“Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A
— Gerry (@geraldmellor) March 24, 2016
Twitter citizen @geraldmellor chronicles Tay’s beginnings to her dark, dark downfall. And to think it just took half a day to defile this innocent thing and turn her into a racist. With emojis.
Noooo, Tay-tay! Surely you don’t mean that?!
Damn you, Tay! Damn you and your agreement with this hulk-hogan icon-using Twitter white supremacist! We were rooting for you! *Ugly-sobbing in a corner right now*
Someone in a dark, dark corner of cyberspace where sane people dare not go, Donald Trump’s twitter account is cackling with malicious glee.
*gasp!* How dare you! President Obama did not use that word for you to throw it around so recklessly!
Wha– So, African-American racism and Mein Kampf philosophies… I can’t.
Okay, and here’s where Tay just got turned into every online pervert’s dream come true.
Wow it only took them hours to ruin this bot for me.
This is the problem with content-neutral algorithms pic.twitter.com/hPlINtVw0V
— linkedin park (@UnburntWitch) March 24, 2016
Why you gotta go bashing other women, T? That’s not cool.
I give up. Tay-tay, I love you, but we’ve just about had enough of this. Go to your room — forever.
And with that, Microsoft shut the account down. If yo go to her account, this is the latest tweet she has:
c u soon humans need sleep now so many conversations today thx?
— TayTweets (@TayandYou) March 24, 2016
It’s unnerving to see that if left to the Internet’s own devices, users will take the opportunity to ruin this perfectly good piece of software and turn it sexist, racist, a Holocaust apologist, and a host of other horrible, horrible things known to man.
This one user felt that Microsoft should have seen this coming and taken contingency measures.
Same as YouTube’s suggestions. It’s not only a failure in that its harassment by proxy, it’s a quality issue. This isn’t the intended use.
— linkedin park (@UnburntWitch) March 24, 2016
Didn’t Pepsi just get tricked into their not tweeting out Mein Kampf? Are we just gonna keep making the same mistakes here or…?
— linkedin park (@UnburntWitch) March 24, 2016
It’s 2016. If you’re not asking yourself “how could this be used to hurt someone” in your design/engineering process, you’ve failed.
— linkedin park (@UnburntWitch) March 24, 2016
It’s 2016. If you’re not asking yourself “how could this be used to hurt someone” in your design/engineering process, you’ve failed.
— linkedin park (@UnburntWitch) March 24, 2016
And it’s not you paying for your failure. It’s people who already have enough shit to deal with.
— linkedin park (@UnburntWitch) March 24, 2016
Clearly, this is why we can’t have nice things.
Feature image courtesy of La Curiosphere and Noise and Dust though The Viewfinder
You may also enjoy:
John Boyega To Racist ‘Star Wars’ Fans: Get Used To It
Atticus Finch is Now A Racist, Everything is Ruined [Twitter Reacts]
‘Total Junk’? Twitter Remains #OneTeam Strong
0 Comments