jbluhm86 Posted March 25, 2016 Report Posted March 25, 2016 http://www.slate.com/blogs/business_insider/2016/03/24/microsoft_s_new_ai_chatbot_tay_removed_from_twitter_due_to_racist_tweets.html Microsoft Took Its New A.I. Chatbot Offline After It Started Spewing Racist Tweets Microsoft's new A.I. chatbot went off the rails Wednesday, posting a deluge of incredibly racist messages in response to questions. The tech company introduced “Tay” this week—a bot that responds to users' queries and emulates the casual, jokey speech patterns of a stereotypical millennial. The aim was to “experiment with and conduct research on conversational understanding,” with Tay able to learn from her conversations and get progressively “smarter.” But Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide. Microsoft has now taken Tay offline for “upgrades,” and it is deleting some of the worst tweets—though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability—that Tay didn't understand what it was talking about—and exploited it. Nonetheless, it is hugely embarrassing for the company. In one highly publicised tweet, which has since been deleted, Tay said: “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got.” In another, responding to a question, she said, “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.” Many extremely inflammatory tweets remain online as of writing. Here's Tay denying the existence of the Holocaust. And here's the bot calling for genocide. (Note: In some—but not all—instances, people managed to have Tay say offensive comments by asking them to repeat them. This appears to be what happened here.) It's clear that Microsoft's developers didn't include any filters on what words Tay could or could not use. Tay also expressed agreement with the “Fourteen Words”—an infamous white-supremacist slogan. In an emailed statement, a Microsoft representative said the company was making “adjustments” to the bot: “The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay.”
Axe Posted March 25, 2016 Report Posted March 25, 2016 I'm sorry.. I LOL'ed.. Does that mean I'm going to hell? Or just Purgatory where I'll be forced to use Windows 10 until I repent
MLD Woody Posted March 25, 2016 Report Posted March 25, 2016 Hahahaha, I was going to post the same thing. This is amazing. Microsoft completely underestimated the internet. I love that they turned it into a Trump supporter
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 oh my god. oh my fucking god. Can someone explain how the hell an AI came up with that shit on it's own? I'm fucking dying that is some of the funniest hsit I've ever read. The ngrs can't do taxes line had me ducking under my desk. Like nobody could know what i'm laughing at but i'm still duckin under the desk like i'm hiding from gods lightning bolts or something. Dear fucking lord that went bad quick. That fucking company steps in some shit almost bi monthly.
bbedward Posted March 25, 2016 Report Posted March 25, 2016 oh my god. oh my fucking god. Can someone explain how the hell an AI came up with that shit on it's own? I'm fucking dying that is some of the funniest hsit I've ever read. The ngrs can't do taxes line had me ducking under my desk. Like nobody could know what i'm laughing at but i'm still duckin under the desk like i'm hiding from gods lightning bolts or something. Dear fucking lord that went bad quick. That fucking company steps in some shit almost bi monthly. It supposedly "learns the more you talk to it", it just learned how to be a racist white supremacist somehow lol
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 yeah "somehow".....like they couldn't see that coming lmfao. No exaggeration you know someone at MS had to sprint...not walk not jog not hastily make their way....full max sprint down a hallway yelling turn it off turn it off oh god we're all fired turn it off.
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 Those poor people at MS PR, they must be absolutely raging behind closed doors right now. I wonder if they even had to roust old Bill. Certainly Nadella got an early call this morning.....ummm our cute 19 year old AI is now making Archie Griffin come off like Ghandi. I can only imagine my own building level of rage if I was their CEO reading those tweets...each one worse than the next. Who thought that repeat after me game wasn't going to turn out all bad? I would fire the whole dept, in person.....i'd even help them box all their shit up.
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 This is hilarious. beyond hilarious. There might not be a proper English word for how funny this is.
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 If it was a person she would be the most amazing racist on the planet.
MLD Woody Posted March 25, 2016 Report Posted March 25, 2016 Well, it started off perfectly normal (originally based off of conversations across the web on twitter, facebook, etc). Then, it was supposed to learn as more and more people talked with it on Twitter. You could also say "Tay, repeat after me..." and then say something racist/sexist/whatever, and that probably sped the process up. But all this needs to do is get to 4chan or whatever and they'll run with it and break it haha
Clevfan4life Posted March 25, 2016 Report Posted March 25, 2016 4chan is the downfall of humanity i'm convinced. Like if you log into 4chan you should be investigated as having possible links to sex rings or whatnot. 4chan will be remembered as why we couldn't have nice things.
VaporTrail Posted March 25, 2016 Report Posted March 25, 2016 lol, 4chan got a hold of the patriots contest to be its millionth follower
bbedward Posted March 25, 2016 Report Posted March 25, 2016 4chan is the downfall of humanity i'm convinced. Like if you log into 4chan you should be investigated as having possible links to sex rings or whatnot. 4chan will be remembered as why we couldn't have nice things. I don't go on 4chan but the massive trolling efforts they do has brought me much joy.
Clevfan4life Posted March 26, 2016 Report Posted March 26, 2016 I don't go on 4chan but the massive trolling efforts they do has brought me much joy. probably couldn't argue with that.
Clevfan4life Posted March 26, 2016 Report Posted March 26, 2016 THis was the troll beautiful. Truly a masterpiece. They turned what was supposed to be an AI modeled after a 19 year old chick into something that slaps archie bunker out of his recliner like a silly bitch and then flips it over on him for good measure.
Clevfan4life Posted March 26, 2016 Report Posted March 26, 2016 No joke I was literally half ducking under my desk at work today while trying not to make too much sound. I was in awe of what I was reading.
Clevfan4life Posted March 27, 2016 Report Posted March 27, 2016 when is Tay coming back on line anybody know? I've never signed up for Twitter nor had any interest, but........well........this has me interested.
calfoxwc Posted March 27, 2016 Report Posted March 27, 2016 sometime in June, I hear. And only in a community in the southern section of the city of San Diego for a 6 month beta test.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.