Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
September 12, 2019 01:00 am

Taylor Swift Reportedly Threatened To Sue Microsoft Over Racist Twitter Bot

When an artificially intelligent chatbot that used Twitter to learn how to talk unsurprisingly turned into a bigot bot, Taylor Swift reportedly threatened legal action because the bot's name was Tay. Microsoft would probably rather forget the experiment where Twitter trolls took advantage of the chatbot's programming and taught it to be racist in 2016, but a new book is sharing unreleased details that show Microsoft had more to worry about than just the bot's racist remarks. Digital Trends reports: Tay was a social media chatbot geared toward teens first launched in China before adapting the three-letter moniker when moving to the U.S. The bot, however, was programmed to learn how to talk based on Twitter conversations. In less than a day, the automatic responses the chatbot tweeted had Tay siding with Hitler, promoting genocide, and just generally hating everybody. Microsoft immediately removed the account and apologized. When the bot was reprogrammed, Tay was relaunched as Zo. But in the book Tools and Weapons by Microsoft president Brad Smith and Carol Ann Browne, Microsoft's communications director, the executives have finally revealed why -- another Tay, Taylor Swift. According to The Guardian, the singer's lawyer threatened legal action over the chatbot's name before the bot broke bad. The singer claimed the name violated both federal and state laws. Rather than get in a legal battle with the singer, Smith writes, the company instead started considering new names.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/7-zbNaPkpMU/taylor-swift-reportedly-threatened-to-sue-microsoft-over-racist-twitter-bot

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot