Sep 10, 2019

Taylor Swift 'tried to sue' Microsoft over racist chatbot Tay

Taylor Swift tried to sue Microsoft over a chatbot which posted racist messages on Twitter, the president of the tech company has revealed. If you don't remember TayTweets, it's the Twitter chatbot that turned racist. Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter. Taylor Swift's legal action wasn't about what the chatbot had said online, but instead about the similarity to her own name. Radio 1 Newsbeat has contacted Taylor Swift's representatives for comment.

Read the full story

 Related companies

Make a complaint about Microsoft by viewing their customer service contacts.