Taylor Swift ‘tried to sue’ Microsoft over racist chatbot Tay

Taylor Swift ‘tried to sue’ Microsoft over racist chatbot Tay

Taylor SwiftImage copyright
Getty Images

Taylor Swift tried to sue Microsoft over a chatbot which posted racist messages on Twitter, the president of the tech company has revealed.

Taylor’s lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith.

She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds online, because it was similar to hers.

If you don’t remember TayTweets, it’s the Twitter chatbot that turned racist.

What was TayTweets?

TayTweets was controlled by Artificial Intelligence and was designed to learn from conversations held on social media.

But shortly after Tay was launched, it tweeted to say it supported genocide and didn’t believe the holocaust happened – among other things.

Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter.

Media playback is unsupported on your device

Media captionDonald Trump and Taylor Swift’s political views do not align… as he sums up in this video

Taylor Swift’s legal action wasn’t about what the chatbot had said online, but instead about the similarity to her own name.

“I was on vacation when I made the mistake of looking at my phone during dinner,” Brad Smith writes in his new book, Tools and Weapons, reports the Guardian.

Image copyright
Getty Images

Image caption

Brad Smith is the president of Microsoft

“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“‘The name Tay, as I’m

Read More

Leave a reply