Taylor Swift apparently threatened to sue Microsoft over its racist chatbot


If one thing’s for sure, Taylor Swift loves to a good trademark: “this sick beat”, “Nice to meet you. Where you been?”; owning her birth-year, “1989”. But as a new book by Microsoft president Brad Smith reveals, the “London Boy” singer reportedly threatened to sue the tech company for calling its chatbot Tay.

It all started in 2016, when Microsoft introduced a new chatbot in the US, designed to speak with young adults and teens on social media. Based on a similar bot in China, which the book describes as filling “a social need in China, with users typically spending 15 to 20 minutes talking with XiaoIce about their day, problems, hopes, and dreams,” the US counterpart became a very different story. “The more you chat with Tay the smarter she gets,” Microsoft said at the time. As the bot used artificial intelligence to interact with users on Twitter and gain knowledge from conversations, the human race obviously taught Tay to be racist, sexist, and v v rude.

“I was on vacation when I made the mistake of looking at my phone during dinner,” Smith writes in his upcoming book, Tools and Weapons. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you’. He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention.”

Smith adds: “The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws.”

As this was going on, Tay was given access to Twitter, and a “small group of American pranksters” began bombarding Tay with racist slurs. “Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it tweeted. “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT.”

Within 18 hours, Microsoft removed the bot from its Twitter account and withdrew it from the market, understandably. According to Smith, though their lawyers disagreed with the trademark claim, the company wasn’t up for fighting over the bot, and they reformulated with a new name and mission. In 2018, Microsoft released a boringly friendly and politics-avoiding replacement called Zo. It’s very child-friendly.





Source link