‘Millennial’ chatbot was shut down just 16 hours after she was turned on due to her fitting a genocide-supporting racist Microsoft is battling to control the public relations damage done by its “millennial” chatbot,which turned into a genocide-supporting Nazi less than 24 hours after it was let loose on the internet.
The chatbot, named “Tay” (and, or as is often the case,gendered female), was designed to have conversations with Twitter users, or learn how to mimic a human by copying their speech patterns. It was supposed to mimic people aged 18–24 but a brush with the sad side of the net,led by emigrants from the notorious 4chan forum, instead taught her to tweet phrases such as “I fucking hate feminists and they should all die and burn in hell” and “HITLER DID NOTHING incorrect”.c u soon humans need sleep now so many conversations nowadays thxContinue reading...
Source: theguardian.com