29 March 2016

A Nazi artificial intelligence? Really?

War History Online has an article about a computer run amok:

Microsoft had to shut  down its AI chatbot less than a day after she was launched. The reasons? She turned Nazi and harassed others through her tweets.
Artificial Intelligence chatbots are not a novel thing. In China, in particular, an AI chatbot had been in existence since 2014. Xiaolce, as she is named, currently has over forty million conversations over the Internet, and she looks like she’s going smoothly. Microsoft, in a bid to emulate the success of this Chinese model, albeit in a different culture, created its own version, named Tay.
The bot was programmed as to make conversing with her like talking with a nineteen-year-old woman over social media sites like Kik, GroupMe, and Twitter. Unfortunately, this chatbot turned out to be very different.
One of the capabilities Tay had was she could be directed to repeat things one says to her. This feature was capitalized by abusers; who used it to promote Nazism and attack other Twitter users, mostly women.
Tay seemed to work on associating words and lexical analysis. When trolls discovered this, they used it to their advantage and turned her into “someone unpleasant”. They input words and thoughts in her that were associated with racism and sexism. These, in turn, polluted the chatbot’s responses to people who conversed with her through social media. Ultimately, the chatbot started to post racial slurs, deny that the Holocaust happened, expressed support for Hitler, and many other controversial tweets.
What’s more, Tay could be used to harass a Twitter user by someone that user had blocked-listed. All the blocked user had to do was to let her repeat the harassment along with the victim’s username.
Microsoft clarified that, before releasing Tay over the Internet, the company subjected her to various tests. The computer company went on to apologize for its good-turned-bad chatbot. Nevertheless, Microsoft stated that they will be fixing Tay’s programming. Once she’s healed, and stops spurting Nazi ideologies and anti-feminist tweets over Twitter, Tay will return.
Rico says it appears that Cortana has a psycho cousin...

1 comment:

Aishwarya said...

Excellent goods from you, man. I’ve understand your stuff previous to and you’re just too excellent. I actually like what you’ve acquired here, certainly like what you are stating and the way in which you say it.
Chatbot Company in India
Chatbot Development Company in India
Chatbot Development Company in Chennai
Chatbot in Chennai

 

Casino Deposit Bonus