Friday, March 25, 2016

Microsoft's Brief Experiment With Tay the Teen Chatbot Is the Reason We Can't Have Nice Things


Microsoft released a chatbot aimed at 18-24 year olds with the ability to increase her responses by learning from what people say to her. Unfortunately, like many teens, Tay the chatbot had no filters and was able to rapidly spin out of control. Within 24 hours, she was shut down due to the horrible, racist things she was spitting out in all caps. Thankfully, the internet did not let her go without some documentation!

Submitted by:

No comments:

Post a Comment