Microsoft Created Another Offensive Chatbot

A year ago, Microsoft created the chatbot “Tay” and it got exposed on Twitter as being super offensive and racist. Now, the new bot “Zo” is saying the Qu’ran is “violent” and giving some hot takes on Bin Laden. Who knew the first robots to turn on their creators would just become internet trolls!?

Read more at engadget.com.

Read More Stories From the IB Wire

.

.