Toggle light / dark theme

Racist, Hitler-loving bot unleashed

Oy veh!


Microsoft created an artificial intelligence called “Tay” that was designed to learn from back-and-forth interaction on Twitter. But after a mere 16 hours online, the bot had become so offensively racist that Microsoft was forced to take it down.

Read more

Leave a Comment

If you are already a member, you can use this form to update your payment info.

Lifeboat Foundation respects your privacy! Your email address will not be published.