Tay (chatbot)

Tay
Developer(s)Microsoft Research, Bing
Available inEnglish
TypeArtificial intelligence chatbot
LicenseProprietary
Websitehttps://tay.ai at the Wayback Machine (archived 2016-03-23)

Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch.[1] According to Microsoft, this was caused by trolls who "attacked" the service as the bot made replies based on its interactions with people on Twitter.[2] It was replaced with Zo.

  1. ^ Wakefield, Jane (March 24, 2016). "Microsoft chatbot is taught to swear on Twitter". BBC News. Archived from the original on April 17, 2019. Retrieved March 25, 2016.
  2. ^ Mason, Paul (March 29, 2016). "The racist hijacking of Microsoft's chatbot shows how the internet teems with hate". The Guardian. Archived from the original on June 12, 2018. Retrieved September 11, 2021.