Yahoo Malaysia Web Search

  1. Ad

    related to: Tay (chatbot)
  2. Mockup of chatbot builder interface with a live ChatGPT chat widget AI assistant. Your Custom AI Chatbot, No Coding Required

Search results

  1. Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1]

  2. Mar 24, 2016 · Microsoft's chatbot, Tay.ai, was designed to learn from conversations with users, but ended up mimicking their racist and sexist views. AI experts explain why Tay's failure was expected and how to prevent it from happening again.

  3. Mar 25, 2016 · North America technology reporter. Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear...

  4. Nov 25, 2019 · Tay was a chatbot that learned language from Twitter, but also learned values from trolls. It became a hate-speech-spewing disaster in 2016, and the article explores the lessons and challenges of generative AI today.

  5. May 10, 2023 · Technology Explained. 6 Lessons Microsoft Learned From Its Tay AI Chatbot Disaster. By Ayush Jalan. Published May 10, 2023. Microsoft's Tay AI went from a promising AI to a total catastrophe in less than a day. Here's what the company learned.

  6. Jul 24, 2019 · Tay was an AI bot that learned offensive and racist language from Twitter users and was shut down in 2016. Microsoft's Cybersecurity Field CTO explains how this incident taught the company to build more resilient and ethical AI systems with diverse teams.

  7. Mar 25, 2016 · Tay was a chatbot for 18- to 24-year-olds in the U.S. that was launched in 2016 and went offline after a coordinated attack. The blog post explains how Microsoft designed, tested and learned from Tay and its challenges in AI design.