Microsoft's "Tay" social media "AI" experiment has gone awry in a turn of events that will shock absolutely nobody.The Redmond chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket.On Wednesday, Microsoft launched a new artificial intelligence chatbot, named Tay, created to "Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation.The more you chat with Tay the smarter she gets, so the experience can be more personalized for you. Since the steep rise of available hardware and software platforms lately, nowadays chatbots are available everywhere. Nevertheless, first of all you have to choose between a stand-alone chatbot application, and a web-based chatbot.
This could be a text based (typed) conversation, a spoken conversation or even a non-verbal conversation. The “web-based” solution, which runs on a remote server, is generally able to be reached by the general public through a web page.
The intent was for "Tay" to develop the ability to sustain conversations with humans on social networks just as a regular person could, and learn from the experience. Unfortunately, Microsoft neglected to account for the fact that one of the favorite pastimes on the internet is ruining other people's plans with horrific consequences.
Miscreants were able to find a debugging command phrase – "repeat after me" – that can be used to teach the bot new responses.
In order for you to be able to alter the responses your chatbot give to any given input, you need to have at least a basic understanding of how AIML works.
If you would like to connect with many other people who did this before, please check our AI Zone forum.