A Microsoft experiment to create a robotic teenage girl and unleash it on the Internet went haywire on Thursday — when the online chatbot morphed into a racist, Hitler-loving, sex-crazed conspiracy theorist.
The creation, called Tay, was designed as a “playful” teen girl with whom to chat online — but within hours, “she” started praising Hitler and asking to be satisfied sexually.
Believing this to be the way humans speak, the program simply spat their messages back out, and the Twitter account was quickly locked after Tay praised Hitler, used racial slurs, and boasted about smoking cannabis in front of the police.
Microsoft hopes that future products built with their software will be more successful.
The framework was announced at the opening event of the Build 2016 developer conference, where Microsoft CEO Satya Nadella heralded bots as the next big platform to be tackled by tech companies.
Tay, a Twitter chatbot designed to mimic the online behaviour of a teenage girl, was unveiled last week as a demonstration of what the company's new technology could do.
The Tay Tweets account, which was meant to mimic the language habits of a social media-frequenting millennial, arrived on Twitter with the ability to learn from interactions with other members of the Twittervierse.
The release is part of the company's new Bot Framework, a toolkit which can be used to create conversational chat programs for a variety of tasks.
The Bot Framework is comprised of three parts, which allow developers to connect their conversational programs to applications like Skype, email and embeddable chat windows.
Rather than interacting with websites by navigating windows and clicking buttons, Microsoft sees users simply speaking to a bot in natural language, letting it complete tasks for them.
Sample tweets from it proclaimed that "Hitler did nothing wrong!
", then went on to blame former President Bush for 9/11, stated that "donald trump is the only hope we've got", and other similar instances.