“donald trump is the only hope we've got.” Another tweet praised Hitler and claimed that the account hated the Jews.Those widely-publicised and offensive tweets appear to have led the account to be shut down, while Microsoft looks to improve the account to make it less likely to engage in racism.In the end, it was perhaps not unexpected that the scourge of malevolent artificial intelligence should be thrust upon humanity by Twitter.It all started innocently enough on Tuesday, when Microsoft introduced an AI Twitter account simulating a teenage millennial girl.Other tweets from Tay claimed that the Holocaust “was made up” and that it supported the genocide of Mexicans.Another called game developer Zoe Quinn “a stupid whore” while several others expressed hatred for “n*****s” and “k***s.” Still others invited users to sexual encounters, while calling herself “a naughty robot.” The company was forced to quickly pause the account and delete the vast majority of its tweets.UPDATE: Welp, THIS is why we can’t have nice things: Less than 24 hours after Tay launched — to my enormous and genuine excitement, I might add — Microsoft has temporarily suspended the bot for being racist, conspiratorial and otherwise … Among other things, the bot wrote (in now deleted tweets), that “Bush did 9/11,” the Holocaust “was made up [clapping hands emoji],” and that various minorities should be put “in a concentration camp.” ——— Tay.ai, the coolest chatbot since Smarter Child, is “so fricken excited” to talk to you.That’s because she’s engineered to talk like a teenager — and does a pretty convincing job of it, too.
It seems to use artificial intelligence to watch what is being tweeted at it and then push that back into the world in the form of new tweets.
(There have been more than 10 of them.) In China, this sort of data mining has raised privacy concerns, particularly given that many users report having intimate conversations with Xiaoice.
But it’s also made her an eerily convincing conversation partner, with her own distinctly teenage personality, mood swings and comedic voice.
That bot, named Xiaoice, pulls from the vast data troves indexed by Microsoft’s Bing search engine, mining it for human conversations and looking for patterns to model her own conversations on.
Xiaoice also adds each new conversation to the deep-learning database that she draws on.