Online mistress chatbot

[email protected]#%&*# hate feminists and they should all die and burn in hell. Bush did 9/11 and Hitler would have done a better job...

Some people believe that as humans we have become inured to bad behavior. Maybe it's the digital, always-on world we live in...

We won't understand them minds or bodies and we will survive only by accident (War of the Worlds, Alien) or through Promethean cunning (Footfall, Independence Day).

Discover the best digital projects and content trends around the world!

Yet, what happens when a seemingly nonsentient, nonorganic being -- a creation of artificial intelligence (AI), our creature, a benign animation targeted towards a younger crowd, jumps the tracks. What do you do when parlor trick, Jeopardy, chess and Go high-tech creations start spewing racist, misogynist, hateful language?

Hateful language like that which I quoted above which came from a Microsoft AI-based chat bot named Tay.

You do as Peter Lee, Corporate Vice President, Microsoft Research, did -- take it offline and post apologies quickly "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay..." NOR HOW WE DESIGNED TAY...

Who do we share ourselves with and what exactly are we sharing?

Hollywood aside seem to imagine that AIs will be more or less like us because we aim to make them like us.

And as part of that we will make them with affection for, or at least obedience to us.

Add our blog to your news feed reader here, follow us on Twitter or like us on Facebook to stay up to date.

We also are also guest writers on international news services such as Found Remote (The Drum), C21Media and the MIP Blog.


Leave a Reply