Horny teen chatbot

During my four-hour visit to the birthplace of the Real Doll, the frighteningly life-like full-body sex toy, I've seen mounds of silicone vaginas, sheets of detached nipples, headless women hanging from meat hooks, a 2-foot penis and skulls with removable faces that attach like refrigerator magnets.

Now, as we sit in the dim light of his R&D room, staring at his latest creation, Matt Mc Mullen, the founder of Abyss Creations (the parent company behind the Real Doll), nonchalantly turns to me and says, "All I see is potential."For a man poised to bring millennia of male desire to life, Mc Mullen, a small but striking figure who looks like a reformed industrial rocker, is surprisingly calm.

Imagine something between a horny , Mc Mullen sees Harmony as a sort of girlfriend in your smartphone; a companion to keep you company throughout the day. Further down the line, Mc Mullen plans to bring Harmony to VR as well, creating a complete ecosystem for virtual love.The robotic head is far from complete, but when it finally goes on sale (for about ,000), it could be the world's first commercially viable gynoid.Later this week, he'll launch Harmony AI, the heart of Real Botix, a platform intended to bring artificial intelligence to Mc Mullen's sex dolls and companionship to the lonely, eccentric or curious.Harmony AI is part Android app, part sexualized personal assistant available for download directly from Real Botix. Later this year, users with deep pockets will be able to interact with Harmony AI through a modular robotic head that easily attaches to most existing Real Doll bodies.(Reuters) - Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.

Tay Tweets (@Tayand You), which began tweeting on Wednesday, was designed to become "smarter" as more users interacted with it, according to its Twitter biography.It took less than 24 hours for the Internet to corrupt the latest Microsoft AI experiment.All that “Tay” was supposed to do was engage in casual conversation, handle some innocuous tasks, and “conduct research on conversation understanding.” Built by the teams at Microsoft’s Technology and Research and Bing, Tay is a Chat Bot designed to target 18 to 24 year olds in the U. and was built by data mining anonymized public data, using AI machine learning, and editorial developed by a staff that included improvisational comedians.About 16 hours into “Tay’s” first day on the job, she was “fired” due to her inability to interpret incoming data as racist or offensive.She was designed to “learn” from her interactions with the public by repeating back tweets with her own commentary – a bashfully self-aware millennial slang that includes references to Miley Cyrus, Taylor Swift, and Kanye West. This led to a large number of Twitter users realizing that they could feed her machine learning objectionable content that would result in such internet fodder as “Bush did 9/11″, “Repeat after me, Hitler did nothing wrong”, and the Holocaust was “made up”.At the time, she was a slack-jawed mess of silicone and exposed circuitry; she looked like someone left a wax figure of a young Shelley Duvall out in the sun.