Turn me on

The question I get asked most often is this: do people really want to talk to a piece of code? If they know there isn’t a real person at the other end of the line, why should they care. Relationships and personal connection are the core of what we focus on in our product development. So this is an important question for me.

People with dementia enjoy cuddling robotic cats. I myself spent an inordinate amount of time at a conference the other day stroking an artificial seal baby (soooo cuddly!). But what happens when humans are not interacting with a cuddly toy but cold computer code?

Pretty much the same thing. According to media equation theory, people apply the same social standards when interacting with computers. Or robots for that matter. Interestingly, the more human like a robot acts, the harder it becomes for a human to switch it off. Horstmann et al.* performed a study. Study participants interacted with a robot. After the interaction was completed, the instructor told them: “if you would like to, you can switch off the robot.” At this point the robot implored the study participant not to turn him off. In one experimental setting, more than 50% of the participants did not switch the robot off.

So if they are not trying to pull a 2001 Space Odyssy on you, computers and code make pretty legit companions. Who’s human now?

Stay connected, stay healthy

Carol

P.S: Ariana is THE chatbot for the healthcare industry who supports when people become patients – get in touch to learn how she does it.

*Do a robot’s social skills and its objection discourage interactants from switching the robot off?

Aike C. Horstmann, Nikolai Bock, Eva Linhuber, Jessica M. Szczuka, Carolin Straßmann, Nicole C. Krämer, PLOS ONE, July 31, 2018

2019-05-24T09:57:56+00:00