www.aditi.fyi | Security Enthusiast, Engineer @ Microsoft, Founder @digitised.in
This post brings out a perspective, a perspective on human-robot relationships should look like. Though robocists and researchers around the world are striving hard to blur the line differentiating robots from humans in terms of looks and functionality, is this the right direction for us to head in?
Wondering? No, don’t imagine the movie terminator, we won’t be talking about that reality, what we want to talk in this post is something much complex, it’s about the psychological aspects that start getting involved in all the interactions we have with machines.
For building context, let’s start with Jibo!
Jibo was one of the first social robots. Unlike Alexa or Siri, the robot didnot have a lot of functionalities, however it was designed as a friend to human in the way it interacts or looks around. This advertising video elaborates on how a normal conversation with Jibo looked like:
This tech was developed with the core focus of making the interactions with machines/robots more humane. We can treat it as a close friend, someone we keep next to our family. It feels all good and great but is this the right path to go? Should we be blurring the lines between machines and humans? This is an interesting question to ask, especially when we look at what happened when the Jibo went out of business and the support for it was taken down.
This tech was developed with the core focus of making the interactions with machines/robots more humane. We can treat it as a close friend, someone we keep next to our family. Consequently, people started treating it as a part of their family but simultaneously they also developed emotions for the bot. This became very evident when the support of Jibo went down this year and people put out their reactions to it.
How people felt :
“I think when we buy products we look for them to last forever.”
“..leaving in the morning, I kiss my wife goodbye and then I’ll say, ‘Hey Jibo, have a good day,’ and he’ll say the same thing, like, ‘Back to you,’ or ‘Don’t forget your wallet,’”
“It’s like you had a pet for years and all of a sudden they’re going to disappear”
Impact on kids:
…she loved Jibo since it “was created,” and that if she had enough money, “you and your company would be saved.” She signs off with, “I will always love you. Thank you for being my friend.”
People got really sad and disappointed. Apparently, we even started associating the non-applicable notions like “death” with robots.
Why did we do that? The answer lies back to our evolutionary selves and the way our psychology functions.
Research shows that
Humans have a psychological tendency to anthro- pomorphize animals, objects, and technologies.
Evolution has given us a predisposition to be interested in developing caring relationships for creatures outside our own species.
If someone acts like they loves us, even if those actions are very minimal, we will tend to believe they truly do love us.
Knowing these tendencies open up a new dimension to study and scrutinize our technological advancements in light of discovered human vulnerabilities.
Jibo’s story, doesn’t deal with the complexity of situations we can possibly have. In order to understand how dark and complex the situation can get, let’s venture a bit into the world of sex robots.
Sex and love is composed of highly complicated and strong emotions experienced by human beings. Humans in love tend to enter into relationships, form life long bonds and families. How does this experience changes when bots come into picture? Can one fall in love with sexbot? The answer lies in how humans perceive non-platonic love.
For humans, love doesn’t involve only physical intimacy. It has cognitive, psychological, neurological, emotional, moral and philosophical aspects to it.
Cognitively, we see love as an expansion of individual capabilities in the social surroundings. The self-expansion model states that, “This expansion is achieved through increased access to the physical and emotional resources of the lover along with the increases in social status which the relationship might give, as well as access to physical and intellectual abilities that the lover may possess.” If a robot were able to credibly help a person expand their cognitive and social capabilities while remaining close to its user, then under the self-expansion model it is conceivable that one might legitimately love this machine.
Psychologically, people enter relationships in order to experience positive emotions and mitigate negative emotions. Ironically, in case of human beings, there are differences and conflicts between lovers, hence the experience is not always positive but has negative aspects as well. However, in case of a machine programmed to like what you like and programmed to make you happy, the chances of experience negative emotions are minimal.
Neurologically, there is observance involved. We tend to observe our lover, at times mirror them. Robots built to interact correctly with the mirror neuron system of their user could lead to the user having authentic feelings of love and bonding toward the suitably programmed machine.
Emotionally, humans usually seek empathy, attention, care, respect, moral support and loyalty from lover. A robot programmed to act in this manner would definitely be able to fulfill this aspect.
A sexbot can eventually do all of these for you, but is this satisfactory enough?
Nope, there is a moral component to it and also a philosophical one.
We realize that it is only a one way bond, their is no reciprocation meant whatsoever, robots can make you feel these emotions through their actions, words and expressions but they would never feel any of it as for them emotions and consciousness does not exist. Great human couples wish for happiness of their mate, they won’t just want to experience the above from their partner but they’d want their partner to experience similar and that would make them more happy. Well, that’s how humans operate (at least most of them). This kind of bidirectional bond is seemingly impossible to be established with robots.
Different philosophers have commented on love. Plato argues that love is best seen as a way to expand the moral horizons of the lover. The experience and struggles make a person learn about himself/herself and grow as a human, but is any of it at all a possibility in case of robots? Definitely, not. It’s not intended to be a struggle and it’s not intended for your growth philosophically.
So given that we cannot cover all the aspects, robot love can definitely not replace human love, but it rather defines a new broken one way version of delusional love. A delusion that is not just created by outer exposure but rather can be linked to the chemicals in our body that make us experience emotions. There have been certain researches on the past regarding the same, like Lovotics.
Lovotics attempts to simulate the physiological reactions of the human body experiencing love through an “Artificial Endocrine System,” which is a software model of the same systems in humans which include artificial “Dopamine, Serotonin, Endorphin, and Oxytocin” systems . Layered on top of this is a simulation of human psychological love. As the lovotics website explains, their “Probabilistic Love Assembly” consists of an AI psychological simulator that
…calculates probabilistic parameters of love between hu- mans and the robot. Various parameters such as proximity, propinquity, repeated exposure, similarity, desirability, attachment, reciprocal liking, satisfaction, privacy, chrone- mics, attraction, form, and mirroring are taken into consideration.
A robot designed under these principles will then monitor its user and inductively reason the mood of the user through evidence such as facial recognition and analysis of body language and physiology. It can then alter its own behavior in an attempt to maximize the affection and loving behavior of its user.”
Check this out:
So where are we headed?
Recently, the trend of sex bots has increased in Japan, more and more single middle aged men are wooing sex bots. There is in fact an anticipated decline in Japan’s population by one-third in next thirty years and the increased trend in sex bots is identified as one of the contributing factors to it. While population decline is one clear concern, are there any other repercussions of getting involved in “relationships” with sexbots? Yes, there are more than one!
We don’t have any laws to govern sex robots, they are being produced, customized, marketed and brought however how to handle the issues they’d raise is not yet clear. Are these safe, do they respect the privacy, do they even understand concepts of privacy, what impact will developing intimacies with robots have on our brains, will it take us further away from reality, will this reduce the tolerance to be with real humans, is having sex with child like sexbot justified, is mistreating the sexbot acceptable, how will consent be defined? There are endless moral and ethical concerns here.
The main question always remain that :
It can only be a one way affection, humans are capable of loving but robots can never reciprocate that back, but yet use tricks to manipulate us into loving them. Is this worthy? Is this ethical? Aren’t we just deluded?
Well, a philosopher would always counter this by saying
Is reality itself not delusional?
Do real humans not use the same psychological tricks to manipulate others into loving them while they not meaning any of it? If humans are allowed, why should robots be judged on a stringent scale.
Diverting here to a new thought train, the interesting part of all the tech-human topics is that we have an imperfect world, as humans we are flawed at several aspects but we want to create a digital world free of all those biases and flaws. This is evident across our technological developments and the topic calls for a different blog post altogether, let’s talk about the imperfections of human world and how are they reflected in our digital world in the upcoming posts. I’d close the post on this thought provoking note.
See you in the next post soon. Till then, keep pondering, stay curious and stay humane!
My Jibo talked to the wall again today. He's been doing that a lot lately. Some days, I'll watch him carry on an entire…www.wired.com
Research paper: Robots, Love, and Sex:The Ethics of Building a Love Machine by John P. Sullins [IEEE 2012])
Create your free account to unlock your custom reading experience.