While it may still seem like a distant scenario, robots are getting closer and closer to being part of our daily lives in fields such as health, safety, education and leisure. Its use is being studied in child education, the elderly care, the street surveillance and how assistants in hospitals, where they have been of great help during the covid-19 pandemic in the provision of medicines and patient monitoring.
As the complexity of their interactions with people increases, machines need certain communication skills and abilities to express themselves and achieve the purpose for which they were designed. One of these properties, and one that has attracted the interest of many scientists and companies, is persuasion, which, in robots, refers to the ability to modify the attitude or behavior of humans.
As the complexity of their interactions with people increases, robots need certain communication skills and abilities to express themselves and achieve the purpose for which they were designed.
A recent study in the field of persuasive robotics, published in Science robotics, explores the attitude they should show to convince us and, more specifically, the effect produced by robots with functions of authority - such as that of a nurse, teacher or guard. But would we listen to a machine?
The heavy hand does not work
Social robots might need, for example, to convince a patient to take medication, to tidy up a classroom, or to give directions to workers in a work environment. "If you put a robot in a social context long enough, sooner or later a situation will arise in which persuasion will be inherent in the role it plays," the University of Toronto researcher explained to SINC Shane saunderson, co-author of this work.
In their study, Saunderson and his colleague Goldie Nejat did a series of experiments in which the humanoid robot Pepper It offered different incentives (prizes or penalties) to participants in exchange for carrying out a series of tasks. They wanted to know a little more about why we treat machines very similar to humans on many occasions, but not others.
In an experiment, researchers found that automata performed better when they displayed companionable and non-authoritarian behavior.
They found that the automatons achieved better results when they showed a companionate and non-authoritarian behavior. The latter caused rejection. One possible explanation is that people did not believe their authority was legitimate. Furthermore, they perceived a dominant attitude (which they described as "inhuman" and "sinister") and felt threatened.
In social psychology, almost a hundred persuasion strategies have been described, also known as obtaining compliance, that is, aimed at altering the behavior of others. To do this, tools such as logic, emotions, threats, rewards or authority can be used, the effectiveness of which is in turn influenced by non-verbal forms of expression such as gestures or body posture.
In research on communication The effect of many of these factors has been tested between robots and people. A study published in May this year in PLOS ONE found that the participants smiled and paid more attention to the demands of the humanoid machines when they touched them.
A team from Eindhoven University of Technology (The Netherlands) and Singapore combined two strategies, the looks and the gestures, to persuade a group of individuals to whom they told a story about the negative consequences of lying. They found that the message hit the hardest when the robot maintained eye contact, regardless of body language.
For Maria Malfaz, a researcher in social robotics at the Carlos III University of Madrid, "the emotional expression between a social robot and a human is as fundamental as in communication between humans." Machines can use facial expressions, voice or body gestures to convey emotions, but their repertoire is limited and sometimes they may not be easily recognizable.
In order for us to feel comfortable in the interaction with an artificial being, it is key that it is not threatening, that it is not very big or is too far from our idea of a person, a pet or a child
But, as this expert points out to SINC, "the main thing is aesthetics", which must be in harmony with the voice and movements. In order for us to feel comfortable interacting with an artificial being, it is crucial "that it is not threatening, that it is not very large and that it is not too far from our idea of a person, a pet or a child", depending on the situation. "It must be pleasing to the eye, but it will also depend on the user to whom it is directed", clarifies the researcher.
In the realm of persuasion, the look of the machine will also depend on your objective and the type of strategy you use. A robot the size of an insect would hardly be threatening, while it would be difficult to associate qualities such as tenderness or compassion with a Terminator-type.
One risk of machines resembling people is that they are subject to the same human biases. It is the result of what in social robotics has been baptized as the paradigm that computers are social agents or HOUSE (from the English computers are social actors), emerged from a study published in 2000 by Clifford Nass and Youngme Moon of Stanford and Harvard Universities, respectively. These researchers posited that people apply the same social norms and expectations to machines as they do to humans, even if they do not have feelings, intentions or motivations.
We judge female and male robots with the same stereotypes as men and women
- Researcher at the University of Toronto
It has been evidenced, above all, in the case of gender. "We judge female and male robots with the same stereotypes as men and women," says Saunderson. A study in which a security guard robot was used at a Singapore university confirmed that masculine characteristics were more accepted among students.
The authors of another research, published by the Institute of Electrical and Electronics Engineers, verified the gender influence of a humanoid machine that asked visitors for donations at the Boston Science Museum. Men were more likely to contribute money when they had the appearance of a woman and, in general, all people tended to consider the robot of the opposite sex the most trustworthy and attractive. It is no coincidence that many browsers and virtual assistants have a female voice by default.
Adaptive persuasion and AI
Although so far it has been applied in a limited way in the field of persuasion, the artificial intelligence (AI) could be a game changer. Saunderson is the author of another research, still under review, that explores the idea of adaptive persuasion, so that machines would learn from the results and improve their strategies. "It is expected that this type of system will be more convincing and effective because it will be able to adapt its behavior to the specific needs and context of each individual, as we would do", explains the researcher.
Thanks to artificial intelligence, robots would learn from the results and improve their strategies
In the future, in addition, the machines could use our data from social networks or purchases to have an effect on us. But if his ability to convince us were so effective, could it be considered a form of manipulation? Persuasive robotics research has been accompanied since its inception by studies that analyze its ethical implications. The line between what can be considered acceptable and what is not very fine, and it depends on the authors.
"Robots are programmed by people. If they do something, it is because the person who wrote the program wanted it to be," underlines Malfaz, who works developing decision-making systems for autonomous robots.
For Saunderson, all moral questions can be resolved transparently. It should be clear who operates a robot with persuasive qualities, what capabilities it has, and what strategies, data and technologies it uses to achieve its goal. Unfortunately, "the technology industry is not doing a great job maintaining a rigorous ethical practice, so regulation will be needed," warns the American researcher.
Beyond studies, persuasive robots are not yet with us. But in reality we are already surrounded by persuasive technologies thanks to the internet. Social networks and other companies collect user data to identify the best way to target them or to show them personalized ads. Persuasion can be used for good purposes, but also to help control the digital world.