Robots in a group that can be trusted and understood
NEWS
Research about human-robot interaction mostly focuses on how single robots should behave when they interact with humans. Researchers at Umeå University are extending this to a study on groups of robots interacting with groups of humans. The results have been published in Paladyn - Journal of Behavioral Robotics, and is under review in Frontiers.
Text: Ingrid Söderbergh
Experimental setup of collaborating Pepper robots moving objects while explaining their own and others’ actions.
ImageAvinash Singh
The recent published paper has been recognized in SoftBank Robotic’s video “Best of 2020 with Pepper and NAO”. The video shows how three robots solve a joint task while verbally explaining how they distribute the work between each other.
In the future, robots are expected to appear in all parts of our society; in our homes, offices, industry, and public places. As they become more and more common, we will see not only one robot, but several robots, working together with one or several humans. This raises many new questions regarding how robots should be designed to be pleasant, safe, trustworthy, and efficient.
“Both humans and robots may have several different roles in such interactions,” says Suna Bensch, Associate Professor at the Department of Computing Science at Umeå University.
She is one of the researchers behind a recently published article, describing a team of robots collaborating with each other, verbally explaining what they do, and plan to do. Such explanations make it possible for humans to understand what the robots do and why they do it, such that interaction becomes safe and efficient.
Verbal explanations by these robot teams should be designed to be not only accurate, but also appropriately detailed. This is, for example, achieved by making the robots follow communication principles such as the “Gricean maxims”.
“A robot should not repeat what it already said or what other robots or humans already said. They should also not say irrelevant things and thereby disturb the interaction,” continues Suna Bensch, and adds that their study suggests that robots that have this kind of conversational skills are perceived as more natural by interacting humans.
“Our robots collaborate with each other also by taking into account their capabilities and limitations as well as detecting and removing obstacles in the environment while collaborating,” says Kai-Florian Richter, Associate Professor at the Department of Computing Science.
The work is part of a larger effort by Umeå researchers to investigate how robots should be made understandable or interpretable by interacting humans. Many robots still lack the ability to make their actions, intentions, capabilities and limitations interpretable by non-expert users, and this negatively affects both user experience, efficiency, and safety.
Screenshot of the developed tool for “Wizard-of-Oz” experiments that is generally available for researchers who want to conduct user studies with Pepper robots interacting with humans.
ImageAvinash Singh
One important part of this work is the recent development of a research tool to control several Pepper robots easily. The tool is distributed open-source and can be used also by scientists with non-technical background to conduct experiments with one or several robots.
“We now invite researchers to contact us if they want to conduct experiments to test their hypotheses or AI models. The tool enables researchers to study how robots are perceived by humans, using the so-called “Wizard-of-Oz” methodology. It allows researchers to simulate robot capabilities that do not yet exist, or are too unstable to be used, including usage of natural language, gestures, gaze, and body movements. This tool makes it possible to simulate how robot teams with advanced cognitive and social capabilities are perceived by interacting humans,” says Thomas Hellström, Professor at the Department of Computing Science and Umeå University.
Regarding the inclusion in SoftBank’s video, Suna Bensch laughingly remarks:
“I think we were included because we made our Pepper robots grab things with their fingers, something they are not designed to do! We may have voided the warranty, but are very happy to see our robots flashing by in the video.”
Collaborating Human-robot team comprising three Pepper robots and Avinash Kumar Singh, Thomas Hellström, Kai-Florian Richter, Suna Bensch and Neha Baranwal.