There is already a worldwide market for mobile robots, which are used in applications ranging from logistics to healthcare. Typically, they transport things from point A to point B, avoiding humans.
But in cases where robots do operate around humans, the algorithms often reach a point where they can’t decide what to do. In such cases, the robot just stops, blocking traffic flows.
To experiment with robots that interact more effectively with humans, the Innovation Fund Denmark launched a project to develop a prototypical robot called Smooth, an acronym derived from “Seamless huMan-robot interactiOn fOr THe support of elderly people”. A consortium of two universities, two businesses, two technological transfer institutions and one municipality began the work in 2017 and ended in 2021 with successful demonstrations.
The project to build the Smooth robot was driven by two goals. The first was to develop robots for the specific use case of care homes for the elderly that are short of staff. The second was to develop general mechanisms for robots to interact with humans.
Cognitive robots
“In the media, you frequently hear about the idea of humanoid robots, cognitive robots that understand what you feel and what you want,” said Norbert Krüger of the University of Southern Denmark. “Sometimes robots are even made to look humanoid, with arms and legs, and so on. But this is misleading. The reality is that we do not yet understand the underlying cognitive processes well enough. Such cognitive robots, in my view, are at least two decades away.
“In the meantime, when you build robots that look humanoid, with arms, legs, eyes and realistic features, people end up disappointed. They expect human-like reactions that just aren’t possible with the current state of the art.”
Krüger added: “On the one hand, the media gives us this picture of robots that look and act like humans. On the other hand, real mobile robots, which are deployed on a large scale, don’t interact with humans at all. Smooth is positioned somewhere in between these two extremes.
“Our goal was to develop a useful robot that can have mild interactions with humans – and by mild interactions, I don’t mean deep conversations. I mean something like understanding whether a person is interested in a cup of coffee. The robot should react appropriately in response to a small set of possible human reactions. But this should not be mistaken for cognition. It is just a repetitive task in a rather controlled scenario.”
The Smooth team developed the hardware and all other major parts, including computer vision, navigation and human interaction. They carefully decided to use only some human features – in particular, eyes. But they also made sure the robot had the appearance of a machine to avoid creating expectations they couldn’t fulfil.
The human-like features help – for example, by making contact when the robot offers a cup of coffee. The robot looks at the person’s eyes, which is a very basic mechanism for measuring whether the person is interested.
The robot also uses the gaze to help it navigate in ways that mimic human interactions. Before moving into someone’s personal space for example, it first makes eye contact to negotiate an agreement that coming into personal space is allowed in that specific context.
Movement through open spaces also requires social awareness. For example, when two people are on a collision course, each one indicates the direction they want to go – and each tries to predict what the other person wants to do. The Smooth project has developed technology that enables robots to do the same thing.
Like a person, the Smooth robot plans its movements based on these predictions, rather than relying only on where it is now. To do that, it needs to have an idea where the person will be in two seconds.
Socially aware navigation
Another tricky situation is when there is a group of people. Robots should not drive into the middle of the group, nor should they drive between two people who are communicating. Similarly, when someone is watching television, robots should not drive in front of the screen. “These are things that are not state of the art in current industrial technology,” said Krüger. “We call this ‘socially aware navigation’.”
The primary use case is to help elderly people, but not in a person’s home. For the robot to be cost-effective, it needs to be used a lot, and that is more likely in an institution or in a public place. “If you are at an airport, you could be offering a coffee, so two or three robots could be engaged in that the whole day,” said Krüger. “At home, offering a cup of coffee will maybe happen three times a day. So you don’t get a business case there.
“A logistic robot with a mild interaction component offering a cup of coffee, a mobile coffee vending machine in the airport or train station – these could be use cases. But it could also be, for example, transporting stuff from the canteen to a meeting place, and then having some clarifying dialogue at the end. Was the customer satisfied? Are any items missing?”
Krüger is confident that the work done on Smooth will be used in the near future. The fundamental building blocks will be used in industrial systems – especially the algorithms that observe humans and predict how they will move, and the software that perceives interest by making eye contact and detecting a human gaze.
He added: “While we are not likely to see humanoid robots or cognitive robots in the next two decades, the kind of robots we developed in the Smooth project will probably be used in the next two to five years in commercial applications. And at the same time, they might be seen as an important step towards truly cognitive robots.”