SocSMCs: making robots interact without words

When we shake hands with someone or help them moving furniture, our body is calculating much more than we know, even though we are not consciously aware. The direction in which our hand is moving, how much force is needed, how we keep the balance meanwhile… These are automatic decisions that are daily taken by our body. And while it goes on, we don’t realise anything.

Despite its complexity, this is easy for people. Instead, with robots it is a whole different thing. Robots need to have all the calculations programmed to behave in a way we find “normal”, or “socially accepted”, even speaking of what we would say that is a very basic level of motion control. They don’t have any intuition nor social education as a basis for their actions. But such interactive behaviour is highly needed for robots if we want them to adapt and help in our life.

One of the EU research projects we are involved in, SocSMCs, is focused on studying socially relevant action-effect contingencies in human-human and human-robot interaction scenarios. With the goal of achieving a robotics technology that is socially competent, some of the SocSMCs tests are being done with PAL Robotics’ humanoid robot REEM-C.

First steps: a friendly and responsive handshake

 

One of the first things that has been developed with REEM-C for the SocSMCs is to enable the humanoid to give a proper handshake, grasping the other’s hand with a controlled force and following the other person’s movements.

The challenge is to create robots that work in human environments in a more natural way for people. The robot shall be able to understand the non-verbal commands that the human uses all the time and react accordingly.

What is REEM-C learning now?

SocSMCs has more plans for our humanoid robot – such as taking a group selfie or helping someone moving a piece of furniture without speaking. REEM-C is currently learning the steps to accomplish the latter one, which are:

  1. See and recognize the object (the furniture)
  2. Reach out to the object altogether with the person with an autonomous navigation
  3. Grasp the object and lift it coordinating with the person
  4. The robot will move the object without explicitly being told just “guessing the human intentions” using its sensors

Studying these cases will help when applying similar developments to other multiple situations in which a robot needs to have an assistive role of cooperation and human-robot interaction.

Related articles
Sarah Terreri on building the future of robotics at PAL Robotics
Read More

Sarah Terreri on building the future of robotics at PAL Robotics

Sarah Terreri, Head of Collaborative Projects at PAL Robotics, has played a key role in shaping the direction of our…
A man in a grey shirt and face mask is shaking hands with a humanoid robot named ARI in a modern lobby. The robot is white and orange, designed with a sleek, futuristic aesthetic. The background features a reception desk with the logo 'Inria' and a glass ceiling that lets in natural light.
Read More

Assistive Robots in Healthcare – Reflecting on the Success of the SPRING Project

As the EU funded project SPRING reaches the end, we, at PAL Robotics, are proud to reflect on our participation…
From Academia to Innovation: Jordi Pagès’ Journey at PAL Robotics
Read More

From Academia to Innovation: Jordi Pagès’ Journey at PAL Robotics

Meet Jordi Pagès, our colleague and the Manager of the Intralogistics Business Unit at PAL Robotics. Since joining the company…