SocSMCs: making robots interact without words

SocSMCs: making robots interact without words

When we shake hands with someone or help them moving furniture, our body is calculating much more than we know, even though we are not consciously aware. The direction in which our hand is moving, how much force is needed, how we keep the balance meanwhile… These are automatic decisions that are daily taken by our body. And while it goes on, we don’t realise anything.

Despite its complexity, this is easy for people. Instead, with robots it is a whole different thing. Robots need to have all the calculations programmed to behave in a way we find “normal”, or “socially accepted”, even speaking of what we would say that is a very basic level of motion control. They don’t have any intuition nor social education as a basis for their actions. But such interactive behaviour is highly needed for robots if we want them to adapt and help in our life.

One of the EU research projects we are involved in, SocSMCs, is focused on studying socially relevant action-effect contingencies in human-human and human-robot interaction scenarios. With the goal of achieving a robotics technology that is socially competent, some of the SocSMCs tests are being done with PAL Robotics’ humanoid robot REEM-C.

First steps: a friendly and responsive handshake


One of the first things that has been developed with REEM-C for the SocSMCs is to enable the humanoid to give a proper handshake, grasping the other’s hand with a controlled force and following the other person’s movements.

The challenge is to create robots that work in human environments in a more natural way for people. The robot shall be able to understand the non-verbal commands that the human uses all the time and react accordingly.

What is REEM-C learning now?

SocSMCs has more plans for our humanoid robot – such as taking a group selfie or helping someone moving a piece of furniture without speaking. REEM-C is currently learning the steps to accomplish the latter one, which are:

  1. See and recognize the object (the furniture)
  2. Reach out to the object altogether with the person with an autonomous navigation
  3. Grasp the object and lift it coordinating with the person
  4. The robot will move the object without explicitly being told just “guessing the human intentions” using its sensors

Studying these cases will help when applying similar developments to other multiple situations in which a robot needs to have an assistive role of cooperation and human-robot interaction.

Related articles
SAFE-LY Project: Improving Patient Safety with Healthcare Robotics
Read More

SAFE-LY Project: Improving Patient Safety with Healthcare Robotics

The healthcare sector is increasingly recognising the importance of robotics in enhancing patient safety and care. With the global healthcare…
PROCARED blog by PAL Robotics
Read More

PRO-CARED pilots: robot ARI as education robot helping students with Catalan language

Integrating robots into education brings a number of potential benefits to the sector, such as enhancing the learning experience for…
Banner for the European Robotics Forum (ERF) 2024 event, highlighting the theme 'Let's talk about collaboration' set against a dark blue background. The event is scheduled from 13-15 March in Rimini, Italy, with the logo of EU Robotics and a decorative pattern of stars in varying sizes and colors. Text also mentions PAL as the gold sponsor of the event.
Read More

European Robotics Forum – ERF2024

The most prominent conference in the European robotics community, the European Robotics Forum, is back for its 15th edition, taking…