Human-Robot Collaboration

We give humanoid robots the ability to help and collaborate with humans.

 


Why

In modern societies, the demand for physical assistance to humans is increasing. In factories, production workers execute repetitive tasks that, in the long run, often cause musculo-skeletal diseases. In clinics, orthopedic patients need orthoses and prostheses to overcome their daily deficiencies. At home, elderly people require a wide range of physical assistance to compensate for their muscles slowly loosing strength. We thus need robot collaborators that  perceive humans and correct inefficient collaboration and unergonomic interaction that lead, in the long term, to musculo-skeletal diseases.

 

Collaboration scenario


What 

Robots can fulfill the human need for physical assistance. Traditional robots, however, are designed to act for humans . But the aforementioned general need of human robot collaboration requires robots to act with humans in a shared workspace. In order to do so, robots that are nowadays proficient in physical interaction should become as proficient in physical collaboration. We need to develop safe dependable systems that are able to react, perceive and collaborate with human beings. To understand the bio-mechanics of human collaborative motion. To track, understand and predict human motion, in realtime, in dynamic environments. To integrate cognition technologies into human robot collaboration. To develop tools for intuitive collaboration that increase human performance. 

To pursue the above objectives, what we do is to attempt at answering the folloqing two research questions:

Q1: How can a robot help a human?

Q2: How can a human help a robot?

A fundamental concern here is to decode, from a mathematical perspective, what a human (or robot) help is. Some of our theoretical research effort goes along the direction of tackling and answering these open points and questions.


How

The above objectives on human-robot collaboration are pursued by researching along different directions.

Research on wearable sensors for force sensing

We work on wearable sensors that allow us to measure the external forces on human beings. The external forces are then used to estimate the musculoskeleton stressess during specific tasks.

Sandals with force-torque sensors

We have developed sandals equipped with home-made IIT force torque sensors from which the interaction forces between the human feet and floor can be precisely measured.

Collaboration scenario

Each force-torque sensor is also equipped with an IMU and two temperature sensors.

Sandals with tactile-sensor-based insole instead of force-torque sensors

To reduce the cost of a pair of sandal, it would be ideal to substitute the force-torque sensors with other cheaper sensors. Hence, along wih the iCub research line, we developed an insole able to measure the pressure distribution produced by the foot in contact with the sandal. 

The insole is an array of capacity based tactile sensors, and the above video shows the activation of the sensor arrays after the human foot exherts pressure on the sandal.

Research on the on-line estimation of human musculoskeletal stresses

The above wearable sensors are fundamental to retrieve the external forces acting on the human. These wearable sensors are then complemented with other wearable sensors in order to measure the human motion. We use the Xsens wearable sensors in order to measure the position-and-orientation of the human limbs, and then we apply home-made IIT on-line estimation algorithms to retrieve the human posture. 

By combining the human motion, the forces measured by the sensorised sandals, and the human model, we can also estimate the human musculoskeletal stresses. We have developed on-line maximum a posteriori probability (MAP) based algotirhms that estimate the human musculoskeletal stresses in any human configuration.

The above video visualises the outcome of our estimation algorithms forhuman musculoskeletal stresses. More precisely, the human is performing some random motions and the robot stays still. Then, the whiter the circles around the human avatar on the right hand side, the higher the estimated human musculoskeletal stresses. The yellow arrows at the avatar human feet represent the estimation of the forces between the human and the floor.

Research on the control of human-robot and physical interactions 

Using Lyapunov theory, this research attempts at answring the two aforementioned questions, i.e. 

Q1: How can a robot help a human?

Q2: How can a human help a robot?

In fact, the kinematics (i.e. position and velocity) and dynamics (i.e. musculoskeletal stresses and foot forces) of the human being can then be sent to the robot that has to move for achiving coordinated human-robot actions. For instance, the robot can stand up from a chair while being helped from a human, and can also minimise some human musculoskeletal stress during this action.

The video below shows exactly this interaction, where the humanoid robot iCub stands up from a chair while being helped from a human being. The robot uses all the kinematic and dynamic information coming from the sensorised human.