Humanoid Robots

ROBIN

RRLAB SEA

 

In the field of humanoid robot research, biologically motivated control approaches for bipedal running are investigated as well as emotion-based interaction mechanisms using humanoid robot ROBIN.

 

Bipedal Locomotion

Bipedal robots after long years of development still far behind their biological model. Compared to the human, bipedal robots lack inherent agility of the musculoskeletal system. 

In Robotics Research Lab, we are currently testing the new compliant leg that inherits the muscular redundancy of the human leg.  Equipped with highly back-drivable and compliant Series Elastic Actuator (RRLAB-SEA). This leg demonstrates the characteristics of the human leg such as compliant muscles,end-point force magnification, low inertial profile, agility and less power consumption. 

Moreover, technical bipedal walking is still significantly restricted compared to human walking. The systems, mostly based on the conventional control theory, are not efficient enough, due to the lack of adaptability to the uneven and unstructured terrain. The approach used in the Robotics Research Lab examines the mapping of the principles of natural running to adequately controlled systems. It has been shown, that the reflexes and motor programs developed at the lowest level can be fused for each joint so that two-legged running behavior arises, although no foot or upper body movements are specified in advance. The control approach has been successfully tested on the example of a 1.80 m and 76 kg dynamically simulated robot, that moves in different application scenarios. Especially with respect to the adaptivity of rough terrain and possible disturbance, the developed control system exceeds approaches known from the literature significantly. More details can be found here.

 

Human-Robot Interaction

The second research area in the field of humanoid robots is human-machine interaction. The major goal of the research area is to realize a natural and efficient human-robot interaction using different modalities such as hand, arm, body gestures and facial expressions by the robot. On the other hand, by the detection of such modalities of a person, which interacts with the robot, human behavior can be analyzed and human intentions can be predicted. It has been demonstrated that the system can recognize gestures and emotions, and also the activity of interaction partners in the surrounding of the robot. Low level perception features like facial expressions, hand gestures, head gestures, posture detection, face recognition, gender recognition, eye gaze behavior and ethnicity detection using visual and depth information has already been implemented. For complex human behavior, high level perception group is implemented which uses the information of multiple low level features to recognize human feedback behavior, contextual information gathering and human personality detection.

For testing of human-robot interaction, ROBIN a humanoid robot has been used. It is equipped with a back-lit projected face, arms, hands and torso. It can speaks via its built-in speech synthesis module in English and German languages. The face makes use of projective technology to express almost any facial expression using set of action units. The upper body has 35 degrees of freedom including the intelligent hands which are able to perform nearly all the gestures. The system has been validated in different scenarios including 20 questions gaming scenario and other general interactive scenarios. When realizing human feedback behaviors, different low level percepts are used. Contextual information is stored for every human interacting with the robot and robot this information to interact more specifically to human interests. Another high level human behavior, human personality detection, is also considered. Psychological studies have shown that non-verbal cues have rich information about the person's personality. This aspect has also been exploited in this research area. More information related to current research and experiments can be found here.