Robin: ROBot-human INteraction

Contact: Zuhair Zafar

 

Further Information:

Description

The interaction between man and robot is often limited to input devices like keyboards and mice. Future applications of mobile robots will use natural interaction. Service robots for example should be able to help a human doing his housework. It is necessary to control such a robot without specific technical knowledge. Especially old people want to communicate in a natural way.

The communication between humans is not only based on speech. It is a complex summary of speech, gestures, mimics and emotional expressions. Therefore, it is necessary to observe the movements of a communication partner. These movements can be the expression of emotions with the help of the skin or a hand wave.

This research work focuses on efficient human-robot-interaction. The goal is to recognize human emotions and predict their intentions to interact in a natural human-like way. In this regard, a lot of work has been conducted by RRLAB on recognizing basic things like face, human detection, facial expressions, hand gestures (static and dynamic), head gestures, head poses and some higher level perception tasks like human feedback and intentions. A dialog system has been established which uses these basic modules and generates a scenario which helps the robot to interact with humans naturally. An important step towards natural communication is therefore the dynamic modeling of humans. This model enables the robot to interpret the movements of a communication partner and react adequately.

Images

Videos

Head Pose Estimation
Robot plays 'Hamlet'
Facial Expression Recognition
Dynamic Hand Gestures Recognition
Human Robot Interaction
Movie Schedule Information
Multiple Human Tracking
Static Hand Gesture Recognition
Disinterested Human Behavior
Weather Information
Human Posture Detection