Robotics Research Lab

Robin

ROBot-human INteraction

Description

The interaction between man and robot is often limited to input devices like keyboards and mice. Future applications of mobile robots will use natural interaction. Service robots for example should be able to help a human doing his housework. It is necessary to control such a robot without specific technical knowledge. Especially old people want to communicate in a natural way.

The communication between humans is not only based on speech. It is a complex summary of speech, gestures, mimics and emotional expressions. Therefore, it is necessary to observe the movements of a communication partner. These movements can be the expression of emotions with the help of the skin or a hand wave.

This research work focuses on efficient human-robot-interaction. The goal is to recognize human emotions and predict their intentions to interact in a natural human-like way. In this regard, a lot of work has been conducted by RRLAB on recognizing basic things like face, human detection, facial expressions, hand gestures (static and dynamic), head gestures, head poses and some higher level perception tasks like human feedback and intentions. A dialog system has been established which uses these basic modules and generates a scenario which helps the robot to interact with humans naturally. An important step towards natural communication is therefore the dynamic modeling of humans. This model enables the robot to interpret the movements of a communication partner and react adequately.

Videos

Publications

Sort by: Author, Year, Title

  • Multimodal Fusion of Human Behavioural Traits. A Step Towards Emotionally Intelligent Human-Robot Interaction.
    Zuhair Zafar
    (2020)
    http://nbn-resolving.org/urn:nbn:de:hbz:386-kluedo-59800
  • Pseudo-Randomization in Automating Robot Behaviour during Human-Robot Interaction.
    Sarwar Paplu, Chinmaya Mishra and Karsten Berns
    Proceedings of the 10th IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-ER2020), S. 120 - 125. (2020)
  • Automatic Assessment of Human Personality Traits. A Step towards Intelligent Human-Robot Interaction.
    Zuhair Zafar, Sarwar Paplu and Karsten Berns
    2018 IEEE-RAS 18th International Conference on Humanoid Robotics (Humanoids), S. 670 - 675. (2018)
  • Emotion Based Human-Robot Interaction.
    Karsten Berns and Zuhair Zafar
    Proceedings of the 13th International Scientific-Technical Conference on Electromechanics and Robotics "Zavalishin’s Readings" (ER(ZR)), (2018)
  • Real-Time Recognition of Human Postures for Human-Robot Interaction.
    Zuhair Zafar, Rahul Venugopal and Karsten Berns
    Proceedings of the 11th International Conference on Advances in Computer-Human Interactions (ACHI), S. 114 - 119. (2018)
  • Real-time Recognition of Extroversion-Introversion Trait in Context of Human-Robot Interaction.
    Zuhair Zafar, Sarwar Paplu and Karsten Berns
    Advances in Service and Industrial Robotics, Vol. 67, S. 63 - 70. (2018)
    https://doi.org/10.1007/978-3-030-00232-9 ISBN: 978-3-030-00232-9
  • Ability of Humanoid Robot to Perform Emotional Body Gestures.
    Djordje Urukalo, Ljubinko Kevac, Zuhair Zafar, Salah Al-Darraji, Aleksandar Rodić and Karsten Berns
    Advances in Service and Industrial Robotics, Vol. 49, S. 657 - 664. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Human Robot Interaction using Dynamic Hand Gestures.
    Zuhair Zafar, Daniel Villarreal, Salah Al-Darraji, Djordje Urukalo, Berns Karsten and Aleksandar Rodić
    Advances in Service and Industrial Robotics, Vol. 49, S. 647 - 656. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Interactive Communication Between Human and Robot Using Nonverbal Cues.
    Salah Al-Darraji, Zuhair Zafar, Karsten Berns, Djordje Urukalo and Aleksandar Rodić
    Advances in Service and Industrial Robotics, Vol. 49, S. 673 - 680. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Action Unit Based Facial Expression Recognition Using Deep Learning.
    Salah Al-Darraji, Karsten Berns and Aleksandar Rodić
    Proceedings of the 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016, Vol. 540, S. 413 - 420. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Embodiment of Human Personality with EI-Robots by Mapping Behaviour Traits from Live-Model.
    Aleksandar Rodić, Djordje Urukalo, Milica Vujović, Sofija Spasojević, Marija Tomić, Karsten Berns, Salah Al-Darraji and Zuhair Zafar
    Advances in Intelligent Systems and Computing, Vol. 540, S. 437 - 448. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Perception of Nonverbal Cues for Human-Robot Interaction.
    Salah Al-Darraji
    (2016)
    http://www.dr.hut-verlag.de/9783843928526.html ISBN-13: 978-3-8439-2852-6
  • Real-time Perception of Nonverbal Human Feedback in a Gaming Scenario.
    Salah Al-Darraji, Zuhair Zafar and Karsten Berns
    Proceedings of the 2016 British HCI Conference, (2016)
  • Recognizing Hand Gestures Using Local Features. A Comparison Study.
    Zuhair Zafar, Karsten Berns and Aleksandar Rodić
    Advances in Intelligent Systems and Computing, Vol. 540, S. 394 - 401. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Recognizing Hand Gestures for Human-Robot Interaction.
    Zuhair Zafar and Karsten Berns
    Proceedings of the 9th International Conference on Advances in Computer-Human Interactions (ACHI), S. 333 - 338. (2016)
  • A Multimodal Nonverbal Human-Robot Communication System.
    Salah Saleh, Manish Sahu, Zuhair Zafar and Karsten Berns
    Proceedings of the 6th International Conference on Computational Bioengineering (ICCB), (2015)

 

Zum Seitenanfang