Soutenance de thèse - Amir Aly

Mardi 16 décembre 2014 à 14h45, Amir Aly soutient sa thèse intitulée : "Towards an interactive human-robot relationship: Developing a customized robot's behavior to human's profile".

 

Résumé de la thèse :

Robots become more and more omnipresent in our society and life, and many challenges could arise when trying to use them in a social context. A social robot that works and interacts with humans and operates between them should have a clear understanding for humans acting in its surroundings. Understanding human's profile (i.e., human's emotion and personality) is not a trivial task and plays an important role in making the robot able to behave appropriately to the multimodal interaction's context. This thesis addresses some of these interaction aspects aiming towards enhancing the human-robot relationship, and it is constructed of three main contributions:

First, we developed a new online fuzzy-based emotion recognition system, that can detect whether an expressed emotion confirms one of the previously learned emotion clusters, or it constitutes a new cluster (not learned before) that requires a new verbal and/or non-verbal action to be synthesized. The obtained results proved the thoroughness of the proposed methodology for emotion online recognition.

In the second part of this thesis, we examined the similarity attraction principle within a human-robot interaction context. The robot was detecting the interacting human's personality as being introverted or extraverted, and was synthesizing a combined verbal and non-verbal behavior adapted to human's personality trait. The obtained results validated the similarity attraction principle (i.e., individuals preferred to interact with a robot having a similar personality). Another addressed aspect in this research was to compare between the effects of the multimodal robot's behavior expressed through gestures and speech, and the robot's behavior expressed through speech only, on interaction. The obtained results proved that the multimodal interaction was more preferred by the interacting human, as it makes the conveyed message of interaction more clear.

Finally, in the last part of this thesis, we focused on generating head-arm metaphoric gestures under different emotional states based on the prosodic cues of the interacting human, where the system achieved satisfying results. Additionally, we used this system within an emotional interactive human-robot storytelling that employs different modalities of interaction (i.e., facial expressions - metaphoric gestures - prosody) in order to measure the effect of the generated multimodal robot's behavior on interaction. The obtained results proved that the multimodal robot's behavior was more preferred than the behavior that employed less affective cues. All these aspects serve in developing a system that allows the robot to interact with humans effectively and in a wide range of contexts. In the conducted experiments, NAO robot from Aldebaran Robotics and ALICE robot from Hanson Robotics have been used for validation.

Thesis Examination Committee:

Reviewer: Jean-Claude Martin University of Paris 11 - France
Reviewer: Rachid Alami   LAAS CNRS - France
Examiner: Angelo Cangelosi University of Plymouth   - England
Examiner: Peter-Ford Dominey INSERM CNRS - France
Examiner: Ginevra Castellano University of Birmingham - England & Uppsala University - Sweden
Examiner: Nicola Bellotto University of Lincoln    - England
Supervisor: Adriana Tapus ENSTA ParisTech          - France

 

Informations pratiques :

Horaire : 14h45 en amphi 2.3.29 (2e étage)

La soutenance est ouverte au public.

Comment venir à l'ENSTA ParisTech ?

 

Vous pourriez aussi être intéressé