Skip to Main content Skip to Navigation

Bi-lateral Interaction Between Humanoid Robots And Human

Abstract : In this thesis, we address the issue of recognizing human body language in order to establish a bi-lateral interaction human-robot and robot-robot. New contributions have been made to this research. Our approach is founded on the identification of human gestures based on a motion analysis method that accurately describes motions. This thesis is divided into two parts: gesture recognition and emotion recognition based on the body gestures. In these two parts, we utilize two methods : classical Machine Learning and Deep Learning. In the Gesture Recognition section, we first define a local descriptor based on the Laban Movement Analysis (LMA) to describe the movements. LMA is a method that uses four components to describe a movement: Body, Space, Shape and Effort. Since the only goal in this part is gesture recognition, only the first three factors are utilized. The Dynamic Time Warping (DTW) algorithm is implemented to find the similarities of the curves obtained from the descriptor vectors obtained by the LMA method. Finally, the Support Vector Machine, SVM, algorithm is utilized to train and classify the data. Thanks to normalization process, our system is invariant to the initial positions and orientations of people. By the use of Spline functions, the data are sampled in order to reduce the size of our descriptor and also to adapt the data to the classification methods. Several experiments are performed using public data sets. In the second part of first section, we construct a new descriptor based on the geometric coordinates of different parts of the body in order to characterize a movement. To do this, in addition to the distances between hip center and other joints of the body and the angular changes, we define the triangles formed by the different parts of the body and calculated their area. We also calculate the area of the convex hell encompassing all the joints of the body. At the end we add the velocity of different joints in the proposed descriptor. We used a long short-term memory (LSTM) network to evaluate this descriptor. The proposed algorithm is implemented on two public data sets, NTU RGB+D 120 and SYSU 3D HOI data sets, and the results are compared with those available in the literature. In the second section of this thesis, we first present a higher level algorithm to identify the inner feelings of human beings by observing their body movements. In order to define a robust descriptor, two methods are carried out: the first method is the LMA with the "Effort" factor, which describes a movement and the state in which it was performed. The second one is based on a set of spatio-temporal features. In the continuation of this section, a pipeline of expressive motions recognition is proposed in order to classify the emotions of people through their gestures by the use of machine learning methods (Random Decision Forest, Feed forward Neural Network). A comparative study is made between these two methods in order to choose the best one. The approach is validated with public data sets and our own data set of expressive gestures called Xsens Expressive Motion (XEM). In a second part of this section, we carry out of a statistical study based on human perception in order to evaluate the recognition system as well as the proposed motion descriptor. This allows us to estimate the capacity of our system to be able to classify and analyze human emotions. In this part two tasks are carried out with the two classifiers (the RDF for learning and the human approach for validation).
Complete list of metadata
Contributor : Frédéric Davesne <>
Submitted on : Monday, January 25, 2021 - 2:56:06 PM
Last modification on : Wednesday, March 17, 2021 - 7:26:28 PM


Files produced by the author(s)


  • HAL Id : tel-03120401, version 1


Zahra Ramezanpanah. Bi-lateral Interaction Between Humanoid Robots And Human. Signal and Image processing. Université Paris-Saclay, Université d'Evry Val-d'Essonne, 2020. English. ⟨NNT : 2020UPASG039⟩. ⟨tel-03120401v1⟩



Record views


Files downloads