Trabajo de grado - Pregrado
Desarrollo de un sistema de interacción humano-humanoide mediante el reconocimiento y aprendizaje del lenguaje corporal
Fecha
2018-03-13Autor
González López, Francisco Javier
Institución
Resumen
At present the advances in the different fields that integrate the robotics, have
allowed to be promoted in various areas, even outside the environment
industrial, constituting new concepts for it, such as the
social and cognitive robotics, which in their simplest synthesis seek
development of autonomous devices that are companions for beings
human beings and can enrich the daily life of people, as well as allow
improvement in the quality of life of these. And it is so, that the development of robots
humanoids intended for work such as assistance to the sick and people
elderly, and attention to the public in various environments such as airports, hotels,
shopping centers, etc. It has opened the doors to new fields of research
ranging from the design of the same, to the development of techniques
interaction with humans.
This project proposes the development of a system that integrates functionalities that
complement the communication process between human beings and
humanoid robots, improving human-humanoid interaction, through the
implementation of machine learning algorithms, specifically networks
artificial neurons, arranged to execute the recognition of the state of
encouragement of people (Happy, Sad, Angry, Surprised, Reflective and Normal)
using non-verbal language expressed with body language; as well as
the ability to "teach" the humanoid robot nonverbal language with which this
can complement verbal messages and interact in a manner consistent with
the mood of human beings, and that this can generate its own
body gestures.
The developed system includes a function based on multilayer neural networks
(MLP) used to perform the recognition of mood and classification
of the same in the established categories, through the expressed body language
using with angular values that describe the orientation of the different
body joints, obtained using the Kinect Version 2 sensor, developed
by Microsoft. It also includes a function with which you can use a
conversation between the user and the Pepper robot, using a system of
integrated speech recognition with a response generation tool
created on the basis of recurrent neural networks (RNN); which, in turn,
taking into account the generated response determines a physical behavior
suitable with which humanoid robot Pepper, developed by Aldebaran and
Softbank Robotics, can complement the verbal message.
A series of movement sequences were developed, using a technique of
which the humanoid robot is programmed by imitation, which indicate
specific behaviors associated with the different moods that
can recognize the system and consistent with the meaning of the conversation.
Additionally, a Generative Adverse Network (GAN) model was used, through
from which, exploiting its generative functionality, it was possible to create sequences of
movement from the sequences generated with the imitation tool,
original and different that allow to generate the sensation of naturalness in the
execution of the physical behavior reproduced by the robot