Scientists from the Universidad Carlos III de Madrid (UC3M) have published a paper featuring the results of research into interactions between robots and deaf people, in which they were able to programme a humanoid - called TEO - to communicate in sign language. For more information see the IDTechEx report on Mobile Robots and Drones in Material Handling and Logistics 2018-2038.
For a robot to be able to "learn" sign language, it is necessary to combine different areas of engineering such as artificial intelligence, neural networks and artificial vision, as well as underactuated robotic hands.
"One of the main new developments of this research is that we united two major areas of Robotics: complex systems (such as robotic hands) and social interaction and communication," explains Juan Víctores, one of the researchers from the Robotics Lab in the Department of Systems Engineering and Automation of the UC3M.
The first thing the scientists did as part of their research was to indicate, through a simulation, the specific position of each phalanx in order to depict particular signs from the Spanish Sign Language. They then attempted to reproduce this position with the robotic hand, trying to make the movements similar to those a human hand could make. "The objective is for them to be similar and, above all, natural. Various types of neural networks were tested to model this adaptation, and this allowed us to choose the one that could perform the gestures in a way that is comprehensible to people who communicate with sign language," the researchers explain.
Finally, the scientists verified that the system worked by interacting with potential end-users. "The deaf people who have been in contact with the robot have reported 80 percent satisfaction, so the response has been very positive," says another of the researchers from the Robotics Lab, Jennifer J. Gago. The experiments were carried out with TEO (Task Environment Operator), a humanoid robot for home use developed in the Robotics Lab of the UC3M.
To date, TEO has mastered the fingerspelling alphabet of sign language, as well as a very basic vocabulary related to household tasks, this researcher explains. One of the challenges the scientists now face in order to continue developing this system is "the rendering of more complex gestures, using complete sentences", says another member of the Robotics Lab team, Bartek Lukawski. The robot could then be used by the approximately 13,300 people in Spain who use sign language to communicate.
The broader objective is for robots of this type to become household assistants that are able to help with ironing (TEO also does this), folding clothes, serving food, and interacting with users in domestic environments. In addition, "these robotic hands could be implemented in other humanoids and they could be used in other environments and circumstances," says Jennifer J. Gago. "The really important thing is that all of these technologies, all of these developments that we contribute to, are geared towards including all members of society. It is a way of envisaging technology as an aid to inclusion, both of minorities and of majorities, within a democracy," Juan Víctores emphasises.
Source and top image: Carlos III University of Madrid (UC3M)
Learn more at the next leading event on the topic: Sensors USA 2019 on 20 - 21 Nov 2019 at Santa Clara Convention Center, CA, USA hosted by IDTechEx.