Telepresence Robot for the Disabled Takes Directions from Brain Signals

Written by prodigitalweb

Experimental Telepresence Robot for the Severe Motor Disabled

An experimental telepresence robot has been developed by Italian and Swiss researchers for people with severe motor disabilities with new options of interacting with the work by using a robot controlled through brain signals. The user instructs the telepresence robot where to go through a brainwave-detecting headset and the telepresence robot tends to take care of the details such as evading obstacles and determines the best route ahead.

The robot is basically a laptop which is mounted on a rolling base where the user views the surroundings of the robot through the webcam of the laptop and can communicate with people through Skype. The user tends to wear a skullcap studded with electroencephalogram – EEG sensors, to move the robot, imagining movements with their feet or hands. Each of the movements tends to correspond with a different command like forwards, backward, left or right where the software does the translation of the different signals that are generated into actions for the robot.

The telepresence robot’s control software however tends to decide for itself the best method of changing trajectories and accelerates to go where it has been instructed to go. It comprises of nine infrared sensors which alerts it when faced with obstacles which it can move around while following the user’s instructions.


The Design Easy for Use

This design makes telepresence robot easy for use which provides a useful technique of giving disabled individuals more independence, according to research scientist at the Swiss Federal Institute of Technology in Lausanne Robert Leeb, who worked on the project. He commented on imagining an end-user lying in his bed at home connected to all the essential equipment to support his life. With such a telepresence robot, he can again participate in his family life.The robot had been tested by the researchers, by having people with and without motor disabilities to navigate it through various rooms with obstacles.

Both the groups had the capabilities of steering through the course in similar times. The uses of fewer commands were needed than when they seemed to control the robot totally and concluded the course quicker. Those without motor disabilities were also tested on the speed of manually navigating the robot without providing it any autonomy. It was observed that their times were only slightly shorter than when they had shared control and had navigated through brain-computer interface.

Brain-Computer Interface for Steering Wheelchairs – Moving Prosthetic Limbs

The results were published in a recent issue of the journal Proceedings of the IEEE, by the researchers. Researchers are now exploring the use of brain-computer interface for all purpose, from steering wheelchairs to moving prosthetic limbs. Versions which have been implanted in the brain could provide individuals with impressive control of robotic limbs though could be challenging to install as well as to maintain and also not widely utilised.

Simple systems seem available for home use like the Muse headband, for the purpose of mediation and Emotiv which has been designed for gaming. Non-invasive brain computer interfaces which tends to listen for EEG signals by just touching the scalp at various points could be less powerful though more practical.

Leeb states that his shared control system for robots would not be moving towards the market for some time. Most effort on commercializing EEG-based brain computer interfaces is aimed on low-cost single purpose devices, not the manufacture of high quality sensors which could be utilised in various applications.



About the author