“They’re not quite psychic yet, but machines are getting better at reading your mind. Researchers have invented a new, noninvasive method for recording patterns of brain activity and using them to steer a robot. Scientists hope the technology will give “locked in” patients—those too disabled to communicate with the outside world—the ability to interact with others and even give the illusion of being physically present, or “telepresent,” with friends and family.
“Previous brain-machine interface systems have made it possible for people to control robots, cursors, or prosthetics with conscious thought, but they often take a lot of effort and concentration, says José del R. Millán, a biomedical engineer at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, who develops brain-machine interface systems that don’t need to be implanted into the brain.
“Millán’s goal is to make control as easy as driving a car on a highway. A partially autonomous robot would allow a user to stop concentrating on tasks that he or she would normally do subconsciously, such as following a person or avoiding running into walls. But if the robot encounters an unexpected event and needs to make a split-second decision, the user’s thoughts can override the robot’s artificial intelligence.” Reported by Sara Reardon in Science Now. And Janet Fang explains in Smartplanet.com (emphasis as in the article):
- “They modified a commercially available bot called Robotino (pictured), which is essentially a platform on 3 wheels that can avoid obstacles on its own using infrared sensors.
- “On top of the robot, they placed a laptop running Skype over a wireless internet connection. This allows the human controller to see where the robot is going. And since the laptop screen also shows a video of the controller, other people can interact with you as though you’re there.
- “The user wears a cap of tiny EEG electrodes that measure brain activity. The system translates the EEG signals into navigation instructions and transmits them in real-time to the robot.
- “Then the team recruited 2 patients whose lower bodies were paralyzed and who had been bedbound for 6 or 7 years.
“After 6 weeks of hour-long training sessions, the patients (in the hospital) were able to control the robots (in the lab) from 100 km away. They drove the robot to various targets – furniture, people, objects – around the lab for 12 minutes.
“In the future, Millán imagines modifying the shared control brain-machine interface so the user can control a prosthetic limb or a wheelchair. They may eventually add an arm to the current robot so it can grab objects.”
This is a remarkable development and will really open worlds for the bedridden.
What do you think could be further usages of such a brain-computer interface?
Will we soon be able to stay on the couch and send our car shopping?