The Brain-Computer Interface in 2015: Are We Telepathic Yet?

To make a robotic appendage easier to manipulate, it needs to provide feedback to the user.

Stepping a year back to 2011, VIDEO 1 min. 27 sec. – In this video, a monkey moves a virtual arm on a computer screen, using a brain-computer interface, and selects objects according to sensory clues sent back to his brain. This clip was shown by Dr. Nicholesi in his TED talk, presented on the next page of this article.

Entitled the “Move and Feel” experiment, this experiment in October, 2011, at Duke University, was the first demonstration of a movement of a robotic arm after the subject received sensory feedback. Now, instead of brain-machine interface (BMI/BCI) it was described as brain-machine-brain interface (BMBI), as described below:

DURHAM, NC – In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and, upon contact, were able to differentiate their textures.

Although the virtual objects employed in this study were visually identical, they were designed to have different artificial textures that could only be detected if the animals explored them with virtual hands controlled directly by their brain’s electrical activity.

The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkeys’ brains. Three different electrical patterns corresponded to each of three different object textures.

Because no part of the animal’s real body was involved in the operation of this brain-machine-brain interface, these experiments suggest that in the future patients severely paralyzed due to a spinal cord lesion may take advantage of this technology, not only to regain mobility, but also to have their sense of touch restored, said Nicolelis, who was senior author of the study published in the journal Nature on October 5, 2011.

“This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body,” Nicolelis said

To see the full article, visit: full article.

To hear a summary of the foregoing research, beginning with listening to a brain storm, visit the next page to listen to part or all of Professor Miguel Nicolelis’ TED (Technology, Entertainment, and Design) talk filmed at TEDMED 2012, and published February 18, 2013.

Renee Leech
Renee Leech is an Education Copywriter on a mission to fight shallow reader experiences. She writes articles, B2C long form sales letters and B2B copy with tutorial value.

Advertisement

No comments.

Leave a Reply