The Duke University team moved a robotic arm in 2000
Scientists at Duke University, led by Miguel Nicolelis, working on BCI/BMI (brain computer interface/brain machine interface) technology, saw their research come to fruition, when their experimental subject, an owl monkey, moved a robotic arm 6-700 miles away at MIT, over the internet. A press release by Elizabeth A. Thomson, MIT News Office, on December 6, 2000, announced details of the breakthrough in neuroscience:
Monkeys in North Carolina have remotely operated a robotic arm 600 miles away in MIT’s Touch Lab — using their brain signals.
The feat is based on a neural-recording system reported in the November 16 issue of Nature. In that system, tiny electrodes implanted in the animals’ brains detected their brain signals as they controlled a robot arm to reach for a piece of food.
According to the scientists from Duke University Medical Center, MIT and the State University of New York (SUNY) Health Science Center, the new system could form the basis for a brain-machine interface that would allow paralyzed patients to control the movement of prosthetic limbs.
The Internet experiment “was a historic moment, the start of something totally new,” Mandayam Srinivasan, director of MIT’s Touch Lab, said in a November 15 story in the Wall Street Journal.
The work also supports new thinking about how the brain encodes information, by spreading it across large populations of neurons and by rapidly adapting to new circumstances.
In the Nature paper, the scientists described how they tested their system on two owl monkeys, implanting arrays of as many as 96 electrodes, each less than the diameter of a human hair, into the monkeys’ brains.
The technique they used allows large numbers of single neurons to be recorded separately, then combines their information using a computer coding algorithm. The scientists implanted the electrodes in multiple regions of the brain’s cortex, including the motor cortex from which movement is controlled. They then recorded the output of these electrodes as the animals learned reaching tasks, including reaching for small pieces of food.
Analyzing brain signals
To determine whether it was possible to predict the trajectory of monkeys’ hands from the signals, the scientists fed the mass of neural signal data generated during many repetitions of these tasks into a computer, which analyzed the brain signals. In this analysis, the scientists used simple mathematical methods and artificial neural networks to predict hand trajectories in real time as the monkeys learned to make different types of hand movements.
“We found two amazing things,” said Miguel Nicolelis, associate professor of neurobiology at Duke. “One is that the brain signals denoting hand trajectory show up simultaneously in all the cortical areas we measured. This finding has important implications for the theory of brain coding, which holds that information about trajectory is distributed really over large territories in each of these areas even though the information is slightly different in each area.
“The second remarkable finding is that the functional unit in such processing does not seem to be a single neuron,” Professor Nicolelis said. “Even the best single-neuron predictor in our samples still could not perform as well as an analysis of a population of neurons. So this provides further support to the idea that the brain very likely relies on huge populations of neurons distributed across many areas in a dynamic way to encode behavior.”
To see the original article which describes more scientific detail, click this link.
In the forefront of brain-computer interface research are Duke University, Durham, North Carolina; the University of Pittsburgh, Pennsylvania; Brown University in Providence, Rhode Island; and Washington University in St. Louis, Missouri.
To see how brain-computer interface research progressed at these 5 universities in the U. S., continue on through this article.