One promising approach involves circumventing neuronal damage by establishing connections between healthy areas of the brain and virtual devices, called brain-machine interfaces (BMIs), programmed to transform neural impulses into signals that can control a robotic device. While experiments have shown that animals using these artificial actuators can learn to adjust their brain activity to move robot arms, many issues remain unresolved, including what type of brain signal would provide the most appropriate inputs to program these machines.
As they report in this paper, Miguel Nicolelis and colleagues have helped clarify some of the fundamental issues surrounding the programming and use of BMIs. Presenting results from a series of long-term studies in monkeys, they demonstrate that the same set of brain cells can control two distinct movements, the reaching and grasping of a robotic arm. This finding has important practical implications for spinal-cord patients--if different cells can perform the same functions, then surgeons have far more flexibility in how and where they can introduce electrodes or other functional enhancements into the brain. The researchers also show how monkeys learn to manipulate a robotic arm using a BMI. And they suggest how to compensate for delays and other limitations inherent in robotic devices to improve performance.
By charting the relationship between neural signals and motor movements, Nicolelis et al. demonstrate how BMIs can work with healthy neural areas to reconfigure the brain's motor command neuronal elements and help restore intentional movement. These findings, they say, suggest that such artificial models of arm dynamics could one day be used to retrain the brain of a patient with paralysis, offering patients not only better control of prosthetic devices but the sense that these devices are truly an extension of themselves.
Duke University Medical Center
Durham, NC 27710
United States of America