Eugene Alford just couldn’t get his legs to move, but it wasn’t for want of trying. It was 2012, and he was in a laboratory at the University of Houston in Texas, participating in a study that was designed to see whether people with paralysis could control a robotic exoskeleton with their thoughts. Alford, a plastic surgeon who’d lost the use of his legs when a tree fell on him at his farm, kept trying to walk by willing the electrical impulses in his brain up and into the electrodes on his head, from where they could be translated into movement.
Jose Contreras-Vidal, the neural engineer who was conducting the experiment, urged Alford not to think too specifically about the act of walking. Instead, he should just concentrate on where he wanted to go. “Finally, he put a cup of coffee on the desk, and I started thinking, ‘I want that cup of coffee’,” Alford, now 58, says. So Alford strode over to the desk and took it. By thinking about walking as an able-bodied person would — that is, by barely thinking about it at all — he was able to send the correct signals to the brain–machine interface that controlled the robot.
The movement that the technology bestowed was a big deal for Alford. “Just being able to stand up and look somebody face to face, in the eye, for a person who’s been in a wheelchair for five years, that’s what brings tears to your eye,” he says. Six years on, Contreras-Vidal’s lab at the Building Reliable Advances and Innovation in Neurotechnology Center, a collaboration between the University of Houston and Arizona State University, continues to train paralysed people to walk, albeit only under the supervision of researchers. His group is one of a number that are developing practical neural prostheses — devices capable of reading signals from the brain and then using them to restore movement in people who have been paralysed through injury or illness. S1S2S3S4S5S6S7S8S9S10S11S12S13S14S15
The World Health Organization estimates that 250,000–500,000 people worldwide suffer a spinal-cord injury every year, about 13% of whom will lose the ability to control all four limbs. Another 45% will retain some movement or feeling in all limbs, but are still severely limited in what they can do physically. And almost 2 million people affected by stroke in the United States are living with some degree of paralysis, as are another 1.5 million people with multiple sclerosis or cerebral palsy.
Against this backdrop of paralysis, researchers are working to engineer technological solutions. As well as enabling the control of robotic aids, some groups are learning to detect the brain’s intention to initiate movement and to then feed that instruction into the muscles. A few groups are also trying to send signals back into the brain to restore sensation in people who can no longer feel their limbs. But before these technologies can touch lives beyond the lab, researchers must improve the understanding of how best to integrate humans with machines.
A closer listen
Contreras-Vidal records electrical activity in the brains of his study volunteers through a skull cap that is studded with 64 electrodes. The impulses gathered are then translated into signals to control the robotic exoskeletons.
Listening to populations of neurons using electrodes mounted outside the skull is not a simple task. Like hearing music from across the street, some subtleties are lost. And movement of the scalp muscles, eye blinking and motion in the wires that connect the electrodes to the decoder all add noise that makes the neural signals trickier to interpret. The system provides enough information to unravel the user’s intentions and to translate them into movement, but other researchers are using implanted electrodes to read signals from individual neurons, in the hope of collecting a more nuanced signal and providing finer-grained motor control.
In 2016, Bill Kochevar of Cleveland, Ohio, became the first person with paralysis to use electrodes implanted in the motor cortex of his brain to stimulate his arm to move. Implanted electrodes had already enabled people with a spinal-cord injury to move robotic arms, but, thanks to a combination of the brain implants and a set of stimulatory electrodes in his right arm, he was able to move his arm to feed himself, raise a cup to his mouth and scratch his nose. Although these regained abilities were limited, they still opened up his world. “I know there are a lot more possibilities out there for doing things I didn’t think were possible,” he said in October last year. “It’s always been exciting to me that I’m first in the world to do this.”
The feat brings doctors closer to restoring lost function in people with paralysis. “It’s a big deal scientifically, but it’s also a big deal clinically,” says Bolu Ajiboye, a biomedical engineer at Case Western Reserve University in Cleveland, who worked with Kochevar. “He couldn’t do anything on his own before.”
Kochevar was in his mid-forties when he crashed his bicycle into the back of a postal truck, injuring the top of his spine, and causing him to completely lose the ability to move his limbs. He died last December at the age of 56 from complications of that injury, having participated in the implant research for about three years. The researchers he assisted — part of BrainGate, a collaboration between Case Western, Brown University in Providence, Rhode Island, Massachusetts General Hospital in Boston, Stanford University in California, and the US Department of Veterans Affairs — are continuing to recruit volunteers.
To enable Kochevar to move his arm, the researchers implanted two square arrays of 100 electrodes, both 4 millimetres long, in the area of his motor cortex that was responsible for hand movement. Another 36 electrodes implanted under the skin of his right arm provided tiny jolts to the muscles in his hand, elbow and shoulder through a technique known as functional electrical stimulation. The brain arrays were wired to bolt-like connectors that protruded from the top of his head. Cables carried signals from the connectors to a computer, which applied machine-learning to the data to ascertain the movements that Kochevar wanted to make. The electrodes in his arm then received a pattern of stimuli that caused his muscles to move. Because Kochevar’s muscles had weakened through disuse, the researchers also provided him with a motorized arm support, which received the same movement commands as his own muscles.
Before Kochevar could start to use the system, the researchers had to train the computer to interpret his intentions. Initially, they asked him to watch a moving arm in virtual reality while imagining that he was making the same movements. Later, they tried a lower-tech approach that Ajiboye says worked just as well; they moved Kochevar’s arm using the computer and had him imagine he was doing it.
The imagined movements created distinct patterns of activity in the 200 or so neurons in Kochevar’s brain that were being monitored individually by the two implants. The researchers recorded the order and rate of neuron firing for each movement, enabling them to stimulate the correct muscles in Kochevar’s arm when a particular pattern of movement was detected in subsequent experiments.
To begin with, Kochevar had to concentrate on the individual movements that comprise a gesture. “When I first started doing it, I thought a lot about moving in, out, up, down,” he said. But as time went on, he was able to go beyond purely mechanical directives. With practice, moving his arm came more naturally; like Alford, he learnt to think about what he wanted to do rather than how to do it. “I just think about going from here to there, and it pretty much goes there,” he said.
He only ever used the system under the supervision of the researchers, either in the lab or at his home, owing to the complexity of the set-up and US Food and Drug Administration (FDA) safety regulations. Ajiboye and his colleagues needed to calibrate the system at the start of each day of testing, to ensure that the electrodes were aligned correctly in the brain. Although the day-to-day drift is usually small, in time the implants could end up recording a different group of neurons, which would mean having to interpret a fresh set of activity patterns. Calibration takes around five minutes, but Ajiboye hopes that his team will eventually reduce it to just a few seconds.