Reconnecting the Brain After Paralysis Using Machine Learning

Brain-Computer Interface Restores Sense of Touch with Haptic Signals


Less than a year after his spinal cord injury, Ian Burkhart was ready for whatever was next. A 2010 diving accident had severed his spine, and Burkhart lost sensation and movement below his bicep. But he had not given up on regaining some of those capabilities.

He was working with doctors and physical therapists at The Ohio State University Wexner Medical Center to manage the effects of his injury. A few months after beginning treatment, he started asking his healthcare team about his options.

“I wanted to know what was possible today, and what I could hope for in the future,” he says.

Ian Burkhart is seated in front of a computer monitor. His is wearing the sleeve on his right arm. A cable connects the implanted microelectrode array to the computer equipment.

Study partner, Ian Burkhart. Image credit: Battelle

Three years after his injury, he learned of a way to be involved in shaping that future. An experimental trial of a brain-computer interface (BCI) was being planned by a team at Ohio State and Battelle, just a few blocks away from where Burkhart was receiving care.

“It worked out perfectly,” says Burkhart. “It was the right place at the right time.”

“I wanted to know what was possible today, and what I could hope for in the future.”

Connecting the Brain Directly to the Muscles

The nervous system is the communication pathway between the brain and the rest of the body. It carries signals to and from the brain, allowing communication with the muscles and skin, so that when you think “pick up the pencil,” the nerves provide the sense of touch and your hand responds by flexing the correct muscles to make a pinching motion. When the nervous system is damaged, these signals can be blocked from reaching their target, leading to paralysis and an inability to feel.

BCIs use computational systems to record and analyze brain signals, sending those signals in the form of commands to a device that performs an action. Scientists have been working for decades to develop BCIs for people living with paralysis, but these systems are still mostly confined to the lab. The goal for the Ohio State and Battelle team, and other groups, is to create a portable device that can restore some function and independence for these individuals.

Battelle’s NeuroLife system was designed to help Burkhart regain conscious control of his fingers, hand, and wrist.

Some BCIs analyze the brain’s electrical signals with an electroencephalogram (EEG), a system with electrodes fixed to the scalp to record brain activity. EEG-based BCIs enable participants to do simple things, such as moving a cursor on a screen with just their thoughts, or advanced tasks like controlling robotic prosthetics.

Other BCIs require computer chips surgically implanted directly into the brain. These chips have an array of electrodes that record signals from a small but specific group of neurons. While far more invasive, these systems provide precision, as the electrodes record directly from the desired cells. Battelle and Ohio State researchers opted for an implanted chip to reconnect a specific part of the brain’s motor cortex back to Burkhart’s paralyzed hand muscles. Battelle’s system, called NeuroLife, was designed to help Burkhart regain conscious control of his fingers, hand, and wrist.

This image is from the 2016 NeuroLife study. In the upper left is an image of the brain that shows the location of the implanted microgrid array. Upper middle shows Burkhart’s forearm wrapped in a sleeve of electrodes. Upper right shows Burkhart seated facing a computer monitor. On the monitor is an avatar if his right hand. He is wearing the electrode sleeve. The bottom of the picture shows a series grayscale histograms and multicolor raster readings that correspond to neuron activity when wrist movements were attempted.

In the earlier study, the BCI enabled Burkhart to move his hand and arm. (a)The location of the implanted microelectrode array and where the location overlaps with neuron activity during arm movements. (b) Neuromuscular electrical stimulation sleeve. (c) The neural bypass system in use. (d) The rasters and histograms of neurons firing that correspond to attempted wrist movements. Image credit: Battelle

A Phased Approach

The earliest version of the Battelle system wasn’t a BCI at all. Before undergoing surgery to implant the chip, Burkhart tested the device that would help his muscles move, which at that point consisted of electrodes stuck to his arm. When the electrodes applied a small current to his forearm, they signaled specific muscles to activate and flex.

Rather than using his thoughts to control the device, a computer activated the electrodes, stimulating muscle movement. “This first phase demonstrated enough promise that Ian agreed to participate in the research project,” says Patrick Ganzer, principal research scientist at Battelle. 

“There was never a doubt in Ian’s mind that this was going to work,” says Marcie Bockbrader, assistant professor and physical medicine and rehabilitation physician at The Ohio State University, and principal investigator for this trial. “In his mind, it was always a question of how to use it.”

A few years after his injury, with the help of this BCI, Burkhart could move his hand to swipe a card, stir coffee, and even play a version of Guitar Hero.

The computer screen in the upper right corner shows an avatar of Burkhart’s hand in a clenched position, which is how his hand (middle of screen) is gripping the mug. He is pouring cubes from the mug into a glass.

With the NeuroLife system, Burkhart pours items into a cup. Image credit: Battelle

In 2014, Burkhart underwent brain surgery at the OSU Wexner Medical Center to implant the chip. About the size of a pea, this chip, made by Blackrock Microsystems, Inc., sits in his motor cortex, an area of the brain responsible for generating voluntary movements. “It has small wires that act like microphones; each one listens to a handful of brain cells,” says Ganzer. With the chip in place, the research team was ready to work on the second phase with a more complex interface.

Using MATLAB®, the team developed machine learning algorithms that could decode Burkhart’s thoughts as the chip recorded his brain activity. To translate Burkhart’s thoughts of moving his hand into motion, the NeuroLife system needed to bypass the spinal cord. This neural bypass technology stimulates the muscles in Burkhart’s hand by transmitting brain signals to a computer, where algorithms decode the signals and transform them into commands. Those commands control the activity of a sleeve of electrodes that wraps around Burkhart’s forearm, stimulating his muscles to move them in accordance with his thoughts.

A few years after his injury, with the help of this BCI, Burkhart could move his hand to swipe a card, stir coffee, and even play a version of Guitar Hero®. “All of that innovation was ultimately driven by Ian,” says Bockbrader. “He wasn’t satisfied, and was always thinking about what came next.”

Sensing and Feeling

This amazing breakthrough represented a one-way direction of communication. Burkhart was able to send signals to his arm. He could play the guitar, but he didn’t feel the guitar in his hand.

Then Ganzer had an idea. He went to the rest of the research team and suggested experimenting to see if the chip in Burkhart’s brain was picking up any residual touch sensation. It was possible that signals were still traveling to the brain through the few remaining intact fibers in his spine, even if Burkhart couldn’t perceive touch.

Being in the motor cortex, the chip supposedly only picked up on Burkhart’s motor intentions. But the brain is adaptable, and boundaries between brain areas can change. It was possible that some touch-processing neurons were providing a faint signal to the motor cortex, according to Ganzer. His colleagues were skeptical at first. David Friedenberg, a senior data scientist at Battelle, remembers expressing doubts but thought it might be worth a shot.

To test this idea, they blindfolded Burkhart and touched different parts of his arm and hand. By analyzing recordings from the chip in his brain, they could tell that a patch of the motor cortex was picking up some small amount of tactile information, even when Burkhart reported he didn’t feel anything.

The block at the top shows an outline of a brain with the location of the neural implant magnified. The block number 2 show a drawing of a human torso and head with a connection in blue between the hand and the head. Block three shows an illustration of a forearm with the electrical stimulation sleeve and a separate band around the upper arm that provides haptic feedback.  The illustration in block 4 indicates that there are separate signals for touch and movement that are multiplexed to the implant.

For the recent NeuroLife study, (1) Neural activity is detected with an implanted microelectrode array. (2) The study determined that residual touch signaling reaches the brain. (3) The electrical stimulation sleeve also has a band on the upper arm to provide closed loop feedback. (4) Touch and movements signals are separated. (5) Grip strength is autonomously controlled by the touch signals. Image credit: Battelle

Burkhart is not alone in this. A handful of studies have analyzed brain activation and touch perception in people with spinal cord injuries like Burkhart’s. These studies suggest that as many as half of such injuries are “sensory discomplete.” Like Burkhart, others with sensory discomplete injuries are not able to feel touch, but residual nerve fibers are still sending sensory signals to the brain. The next step for the Battelle and OSU group was figuring out how to use this information to Burkhart’s advantage.

The original NeuroLife system enabled Burkhart to perform motions with his hand, but he still largely couldn’t feel when he was touching an object. Because of this lack of sensation, Burkhart couldn’t reliably tell when he was gripping an object unless he was looking at his hand. The Battelle and OSU team wanted to provide that missing sensory feedback.

“The possibility of sensory information would make the system work a lot better and allow me to be much more independent when I’m using the system,” says Burkhart. “But I did not think it was possible at all without having another surgery to implant a different device in the sensory area of my brain.”

Battelle worked with Burkhart to design a device that would work with the NeuroLife system to provide sensory feedback. They would not be able to restore a sense of touch to Burkhart’s hand, but they could use the residual touch-sensing brain signals and the BCI to provide artificial feedback to Burkhart when he grasped an object.

To do that, the team needed to find a way to route those brain signals to a device located in a place on Burkhart’s body where still had sensation. “That was challenging, figuring out the best way to feed that information back to him in a way that he could understand and make sense of,” says Friedenberg.

“It provides a lot of hope for the future that a device like this is going to change the lives of individuals like myself. It’s something that I look forward to all the time.”

Burkhart wanted the device to be something he could sense and control as naturally as possible. He and Battelle tested out a few different ideas, like putting the feedback device on his back and having it vibrate when he touched an object. Burkhart still had sensation there, but the device needed to be somewhere that felt more natural to him. He needed to be able to connect the artificial sensory feedback with the fact that he’s touching something. “That way, my brain wouldn’t have to do a whole lot of re-learning,” says Burkhart.

In the end, they settled on a vibrotactile band that wraps around Burkhart’s bicep. That area both had intact sensation and felt the most natural to Burkhart. To get the sensory information from the brain to the device, the researchers built and trained machine learning algorithms in MATLAB to detect and decode the brain’s subperceptual touch signals. When Burkhart touched an object while using the BCI, those algorithms teased apart the motor and sensory signals, transmitting touch feedback to the vibrotactile band and motor signals to the electrode sleeve. The vibrotactile band vibrated, in real-time, signaling to Burkhart that he was touching an object.

The first few tries with the device were a bit awkward. While Burkhart was getting sensory feedback, it was on his bicep, not his hand, which was actually touching the object. “It was a big challenge, kind of remapping that portion of the brain,” says Burkhart. “That took a little practice before I was able to link those two parts together.”

While without the feedback Burkhart has to guess if he’s touching an object he can’t see, with the vibrotactile device, he can tell whether he’s gripping an object over 90% of the time, even when blindfolded. Without the artificial sensory feedback system active, Burkhart is basically guessing or completely unable to recognize object touch, depending on object size. “It was pretty amazing the first time I was able to do a test blindfolded,” says Burkhart.

Not only can Burkhart pick up objects without looking, but he also has more confidence when using the system thanks to the artificial sensory feedback. “It’s huge because I can know that I’m not going to drop something when I’m using the system,” he says. “Things like that really make it a more natural system to use.”

This latest iteration of the BCI is still too bulky and complex for Burkhart to use at home, however. Right now, it can only be used at the lab, and setup is complicated. The system often needs adjustments and recalibrations. Despite these challenges, the team is confident one day Burkhart could use this at home.

“It seemed very far-fetched seven years ago when we first started,” says Friedenberg. But now, he says there's little in the way of that goal.

“It provides a lot of hope for the future that a device like this is going to change the lives of individuals like me,” says Burkhart. “It’s something that I look forward to all the time.”


Read Other Stories

Panel Navigation

Biotech

An Intensive Push to Make Ventilators for the COVID-19 Pandemic

Creating a New Design in 47 Days for the UK Ventilator Challenge

Panel Navigation

Academia / AI

Making Better Beer and Wine with Data and Machine Learning

GPS-Wearing Dogs, an Electronic Nose, and a Robot That Pours the Perfect Beer

Panel Navigation

Academia / Robotics

A Team of Nine Undergraduate Students Builds Innovative Jumping Robot for Their Final Project

Small, Agile “Ascento” Climbs Stairs and Avoids Obstacles