Register Now

Login

Lost Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Add post

You must login to add post .

Add question

You must login to ask a question.

Login

Register Now

Welcome to Scholarsark.com! Your registration will grant you access to using more features of this platform. You can ask questions, make contributions or provide answers, view profiles of other users and lots more. Register now!

Brown scientist wins $1.5M innovator award for new approach to decoding brain signals

The motor cortex is the portion of the brain that controls movement, but it’s also active when a person watches someone else move or plansmovements, such as tracking a ball before catching it. This complicates the process of decoding brain activity from that region in order to control a robotic limb, said Carlos Vargas-Irwin, an assistant professor of neuroscience (research) at Brown University.

Now, with a five-year, $1.5 million New Innovator Award from the National Institutes of Health’s High-Risk, High-Reward Research program, he will use cameras and artificial vision with the aim of decoding intentional movements and other activity in the motor cortex and applying that knowledge to develop more effective brain-computer interfaces.

“The goal of this project is to better understand the interaction between sensory and movement-related information so that we can interpret signals related to intended actions more accurately,” said Vargas-Irwin, who is part of the BrainGate collaboration led by researchers from Brown, Case Western Reserve University, Stanford University, Massachusetts General Hospital and the Providence V.A. Medical Center.

BrainGate is a brain-computer interface being used in a clinical trial to help paralyzed people regain independence by directly controlling assistive devices with their thoughts, even restoring some mobility by artificially stimulating muscles.

In Vargas-Irwin’s project, he will combine an external camera and artificial vision with the neural activity information from a neural decoder.

“This would allow the decoder to interpret neural activity within the context of the visual environment in a way that more closely resembles the natural operating state of the brain,” said Vargas-Irwin, who is affiliated with Brown’s Carney Institute for Brain Science.

To do this, first Vargas-Irwin and his team will train rhesus macaques to perform different movements while they capture three different sets of information. The team will record the macaque’s neural activity, use a Hollywood-like motion capture system to track its precise movements, and record the visual environment with a camera and artificial vision.

Next, the team will take the motion capture information and the artificial vision information and try to predict the neural activity. “This may sound a little bit backwards, but the idea is to see what the neurons in the motor cortex are actually encoding and then be able develop more accurate models,” Vargas-Irwin said.

They will know they understand what features of the neural activity reflect the environment and what features reflect the intended movement when they can take the rhesus macaque’s neural activity and the artificial vision and correctly predict its movement.

Using rhesus macaques is important so that the team can collect information on the precise movements that result from the neural activity, Vargas-Irwin said. This will allows them to evaluate their models and wouldn’t be available from human patients who have very limited motion. When the technology is eventually applied to the BrainGate clinical trials — which Vargas-Irwin said he fully anticipates — patients will be shown predetermined movement sequences and be asked to try to generate the same movement.

Each year the NIH supports unusually innovative research from early career researchers, Vargas-Irwin was one of 58 recognized this year. Other labs are pursuing a more conventional approach to improve accuracy in interpreting motor cortex activity. This involves recording neural activity in many related areas, such as the visual cortex and posterior parietal cortex, that are involved in motion planning. Vargas-Irwin said that for clinical patients, it is important to have the least amount of sensors that still provide clear information. Potentially, the method could even replace additional implants with external sensors such as video cameras.

John Donoghue, a BrainGate leader and Brown professor of neuroscience who has worked with Vargas-Irwin for more than 15 years, said he was pleased to see the NIH recognize Vargas-Irwin with the prestigious award.

“His project brings a clever approach to achieve much better control for brain computer interfaces (BCIs) by adding visual information to neural signals in a novel way,” Donoghue said. “This research will advance both basic understanding of how sensory signals contribute to coding of movement by the brain and practical knowledge on how to create useful BCIs that restore movement to people with paralysis.”

Vargas-Irwin grew up in Cali, Colombia, and earned his both his bachelor’s degree and Ph.D. from Brown. He said he has long been interested in working to understand how networks of neurons join forces to process information.

“The motor system is interesting because there’s a complex computation taking place in the brain, and we can directly measure the output,” he said. “And of course we can use this understanding of motor circuits to develop technology to help people with motor disorders or injuries.”


Source:

https://news.brown.edu

About Marie

Leave a reply