A University of Minnesota research team has made mind-reading possible through the use of electronics and AI.
Researchers at the University of Minnesota Twin Cities have created a system that enables amputees to operate a robotic arm using their brain impulses rather than their muscles. This new technology is more precise and less intrusive than previous methods.
The majority of commercial prosthetic limbs now on the market are controlled by the shoulders or chest using a wire and harness system. More sophisticated models employ sensors to detect small muscle movements in the patient’s natural limb above the prosthetic. Both options, however, can be difficult for amputees to learn how to use and are sometimes unhelpful.
The Department of Biomedical Engineering at the University of Minnesota with the help of industrial collaborators has developed a tiny, implantable device that connects to the peripheral nerve in the arm of a person. The technology, when coupled with a robotic arm and an artificial intelligence computer, can detect and decipher brain impulses, enabling upper limb amputees to move the arm only with their thoughts.
The researchers’ most recent paper was published in the Journal of Neural Engineering, a peer-reviewed scientific journal for the interdisciplinary field of neural engineering.
The University of Minnesota-led team’s technology allows research participant Cameron Slavens to move a robotic arm using only his thoughts. Credit: Eve Daniels
“It’s a lot more intuitive than any commercial system out there,” said Jules Anh Tuan Nguyen, a postdoctoral researcher and University of Minnesota Twin Cities biomedical engineering Ph.D. graduate. “With other commercial prosthetic systems, when amputees want to move a finger, they don’t actually think about moving a finger. They’re trying to activate the muscles in their arm, since that’s what the system reads. Because of that, these systems require a lot of learning and practice. For our technology, because we interpret the nerve signal directly, it knows the patient’s intention. If they want to move a finger, all they have to do is think about moving that finger.”
Nguyen has been working on this research for about 10 years with the University of Minnesota’s Department of Biomedical Engineering Associate Professor Zhi Yang and was one of the key developers of the neural chip technology.
The project began in 2012 when Edward Keefer, an industry neuroscientist and CEO of Nerves, Incorporated, approached Yang about creating a nerve implant that could benefit amputees. The pair received funding from the U.S. government’s Defense Advanced Research Projects Agency (
A big part of what makes the system work so well compared to similar technologies is the incorporation of artificial intelligence, which uses machine learning to help interpret the signals from the nerve.
“Artificial intelligence has the tremendous capability to help explain a lot of relationships,” Yang said. “This technology allows us to record human data, nerve data, accurately. With that kind of nerve data, the AI system can fill in the gaps and determine what’s going on. That’s a really big thing, to be able to combine this new chip technology with AI. It can help answer a lot of questions we couldn’t answer before.”
The technology has benefits not only for amputees but for other patients as well who suffer from neurological disorders and chronic pain. Yang sees a future where invasive brain surgeries will no longer be needed and brain signals can be accessed through the peripheral nerve instead.
Plus, the implantable chip has applications that go beyond medicine.
Right now, the system requires wires that come through the skin to connect to the exterior AI interface and robotic arm. But, if the chip could connect remotely to any computer, it would give humans the ability to control their personal devices—a car or phone, for example—with their minds.
“Some of these things are actually happening. A lot of research is moving from what’s in the so-called ‘fantasy’ category into the scientific category,” Yang said. “This technology was designed for amputees for sure, but if you talk about its true potential, this could be applicable to all of us.”
In addition to Nguyen, Yang, and Keefer, other collaborators on this project include Associate Professor Catherine Qi Zhao and researcher Ming Jiang from the University of Minnesota Department of Computer Science and Engineering; Professor Jonathan Cheng from the University of Texas Southwestern Medical Center; and all group members of Yang’s Neuroelectronics Lab in the University of Minnesota’s Department of Biomedical Engineering.
Reference: “A portable, self-contained neuroprosthetic hand with deep learning-based finger control” by Anh Tuan Nguyen, Markus W Drealan, Diu Khue Luu, Ming Jiang, Jian Xu, Jonathan Cheng, Qi Zhao, Edward W Keefer and Zhi Yang, 11 October 2021, Journal of Neural Engineering.DOI: 10.1088/1741-2552/ac2a8d