Two robotic arms – in one hand, a fork; a knife, in the other – flank a man seated at a table, with a piece of cake on a plate. A computerized voice announces each action: “move the fork towards the food” and “withdraw the knife”. The man, partially paralyzed from the waist up, makes subtle movements with his right and left fists at certain prompts, such as “select cutting location,” so that the machine cuts out a bite-sized piece. Now: “move food to mouth” and another subtle gesture to align the fork with his mouth. In less than 90 seconds, a person with very limited upper body mobility and who hasn't been able to use his fingers in about 30 years feeds himself using his mind and intelligent robotic hands.
They manage to make an incommunicado ALS patient 'speak' through a brain implant
A team led by researchers from the Applied Physics Laboratory (APL) at Johns Hopkins University in Laurel, Maryland, and the Department of Physical Medicine and Rehabilitation (PMR) at the Johns Hopkins School of Medicine, has published the breakthrough this Tuesday in a magazine article Frontiers in Neurorobotics. It describes this latest feat using a brain-machine interface (BMI) and a pair of modular prosthetics, the experiment's authors note in a press release.
Brain-computer interface systems provide a direct communication link between the brain and a computer, which decodes neural signals and “translates” them to perform various external functions, from moving a cursor on a screen to enjoying a bite of cake. . In this particular experiment, muscle movement signals from the brain helped control the robotic prostheses.
A new approach
The study was based on more than 15 years of research in neural science, robotics and software, led by the APL in collaboration with the Department of PMR, as part of the program 'Revolutionizing Prosthetics', which was originally sponsored by the US Defense Advanced Research Projects Agency (DARPA). The new paper describes an innovative 'shared control' model that allows a human to maneuver a pair of robotic prostheses with minimal mental intervention.
This 'shared control' approach aims to harness the intrinsic capabilities of the brain-machine interface and the robotic system, creating a "best of both worlds" environment in which the user can customize the behavior of an intelligent prosthesis." Francesco Tenore, project manager of the APL Exploratory Research and Development Department. Tenore, lead author of the article, is engaged in neural interface research and applied neuroscience.
"Although our results are preliminary, we are excited about giving users with limited abilities a real sense of control over increasingly intelligent assistive machines," he adds.
Help people with disabilities
One of the most important advances in robotics demonstrated in the article is the combination of robot autonomy with limited human intervention, where the machine does most of the work while allowing the user to customize the behavior of the robot. robot to your liking, according to David Handelman, the paper's first author and senior roboticist in the Intelligent Systems Branch of APL's Department of Exploratory Research and Development.
“For robots to perform human-like tasks with reduced functionality, they will need human-like dexterity. Human dexterity requires complex control of a complex robotic skeleton,” he explained. "Our goal is to make it easy for the user to control the few things that matter most for specific tasks."
Pablo Celnik, principal investigator of the project in the PMR department, said: "The human-machine interaction demonstrated in this project denotes the potential capacities that can be developed to help people with disabilities."
Closing the circle
Although the DARPA program officially ended in August 2020, the APL and Johns Hopkins School of Medicine team continue to collaborate with colleagues at other institutions to demonstrate and explore the potential of the technology.
The next step in the system could integrate earlier research that found that providing sensory stimulation to amputees allowed them to not only sense their phantom limb, but use muscle movement signals from the brain to control a prosthesis. The theory is that the addition of so-called 'sensory feedback' can help a person perform some tasks without requiring the constant visual feedback in the current experiment.
"This research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex everyday two-handed activity that people without disabilities take for granted," Tenore said. "There are still many challenges ahead, such as improving task execution, both in terms of accuracy and time, and control without the constant need for visual feedback." Celnik added: "Future research will explore the limits of these interactions, even beyond the basic activities of daily life."