Credit: University of Michigan
A dive into how AI is helping overcome limitations with current prosthetics by offering improved signal decoding, functionality and more intuitive control.
A study conducted in 2017 concluded that 57.7 million people were living with limb amputations worldwide. [1] In the United States alone, it is estimated that 185,000 [2] amputations occur annually.
Unfortunately, only about 5% of amputees have access to prosthetics because of their high cost and barrier to use. This article looks at how Artificial Intelligence is helping prosthetics to become smarter and easier to use, while open-source projects are making them more accessible and personalized.
The evolution of prosthetics
The earliest known example of a prosthetic device was a wooden toe discovered on a 3,000 Egyptian mummy [3]. It was made of wood as were most traditional prosthetics.
Many older examples of prosthetics were body-powered. They were controlled by a cable system that resulted in the opening or closing of the prosthetics such as the extension of an arm or leg.
Contemporary prosthetics have become simpler, lighter, and more customizable thanks to advances in material science.
Myoelectric prostheses are the most common commercially-available type of advanced prosthetics. They rely on electromyography (EMG) signals from the residual limb muscles for control.
Sensors are placed on the skin over the muscles on the residual limb and when the user contracts the muscles, it is interpreted as electric signals that are then converted into commands that control the movement of the prosthetics.
The problem with current prosthetics
This control method is great if the user just wants to be able to perform simple tasks like opening and closing the prosthetic arm. However, things become complicated for tasks such as moving individual fingers or rotating the wrist because the muscles that control these movements are no longer there.
Targeted muscle reinnervation (TMR), a process that involves reconnecting transected nerves to remaining muscles within the residual limb or chest, can help users control more degrees of freedom but it has its limitations.
It requires a lengthy and tiring EMG signal training period before the amputee can get the movements right. The process is physically and mentally exhausting to the user. Even then, control feels unnatural and the user will struggle to integrate the prosthetic into their daily tasks.
Additionally, factors such as poor socket fit and residual limb sweating lead to noisy EMG signals further complicating the decoding process.
Using AI to improve prosthetics usability
Follow along as we explore how Artificial Intelligence is being used to improve functionality on the existing myoelectric prosthetics and, also how it is being applied to new control methods such as the peripheral nerve interface and the brain-machine interface to improve their efficiency.
1. AI-powered myoelectric prosthetic hands
AI is being used to give prosthetic arms the autonomy to perform actions such as finger movement. For instance, in 2017 a team of researchers from Newcastle University created a prosthetic hand that uses computer vision to identify the object that it’s about to grasp and adjusts the grip without manual intervention from the user. [4]
2. AI-powered prosthetic leg
For a person using a prosthetic leg, tasks such as jumping over obstacles, walking on uneven grounds, or navigating a flight of stairs can be challenging.
Mechanical engineers from Utah university have taken steps to make these common tasks easier by develop a bionic leg that leveraged AI and machine learning to adapt to different environments based on feedback from the user’s residual limb.[5]
The leg uses sensors on the residual hip muscle to determine the user’s intended movement and then uses AI to bend the prosthetic knee and adjust swing duration accordingly. The leg can also, adapt to a user’s specific stride pattern leading to effortless and more natural movement for the user.
The researchers have now partnered with Ottobock UK to take the leg to market.
3. AI decoder for the peripheral nerve interface
The peripheral nerve interface is the more efficient replacement for EMG-controlled prosthetic limbs. Instead of relying on EMG sensors placed over the skin on the residual limb, this control method relies on implanted electrodes to read signals directly from the nerves.
Traditionally, scientists have relied on mathematical modeling algorithms to decode the signal but researchers from the University of Minnesota have developed a new cutting-edge way to decode the signals using an AI system. [6]
To train the AI system, the prosthetic user wears a data glove on their existing hand and then performs repeated hand movement on this arm and the amputated arm. As they do this, the data glove records the intended movement while the peripheral nerve interface records nerve signals on the missing arm.
Consequently, the AI system learns to correlate the patterns of nerve signals with specific hand movements. The best thing about an AI decoder is that it can recognize and actuate several movements at the same time, such as pinching which involves concurrent movement of the thumb and forefinger.
CSE biomedical engineering associate professor Zhi Yang shakes hands with research participant Cameron Slavens, who tested out the researchers' robotic arm system. With the help of industry collaborators, the researchers have developed a way to tap into a patient’s brain signals through a neural chip implanted in the arm, effectively reading the patient’s mind and opening the door for less invasive alternatives to brain surgeries. Credit: Neuroelectronics Lab, University of Minnesota
The Future of AI Prosthetics
Prosthetics with full sensory feedback are the future. Users will be able to feel the object they are holding. How hot is it, how soft is it?
Research has already begun to establish the best way to connect prosthetics with the somatosensory system. (The part of the sensory system concerned with the conscious perception of touch, pressure, pain, and temperature.)
Data is recorded from sensors placed on the prosthetics, encoded into signals, and then sent to the brain in form of electrical stimulation. This is being done through the peripheral nerve interface or directly stimulating the somatosensory cortex through intracortical microstimulation (ICMS). [7]
Among the key drivers for this research is the revelation that prosthetic sensory feedback can help improve the cognitive control of prosthetics.
The trials have moved from animal trials and are now entering the clinical stage.
Challenges that need to be overcome
Even as the industry evolves, some challenges regarding cost and accessibiity need to be overcome before it reaches its full potential.
The use of cheap 3D printed materials has helped lower the price of prosthetics, but advanced devices still remain largely inaccessible for the average person. The research, clinical trials, tech components, and expertise required in making prosthetic devices all contribute to the high cost of the device.
For instance, processors are a critical part of a prosthetic device, but ever since the pandemic hit, there has been a shortage of chips, leading to a hike in their prices.
Prosthetics also need regular maintenance and in some instances replacement which further adds to the cost and hinders users in remote areas. Engineers will need to work with health agencies to ensure a strong relationship between research and primary care.
Conclusion
From body-controlled prosthetics, through muscle-controlled prosthetics, and now to mind-controlled prosthetics, it has been, without a doubt, a phenomenal journey.
We are still early in the age of AI prosthetics, but finally, there is hope that artificial limbs can replicate the full functionality of a biological limb.
We have already seen how AI is being used to add intelligence to prosthetics and improve their control, but the real game changer will be when scientists develop a prosthetic that is inherently smart and has perfect synchronicity with the user’s mind.
And that’s what advancements in nerve interfaces are leading to.
Right now, a lot of the projects using AI are still prototypes yet to be commercialized. With the help of respective governments, manufacturing units, and investors, scientists will be able to create the ultimate artificial limb that is affordable and a true extension of the individual.
About the sponsor: Mouser Electronics
Mouser Electronics is a worldwide leading authorized distributor of semiconductors and electronic components for over 1,100 manufacturer brands. They specialize in the rapid introduction of new products and technologies for design engineers and buyers. Their extensive product offering includes semiconductors, interconnects, passives, and electromechanical components.
References
- McDonald CL, McCoy SW, Weaver MR, Haagsma J, Kartin D. Global prevalence of traumatic non-fatal limb amputation. Prosthet Orthot Int [Internet] [2021 Apr 1]; 45(2): 105-114. Available from: https://pubmed.ncbi.nlm.nih.gov/33274665 DOI: 10.1177/0309364620972258
- Amputee Coalition. Limb loss Statistics [Internet]. Washington DC: Amputee Coalition. Available from: https://www.amputee-coalition.org/resources/limb-loss-statistics/
- Cairo toe earliest fake body bit [Internet]. London UK: BBC; July 2007. Available from: http://news.bbc.co.uk/2/hi/health/6918687.stm
- Ghazaei G, Alameer A, Degenaar P, Morgan G, Nazarpour K. Deep learning-based artificial vision for grasp classification in myoelectric hands. J. Neural Eng [Internet]. 2017 May; 14(3): 1-19. Available from: https://iopscience.iop.org/article/10.1088/1741-2552/aa6802/meta;jsessionid=440CAFE830BF3682EF472808E158E9FC.c3.iopscience.cld.iop.org DOI: 10.1088/1741-2552/aa6802
- Utah COE. Utah bionic leg [web streaming video]. Utah (USA): Utah COE; 2019 Oct 29 [cited 2022 Dec 23]. Available from: https://www.youtube.com/watch?v=GHTbK3zJ6OY&ab_channel=UtahCOE
- arXiv:2203.08648 [Internet]. Anh Tuan Nguyen; 2022. Artificial Intelligence Enables Real-Time and Intuitive Control of Prostheses via Nerve Interface; 2022 Mar 16 [Jan 04 2023]; [top page]. Available from: https://arxiv.org/abs/2203.08648
- Gabriel W. Vidal V, Mathew L. Rynes, Kelliher Z, Goodwin SJ. Review of Brain-Machine Interfaces Used in Neural Prosthetics with New Perspective on Somatosensory Feedback through Method of Signal Breakdown. Scientifical [Internet]. 2016 May [cited 2023 Jan 04];2016(2016): Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4904116/ DOI: 10.1155/2016/8956432
A Computer Scientist with a passion for writing. So I became a tech writer as a form of compromise. Now, I explore the implications of new technologies such as AI and how they can be applied to improve business efficiency in different sectors. I've also written extensively on cybersecurity with part...