Albuquerque Journal

TOUCHING moments in prosthetic­s

New bionic limbs that can ‘feel’

- BY PAYAL DHAR THE WASHINGTON POST

Phantom pain was all that Keven Walgamott had left of the limb he lost in an accident over a decade ago — until he tried on the LUKE Arm for the first time in 2017, and told researcher­s that he could “feel” again. The arm is a motorized and sensorized prosthetic that has been in developmen­t for over 15 years by a team at the University of Utah.

Researcher­s around the world have been developing prosthetic­s that closely mimic the part of the human body they would replace. This goes beyond the cosmetic and even the functional; these are bionic body parts that can touch and feel, and even learn new things.

“Touch isn’t a single sense,” said Gregory Clark, associate professor of biomedical engineerin­g at the University of Utah and lead researcher of the study. “When you first touch objects with a natural hand, there’s an extra burst of neural impulses.”

The brain then “translates” these into characteri­stics, such as firmness, texture and temperatur­e, all of which are crucial in deciding how to interact with the object, he said. In other words, by using the LUKE Arm (named after the “Star Wars” hero Luke Skywalker and manufactur­ed by Deka), Walgamott, of West Valley City, Utah, was able to “feel” the fragility of a mechanical egg, just as he would have with a natural limb. He could pick it up and transfer it without damaging it.

As he performed everyday activities with the prosthetic — such as holding his wife’s hand, sending a text message and plucking grapes from a bunch — Walgamott told researcher­s that it felt like he had his arm back. Even his phantom pain was reduced.

“When the prosthetic hand starts to feel like the user’s real hand, the brain is tricked into thinking that it actually is real,” Clark said. “Hence, the phantom limb doesn’t have a place to live in the brain any more. So it goes away — and, with it, goes the phantom pain.”

Clark’s team were able to achieve these results by stimulatin­g the sensory nerve fibers in a “biological­ly realistic” manner, he said. Using a computer algorithm as a go-between, they were able to provide a more biological­ly realistic digital pulse similar to what the brain normally receives from a native arm.

“Participan­ts can feel over 100 different locations and types of sensation coming from their missing hand,” Clark said. “They can also feel the location and the contractio­n force of their muscles — even when muscles aren’t there. That’s because we can send electrical signals up the sensory fibers from the muscles, so the brain interprets them as real.”

The critical component of a prosthetic powered by thought would be the communicat­ion between the brain and a robotic body part — called the brain

computer interface (BCI).

The LUKE Arm uses a neural interface, but in other mindcontro­lled prosthetic­s, brain implants are used to send instructio­ns to a robotic limb, much like how neurons transmit messages from the brain to a muscle. But this means precision brain surgery and all the attendant risks, not to mention the expense and recovery time. This might be about to change. Bin He, professor and head of biomedical engineerin­g at Carnegie Mellon University, and his colleagues have been working on a noninvasiv­e highprecis­ion BCI, and reported a breakthrou­gh in June: a “mindcontro­lled robotic arm … that demonstrat­es for the first time, to our knowledge, the capability for humans to continuous­ly control a robotic device using noninvasiv­e EEG signals.”

Noninvasiv­e BCIs have shown promising results, but only in performing distinct actions — for example, pushing a button. When it comes to a sustained, continuous action, such as tracking a cursor on a computer screen, noninvasiv­e BCIs have resulted in jerky, disjointed movements of the robotic prosthesis.

In the demonstrat­ion by He and his team, the subject controlled with their mind a robotic arm to track a cursor on a computer screen and the prosthetic finger was able to follow the cursor in a smooth, continuous path — just as a real finger would. What is more interestin­g is that, while they used a computer-wired EEG cap on the subject in the lab, He said that it is not necessary.

A smartphone app programmed with EEG recordings and wireless electrodes could streamline the process for everyday use, He said.

This could pave the way for thought-controlled robotic devices by decoding “intention signals” from the brain without needing invasive and risky brain surgery, He said.

Engineers at the lab of Joseph Francis, associate professor of biomedical engineerin­g at the University of Houston, have been working on a BCI that can autonomous­ly update using implicit feedback from the user.

“We are moving toward an autonomous system that will learn to perform new actions as per the user’s intentions with the least supervisio­n from outside, and enable the user to control the prosthetic more independen­tly,” said Taruna Yadav, a Ph.D. student who is part of Francis’ team.

If BCIs and other neural interfaces provide a means to connect our brains with external devices that extend the function of our body, and if they can make paralyzed patients walk again or restore a body part that has been lost to disease or injury, would it be theoretica­lly possible to develop bionic add-ons that could bestow superhuman abilities?

“In a sense, yes,” Clark said. “Indeed, we already do. Glasses restore normal vision to the nearsighte­d. But telescopes and microscope­s allow us to see what would be otherwise unseeable. Canes assist in walking after injury, but fiberglass vaulting poles allow us to clear superhuman heights.

“In clinical applicatio­ns, exoskeleto­ns provide important assistive technologi­es after spinal cord injury or stroke,” Clark said. “Yet, they can be used to increase the power and endurance of intact individual­s.”

But in other ways, bionic parts are no match for nature.

“For all its merits, the LUKE Arm contains only 19 sensors and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,” Clark said. “That’s a lot, but both are impoverish­ed compared with the thousands of motor and sensory channels of the human body, or its natural functional capabiliti­es.”

 ?? DAN HIXSON/UNIVERSITY OF UTAH COLLEGE OF ENGINEERIN­G ?? Doctoral student Jacob George, left, and professor Gregory Clark examine the LUKE Arm, a motorized and sensorized prosthetic that has been in developmen­t for more than 15 years.
DAN HIXSON/UNIVERSITY OF UTAH COLLEGE OF ENGINEERIN­G Doctoral student Jacob George, left, and professor Gregory Clark examine the LUKE Arm, a motorized and sensorized prosthetic that has been in developmen­t for more than 15 years.

Newspapers in English

Newspapers from United States