Susan Walsh / AP
The palpable robotic arm enabled the paralyzed man to quickly perform tasks such as pouring water from one cup to another.
The robotic arm provides haptic feedback directly to the man’s brain as he uses his thoughts to control the device as a team the report Thursday in the magazine Science.
Previous versions of the arm had required the participant, Nathan Copeland, to direct the arm only visually.
“When I only received a visual response, I could see that the hand had touched the body,” Copeland says. “But sometimes I picked it up and fell off.”
Model Copeland also took about 20 seconds to complete. He says, “Through sensory responses, he was able to complete it in 10” Jennifer Collinger, Associate Professor in the Department of Physical Medicine and Rehabilitation at the University of Pittsburgh.
Collinger says haptic information is important for prosthetic use because it is difficult to grasp an object that you are not feeling.
“Even something as simple as picking up a cup and trying to maintain the correct amount of pressure while moving it to a different location depends a lot on the tactile feedback of your hand,” she says.
So Collinger and a team of researchers spent years researching ways to add sensory responses to a robotic arm and hand.
The team has worked with Copeland, who was paralyzed as a teenager over 15 years ago. He learned to control robotic arm movements using a brain computer interface.
The team began by placing electrodes in a region of Copeland’s brain that processes sensory information. This allowed them to use electrical impulses to simulate a range of sensations.
“It turned out that stimulation in the fingertip-related areas of the brain triggered sensations that felt as if they were coming from the participant’s hand,” Collinger says.
The team then discovered how to generate these signals when the hand and the robotic arm are touching an object. The final step was timing Copeland while performing tasks such as capturing a block or pouring water, with and without haptic feedback.
The results showed that Copeland was able to complete some manual tasks at the speed of someone’s use.
“The sensation will actually change its intensity based on the amount of force the hand exerts on the body,” says Copeland. “So I can also see if I’m holding it tightly or not.”
As an added bonus, Copeland says adding the sense of touch is obtained by using the robotic arm in a more natural way.
“The controls are so intuitive that I think of things as if I were moving my arm,” he says.
He says the results have implications that extend beyond robotic arms Jeremy de BrownJohn C. Malone, Associate Professor in the Department of Mechanical Engineering at Johns Hopkins University.
He also says that high-tech prosthetics work best when they mimic the sense of touch. Some do this by vibrating or providing another form of haptic feedback – the same approach many smartphones use to help users type on the screen.
The newest prosthesis, Brown says, “works just like our natural limbs.” It can bend at the elbow, rotate around the wrist and grip the fingers.
“But when you give someone the power to control these things so that they have their touch,” he says, “it’s difficult.”
He says that most sensors only have rudimentary capabilities, such as detecting resistance or temperature.
When his hand touches something, Brown says: “I feel pressure, I feel slipping, I feel if the body is wet or dry, I can feel the touch, I know if it is rough and if it is smooth.”
Scientists are just beginning to learn how to make artificial hands and fingers that can detect these subtle features of an object. Because prosthetics or prosthetics provide more sensory feedback, Brown says they will be more beneficial.
But he says the sense of touch is more than just an increase in dexterity.
“It’s not just about being able to enter your pocket and get your keys,” he says. “It’s also the ability to hold a loved one’s hand and feel this emotional attachment.”