At SXSW 2018, TechRepublic’s Teena Maddox spoke with Hugh Herr, professor and Co-director of the Center for Extreme Bionics, MIT Media Lab. The following is a transcript of the edited video interview.

“At MIT, I co-direct the Center for Extreme Bionics, with my colleague, Edward Boyden,” explains Herr. “At the center, we are advancing bionics, electromechanics linked to the human nervous system that extends physicality, cognition, emotional experience; a very broad agenda across human augmentation.”

It’s also a tech to be applied in the future, as technology offers normalization and assistance for someone with some form of pathology, and Herr cites, as an example, someone may suffer an issue within a limb, perhaps amputation, or paralysis. In addition, says Herr, actual augmentation beyond innate, physiological levels will be available for everyone. “We recently built an exoskeleton that augments human bipedalism, on two legs, in terms of energetics and speed.”

At SXSW, Herr says, “We are gearing up to do a bionic limb where there’s a mechanical integration into the bone, and also a neuro-electrical integration into the nervous system. We insert a titanium shaft through the skin membrane into the femur, for those with an above-the-knee amputation. We’re running wires through that shaft from nerves inside the body out to a bionic limb. We can actually close the loop between the bionic limb and the human nervous system, where the person can think and control the synthetic motors on the limb, and they can also feel the bionic limb moving, as part of their body. We recently got FDA approval, so this year we’ll be launching the initiative.”

SEE: Ebook–Executive’s guide to the business value of VR and AR (TechRepublic)

Currently, explains Herr, “What’s used by worldwide is a non-invasive socket, a cup which goes around the residuum on the outside, where body weight is loaded through the soft tissues.

In this procedure, the load goes directly to the bone, to the skeleton is more natural. Also, most limbs are not neurally controlled, but can be controlled by the brain and very, very, very, very few prostheses can be felt by the user, where synthetic computation is actually in putting information into the nervous system, so the person can feel their joints moving, or feel pressure.”

The procedure was not without challenges. “We’re reflecting proprioception-like signals from the prosthesis into the nervous system. Proprioception relates to the position of the robotic joint, its speed, and how much load is on it. We have a model, where we move the sensory information onto the nervous system, so when the body joint moves, it’s the same feeling as if the joint was made of flesh and bone.”

Also see