As part of a larger study exploring neural multiplexing and new modes of perception enabled by brain-computer interface (BCI), Johns Hopkins researchers have demonstrated the ability to “feel” virtual objects by integrating neural stimulation in a mixed-reality environment.

The participant in the study, Robert “Buz” Chmielewski – who previously demonstrated simultaneous control of two of the world’s most advanced prosthetic limbs through a brain-machine interface, and used brain signals to feed himself with two prosthetic limbs – has now demonstrated virtual tactile perception.

“All organisms rely exclusively on their sensory organs to perceive information about the world around them,” said Mike Wolmetz, who manages the Human and Machine Intelligence program at the Johns Hopkins University Applied Physics Laboratory (APL), in Laurel. “BCI creates a new pathway to perceive information directly, in ways that are not constrained by, filtered through or aligned with our specific sensory organs.

“This demonstration gives us a very early indication of how neural interfaces may fundamentally change the way we interact with technology and perceive our natural and digital environments in the not-too-distant future.”

The research is part of the Neurally Enhanced Operations (NEO) project, funded by the Defense Advanced Research Projects Agency to investigate neural multiplexing: to what extent the brain can accomplish typical perception and control through the senses and muscles at the same time as perception and control through a BCI.

Chmielewski, the study participant, is particularly well-suited to help. He suffered a spinal cord injury at the age of 17 that resulted in a diagnosis of incomplete quadriplegia, retaining some motor function and sensation in his arms and hands. In 2019, he underwent a 12-hour brain surgery at The Johns Hopkins Hospital to become the first research participant with chronic microelectrode arrays implanted in both hemispheres of the brain.