To complement the non-human primate work, the lab also has human participants who volunteer to have intracranial implants that allow us to record from their brain while they perform a suite of tasks - better known as brain computer interfaces (BCI). While limiting in terms of the areas we are able to record from, human subjects allow us to perform an amazing range of experiments each day that might take years to train a monkey to perform while also giving rich verbal reports that massively increase the insights we can gain into neural processing of tactile and intracortical stimulation.
Biomimetic multi-channel microstimulation of somatosensory cortex conveys high resolution force feedback for bionic hands
The next paper in our first series of BCI papers focused on how well participants could perceive differences in ICMS intensity which determines how well they might use robotic hands to interact with objects across a range of forces. We found that using biomimetic encoding strategies doubled the precision with which participants could discriminate ICMS stimuli. Furthermore, when these stimuli were distributed across several channels we could again double the number of forces they could perceive.
Microstimulation of human somatosensory cortex evokes task-dependent, spatially patterned responses in motor cortex
Once we had our own BCI project up and running, one of the first questions we wanted to answer was how the somatosensory cortices and motor cortex interacted with one another. Consequently, we extensively stimulated somatosensory cortex and measured the evoked responses in motor cortex. We found that there was a somatotopic relationship that determined the pattern of activation in motor cortex when somatosensory cortex was stimulated. Furthermore, the nature of the modulation was altered by the motor task. These effects had a tendency to interfere with motor decoding, however, biomimetic stimulation reduced the magnitude of this effect and so prevented catastrophic interference.
Chronic use of a sensitized bionic hand does not remap the sense of touch
In my first foray into BCI, we took a historical dataset courtesy of Max Ortiz-Catalan in which participants had used an osteo-integrated prosthetic limb with a single sensor that gave tactile feedback over several years. As there was a mismatch between the location of the sensor and the location in which the participants perceived the stimulus, it was hypothesized that over time plasticity would cause the participants to perceive the sensation as arising at the same location as the sensor. Instead, we observed no changes in the location of the evoked percept, implying that the cortical representation of touch is highly stable.