Matching Items (5)
Filtering by

Clear all filters

Description

Many upper limb amputees experience an incessant, post-amputation “phantom limb pain” and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor

Many upper limb amputees experience an incessant, post-amputation “phantom limb pain” and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech “rubber hand” illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the “BairClaw” presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger–object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden.

ContributorsHellman, Randall (Author) / Chang, Eric (Author) / Tanner, Justin (Author) / Helms Tillery, Stephen (Author) / Santos, Veronica (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-02-19
128598-Thumbnail Image.png
Description

The addition of tactile and proprioceptive feedback to neuroprosthetic limbs is expected to significantly improve the control of these devices. Intracortical microstimulation (ICMS) of somatosensory cortex is a promising method of delivering this sensory feedback. To date, the main focus of somatosensory ICMS studies has been to deliver discriminable signals,

The addition of tactile and proprioceptive feedback to neuroprosthetic limbs is expected to significantly improve the control of these devices. Intracortical microstimulation (ICMS) of somatosensory cortex is a promising method of delivering this sensory feedback. To date, the main focus of somatosensory ICMS studies has been to deliver discriminable signals, corresponding to varying intensity, to a single location in cortex. However, multiple independent and simultaneous streams of sensory information will need to be encoded by ICMS to provide functionally relevant feedback for a neuroprosthetic limb (e.g., encoding contact events and pressure on multiple digits). In this study, we evaluated the ability of an awake, behaving non-human primate (Macaca mulatta) to discriminate ICMS stimuli delivered on multiple electrodes spaced within somatosensory cortex. We delivered serial stimulation on single electrodes to evaluate the discriminability of sensations corresponding to ICMS of distinct cortical locations. Additionally, we delivered trains of multichannel stimulation, derived from a tactile sensor, synchronously across multiple electrodes. Our results indicate that discrimination of multiple ICMS stimuli is a challenging task, but that discriminable sensory percepts can be elicited by both single and multichannel ICMS on electrodes spaced within somatosensory cortex.

ContributorsOverstreet, Cynthia (Author) / Hellman, Randall (Author) / Ponce Wong, Ruben (Author) / Santos, Veronica (Author) / Helms Tillery, Stephen (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-12-02
128579-Thumbnail Image.png
Description

The neural mechanisms that take place during learning and adaptation can be directly probed with brain-machine interfaces (BMIs). We developed a BMI controlled paradigm that enabled us to enforce learning by introducing perturbations which changed the relationship between neural activity and the BMI's output. We introduced a uniform perturbation to

The neural mechanisms that take place during learning and adaptation can be directly probed with brain-machine interfaces (BMIs). We developed a BMI controlled paradigm that enabled us to enforce learning by introducing perturbations which changed the relationship between neural activity and the BMI's output. We introduced a uniform perturbation to the system, through a visuomotor rotation (VMR), and a non-uniform perturbation, through a decorrelation task. The controller in the VMR was essentially unchanged, but produced an output rotated at 30° from the neurally specified output. The controller in the decorrelation trials decoupled the activity of neurons that were highly correlated in the BMI task by selectively forcing the preferred directions of these cell pairs to be orthogonal. We report that movement errors were larger in the decorrelation task, and subjects needed more trials to restore performance back to baseline. During learning, we measured decreasing trends in preferred direction changes and cross-correlation coefficients regardless of task type. Conversely, final adaptations in neural tunings were dependent on the type controller used (VMR or decorrelation). These results hint to the similar process the neural population might engage while adapting to new tasks, and how, through a global process, the neural system can arrive to individual solutions.

Created2016-08-23
128790-Thumbnail Image.png
Description

Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the

Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences.

Created2011-11-16
128895-Thumbnail Image.png
Description

Tactile perception is typically considered the result of cortical interpretation of afferent signals from a network of mechanical sensors underneath the skin. Yet, tactile illusion studies suggest that tactile perception can be elicited without afferent signals from mechanoceptors. Therefore, the extent that tactile perception arises from isomorphic mapping of tactile

Tactile perception is typically considered the result of cortical interpretation of afferent signals from a network of mechanical sensors underneath the skin. Yet, tactile illusion studies suggest that tactile perception can be elicited without afferent signals from mechanoceptors. Therefore, the extent that tactile perception arises from isomorphic mapping of tactile afferents onto the somatosensory cortex remains controversial. We tested whether isomorphic mapping of tactile afferent fibers onto the cortex leads directly to tactile perception by examining whether it is independent from proprioceptive input by evaluating the impact of different hand postures on the perception of a tactile illusion across fingertips. Using the Cutaneous Rabbit Effect, a well studied illusion evoking the perception that a stimulus occurs at a location where none has been delivered, we found that hand posture has a significant effect on the perception of the illusion across the fingertips. This finding emphasizes that tactile perception arises from integration of perceived mechanical and proprioceptive input and not purely from tactile interaction with the external environment.

ContributorsWarren, Jay (Author) / Santello, Marco (Author) / Helms Tillery, Stephen (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2011-03-25