Matching Items (12)
152070-Thumbnail Image.png
Description
When surgical resection becomes necessary to alleviate a patient's epileptiform activity, that patient is monitored by video synchronized with electrocorticography (ECoG) to determine the type and location of seizure focus. This provides a unique opportunity for researchers to gather neurophysiological data with high temporal and spatial resolution; these data are

When surgical resection becomes necessary to alleviate a patient's epileptiform activity, that patient is monitored by video synchronized with electrocorticography (ECoG) to determine the type and location of seizure focus. This provides a unique opportunity for researchers to gather neurophysiological data with high temporal and spatial resolution; these data are assessed prior to surgical resection to ensure the preservation of the patient's quality of life, e.g. avoid the removal of brain tissue required for speech processing. Currently considered the "gold standard" for the mapping of cortex, electrical cortical stimulation (ECS) involves the systematic activation of pairs of electrodes to localize functionally specific brain regions. This method has distinct limitations, which often includes pain experienced by the patient. Even in the best cases, the technique suffers from subjective assessments on the parts of both patients and physicians, and high inter- and intra-observer variability. Recent advances have been made as researchers have reported the localization of language areas through several signal processing methodologies, all necessitating patient participation in a controlled experiment. The development of a quantification tool to localize speech areas in which a patient is engaged in an unconstrained interpersonal conversation would eliminate the dependence of biased patient and reviewer input, as well as unnecessary discomfort to the patient. Post-hoc ECoG data were gathered from five patients with intractable epilepsy while each was engaged in a conversation with family members or clinicians. After the data were separated into different speech conditions, the power of each was compared to baseline to determine statistically significant activated electrodes. The results of several analytical methods are presented here. The algorithms did not yield language-specific areas exclusively, as broad activation of statistically significant electrodes was apparent across cortical areas. For one patient, 15 adjacent contacts along superior temporal gyrus (STG) and posterior part of the temporal lobe were determined language-significant through a controlled experiment. The task involved a patient lying in bed listening to repeated words, and yielded statistically significant activations that aligned with those of clinical evaluation. The results of this study do not support the hypothesis that unconstrained conversation may be used to localize areas required for receptive and productive speech, yet suggests a simple listening task may be an adequate alternative to direct cortical stimulation.
ContributorsLingo VanGilder, Jennapher (Author) / Helms Tillery, Stephen I (Thesis advisor) / Wahnoun, Remy (Thesis advisor) / Buneo, Christopher (Committee member) / Arizona State University (Publisher)
Created2013
Description
Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the prosthetic would be translated into electrical stimulation and delivered directly

Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the prosthetic would be translated into electrical stimulation and delivered directly to the brain, providing feedback about features of objects in contact with the prosthetic. To achieve this goal, multiple simultaneous streams of information will need to be encoded by ICMS in a manner that produces robust, reliable, and discriminable sensations. The first segment of this work focuses on the discriminability of sensations elicited by ICMS within somatosensory cortex. Stimulation on multiple single electrodes and near-simultaneous stimulation across multiple electrodes, driven by a multimodal tactile sensor, were both used in these experiments. A SynTouch BioTac sensor was moved across a flat surface in several directions, and a subset of the sensor's electrode impedance channels were used to drive multichannel ICMS in the somatosensory cortex of a non-human primate. The animal performed a behavioral task during this stimulation to indicate the discriminability of sensations evoked by the electrical stimulation. The animal's responses to ICMS were somewhat inconsistent across experimental sessions but indicated that discriminable sensations were evoked by both single and multichannel ICMS. The factors that affect the discriminability of stimulation-induced sensations are not well understood, in part because the relationship between ICMS and the neural activity it induces is poorly defined. The second component of this work was to develop computational models that describe the populations of neurons likely to be activated by ICMS. Models of several neurons were constructed, and their responses to ICMS were calculated. A three-dimensional cortical model was constructed using these cell models and used to identify the populations of neurons likely to be recruited by ICMS. Stimulation activated neurons in a sparse and discontinuous fashion; additionally, the type, number, and location of neurons likely to be activated by stimulation varied with electrode depth.
ContributorsOverstreet, Cynthia K (Author) / Helms Tillery, Stephen I (Thesis advisor) / Santos, Veronica (Committee member) / Buneo, Christopher (Committee member) / Otto, Kevin (Committee member) / Santello, Marco (Committee member) / Arizona State University (Publisher)
Created2013
152076-Thumbnail Image.png
Description
Human fingertips contain thousands of specialized mechanoreceptors that enable effortless physical interactions with the environment. Haptic perception capabilities enable grasp and manipulation in the absence of visual feedback, as when reaching into one's pocket or wrapping a belt around oneself. Unfortunately, state-of-the-art artificial tactile sensors and processing algorithms are no

Human fingertips contain thousands of specialized mechanoreceptors that enable effortless physical interactions with the environment. Haptic perception capabilities enable grasp and manipulation in the absence of visual feedback, as when reaching into one's pocket or wrapping a belt around oneself. Unfortunately, state-of-the-art artificial tactile sensors and processing algorithms are no match for their biological counterparts. Tactile sensors must not only meet stringent practical specifications for everyday use, but their signals must be processed and interpreted within hundreds of milliseconds. Control of artificial manipulators, ranging from prosthetic hands to bomb defusal robots, requires a constant reliance on visual feedback that is not entirely practical. To address this, we conducted three studies aimed at advancing artificial haptic intelligence. First, we developed a novel, robust, microfluidic tactile sensor skin capable of measuring normal forces on flat or curved surfaces, such as a fingertip. The sensor consists of microchannels in an elastomer filled with a liquid metal alloy. The fluid serves as both electrical interconnects and tunable capacitive sensing units, and enables functionality despite substantial deformation. The second study investigated the use of a commercially-available, multimodal tactile sensor (BioTac sensor, SynTouch) to characterize edge orientation with respect to a body fixed reference frame, such as a fingertip. Trained on data from a robot testbed, a support vector regression model was developed to relate haptic exploration actions to perception of edge orientation. The model performed comparably to humans for estimating edge orientation. Finally, the robot testbed was used to perceive small, finger-sized geometric features. The efficiency and accuracy of different haptic exploratory procedures and supervised learning models were assessed for estimating feature properties such as type (bump, pit), order of curvature (flat, conical, spherical), and size. This study highlights the importance of tactile sensing in situations where other modalities fail, such as when the finger itself blocks line of sight. Insights from this work could be used to advance tactile sensor technology and haptic intelligence for artificial manipulators that improve quality of life, such as prosthetic hands and wheelchair-mounted robotic hands.
ContributorsPonce Wong, Ruben Dario (Author) / Santos, Veronica J (Thesis advisor) / Artemiadis, Panagiotis K (Committee member) / Helms Tillery, Stephen I (Committee member) / Posner, Jonathan D (Committee member) / Runger, George C. (Committee member) / Arizona State University (Publisher)
Created2013
152656-Thumbnail Image.png
Description
The basal ganglia are four sub-cortical nuclei associated with motor control and reward learning. They are part of numerous larger mostly segregated loops where the basal ganglia receive inputs from specific regions of cortex. Converging on these inputs are dopaminergic neurons that alter their firing based on received and/or predicted

The basal ganglia are four sub-cortical nuclei associated with motor control and reward learning. They are part of numerous larger mostly segregated loops where the basal ganglia receive inputs from specific regions of cortex. Converging on these inputs are dopaminergic neurons that alter their firing based on received and/or predicted rewarding outcomes of a behavior. The basal ganglia's output feeds through the thalamus back to the areas of the cortex where the loop originated. Understanding the dynamic interactions between the various parts of these loops is critical to understanding the basal ganglia's role in motor control and reward based learning. This work developed several experimental techniques that can be applied to further study basal ganglia function. The first technique used micro-volume injections of low concentration muscimol to decrease the firing rates of recorded neurons in a limited area of cortex in rats. Afterwards, an artificial cerebrospinal fluid flush was injected to rapidly eliminate the muscimol's effects. This technique was able to contain the effects of muscimol to approximately a 1 mm radius volume and limited the duration of the drug effect to less than one hour. This technique could be used to temporarily perturb a small portion of the loops involving the basal ganglia and then observe how these effects propagate in other connected regions. The second part applied self-organizing maps (SOM) to find temporal patterns in neural firing rate that are independent of behavior. The distribution of detected patterns frequency on these maps can then be used to determine if changes in neural activity are occurring over time. The final technique focused on the role of the basal ganglia in reward learning. A new conditioning technique was created to increase the occurrence of selected patterns of neural activity without utilizing any external reward or behavior. A pattern of neural activity in the cortex of rats was selected using an SOM. The pattern was then reinforced by being paired with electrical stimulation of the medial forebrain bundle triggering dopamine release in the basal ganglia. Ultimately, this technique proved unsuccessful possibly due to poor selection of the patterns being reinforced.
ContributorsBaldwin, Nathan Aaron (Author) / Helms Tillery, Stephen I (Thesis advisor) / Castaneda, Edward (Committee member) / Buneo, Christopher A (Committee member) / Muthuswamy, Jitendran (Committee member) / Si, Jennie (Committee member) / Arizona State University (Publisher)
Created2014
150828-Thumbnail Image.png
Description
Effective tactile sensing in prosthetic and robotic hands is crucial for improving the functionality of such hands and enhancing the user's experience. Thus, improving the range of tactile sensing capabilities is essential for developing versatile artificial hands. Multimodal tactile sensors called BioTacs, which include a hydrophone and a force electrode

Effective tactile sensing in prosthetic and robotic hands is crucial for improving the functionality of such hands and enhancing the user's experience. Thus, improving the range of tactile sensing capabilities is essential for developing versatile artificial hands. Multimodal tactile sensors called BioTacs, which include a hydrophone and a force electrode array, were used to understand how grip force, contact angle, object texture, and slip direction may be encoded in the sensor data. Findings show that slip induced under conditions of high contact angles and grip forces resulted in significant changes in both AC and DC pressure magnitude and rate of change in pressure. Slip induced under conditions of low contact angles and grip forces resulted in significant changes in the rate of change in electrode impedance. Slip in the distal direction of a precision grip caused significant changes in pressure magnitude and rate of change in pressure, while slip in the radial direction of the wrist caused significant changes in the rate of change in electrode impedance. A strong relationship was established between slip direction and the rate of change in ratios of electrode impedance for radial and ulnar slip relative to the wrist. Consequently, establishing multiple thresholds or establishing a multivariate model may be a useful method for detecting and characterizing slip. Detecting slip for low contact angles could be done by monitoring electrode data, while detecting slip for high contact angles could be done by monitoring pressure data. Predicting slip in the distal direction could be done by monitoring pressure data, while predicting slip in the radial and ulnar directions could be done by monitoring electrode data.
ContributorsHsia, Albert (Author) / Santos, Veronica J (Thesis advisor) / Santello, Marco (Committee member) / Helms Tillery, Stephen I (Committee member) / Arizona State University (Publisher)
Created2012
150144-Thumbnail Image.png
Description
In the past decade, research on the motor control side of neuroprosthetics has steadily gained momentum. However, modern research in prosthetic development supplements a focus on motor control with a concentration on sensory feedback. Simulating sensation is a central issue because without sensory capabilities, the sophistication of the most advanced

In the past decade, research on the motor control side of neuroprosthetics has steadily gained momentum. However, modern research in prosthetic development supplements a focus on motor control with a concentration on sensory feedback. Simulating sensation is a central issue because without sensory capabilities, the sophistication of the most advanced motor control system fails to reach its full potential. This research is an effort toward the development of sensory feedback specifically for neuroprosthetic hands. The present aim of this work is to understand the processing and representation of cutaneous sensation by evaluating performance and neural activity in somatosensory cortex (SI) during a grasp task. A non-human primate (Macaca mulatta) was trained to reach out and grasp textured instrumented objects with a precision grip. Two different textures for the objects were used, 100% cotton cloth and 60-grade sandpaper, and the target object was presented at two different orientations. Of the 167 cells that were isolated for this experiment, only 42 were recorded while the subject executed a few blocks of successful trials for both textures. These latter cells were used in this study's statistical analysis. Of these, 37 units (88%) exhibited statistically significant task related activity. Twenty-two units (52%) exhibited statistically significant tuning to texture, and 16 units (38%) exhibited statistically significant tuning to posture. Ten of the cells (24%) exhibited statistically significant tuning to both texture and posture. These data suggest that single units in somatosensory cortex can encode multiple phenomena such as texture and posture. However, if this information is to be used to provide sensory feedback for a prosthesis, scientists must learn to further parse cortical activity to discover how to induce specific modalities of sensation. Future experiments should therefore be developed that probe more variables and that more systematically and comprehensively scan somatosensory cortex. This will allow researchers to seek out the existence or non-existence of cortical pockets reserved for certain modalities of sensation, which will be valuable in learning how to later provide appropriate sensory feedback for a prosthesis through cortical stimulation.
ContributorsNaufel, Stephanie (Author) / Helms Tillery, Stephen I (Thesis advisor) / Santos, Veronica J (Thesis advisor) / Buneo, Christopher A (Committee member) / Robert, Jason S (Committee member) / Arizona State University (Publisher)
Created2011
157346-Thumbnail Image.png
Description
Vagal Nerve Stimulation (VNS) has been shown to be a promising therapeutic technique in treating many neurological diseases, including epilepsy, stroke, traumatic brain injury, and migraine headache. The mechanisms by which VNS acts, however, are not fully understood but may involve changes in cerebral blood flow. The vagus nerve plays

Vagal Nerve Stimulation (VNS) has been shown to be a promising therapeutic technique in treating many neurological diseases, including epilepsy, stroke, traumatic brain injury, and migraine headache. The mechanisms by which VNS acts, however, are not fully understood but may involve changes in cerebral blood flow. The vagus nerve plays a significant role in the regulation of heart rate and cerebral blood flow that are altered during VNS. Here, the effects of acute vagal nerve stimulation using varying stimulation parameters on both heart rate and cerebral blood flow were examined. Laser Speckle Contrast Analysis (LASCA) was used to analyze the cerebral blood flow of male Long–Evans rats. In the first experiment, results showed two distinct patterns of responses to 0.8mA of stimulation whereby animals either experienced a mild or severe decrease in heart rate. Further, animals that displayed mild heart rate decreases showed an increase in cerebral blood flow that persisted beyond VNS. Animals that displayed severe decreases showed a transient decrease in cerebral blood flow followed by an increase that was greater than that observed in mild animals but progressively decreased after VNS. The results suggest two distinct patterns of changes in both heart rate and blood flow that may be related to the intensity of VNS. To investigate the effects of lower levels of stimulation, an additional group of animals were stimulated at 0.4mA. The results showed moderate changes in heart rate but no significant changes in cerebral blood flow in these animals. The results demonstrate that VNS alters both heart rate and cerebral blood flow and that these effects are dependent on current intensity.
ContributorsHillebrand, Peter (M.S.) (Author) / Kleim, Jeffrey A (Thesis advisor) / Helms Tillery, Stephen I (Committee member) / Muthuswamy, Jitendran (Committee member) / Arizona State University (Publisher)
Created2019
154148-Thumbnail Image.png
Description
Brain-machine interfaces (BMIs) were first imagined as a technology that would allow subjects to have direct communication with prosthetics and external devices (e.g. control over a computer cursor or robotic arm movement). Operation of these devices was not automatic, and subjects needed calibration and training in order to master this

Brain-machine interfaces (BMIs) were first imagined as a technology that would allow subjects to have direct communication with prosthetics and external devices (e.g. control over a computer cursor or robotic arm movement). Operation of these devices was not automatic, and subjects needed calibration and training in order to master this control. In short, learning became a key component in controlling these systems. As a result, BMIs have become ideal tools to probe and explore brain activity, since they allow the isolation of neural inputs and systematic altering of the relationships between the neural signals and output. I have used BMIs to explore the process of brain adaptability in a motor-like task. To this end, I trained non-human primates to control a 3D cursor and adapt to two different perturbations: a visuomotor rotation, uniform across the neural ensemble, and a decorrelation task, which non-uniformly altered the relationship between the activity of particular neurons in an ensemble and movement output. I measured individual and population level changes in the neural ensemble as subjects honed their skills over the span of several days. I found some similarities in the adaptation process elicited by these two tasks. On one hand, individual neurons displayed tuning changes across the entire ensemble after task adaptation: most neurons displayed transient changes in their preferred directions, and most neuron pairs showed changes in their cross-correlations during the learning process. On the other hand, I also measured population level adaptation in the neural ensemble: the underlying neural manifolds that control these neural signals also had dynamic changes during adaptation. I have found that the neural circuits seem to apply an exploratory strategy when adapting to new tasks. Our results suggest that information and trajectories in the neural space increase after initially introducing the perturbations, and before the subject settles into workable solutions. These results provide new insights into both the underlying population level processes in motor learning, and the changes in neural coding which are necessary for subjects to learn to control neuroprosthetics. Understanding of these mechanisms can help us create better control algorithms, and design training paradigms that will take advantage of these processes.
ContributorsArmenta Salas, Michelle (Author) / Helms Tillery, Stephen I (Thesis advisor) / Si, Jennie (Committee member) / Buneo, Christopher (Committee member) / Santello, Marco (Committee member) / Kleim, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2015
154883-Thumbnail Image.png
Description
Robotic systems are outmatched by the abilities of the human hand to perceive and manipulate the world. Human hands are able to physically interact with the world to perceive, learn, and act to accomplish tasks. Limitations of robotic systems to interact with and manipulate the world diminish their usefulness. In

Robotic systems are outmatched by the abilities of the human hand to perceive and manipulate the world. Human hands are able to physically interact with the world to perceive, learn, and act to accomplish tasks. Limitations of robotic systems to interact with and manipulate the world diminish their usefulness. In order to advance robot end effectors, specifically artificial hands, rich multimodal tactile sensing is needed. In this work, a multi-articulating, anthropomorphic robot testbed was developed for investigating tactile sensory stimuli during finger-object interactions. The artificial finger is controlled by a tendon-driven remote actuation system that allows for modular control of any tendon-driven end effector and capabilities for both speed and strength. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. Next, attention was focused on real-time artificial perception for decision-making. A robotic system needs to perceive its environment in order to make decisions. Specific actions such as “exploratory procedures” can be employed to classify and characterize object features. Prior work on offline perception was extended to develop an anytime predictive model that returns the probability of having touched a specific feature of an object based on minimally processed sensor data. Developing models for anytime classification of features facilitates real-time action-perception loops. Finally, by combining real-time action-perception with reinforcement learning, a policy was learned to complete a functional contour-following task: closing a deformable ziplock bag. The approach relies only on proprioceptive and localized tactile data. A Contextual Multi-Armed Bandit (C-MAB) reinforcement learning algorithm was implemented to maximize cumulative rewards within a finite time period by balancing exploration versus exploitation of the action space. Performance of the C-MAB learner was compared to a benchmark Q-learner that eventually returns the optimal policy. To assess robustness and generalizability, the learned policy was tested on variations of the original contour-following task. The work presented contributes to the full range of tools necessary to advance the abilities of artificial hands with respect to dexterity, perception, decision-making, and learning.
ContributorsHellman, Randall Blake (Author) / Santos, Veronica J (Thesis advisor) / Artemiadis, Panagiotis K (Committee member) / Berman, Spring (Committee member) / Helms Tillery, Stephen I (Committee member) / Fainekos, Georgios (Committee member) / Arizona State University (Publisher)
Created2016
155813-Thumbnail Image.png
Description
Prosthetic users abandon devices due to difficulties performing tasks without proper graded or interpretable feedback. The inability to adequately detect and correct error of the device leads to failure and frustration. In advanced prostheses, peripheral nerve stimulation can be used to deliver sensations, but standard schemes used in sensorized

Prosthetic users abandon devices due to difficulties performing tasks without proper graded or interpretable feedback. The inability to adequately detect and correct error of the device leads to failure and frustration. In advanced prostheses, peripheral nerve stimulation can be used to deliver sensations, but standard schemes used in sensorized prosthetic systems induce percepts inconsistent with natural sensations, providing limited benefit. Recent uses of time varying stimulation strategies appear to produce more practical sensations, but without a clear path to pursue improvements. This dissertation examines the use of physiologically based stimulation strategies to elicit sensations that are more readily interpretable. A psychophysical experiment designed to investigate sensitivities to the discrimination of perturbation direction within precision grip suggests that perception is biomechanically referenced: increased sensitivities along the ulnar-radial axis align with potential anisotropic deformation of the finger pad, indicating somatosensation uses internal information rather than environmental. Contact-site and direction dependent deformation of the finger pad activates complimentary fast adapting and slow adapting mechanoreceptors, exhibiting parallel activity of the two associate temporal patterns: static and dynamic. The spectrum of temporal activity seen in somatosensory cortex can be explained by a combined representation of these distinct response dynamics, a phenomenon referred in this dissertation to “biphasic representation.” In a reach-to-precision-grasp task, neurons in somatosensory cortex were found to possess biphasic firing patterns in their responses to texture, orientation, and movement. Sensitivities seem to align with variable deformation and mechanoreceptor activity: movement and smooth texture responses align with potential fast adapting activation, non-movement and coarse texture responses align with potential increased slow adapting activation, and responses to orientation are conceptually consistent with coding of tangential load. Using evidence of biphasic representations’ association with perceptual priorities, gamma band phase locking is used to compare responses to peripheral nerve stimulation patterns and mechanical stimulation. Vibrotactile and punctate mechanical stimuli are used to represent the practical and impractical percepts commonly observed in peripheral nerve stimulation feedback. Standard patterns of constant parameters closely mimic impractical vibrotactile stimulation while biphasic patterns better mimic punctate stimulation and provide a platform to investigate intragrip dynamics representing contextual activation.
ContributorsTanner, Justin Cody (Author) / Helms Tillery, Stephen I (Thesis advisor) / Santos, Veronica J (Committee member) / Santello, Marco (Committee member) / Greger, Bradley (Committee member) / Buneo, Christopher A (Committee member) / Arizona State University (Publisher)
Created2017