Matching Items (419)
150222-Thumbnail Image.png
Description
An accurate sense of upper limb position is crucial to reaching movements where sensory information about upper limb position and target location is combined to specify critical features of the movement plan. This dissertation was dedicated to studying the mechanisms of how the brain estimates the limb position in space

An accurate sense of upper limb position is crucial to reaching movements where sensory information about upper limb position and target location is combined to specify critical features of the movement plan. This dissertation was dedicated to studying the mechanisms of how the brain estimates the limb position in space and the consequences of misestimation of limb position on movements. Two independent but related studies were performed. The first involved characterizing the neural mechanisms of limb position estimation in the non-human primate brain. Single unit recordings were obtained in area 5 of the posterior parietal cortex in order to examine the role of this area in estimating limb position based on visual and somatic signals (proprioceptive, efference copy). When examined individually, many area 5 neurons were tuned to the position of the limb in the workspace but very few neurons were modulated by visual feedback. At the population level however decoding of limb position was somewhat more accurate when visual feedback was provided. These findings support a role for area 5 in limb position estimation but also suggest that visual signals regarding limb position are only weakly represented in this area, and only at the population level. The second part of this dissertation focused on the consequences of misestimation of limb position for movement production. It is well known that limb movements are inherently variable. This variability could be the result of noise arising at one or more stages of movement production. Here we used biomechanical modeling and simulation techniques to characterize movement variability resulting from noise in estimating limb position ('sensing noise') and in planning required movement vectors ('planning noise'), and compared that to the variability expected due to noise in movement execution. We found that the effects of sensing and planning related noise on movement variability were dependent upon both the planned movement direction and the initial configuration of the arm and were different in many respects from the effects of execution noise.
ContributorsShi, Ying (Author) / Buneo, Christopher A (Thesis advisor) / Helms Tillery, Stephen (Committee member) / Santello, Marco (Committee member) / He, Jiping (Committee member) / Santos, Veronica (Committee member) / Arizona State University (Publisher)
Created2011
151390-Thumbnail Image.png
Description
Our ability to estimate the position of our body parts in space, a fundamentally proprioceptive process, is crucial for interacting with the environment and movement control. For proprioception to support these actions, the Central Nervous System has to rely on a stored internal representation of the body parts in space.

Our ability to estimate the position of our body parts in space, a fundamentally proprioceptive process, is crucial for interacting with the environment and movement control. For proprioception to support these actions, the Central Nervous System has to rely on a stored internal representation of the body parts in space. However, relatively little is known about this internal representation of arm position. To this end, I developed a method to map proprioceptive estimates of hand location across a 2-d workspace. In this task, I moved each subject's hand to a target location while the subject's eyes were closed. After returning the hand, subjects opened their eyes to verbally report the location of where their fingertip had been. Then, I reconstructed and analyzed the spatial structure of the pattern of estimation errors. In the first couple of experiments I probed the structure and stability of the pattern of errors by manipulating the hand used and tactile feedback provided when the hand was at each target location. I found that the resulting pattern of errors was systematically stable across conditions for each subject, subject-specific, and not uniform across the workspace. These findings suggest that the observed structure of pattern of errors has been constructed through experience, which has resulted in a systematically stable internal representation of arm location. Moreover, this representation is continuously being calibrated across the workspace. In the next two experiments, I aimed to probe the calibration of this structure. To this end, I used two different perturbation paradigms: 1) a virtual reality visuomotor adaptation to induce a local perturbation, 2) and a standard prism adaptation paradigm to induce a global perturbation. I found that the magnitude of the errors significantly increased to a similar extent after each perturbation. This small effect indicates that proprioception is recalibrated to a similar extent regardless of how the perturbation is introduced, suggesting that sensory and motor changes may be two independent processes arising from the perturbation. Moreover, I propose that the internal representation of arm location might be constructed with a global solution and not capable of local changes.
ContributorsRincon Gonzalez, Liliana (Author) / Helms Tillery, Stephen I (Thesis advisor) / Buneo, Christopher A (Thesis advisor) / Santello, Marco (Committee member) / Santos, Veronica (Committee member) / Kleim, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2012
151399-Thumbnail Image.png
Description
Millions of Americans live with motor impairments resulting from a stroke and the best way to administer rehabilitative therapy to achieve recovery is not well understood. Adaptive mixed reality rehabilitation (AMRR) is a novel integration of motion capture technology and high-level media computing that provides precise kinematic measurements and engaging

Millions of Americans live with motor impairments resulting from a stroke and the best way to administer rehabilitative therapy to achieve recovery is not well understood. Adaptive mixed reality rehabilitation (AMRR) is a novel integration of motion capture technology and high-level media computing that provides precise kinematic measurements and engaging multimodal feedback for self-assessment during a therapeutic task. The AMRR system was evaluated in a small (N=3) cohort of stroke survivors to determine best practices for administering adaptive, media-based therapy. A proof of concept study followed, examining changes in clinical scale and kinematic performances among a group of stroke survivors who received either a month of AMRR therapy (N = 11) or matched dosing of traditional repetitive task therapy (N = 10). Both groups demonstrated statistically significant improvements in Wolf Motor Function Test and upper-extremity Fugl-Meyer Assessment scores, indicating increased function after the therapy. However, only participants who received AMRR therapy showed a consistent improvement in their kinematic measurements, including those measured in the trained reaching task (reaching to grasp a cone) and in an untrained reaching task (reaching to push a lighted button). These results suggest that that the AMRR system can be used as a therapy tool to enhance both functionality and reaching kinematics that quantify movement quality. Additionally, the AMRR concepts are currently being transitioned to a home-based training application. An inexpensive, easy-to-use, toolkit of tangible objects has been developed to sense, assess and provide feedback on hand function during different functional activities. These objects have been shown to accurately and consistently track hand function in people with unimpaired movements and will be tested with stroke survivors in the future.
ContributorsDuff, Margaret Rose (Author) / Rikakis, Thanassis (Thesis advisor) / He, Jiping (Thesis advisor) / Herman, Richard (Committee member) / Kleim, Jeffrey (Committee member) / Santos, Veronica (Committee member) / Towe, Bruce (Committee member) / Arizona State University (Publisher)
Created2012
152011-Thumbnail Image.png
Description
Humans' ability to perform fine object and tool manipulation is a defining feature of their sensorimotor repertoire. How the central nervous system builds and maintains internal representations of such skilled hand-object interactions has attracted significant attention over the past three decades. Nevertheless, two major gaps exist: a) how digit positions

Humans' ability to perform fine object and tool manipulation is a defining feature of their sensorimotor repertoire. How the central nervous system builds and maintains internal representations of such skilled hand-object interactions has attracted significant attention over the past three decades. Nevertheless, two major gaps exist: a) how digit positions and forces are coordinated during natural manipulation tasks, and b) what mechanisms underlie the formation and retention of internal representations of dexterous manipulation. This dissertation addresses these two questions through five experiments that are based on novel grip devices and experimental protocols. It was found that high-level representation of manipulation tasks can be learned in an effector-independent fashion. Specifically, when challenged by trial-to-trial variability in finger positions or using digits that were not previously engaged in learning the task, subjects could adjust finger forces to compensate for this variability, thus leading to consistent task performance. The results from a follow-up experiment conducted in a virtual reality environment indicate that haptic feedback is sufficient to implement the above coordination between digit position and forces. However, it was also found that the generalizability of a learned manipulation is limited across tasks. Specifically, when subjects learned to manipulate the same object across different contexts that require different motor output, interference was found at the time of switching contexts. Data from additional studies provide evidence for parallel learning processes, which are characterized by different rates of decay and learning. These experiments have provided important insight into the neural mechanisms underlying learning and control of object manipulation. The present findings have potential biomedical applications including brain-machine interfaces, rehabilitation of hand function, and prosthetics.
ContributorsFu, Qiushi (Author) / Santello, Marco (Thesis advisor) / Helms Tillery, Stephen (Committee member) / Buneo, Christopher (Committee member) / Santos, Veronica (Committee member) / Artemiadis, Panagiotis (Committee member) / Arizona State University (Publisher)
Created2013
Description
Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the prosthetic would be translated into electrical stimulation and delivered directly

Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the prosthetic would be translated into electrical stimulation and delivered directly to the brain, providing feedback about features of objects in contact with the prosthetic. To achieve this goal, multiple simultaneous streams of information will need to be encoded by ICMS in a manner that produces robust, reliable, and discriminable sensations. The first segment of this work focuses on the discriminability of sensations elicited by ICMS within somatosensory cortex. Stimulation on multiple single electrodes and near-simultaneous stimulation across multiple electrodes, driven by a multimodal tactile sensor, were both used in these experiments. A SynTouch BioTac sensor was moved across a flat surface in several directions, and a subset of the sensor's electrode impedance channels were used to drive multichannel ICMS in the somatosensory cortex of a non-human primate. The animal performed a behavioral task during this stimulation to indicate the discriminability of sensations evoked by the electrical stimulation. The animal's responses to ICMS were somewhat inconsistent across experimental sessions but indicated that discriminable sensations were evoked by both single and multichannel ICMS. The factors that affect the discriminability of stimulation-induced sensations are not well understood, in part because the relationship between ICMS and the neural activity it induces is poorly defined. The second component of this work was to develop computational models that describe the populations of neurons likely to be activated by ICMS. Models of several neurons were constructed, and their responses to ICMS were calculated. A three-dimensional cortical model was constructed using these cell models and used to identify the populations of neurons likely to be recruited by ICMS. Stimulation activated neurons in a sparse and discontinuous fashion; additionally, the type, number, and location of neurons likely to be activated by stimulation varied with electrode depth.
ContributorsOverstreet, Cynthia K (Author) / Helms Tillery, Stephen I (Thesis advisor) / Santos, Veronica (Committee member) / Buneo, Christopher (Committee member) / Otto, Kevin (Committee member) / Santello, Marco (Committee member) / Arizona State University (Publisher)
Created2013
150499-Thumbnail Image.png
Description
The ability to plan, execute, and control goal oriented reaching and grasping movements is among the most essential functions of the brain. Yet, these movements are inherently variable; a result of the noise pervading the neural signals underlying sensorimotor processing. The specific influences and interactions of these noise processes remain

The ability to plan, execute, and control goal oriented reaching and grasping movements is among the most essential functions of the brain. Yet, these movements are inherently variable; a result of the noise pervading the neural signals underlying sensorimotor processing. The specific influences and interactions of these noise processes remain unclear. Thus several studies have been performed to elucidate the role and influence of sensorimotor noise on movement variability. The first study focuses on sensory integration and movement planning across the reaching workspace. An experiment was designed to examine the relative contributions of vision and proprioception to movement planning by measuring the rotation of the initial movement direction induced by a perturbation of the visual feedback prior to movement onset. The results suggest that contribution of vision was relatively consistent across the evaluated workspace depths; however, the influence of vision differed between the vertical and later axes indicate that additional factors beyond vision and proprioception influence movement planning of 3-dimensional movements. If the first study investigated the role of noise in sensorimotor integration, the second and third studies investigate relative influence of sensorimotor noise on reaching performance. Specifically, they evaluate how the characteristics of neural processing that underlie movement planning and execution manifest in movement variability during natural reaching. Subjects performed reaching movements with and without visual feedback throughout the movement and the patterns of endpoint variability were compared across movement directions. The results of these studies suggest a primary role of visual feedback noise in shaping patterns of variability and in determining the relative influence of planning and execution related noise sources. The final work considers a computational approach to characterizing how sensorimotor processes interact to shape movement variability. A model of multi-modal feedback control was developed to simulate the interaction of planning and execution noise on reaching variability. The model predictions suggest that anisotropic properties of feedback noise significantly affect the relative influence of planning and execution noise on patterns of reaching variability.
ContributorsApker, Gregory Allen (Author) / Buneo, Christopher A (Thesis advisor) / Helms Tillery, Stephen (Committee member) / Santello, Marco (Committee member) / Santos, Veronica (Committee member) / Si, Jennie (Committee member) / Arizona State University (Publisher)
Created2012
136991-Thumbnail Image.png
Description
The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large part due the absence of the sense of touch. Tactile sensing of the human hand comprises a key component of

The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large part due the absence of the sense of touch. Tactile sensing of the human hand comprises a key component of a wide variety of interactions with the external environment; visual feedback alone is not always sufficient for the recreation of nuanced tasks. It is hoped that the results of this study can contribute to the advancement of prosthetics with a tactile feedback loop with the ultimate goal of replacing biological function. A three-fingered robot hand equipped with tactile sensing fingertips was used to biomimetically grasp a ball in order haptically explore the environment for a ball-in-hole task. The sensorized fingertips were used to measure the vibration, pressure, and skin deformation experienced by each fingertip. Vibration and pressure sensed by the fingertips were good indicators of changes in discrete phases of the exploratory motion such as contact with the lip of a hole. The most informative tactile cue was the skin deformation of the fingers. Upon encountering the lip of the test surface, the lagging digit experienced compression in the fingertip and radial distal region of the digit. The middle digit experienced decompression of the middle region of the finger and the lagging digit showed compression towards the middle digit and decompression in the distal-ulnar region. Larger holes caused an increase in pressure experienced by the fingertips while changes in stroke speed showed no effect on tactile data. Larger coefficients of friction between the ball and the test surface led to an increase in pressure and skin deformation of the finger. Unlike most tactile sensing studies that focus on tactile stimuli generated by direct contact between a fingertip and the environment, this preliminary study focused on tactile stimuli generated when a grasped object interacts with the environment. Findings from this study could be used to design experiments for functionally similar activities of daily living, such as the haptic search for a keyhole via a grasped key.
ContributorsLoges, Shea Remegio (Author) / Santos, Veronica (Thesis director) / Artemiadis, Panagiotis (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor)
Created2014-05
137106-Thumbnail Image.png
Description
The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation

The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation speeds and two levels of rotation resistance were used to investigate tactile cues during knob rotation. In the future, this multidigit task can be generalized to similar rotational tasks, such as opening a bottle or turning a doorknob.
ContributorsChalla, Santhi Priya (Author) / Santos, Veronica (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / School of Earth and Space Exploration (Contributor)
Created2014-05
137748-Thumbnail Image.png
Description
I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced, EMG-controlled exoskeleton. The project was new, and any work on the human- machine interface needs the physical interface itself. So

I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced, EMG-controlled exoskeleton. The project was new, and any work on the human- machine interface needs the physical interface itself. So I designed and fabricated a human-robot coupling device with a novel safety feature. The validation testing of this coupling proved very successful, and the device was granted a provisional patent as well as published to facilitate its spread to other human-machine interface applications, where it could be of major benefit. I then employed this coupling in experimentation towards understanding impedance, with the end goal being the creation of an EMG-based impedance exoskeleton control system. I modified a previously established robot-to-human perturbation method for use in my novel, three- dimensional (3D) impedance measurement experiment. Upon execution of this experiment, I was able to successfully characterize passive, static human arm stiffness in 3D, and in doing so validated the aforementioned method. This establishes an important foundation for promising future work on understanding impedance and the creation of the proposed control scheme, thereby furthering the field of human-robot interaction.
ContributorsO'Neill, Gerald D. (Author) / Artemiadis, Panagiotis (Thesis director) / Santello, Marco (Committee member) / Santos, Veronica (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor)
Created2013-05
137299-Thumbnail Image.png
Description
This thesis focused on grasping tasks with the goal of investigating, analyzing, and quantifying human catching trends by way of a mathematical model. The aim of this project was to study human trends in a dynamic grasping task (catching a rolling ball), relate those discovered trends to kinematic characteristics of

This thesis focused on grasping tasks with the goal of investigating, analyzing, and quantifying human catching trends by way of a mathematical model. The aim of this project was to study human trends in a dynamic grasping task (catching a rolling ball), relate those discovered trends to kinematic characteristics of the object, and use this relation to control a robot hand in real time. As an ultimate goal, it was hoped that this research will aide in furthering the bio-inspiration in robot control methods. To achieve the above goal, firstly a tactile sensing glove was developed. This instrument allowed for in depth study of human reactionary grasping movements when worn by subjects during experimentation. This sensing glove system recorded force data from the palm and motion data from four fingers. From these data sets, temporal trends were established relating to when subjects initiated grasping during each trial. Moreover, optical tracking was implemented to study the kinematics of the moving object during human experiments and also to close the loop during the control of the robot hand. Ultimately, a mathematical bio-inspired model was created. This was embodied in a two-term decreasing power function which related the temporal trend of wait time to the ball initial acceleration. The wait time is defined as the time between when the experimental conductor releases the ball and when the subject begins to initiate grasping by closing their fingers, over a distance of four feet. The initial acceleration is the first acceleration value of the object due to the force provided when the conductor throws the object. The distance over which the ball was thrown was incorporated into the model. This is discussed in depth within the thesis. Overall, the results presented here show promise for bio-inspired control schemes in the successful application of robotic devices. This control methodology will ideally be developed to move robotic prosthesis past discrete tasks and into more complicated activities.
ContributorsCard, Dillon (Co-author) / Mincieli, Jennifer (Co-author) / Artemiadis, Panagiotis (Thesis director) / Santos, Veronica (Committee member) / Middleton, James (Committee member) / Barrett, The Honors College (Contributor) / School of Sustainability (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / W. P. Carey School of Business (Contributor)
Created2014-05