Matching Items (8)

Filtering by

Clear all filters

133601-Thumbnail Image.png

Can Startle Elicit Sequential Movements in Highly Trained Individuals?

Description

Most daily living tasks consist of pairing a series of sequential movements, e.g., reaching to a cup, grabbing the cup, lifting and returning the cup to your mouth. The process by which we control and mediate the smooth progression of

Most daily living tasks consist of pairing a series of sequential movements, e.g., reaching to a cup, grabbing the cup, lifting and returning the cup to your mouth. The process by which we control and mediate the smooth progression of these tasks is not well understood. One method which we can use to further evaluate these motions is known as Startle Evoked Movements (SEM). SEM is an established technique to probe the motor learning and planning processes by detecting muscle activation of the sternocleidomastoid muscles of the neck prior to 120ms after a startling stimulus is presented. If activation of these muscles was detected following a stimulus in the 120ms window, the movement is classified as Startle+ whereas if no sternocleidomastoid activation is detected after a stimulus in the allotted time the movement is considered Startle-. For a movement to be considered SEM, the activation of movements for Startle+ trials must be faster than the activation of Startle- trials. The objective of this study was to evaluate the effect that expertise has on sequential movements as well as determining if startle can distinguish when the consolidation of actions, known as chunking, has occurred. We hypothesized that SEM could distinguish words that were solidified or chunked. Specifically, SEM would be present when expert typists were asked to type a common word but not during uncommon letter combinations. The results from this study indicated that the only word that was susceptible to SEM, where Startle+ trials were initiated faster than Startle-, was an uncommon task "HET" while the common words "AND" and "THE" were not. Additionally, the evaluation of the differences between each keystroke for common and uncommon words showed that Startle was unable to distinguish differences in motor chunking between Startle+ and Startle- trials. Explanations into why these results were observed could be related to hand dominance in expert typists. No proper research has been conducted to evaluate the susceptibility of the non-dominant hand's fingers to SEM, and the results of future studies into this as well as the results from this study can impact our understanding of sequential movements.

Contributors

Agent

Created

Date Created
2018-05

134938-Thumbnail Image.png

Startle can evoke individuated movements of the fingers; implications for neural control

Description

Startle-evoked-movement (SEM), the involuntary release of a planned movement via a startling stimulus, has gained significant attention recently for its ability to probe motor planning as well as enhance movement of the upper extremity following stroke. We recently showed that

Startle-evoked-movement (SEM), the involuntary release of a planned movement via a startling stimulus, has gained significant attention recently for its ability to probe motor planning as well as enhance movement of the upper extremity following stroke. We recently showed that hand movements are susceptible to SEM. Interestingly, only coordinated movements of the hand (grasp) but not individuated movements of the finger (finger abduction) were susceptible. It was suggested that this resulted from different neural mechanisms involved in each task; however it is possible this was the result of task familiarity. The objective of this study was to evaluate a more familiar individuated finger movement, typing, to determine if this task was susceptible to SEM. We hypothesized that typing movements will be susceptible to SEM in all fingers. These results indicate that individuated movements of the fingers are susceptible to SEM when the task involves a more familiar task, since the electromyogram (EMG) latency is faster in SCM+ trials compared to SCM- trials. However, the middle finger does not show a difference in terms of the keystroke voltage signal, suggesting the middle finger is less susceptible to SEM. Given that SEM is thought to be mediated by the brainstem, specifically the reticulospinal tract, this suggest that the brainstem may play a role in movements of the distal limb when those movements are very familiar, and the independence of each finger might also have a significant on the effect of SEM. Further research includes understanding SEM in fingers in the stroke population. The implications of this research can impact the way upper extremity rehabilitation is delivered.

Contributors

Agent

Created

Date Created
2016-12

152071-Thumbnail Image.png

Active and passive precision grip responses to unexpected perturbations

Description

The development of advanced, anthropomorphic artificial hands aims to provide upper extremity amputees with improved functionality for activities of daily living. However, many state-of-the-art hands have a large number of degrees of freedom that can be challenging to control in

The development of advanced, anthropomorphic artificial hands aims to provide upper extremity amputees with improved functionality for activities of daily living. However, many state-of-the-art hands have a large number of degrees of freedom that can be challenging to control in an intuitive manner. Automated grip responses could be built into artificial hands in order to enhance grasp stability and reduce the cognitive burden on the user. To this end, three studies were conducted to understand how human hands respond, passively and actively, to unexpected perturbations of a grasped object along and about different axes relative to the hand. The first study investigated the effect of magnitude, direction, and axis of rotation on precision grip responses to unexpected rotational perturbations of a grasped object. A robust "catch-up response" (a rapid, pulse-like increase in grip force rate previously reported only for translational perturbations) was observed whose strength scaled with the axis of rotation. Using two haptic robots, we then investigated the effects of grip surface friction, axis, and direction of perturbation on precision grip responses for unexpected translational and rotational perturbations for three different hand-centric axes. A robust catch-up response was observed for all axes and directions for both translational and rotational perturbations. Grip surface friction had no effect on the stereotypical catch-up response. Finally, we characterized the passive properties of the precision grip-object system via robot-imposed impulse perturbations. The hand-centric axis associated with the greatest translational stiffness was different than that for rotational stiffness. This work expands our understanding of the passive and active features of precision grip, a hallmark of human dexterous manipulation. Biological insights such as these could be used to enhance the functionality of artificial hands and the quality of life for upper extremity amputees.

Contributors

Agent

Created

Date Created
2013

152536-Thumbnail Image.png

Human-robot cooperation: communication and leader-follower dynamics

Description

As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be

As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be seen as cooperating to complete a task with some object of interest. Often these applications are in unstructured environments where many paths can accomplish the goal. This creates a need for the ability to communicate a preferred direction of motion between both participants in order to move in coordinated way. This communication method should be bidirectional to be able to fully utilize both the robot and human capabilities. Moreover, often in cooperative tasks between two humans, one human will operate as the leader of the task and the other as the follower. These roles may switch during the task as needed. The need for communication extends into this area of leader-follower switching. Furthermore, not only is there a need to communicate the desire to switch roles but also to control this switching process. Impedance control has been used as a way of dealing with some of the complexities of pHRI. For this investigation, it was examined if impedance control can be utilized as a way of communicating a preferred direction between humans and robots. The first set of experiments tested to see if a human could detect a preferred direction of a robot by grasping and moving an object coupled to the robot. The second set tested the reverse case if the robot could detect the preferred direction of the human. The ability to detect the preferred direction was shown to be up to 99% effective. Using these results, a control method to allow a human and robot to switch leader and follower roles during a cooperative task was implemented and tested. This method proved successful 84% of the time. This control method was refined using adaptive control resulting in lower interaction forces and a success rate of 95%.

Contributors

Agent

Created

Date Created
2014

150828-Thumbnail Image.png

Multi-directional slip detection between artificial fingers and a grasped object

Description

Effective tactile sensing in prosthetic and robotic hands is crucial for improving the functionality of such hands and enhancing the user's experience. Thus, improving the range of tactile sensing capabilities is essential for developing versatile artificial hands. Multimodal tactile sensors

Effective tactile sensing in prosthetic and robotic hands is crucial for improving the functionality of such hands and enhancing the user's experience. Thus, improving the range of tactile sensing capabilities is essential for developing versatile artificial hands. Multimodal tactile sensors called BioTacs, which include a hydrophone and a force electrode array, were used to understand how grip force, contact angle, object texture, and slip direction may be encoded in the sensor data. Findings show that slip induced under conditions of high contact angles and grip forces resulted in significant changes in both AC and DC pressure magnitude and rate of change in pressure. Slip induced under conditions of low contact angles and grip forces resulted in significant changes in the rate of change in electrode impedance. Slip in the distal direction of a precision grip caused significant changes in pressure magnitude and rate of change in pressure, while slip in the radial direction of the wrist caused significant changes in the rate of change in electrode impedance. A strong relationship was established between slip direction and the rate of change in ratios of electrode impedance for radial and ulnar slip relative to the wrist. Consequently, establishing multiple thresholds or establishing a multivariate model may be a useful method for detecting and characterizing slip. Detecting slip for low contact angles could be done by monitoring electrode data, while detecting slip for high contact angles could be done by monitoring pressure data. Predicting slip in the distal direction could be done by monitoring pressure data, while predicting slip in the radial and ulnar directions could be done by monitoring electrode data.

Contributors

Agent

Created

Date Created
2012

134804-Thumbnail Image.png

Startle-evoked movement in multi-jointed, two-dimensional reaching tasks

Description

Previous research has shown that a loud acoustic stimulus can trigger an individual's prepared movement plan. This movement response is referred to as a startle-evoked movement (SEM). SEM has been observed in the stroke survivor population where results have shown

Previous research has shown that a loud acoustic stimulus can trigger an individual's prepared movement plan. This movement response is referred to as a startle-evoked movement (SEM). SEM has been observed in the stroke survivor population where results have shown that SEM enhances single joint movements that are usually performed with difficulty. While the presence of SEM in the stroke survivor population advances scientific understanding of movement capabilities following a stroke, published studies using the SEM phenomenon only examined one joint. The ability of SEM to generate multi-jointed movements is understudied and consequently limits SEM as a potential therapy tool. In order to apply SEM as a therapy tool however, the biomechanics of the arm in multi-jointed movement planning and execution must be better understood. Thus, the objective of our study was to evaluate if SEM could elicit multi-joint reaching movements that were accurate in an unrestrained, two-dimensional workspace. Data was collected from ten subjects with no previous neck, arm, or brain injury. Each subject performed a reaching task to five Targets that were equally spaced in a semi-circle to create a two-dimensional workspace. The subject reached to each Target following a sequence of two non-startling acoustic stimuli cues: "Get Ready" and "Go". A loud acoustic stimuli was randomly substituted for the "Go" cue. We hypothesized that SEM is accessible and accurate for unrestricted multi-jointed reaching tasks in a functional workspace and is therefore independent of movement direction. Our results found that SEM is possible in all five Target directions. The probability of evoking SEM and the movement kinematics (i.e. total movement time, linear deviation, average velocity) to each Target are not statistically different. Thus, we conclude that SEM is possible in a functional workspace and is not dependent on where arm stability is maximized. Moreover, coordinated preparation and storage of a multi-jointed movement is indeed possible.

Contributors

Agent

Created

Date Created
2016-12

155910-Thumbnail Image.png

Human-Robot Interaction Utilizing Asymmetric Cooperation and the Brain

Description

The interaction between humans and robots has become an important area of research as the diversity of robotic applications has grown. The cooperation of a human and robot to achieve a goal is an important area within the physical human-robot

The interaction between humans and robots has become an important area of research as the diversity of robotic applications has grown. The cooperation of a human and robot to achieve a goal is an important area within the physical human-robot interaction (pHRI) field. The expansion of this field is toward moving robotics into applications in unstructured environments. When humans cooperate with each other, often there are leader and follower roles. These roles may change during the task. This creates a need for the robotic system to be able to exchange roles with the human during a cooperative task. The unstructured nature of the new applications in the field creates a need for robotic systems to be able to interact in six degrees of freedom (DOF). Moreover, in these unstructured environments, the robotic system will have incomplete information. This means that it will sometimes perform an incorrect action and control methods need to be able to correct for this. However, the most compelling applications for robotics are where they have capabilities that the human does not, which also creates the need for robotic systems to be able to correct human action when it detects an error. Activity in the brain precedes human action. Utilizing this activity in the brain can classify the type of interaction desired by the human. For this dissertation, the cooperation between humans and robots is improved in two main areas. First, the ability for electroencephalogram (EEG) to determine the desired cooperation role with a human is demonstrated with a correct classification rate of 65%. Second, a robotic controller is developed to allow the human and robot to cooperate in six DOF with asymmetric role exchange. This system allowed human-robot cooperation to perform a cooperative task at 100% correct rate. High, medium, and low levels of robotic automation are shown to affect performance, with the human making the greatest numbers of errors when the robotic system has a medium level of automation.

Contributors

Agent

Created

Date Created
2017

158494-Thumbnail Image.png

Characterization of 2D Human Ankle Stiffness during Postural Balance and Walking for Robot-aided Ankle Rehabilitation

Description

The human ankle is a vital joint in the lower limb of the human body. As the point of interaction between the human neuromuscular system and the physical world, the ankle plays important role in lower extremity functions including postural

The human ankle is a vital joint in the lower limb of the human body. As the point of interaction between the human neuromuscular system and the physical world, the ankle plays important role in lower extremity functions including postural balance and locomotion . Accurate characterization of ankle mechanics in lower extremity function is essential not just to advance the design and control of robots physically interacting with the human lower extremities but also in rehabilitation of humans suffering from neurodegenerative disorders.

In order to characterize the ankle mechanics and understand the underlying mechanisms that influence the neuromuscular properties of the ankle, a novel multi-axial robotic platform was developed. The robotic platform is capable of simulating various haptic environments and transiently perturbing the ankle to analyze the neuromechanics of the ankle, specifically the ankle impedance. Humans modulate ankle impedance to perform various tasks of the lower limb. The robotic platform is used to analyze the modulation of ankle impedance during postural balance and locomotion on various haptic environments. Further, various factors that influence modulation of ankle impedance were identified. Using the factors identified during environment dependent impedance modulation studies, the quantitative relationship between these factors, namely the muscle activation of major ankle muscles, the weight loading on ankle and the torque generation at the ankle was analyzed during postural balance and locomotion. A universal neuromuscular model of the ankle that quantitatively relates ankle stiffness, the major component of ankle impedance, to these factors was developed.

This neuromuscular model is then used as a basis to study the alterations caused in ankle behavior due to neurodegenerative disorders such as Multiple Sclerosis and Stroke. Pilot studies to validate the analysis of altered ankle behavior and demonstrate the effectiveness of robotic rehabilitation protocols in addressing the altered ankle behavior were performed. The pilot studies demonstrate that the altered ankle mechanics can be quantified in the affected populations and correlate with the observed adverse effects of the disability. Further, robotic rehabilitation protocols improve ankle control in affected populations as seen through functional improvements in postural balance and locomotion, validating the neuromuscular approach for rehabilitation.

Contributors

Agent

Created

Date Created
2020