Design of a Collapsible Instrument for Studying Grasp of Breakable Objects

135353-Thumbnail Image.png
Description
Research on human grasp typically involves the grasp of objects designed for the study of fingertip forces. Instrumented objects for such studies have often been designed for the simulation of functional tasks, such as feeding oneself, or for rigidity such

Research on human grasp typically involves the grasp of objects designed for the study of fingertip forces. Instrumented objects for such studies have often been designed for the simulation of functional tasks, such as feeding oneself, or for rigidity such that the objects do not deform when grasped. The goal of this thesis was to design a collapsible, instrumented object to study grasp of breakable objects. Such an object would enable experiments on human grip responses to unexpected finger-object events as well as anticipatory mechanisms once object fragility has been observed. The collapsible object was designed to be modular to allow for properties such as friction and breaking force to be altered. The instrumented object could be used to study both human and artificial grasp.
Date Created
2012-05
Agent

A Study of 3D Human Arm Impedance Towards the Development of an EMG-controlled Exoskeleton

137748-Thumbnail Image.png
Description
I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced, EMG-controlled exoskeleton. The project was new, and any work on

I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced, EMG-controlled exoskeleton. The project was new, and any work on the human- machine interface needs the physical interface itself. So I designed and fabricated a human-robot coupling device with a novel safety feature. The validation testing of this coupling proved very successful, and the device was granted a provisional patent as well as published to facilitate its spread to other human-machine interface applications, where it could be of major benefit. I then employed this coupling in experimentation towards understanding impedance, with the end goal being the creation of an EMG-based impedance exoskeleton control system. I modified a previously established robot-to-human perturbation method for use in my novel, three- dimensional (3D) impedance measurement experiment. Upon execution of this experiment, I was able to successfully characterize passive, static human arm stiffness in 3D, and in doing so validated the aforementioned method. This establishes an important foundation for promising future work on understanding impedance and the creation of the proposed control scheme, thereby furthering the field of human-robot interaction.
Date Created
2013-05
Agent

Bio-Inspired Control for Robot Hand Catching and Grasping

137299-Thumbnail Image.png
Description
This thesis focused on grasping tasks with the goal of investigating, analyzing, and quantifying human catching trends by way of a mathematical model. The aim of this project was to study human trends in a dynamic grasping task (catching a

This thesis focused on grasping tasks with the goal of investigating, analyzing, and quantifying human catching trends by way of a mathematical model. The aim of this project was to study human trends in a dynamic grasping task (catching a rolling ball), relate those discovered trends to kinematic characteristics of the object, and use this relation to control a robot hand in real time. As an ultimate goal, it was hoped that this research will aide in furthering the bio-inspiration in robot control methods. To achieve the above goal, firstly a tactile sensing glove was developed. This instrument allowed for in depth study of human reactionary grasping movements when worn by subjects during experimentation. This sensing glove system recorded force data from the palm and motion data from four fingers. From these data sets, temporal trends were established relating to when subjects initiated grasping during each trial. Moreover, optical tracking was implemented to study the kinematics of the moving object during human experiments and also to close the loop during the control of the robot hand. Ultimately, a mathematical bio-inspired model was created. This was embodied in a two-term decreasing power function which related the temporal trend of wait time to the ball initial acceleration. The wait time is defined as the time between when the experimental conductor releases the ball and when the subject begins to initiate grasping by closing their fingers, over a distance of four feet. The initial acceleration is the first acceleration value of the object due to the force provided when the conductor throws the object. The distance over which the ball was thrown was incorporated into the model. This is discussed in depth within the thesis. Overall, the results presented here show promise for bio-inspired control schemes in the successful application of robotic devices. This control methodology will ideally be developed to move robotic prosthesis past discrete tasks and into more complicated activities.
Date Created
2014-05
Agent

Investigation of Multimodal Tactile Cues for Multidigit Rotational Tasks

137106-Thumbnail Image.png
Description
The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin

The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation speeds and two levels of rotation resistance were used to investigate tactile cues during knob rotation. In the future, this multidigit task can be generalized to similar rotational tasks, such as opening a bottle or turning a doorknob.
Date Created
2014-05
Agent

Multidigit Tactile Exploration of Environment through an Object

136991-Thumbnail Image.png
Description
The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large part due the absence of the sense of touch. Tactile

The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large part due the absence of the sense of touch. Tactile sensing of the human hand comprises a key component of a wide variety of interactions with the external environment; visual feedback alone is not always sufficient for the recreation of nuanced tasks. It is hoped that the results of this study can contribute to the advancement of prosthetics with a tactile feedback loop with the ultimate goal of replacing biological function. A three-fingered robot hand equipped with tactile sensing fingertips was used to biomimetically grasp a ball in order haptically explore the environment for a ball-in-hole task. The sensorized fingertips were used to measure the vibration, pressure, and skin deformation experienced by each fingertip. Vibration and pressure sensed by the fingertips were good indicators of changes in discrete phases of the exploratory motion such as contact with the lip of a hole. The most informative tactile cue was the skin deformation of the fingers. Upon encountering the lip of the test surface, the lagging digit experienced compression in the fingertip and radial distal region of the digit. The middle digit experienced decompression of the middle region of the finger and the lagging digit showed compression towards the middle digit and decompression in the distal-ulnar region. Larger holes caused an increase in pressure experienced by the fingertips while changes in stroke speed showed no effect on tactile data. Larger coefficients of friction between the ball and the test surface led to an increase in pressure and skin deformation of the finger. Unlike most tactile sensing studies that focus on tactile stimuli generated by direct contact between a fingertip and the environment, this preliminary study focused on tactile stimuli generated when a grasped object interacts with the environment. Findings from this study could be used to design experiments for functionally similar activities of daily living, such as the haptic search for a keyhole via a grasped key.
Date Created
2014-05
Agent

Digit Control During Object Handover

136230-Thumbnail Image.png
Description
Currently, assistive robots and prosthesis have a difficult time giving and receiving objects to and from humans. While many attempts have been made to program handover scenarios into robotic control algorithms, the algorithms are typically lacking in at least one

Currently, assistive robots and prosthesis have a difficult time giving and receiving objects to and from humans. While many attempts have been made to program handover scenarios into robotic control algorithms, the algorithms are typically lacking in at least one important feature: intuitiveness, safety, and efficiency. By performing a study to better understand human-to-human handovers, we observe trends that could inspire controllers for object handovers with robots. Ten pairs of human subjects handed over a cellular phone-shaped, instrumented object using a key pinch while 3D force and motion tracking data were recorded. It was observed that during handovers, humans apply a compressive force on the object and employ linear grip force to load force ratios when two agents are grasping an object (referred to as the "mutual grasp period"). Results also suggested that object velocity during the mutual grasp period is driven by the receiver, while the duration of the mutual grasp period is driven by the preference of the slowest agent involved in the handover. Ultimately, these findings will inspire the development of robotic handover controllers to advance seamless physical interactions between humans and robots.
Date Created
2015-05
Agent

Discriminability of Single and Multichannel Intracortical Microstimulation Within Somatosensory Cortex

128598-Thumbnail Image.png
Description

The addition of tactile and proprioceptive feedback to neuroprosthetic limbs is expected to significantly improve the control of these devices. Intracortical microstimulation (ICMS) of somatosensory cortex is a promising method of delivering this sensory feedback. To date, the main focus

The addition of tactile and proprioceptive feedback to neuroprosthetic limbs is expected to significantly improve the control of these devices. Intracortical microstimulation (ICMS) of somatosensory cortex is a promising method of delivering this sensory feedback. To date, the main focus of somatosensory ICMS studies has been to deliver discriminable signals, corresponding to varying intensity, to a single location in cortex. However, multiple independent and simultaneous streams of sensory information will need to be encoded by ICMS to provide functionally relevant feedback for a neuroprosthetic limb (e.g., encoding contact events and pressure on multiple digits). In this study, we evaluated the ability of an awake, behaving non-human primate (Macaca mulatta) to discriminate ICMS stimuli delivered on multiple electrodes spaced within somatosensory cortex. We delivered serial stimulation on single electrodes to evaluate the discriminability of sensations corresponding to ICMS of distinct cortical locations. Additionally, we delivered trains of multichannel stimulation, derived from a tactile sensor, synchronously across multiple electrodes. Our results indicate that discrimination of multiple ICMS stimuli is a challenging task, but that discriminable sensory percepts can be elicited by both single and multichannel ICMS on electrodes spaced within somatosensory cortex.

Date Created
2016-12-02
Agent

A Robot Hand Testbed Designed for Enhancing Embodiment and Functional Neurorehabilitation of Body Schema in Subjects With Upper Limb Impairment or Loss

Description

Many upper limb amputees experience an incessant, post-amputation “phantom limb pain” and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from

Many upper limb amputees experience an incessant, post-amputation “phantom limb pain” and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech “rubber hand” illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the “BairClaw” presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger–object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden.

Date Created
2015-02-19
Agent

Human-robot cooperation: communication and leader-follower dynamics

152536-Thumbnail Image.png
Description
As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be

As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be seen as cooperating to complete a task with some object of interest. Often these applications are in unstructured environments where many paths can accomplish the goal. This creates a need for the ability to communicate a preferred direction of motion between both participants in order to move in coordinated way. This communication method should be bidirectional to be able to fully utilize both the robot and human capabilities. Moreover, often in cooperative tasks between two humans, one human will operate as the leader of the task and the other as the follower. These roles may switch during the task as needed. The need for communication extends into this area of leader-follower switching. Furthermore, not only is there a need to communicate the desire to switch roles but also to control this switching process. Impedance control has been used as a way of dealing with some of the complexities of pHRI. For this investigation, it was examined if impedance control can be utilized as a way of communicating a preferred direction between humans and robots. The first set of experiments tested to see if a human could detect a preferred direction of a robot by grasping and moving an object coupled to the robot. The second set tested the reverse case if the robot could detect the preferred direction of the human. The ability to detect the preferred direction was shown to be up to 99% effective. Using these results, a control method to allow a human and robot to switch leader and follower roles during a cooperative task was implemented and tested. This method proved successful 84% of the time. This control method was refined using adaptive control resulting in lower interaction forces and a success rate of 95%.
Date Created
2014
Agent

Intracortical microstimulation of somatosensory cortex: functional encoding and localization of neuronal recruitment

Description
Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the

Intracortical microstimulation (ICMS) within somatosensory cortex can produce artificial sensations including touch, pressure, and vibration. There is significant interest in using ICMS to provide sensory feedback for a prosthetic limb. In such a system, information recorded from sensors on the prosthetic would be translated into electrical stimulation and delivered directly to the brain, providing feedback about features of objects in contact with the prosthetic. To achieve this goal, multiple simultaneous streams of information will need to be encoded by ICMS in a manner that produces robust, reliable, and discriminable sensations. The first segment of this work focuses on the discriminability of sensations elicited by ICMS within somatosensory cortex. Stimulation on multiple single electrodes and near-simultaneous stimulation across multiple electrodes, driven by a multimodal tactile sensor, were both used in these experiments. A SynTouch BioTac sensor was moved across a flat surface in several directions, and a subset of the sensor's electrode impedance channels were used to drive multichannel ICMS in the somatosensory cortex of a non-human primate. The animal performed a behavioral task during this stimulation to indicate the discriminability of sensations evoked by the electrical stimulation. The animal's responses to ICMS were somewhat inconsistent across experimental sessions but indicated that discriminable sensations were evoked by both single and multichannel ICMS. The factors that affect the discriminability of stimulation-induced sensations are not well understood, in part because the relationship between ICMS and the neural activity it induces is poorly defined. The second component of this work was to develop computational models that describe the populations of neurons likely to be activated by ICMS. Models of several neurons were constructed, and their responses to ICMS were calculated. A three-dimensional cortical model was constructed using these cell models and used to identify the populations of neurons likely to be recruited by ICMS. Stimulation activated neurons in a sparse and discontinuous fashion; additionally, the type, number, and location of neurons likely to be activated by stimulation varied with electrode depth.
Date Created
2013
Agent