Matching Items (60)

128364-Thumbnail Image.png

Communication and Inference of Intended Movement Direction during Human–Human Physical Interaction

Description

Of particular interest to the neuroscience and robotics communities is the understanding of how two humans could physically collaborate to perform motor tasks such as holding a tool or moving

Of particular interest to the neuroscience and robotics communities is the understanding of how two humans could physically collaborate to perform motor tasks such as holding a tool or moving it across locations. When two humans physically interact with each other, sensory consequences and motor outcomes are not entirely predictable as they also depend on the other agent’s actions. The sensory mechanisms involved in physical interactions are not well understood. The present study was designed (1) to quantify human–human physical interactions where one agent (“follower”) has to infer the intended or imagined—but not executed—direction of motion of another agent (“leader”) and (2) to reveal the underlying strategies used by the dyad. This study also aimed at verifying the extent to which visual feedback (VF) is necessary for communicating intended movement direction. We found that the control of leader on the relationship between force and motion was a critical factor in conveying his/her intended movement direction to the follower regardless of VF of the grasped handle or the arms. Interestingly, the dyad’s ability to communicate and infer movement direction with significant accuracy improved (>83%) after a relatively short amount of practice. These results indicate that the relationship between force and motion (interpreting as arm impedance modulation) may represent an important means for communicating intended movement direction between biological agents, as indicated by the modulation of this relationship to intended direction. Ongoing work is investigating the application of the present findings to optimize communication of high-level movement goals during physical interactions between biological and non-biological agents.

Contributors

Agent

Created

Date Created
  • 2017-04-13

128585-Thumbnail Image.png

Proof of Concept of an Online EMG-Based Decoding of Hand Postures and Individual Digit Forces for Prosthetic Hand Control

Description

Introduction: Options currently available to individuals with upper limb loss range from prosthetic hands that can perform many movements, but require more cognitive effort to control, to simpler terminal devices

Introduction: Options currently available to individuals with upper limb loss range from prosthetic hands that can perform many movements, but require more cognitive effort to control, to simpler terminal devices with limited functional abilities. We attempted to address this issue by designing a myoelectric control system to modulate prosthetic hand posture and digit force distribution.
Methods: We recorded surface electromyographic (EMG) signals from five forearm muscles in eight able-bodied subjects while they modulated hand posture and the flexion force distribution of individual fingers. We used a support vector machine (SVM) and a random forest regression (RFR) to map EMG signal features to hand posture and individual digit forces, respectively. After training, subjects performed grasping tasks and hand gestures while a computer program computed and displayed online feedback of all digit forces, in which digits were flexed, and the magnitude of contact forces. We also used a commercially available prosthetic hand, the i-Limb (Touch Bionics), to provide a practical demonstration of the proposed approach’s ability to control hand posture and finger forces.
Results: Subjects could control hand pose and force distribution across the fingers during online testing. Decoding success rates ranged from 60% (index finger pointing) to 83–99% for 2-digit grasp and resting state, respectively. Subjects could also modulate finger force distribution.
Discussion: This work provides a proof of concept for the application of SVM and RFR for online control of hand posture and finger force distribution, respectively. Our approach has potential applications for enabling in-hand manipulation with a prosthetic hand.

Contributors

Agent

Created

Date Created
  • 2017-02-01

134095-Thumbnail Image.png

Stability of the Human Ankle with Respect to Environmental Mechanics

Description

This study presents quantification of ankle stability as affected by environmental conditions in two degrees of freedom (DOF) with three distinct analysis techniques. Additionally, this study presents gender-specific trends for

This study presents quantification of ankle stability as affected by environmental conditions in two degrees of freedom (DOF) with three distinct analysis techniques. Additionally, this study presents gender-specific trends for comparison. Intuitively, ankle stability decreased in less stable environments with a negative simulated stiffness. Female subjects generally suffered a greater loss of stability in moderately and highly unstable environments. Both gender groups exhibited greater stability in the sagittal plane than the frontal plane across the entire range of simulated stiffness's. Outcomes of this study are useful in the design of controllers for lower extremity physically-interactive robotics, understanding situations in which the ankle is likely to lose stability, and understanding the strengths and weaknesses of unique analysis techniques.

Contributors

Created

Date Created
  • 2017-12

134817-Thumbnail Image.png

Lower Limb Gait Simulator Based on a Pure External Force

Description

For the past two decades, advanced Limb Gait Simulators and Exoskeletons have been developed to improve walking rehabilitation. A Limb Gait Simulator is used to analyze the human step cycle

For the past two decades, advanced Limb Gait Simulators and Exoskeletons have been developed to improve walking rehabilitation. A Limb Gait Simulator is used to analyze the human step cycle and/or assist a user walking on a treadmill. Most modern limb gait simulators, such as ALEX, have proven themselves effective and reliable through their usage of motors, springs, cables, elastics, pneumatics and reaction loads. These mechanisms apply internal forces and reaction loads to the body. On the other hand, external forces are those caused by an external agent outside the system such as air, water, or magnets. A design for an exoskeleton using external forces has seldom been attempted by researchers. This thesis project focuses on the development of a Limb Gait Simulator based on a Pure External Force and has proven its effectiveness in generating torque on the human leg. The external force is generated through air propulsion using an Electric Ducted Fan (EDF) motor. Such a motor is typically used for remote control airplanes, but their applications can go beyond this. The objective of this research is to generate torque on the human leg through the control of the EDF engines thrust and the opening/closing of the reverse thruster flaps. This device qualifies as "assist as needed"; the user is entirely in control of how much assistance he or she may want. Static thrust values for the EDF engine are recorded using a thrust test stand. The product of the thrust (N) and the distance on the thigh (m) is the resulting torque. With the motor running at maximum RPM, the highest torque value reached was that of 3.93 (Nm). The motor EDF motor is powered by a 6S 5000 mAh LiPo battery. This torque value could be increased with the usage of a second battery connected in series, but this comes at a price. The designed limb gait simulator demonstrates that external forces, such as air, could have potential in the development of future rehabilitation devices.

Contributors

Agent

Created

Date Created
  • 2016-12

134988-Thumbnail Image.png

Development of a Lower Extremity Robotic Device for Ankle Studies

Description

The quality of life of many people is lowered by impediments to walking ability caused by neurological conditions such as strokes. Since the ankle joint plays an important role in

The quality of life of many people is lowered by impediments to walking ability caused by neurological conditions such as strokes. Since the ankle joint plays an important role in locomotion, it is a common subject of study in rehabilitation research. Robotic devices such as active ankle-foot orthoses and powered exoskeletons have the potential to be used directly in physical therapy or indirectly in research pursuing more effective rehabilitation methods. This paper presents the LiTREAD, a lightweight three degree-of-freedom robotic exoskeletal ankle device. This novel robotic system is designed to be worn on a user's leg and actuate the foot position during treadmill studies. The robot's sagittal plane actuation is complemented by passive virtual axis systems in the frontal and transverse planes. Together, these degrees of freedom allow the device to approximate the full range of motion of the ankle. The virtual axis mechanisms feature locking configurations that will allow the effect of these degrees of freedom on gait dynamics to be studied. Based on a kinematic analysis of the robot's actuation and geometry, it is expected to meet and exceed its torque and speed targets, respectively. The device will fit either leg of a range of subject sizes, and is expected to weigh just 1.3 kg (2.9 lb.). These features and characteristics are designed to minimize the robot's interference with the natural walking motion. Pending validation studies confirming that all design criteria have been met, the LiTREAD prototype that has been constructed will be utilized in various experiments investigating properties of the ankle such as its mechanical impedance. It is hoped that the LiTREAD will yield valuable data that will expand our knowledge of the ankle and aid in the design of future lower-extremity devices.

Contributors

Created

Date Created
  • 2016-12

134271-Thumbnail Image.png

Robotic 3D Mapping for Virtual Reality Implementation

Description

In recent years, environment mapping has garnered significant interest in both industrial and academic settings as a viable means of generating comprehensive virtual models of the physical world. These maps

In recent years, environment mapping has garnered significant interest in both industrial and academic settings as a viable means of generating comprehensive virtual models of the physical world. These maps are created using simultaneous localization and mapping (SLAM) algorithms that combine depth contours with visual imaging information to create rich, layered point clouds. Given the recent advances in virtual reality technology, these generated point clouds can be imported onto the Oculus Rift or similar headset for virtual reality implementation. This project deals with the robotic implementation of RGB-D SLAM algorithms on mobile ground robots to generate complete point clouds that can be processed off-line and imported into virtual reality engines for viewing in the Oculus Rift. This project uses a ground robot along with a Kinect sensor to collect RGB-D data of the surrounding environment to build point cloud maps using SLAM software. These point clouds are then exported as object or polygon files for post-processing in software engines such as Meshlab or Unity. The point clouds generated from the SLAM software can be viewed in the Oculus Rift as is. However, these maps are mainly empty space and can be further optimized for virtual viewing. Additional techniques such as meshing and texture meshing were implemented on the raw point cloud maps and tested on the Oculus Rift. The aim of this project was to increase the potential applications for virtual reality by taking a robotic mapping approach to virtual reality environment development. This project was successful in achieving its objective. The following report details the processes used in developing a remotely-controlled robotic platform that can scan its environment and generate viable point cloud maps. These maps are then processed off line and ported into virtual reality software for viewing through the Oculus Rift.

Contributors

Created

Date Created
  • 2017-05

135353-Thumbnail Image.png

Design of a Collapsible Instrument for Studying Grasp of Breakable Objects

Description

Research on human grasp typically involves the grasp of objects designed for the study of fingertip forces. Instrumented objects for such studies have often been designed for the simulation of

Research on human grasp typically involves the grasp of objects designed for the study of fingertip forces. Instrumented objects for such studies have often been designed for the simulation of functional tasks, such as feeding oneself, or for rigidity such that the objects do not deform when grasped. The goal of this thesis was to design a collapsible, instrumented object to study grasp of breakable objects. Such an object would enable experiments on human grip responses to unexpected finger-object events as well as anticipatory mechanisms once object fragility has been observed. The collapsible object was designed to be modular to allow for properties such as friction and breaking force to be altered. The instrumented object could be used to study both human and artificial grasp.

Contributors

Agent

Created

Date Created
  • 2012-05

136991-Thumbnail Image.png

Multidigit Tactile Exploration of Environment through an Object

Description

The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large

The ideal function of an upper limb prosthesis is to replace the human hand and arm, but a gulf in functionality between prostheses and biological arms still exists, in large part due the absence of the sense of touch. Tactile sensing of the human hand comprises a key component of a wide variety of interactions with the external environment; visual feedback alone is not always sufficient for the recreation of nuanced tasks. It is hoped that the results of this study can contribute to the advancement of prosthetics with a tactile feedback loop with the ultimate goal of replacing biological function. A three-fingered robot hand equipped with tactile sensing fingertips was used to biomimetically grasp a ball in order haptically explore the environment for a ball-in-hole task. The sensorized fingertips were used to measure the vibration, pressure, and skin deformation experienced by each fingertip. Vibration and pressure sensed by the fingertips were good indicators of changes in discrete phases of the exploratory motion such as contact with the lip of a hole. The most informative tactile cue was the skin deformation of the fingers. Upon encountering the lip of the test surface, the lagging digit experienced compression in the fingertip and radial distal region of the digit. The middle digit experienced decompression of the middle region of the finger and the lagging digit showed compression towards the middle digit and decompression in the distal-ulnar region. Larger holes caused an increase in pressure experienced by the fingertips while changes in stroke speed showed no effect on tactile data. Larger coefficients of friction between the ball and the test surface led to an increase in pressure and skin deformation of the finger. Unlike most tactile sensing studies that focus on tactile stimuli generated by direct contact between a fingertip and the environment, this preliminary study focused on tactile stimuli generated when a grasped object interacts with the environment. Findings from this study could be used to design experiments for functionally similar activities of daily living, such as the haptic search for a keyhole via a grasped key.

Contributors

Agent

Created

Date Created
  • 2014-05

136546-Thumbnail Image.png

Variable Stiffness Treadmill (VST): Design, Development, and Implementation of a Novel Tool for the Investigation of Human Gait

Description

The generation of walking motion is one of the most vital functions of the human body because it allows us to be mobile in our environment. Unfortunately, numerous individuals suffer

The generation of walking motion is one of the most vital functions of the human body because it allows us to be mobile in our environment. Unfortunately, numerous individuals suffer from gait impairment as a result of debilitating conditions like stroke, resulting in a serious loss of mobility. Our understanding of human gait is limited by the amount of research we conduct in relation to human walking mechanisms and their characteristics. In order to better understand these characteristics and the systems involved in the generation of human gait, it is necessary to increase the depth and range of research pertaining to walking motion. Specifically, there has been a lack of investigation into a particular area of human gait research that could potentially yield interesting conclusions about gait rehabilitation, which is the effect of surface stiffness on human gait. In order to investigate this idea, a number of studies have been conducted using experimental devices that focus on changing surface stiffness; however, these systems lack certain functionality that would be useful in an experimental scenario. To solve this problem and to investigate the effect of surface stiffness further, a system has been developed called the Variable Stiffness Treadmill system (VST). This treadmill system is a unique investigative tool that allows for the active control of surface stiffness. What is novel about this system is its ability to change the stiffness of the surface quickly, accurately, during the gait cycle, and throughout a large range of possible stiffness values. This type of functionality in an experimental system has never been implemented and constitutes a tremendous opportunity for valuable gait research in regard to the influence of surface stiffness. In this work, the design, development, and implementation of the Variable Stiffness Treadmill system is presented and discussed along with preliminary experimentation. The results from characterization testing demonstrate highly accurate stiffness control and excellent response characteristics for specific configurations. Initial indications from human experimental trials in relation to quantifiable effects from surface stiffness variation using the Variable Stiffness Treadmill system are encouraging.

Contributors

Agent

Created

Date Created
  • 2015-05

137748-Thumbnail Image.png

A Study of 3D Human Arm Impedance Towards the Development of an EMG-controlled Exoskeleton

Description

I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced,

I worked on the human-machine interface to improve human physical capability. This work was done in the Human Oriented Robotics and Control Lab (HORC) towards the creation of an advanced, EMG-controlled exoskeleton. The project was new, and any work on the human- machine interface needs the physical interface itself. So I designed and fabricated a human-robot coupling device with a novel safety feature. The validation testing of this coupling proved very successful, and the device was granted a provisional patent as well as published to facilitate its spread to other human-machine interface applications, where it could be of major benefit. I then employed this coupling in experimentation towards understanding impedance, with the end goal being the creation of an EMG-based impedance exoskeleton control system. I modified a previously established robot-to-human perturbation method for use in my novel, three- dimensional (3D) impedance measurement experiment. Upon execution of this experiment, I was able to successfully characterize passive, static human arm stiffness in 3D, and in doing so validated the aforementioned method. This establishes an important foundation for promising future work on understanding impedance and the creation of the proposed control scheme, thereby furthering the field of human-robot interaction.

Contributors

Agent

Created

Date Created
  • 2013-05