Matching Items (4)

Filtering by

Clear all filters

135386-Thumbnail Image.png

EMG-Interfaced Device for the Detection and Alleviation of Freezing of Gait in Individuals with Parkinson's Disease

Description

Parkinson's disease is a neurodegenerative disorder in the central nervous system that affects a host of daily activities and involves a variety of symptoms; these include tremors, slurred speech, and rigid muscles. It is the second most common movement disorder

Parkinson's disease is a neurodegenerative disorder in the central nervous system that affects a host of daily activities and involves a variety of symptoms; these include tremors, slurred speech, and rigid muscles. It is the second most common movement disorder globally. In Stage 3 of Parkinson's, afflicted individuals begin to develop an abnormal gait pattern known as freezing of gait (FoG), which is characterized by decreased step length, shuffling, and eventually complete loss of movement; they are unable to move, and often results in a fall. Surface electromyography (sEMG) is a diagnostic tool to measure electrical activity in the muscles to assess overall muscle function. Most conventional EMG systems, however, are bulky, tethered to a single location, expensive, and primarily used in a lab or clinical setting. This project explores an affordable, open-source, and portable platform called Open Brain-Computer Interface (OpenBCI). The purpose of the proposed device is to detect gait patterns by leveraging the surface electromyography (EMG) signals from the OpenBCI and to help a patient overcome an episode using haptic feedback mechanisms. Previously designed devices with similar intended purposes utilize accelerometry as a method of detection as well as audio and visual feedback mechanisms in their design.

Contributors

Agent

Created

Date Created
2016-05

136785-Thumbnail Image.png

Exploring the Design of Vibrotactile Cues for Visio-Haptic Sensory Substitution

Description

This paper presents the design and evaluation of a haptic interface for augmenting human-human interpersonal interactions by delivering facial expressions of an interaction partner to an individual who is blind using a visual-to-tactile mapping of facial action units and emotions.

This paper presents the design and evaluation of a haptic interface for augmenting human-human interpersonal interactions by delivering facial expressions of an interaction partner to an individual who is blind using a visual-to-tactile mapping of facial action units and emotions. Pancake shaftless vibration motors are mounted on the back of a chair to provide vibrotactile stimulation in the context of a dyadic (one-on-one) interaction across a table. This work explores the design of spatiotemporal vibration patterns that can be used to convey the basic building blocks of facial movements according to the Facial Action Unit Coding System. A behavioral study was conducted to explore the factors that influence the naturalness of conveying affect using vibrotactile cues.

Contributors

Agent

Created

Date Created
2014-05

157187-Thumbnail Image.png

Resonant microbeam high resolution vibrotactile haptic display

Description

One type of assistive device for the blind has attempted to convert visual information into information that can be perceived through another sense, such as touch or hearing. A vibrotactile haptic display assistive device consists of an array of vibrating

One type of assistive device for the blind has attempted to convert visual information into information that can be perceived through another sense, such as touch or hearing. A vibrotactile haptic display assistive device consists of an array of vibrating elements placed against the skin, allowing the blind individual to receive visual information through touch. However, these approaches have two significant technical challenges: large vibration element size and the number of microcontroller pins required for vibration control, both causing excessively low resolution of the device. Here, I propose and investigate a type of high-resolution vibrotactile haptic display which overcomes these challenges by utilizing a ‘microbeam’ as the vibrating element. These microbeams can then be actuated using only one microcontroller pin connected to a speaker or surface transducer. This approach could solve the low-resolution problem currently present in all haptic displays. In this paper, the results of an investigation into the manufacturability of such a device, simulation of the vibrational characteristics, and prototyping and experimental validation of the device concept are presented. The possible reasons of the frequency shift between the result of the forced or free response of beams and the frequency calculated based on a lumped mass approximation are investigated. It is found that one of the important reasons for the frequency shift is the size effect, the dependency of the elastic modulus on the size and kind of material. This size effect on A2 tool steel for Micro-Meso scale cantilever beams for the proposed system is investigated.

Contributors

Agent

Created

Date Created
2019

158792-Thumbnail Image.png

Haptic Vision: Augmenting Non-visual Travel Tools, Techniques, and Methods by Increasing Spatial Knowledge Through Dynamic Haptic Interactions

Description

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from afar. As a sensory organ in particular, the eyes have an unparalleled ability to adjust to varying degrees of light, color, and distance. Therefore, in the case of a non-visual traveler, someone who is blind or low vision, access to visual information is unattainable if it is positioned beyond the reach of the preferred mobility device or outside the path of travel. Although, the area of assistive technology in terms of electronic travel aids (ETA’s) has received considerable attention over the last two decades; surprisingly, the field has seen little work in the area focused on augmenting rather than replacing current non-visual travel techniques, methods, and tools. Consequently, this work describes the design of an intuitive tactile language and series of wearable tactile interfaces (the Haptic Chair, HaptWrap, and HapBack) to deliver real-time spatiotemporal data. The overall intuitiveness of the haptic mappings conveyed through the tactile interfaces are evaluated using a combination of absolute identification accuracy of a series of patterns and subjective feedback through post-experiment surveys. Two types of spatiotemporal representations are considered: static patterns representing object location at a single time instance, and dynamic patterns, added in the HaptWrap, which represent object movement over a time interval. Results support the viability of multi-dimensional haptics applied to the body to yield an intuitive understanding of dynamic interactions occurring around the navigator during travel. Lastly, it is important to point out that the guiding principle of this work centered on providing the navigator with spatial knowledge otherwise unattainable through current mobility techniques, methods, and tools, thus, providing the \emph{navigator} with the information necessary to make informed navigation decisions independently, at a distance.

Contributors

Agent

Created

Date Created
2020