Matching Items (13)

135207-Thumbnail Image.png

Haptic-Based Indoor Navigation for Emergency Responders

Description

Situations present themselves in which someone needs to navigate inside of a building, for example, to the exit or to retrieve and object. Sometimes, vision is not a reliable sense

Situations present themselves in which someone needs to navigate inside of a building, for example, to the exit or to retrieve and object. Sometimes, vision is not a reliable sense of spatial awareness, maybe because of a smoky environment, a dark environment, distractions, etc. I propose a wearable haptic device, a belt or vest, that provides haptic feedback to help people navigate inside of a building that does not rely on the user's vision. The first proposed device has an obstacle avoidance component and a navigation component. This paper discussed the challenges of designing and implementing this kind of technology in the context of indoor navigation, where GPS signal is poor. Analyzing accelerometer data for the purpose of indoor navigation and then using haptic cues from a wearable haptic device for the navigation were explored in this project, and the device is promising.

Contributors

Created

Date Created
  • 2016-05

137106-Thumbnail Image.png

Investigation of Multimodal Tactile Cues for Multidigit Rotational Tasks

Description

The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three

The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation speeds and two levels of rotation resistance were used to investigate tactile cues during knob rotation. In the future, this multidigit task can be generalized to similar rotational tasks, such as opening a bottle or turning a doorknob.

Contributors

Created

Date Created
  • 2014-05

136593-Thumbnail Image.png

Haptic Discrimination of Object Size via Tactile Sensation vs. Vibratory Sensory Substitution

Description

Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the

Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the size, weight, shape, density, and texture of objects being interacted with. While vision is highly relied upon for many tasks, especially those involving accurate reaches, people can typically accomplish common daily skills without constant visual feedback, instead relying on tactile and proprioceptive cues. Amputees using prosthetic hands, however, do not currently have access to such cues, making these tasks impossible. This experiment was designed to test whether vibratory haptic cues could be used in replacement of tactile feedback to signal contact for a size discrimination task. Two experiments were run in which subjects were asked to identify changes in block size between consecutive trials using wither physical or virtual blocks to test the accuracy of size discrimination using tactile and haptic feedback, respectively. Blocks randomly increased or decreased in size in increments of 2 to 12 mm between trials for both experiments. This experiment showed that subjects were significantly better at determining size changes using tactile feedback than vibratory haptic cues. This suggests that, while haptic feedback can technically be used to grasp and discriminate between objects of different sizes, it does not lend the same level of input as tactile cues.

Contributors

Agent

Created

Date Created
  • 2015-05

148244-Thumbnail Image.png

Exploring the Impact of a Haptic Glove on Immersion

Description

In this experiment, a haptic glove with vibratory motors on the fingertips was tested against the standard HTC Vive controller to see if the additional vibrations provided by the glove

In this experiment, a haptic glove with vibratory motors on the fingertips was tested against the standard HTC Vive controller to see if the additional vibrations provided by the glove increased immersion in common gaming scenarios where haptic feedback is provided. Specifically, two scenarios were developed: an explosion scene containing a small and large explosion and a box interaction scene that allowed the participants to touch the box virtually with their hand. At the start of this project, it was hypothesized that the haptic glove would have a significant positive impact in at least one of these scenarios. Nine participants took place in the study and immersion was measured through a post-experiment questionnaire. Statistical analysis on the results showed that the haptic glove did have a significant impact on immersion in the box interaction scene, but not in the explosion scene. In the end, I conclude that since this haptic glove does not significantly increase immersion across all scenarios when compared to the standard Vive controller, it should not be used at a replacement in its current state.

Contributors

Created

Date Created
  • 2021-05

130891-Thumbnail Image.png

Role of Proprioceptive Feedback and Multisensory Integration for Object Weight Perception

Description

Tactile and proprioceptive sensory feedback are the two sensory modalities that make up haptic sensation. The degree which these two sensory modalities are integrated together is not very well known.

Tactile and proprioceptive sensory feedback are the two sensory modalities that make up haptic sensation. The degree which these two sensory modalities are integrated together is not very well known. To investigate this issue a set of experiments were set into motion separating these sensory modalities and testing what happens when a person’s proprioceptive system is perturbed. A virtual reality system with haptic feedback along with a weighted object were utilized in a reach, grasp, and lift task. The subjects would lift two objects sequentially and try to judge which one was heavier. This project was split into three different experiments to measure the subject’s perception in different situations. The first experiment utilized the virtual reality system to measure the perception when the subject only has proprioceptive inputs. The second experiment would include the virtual reality system and the weighted object to act as a comparison to the first experiment with the additional tactile input. The third experiment would then add perturbations to the proprioceptive inputs through the virtual reality system to investigate how perception will change. Results from experiment 1 and 2 showed that subjects are almost just as accurate with weight discrimination even if they only have proprioceptive inputs however, subjects are much more consistent in their weight discrimination with both sensory modalities. Results from experiment 3 showed that subjective perception does change when the proprioception is perturbed but the magnitude of that change in perception depends on the perturbation performed.

Contributors

Agent

Created

Date Created
  • 2020-12

132065-Thumbnail Image.png

HapBack - Providing Spatial Awareness at a Distance Using Haptic Stimulation

Description

This paper presents a study done to gain knowledge on the communication of an object’s relative 3-dimensional position in relation to individuals who are visually impaired and blind. The HapBack,

This paper presents a study done to gain knowledge on the communication of an object’s relative 3-dimensional position in relation to individuals who are visually impaired and blind. The HapBack, a continuation of the HaptWrap V1.0 (Duarte et al., 2018), focused on the perception of objects and their distances in 3-dimensional space using haptic communication. The HapBack is a device that consists of two elastic bands wrapped horizontally secured around the user’s torso and two backpack straps secured along the user’s back. The backpack straps are embedded with 10 vibrotactile motors evenly positioned along the spine. This device is designed to provide a wearable interface for blind and visually impaired individuals in order to understand how the position of objects in 3-dimensional space are perceived through haptic communication. We were able to analyze the accuracy of the HapBack device through three vectors (1) Two different modes of vibration – absolute and relative (2) the location of the vibrotactile motors when in absolute mode (3) and the location of the vibrotactile motors when in relative mode. The results provided support that the HapBack provided vibrotactile patterns that were intuitively mapped to distances represented in the study. We were able to gain a better understanding on how distance can be perceived through haptic communication in individuals who are blind through analyzing the intuitiveness of the vibro-tactile patterns and the accuracy of the user’s responses.

Contributors

Agent

Created

Date Created
  • 2019-12

132352-Thumbnail Image.png

Haptic Learning: The Effects of Multimedia Learning on Haptic Robotic Operation

Description

This is a report on an experiment that examines if the principles of multimedia learning outlined in Richard E. Mayer’s journal article, “Using multimedia for e-learning”, located in the Journal

This is a report on an experiment that examines if the principles of multimedia learning outlined in Richard E. Mayer’s journal article, “Using multimedia for e-learning”, located in the Journal of Computer Assisted Learning would apply to haptic feedback used for haptic robotic operation. This was tested by developing and using a haptic robotic manipulator known as the Haptic Testbed (HTB). The HTB is a manipulator designed to emulate human hand movement for haptic testing purposes and features an index finger and thumb for the right hand. Control is conducted through a Leap Motion Controller, a visual sensor that uses infrared lights and cameras to gather various data about hands it can see. The goal of the experiment was to have test subjects complete a task where they shifted objects along a circuit of positions where they were measured on time to complete the circuit as well as accuracy in reaching the individual points. Analysis of subject responses to surveys as well as performance during the experiment showed haptic feedback during training improving initial performance of individuals as well as lowering mental effort and mental demand during said training. The findings of this experiment showed support for the hypothesis that Mayer’s principles do apply to haptic feedback in training for haptic robotic manipulation. One of the implications of this experiment would be the possibility for haptics and tactile senses to be an applicable sense for Mayer’s principles of multimedia learning as most of the current work in the field is mostly focused on visual or auditory senses. If the results of the experiment were replicated in a future experiment it would provide support to the hypothesis that the principles of multimedia learning can be utilized to improve the training of haptic robotic operation.

Contributors

Agent

Created

Date Created
  • 2019-05

150150-Thumbnail Image.png

Isomorphic categories

Description

Learning and transfer were investigated for a categorical structure in which relevant stimulus information could be mapped without loss from one modality to another. The category space was composed of

Learning and transfer were investigated for a categorical structure in which relevant stimulus information could be mapped without loss from one modality to another. The category space was composed of three non-overlapping, linearly-separable categories. Each stimulus was composed of a sequence of on-off events that varied in duration and number of sub-events (complexity). Categories were learned visually, haptically, or auditorily, and transferred to the same or an alternate modality. The transfer set contained old, new, and prototype stimuli, and subjects made both classification and recognition judgments. The results showed an early learning advantage in the visual modality, with transfer performance varying among the conditions in both classification and recognition. In general, classification accuracy was highest for the category prototype, with false recognition of the category prototype higher in the cross-modality conditions. The results are discussed in terms of current theories in modality transfer, and shed preliminary light on categorical transfer of temporal stimuli.

Contributors

Agent

Created

Date Created
  • 2011

158792-Thumbnail Image.png

Haptic Vision: Augmenting Non-visual Travel Tools, Techniques, and Methods by Increasing Spatial Knowledge Through Dynamic Haptic Interactions

Description

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from afar. As a sensory organ in particular, the eyes have an unparalleled ability to adjust to varying degrees of light, color, and distance. Therefore, in the case of a non-visual traveler, someone who is blind or low vision, access to visual information is unattainable if it is positioned beyond the reach of the preferred mobility device or outside the path of travel. Although, the area of assistive technology in terms of electronic travel aids (ETA’s) has received considerable attention over the last two decades; surprisingly, the field has seen little work in the area focused on augmenting rather than replacing current non-visual travel techniques, methods, and tools. Consequently, this work describes the design of an intuitive tactile language and series of wearable tactile interfaces (the Haptic Chair, HaptWrap, and HapBack) to deliver real-time spatiotemporal data. The overall intuitiveness of the haptic mappings conveyed through the tactile interfaces are evaluated using a combination of absolute identification accuracy of a series of patterns and subjective feedback through post-experiment surveys. Two types of spatiotemporal representations are considered: static patterns representing object location at a single time instance, and dynamic patterns, added in the HaptWrap, which represent object movement over a time interval. Results support the viability of multi-dimensional haptics applied to the body to yield an intuitive understanding of dynamic interactions occurring around the navigator during travel. Lastly, it is important to point out that the guiding principle of this work centered on providing the navigator with spatial knowledge otherwise unattainable through current mobility techniques, methods, and tools, thus, providing the \emph{navigator} with the information necessary to make informed navigation decisions independently, at a distance.

Contributors

Agent

Created

Date Created
  • 2020

158224-Thumbnail Image.png

Modern Sensory Substitution for Vision in Dynamic Environments

Description

Societal infrastructure is built with vision at the forefront of daily life. For those with

severe visual impairments, this creates countless barriers to the participation and

enjoyment of life’s opportunities. Technological progress

Societal infrastructure is built with vision at the forefront of daily life. For those with

severe visual impairments, this creates countless barriers to the participation and

enjoyment of life’s opportunities. Technological progress has been both a blessing and

a curse in this regard. Digital text together with screen readers and refreshable Braille

displays have made whole libraries readily accessible and rideshare tech has made

independent mobility more attainable. Simultaneously, screen-based interactions and

experiences have only grown in pervasiveness and importance, precluding many of

those with visual impairments.

Sensory Substituion, the process of substituting an unavailable modality with

another one, has shown promise as an alternative to accomodation, but in recent

years meaningful strides in Sensory Substitution for vision have declined in frequency.

Given recent advances in Computer Vision, this stagnation is especially disconcerting.

Designing Sensory Substitution Devices (SSDs) for vision for use in interactive settings

that leverage modern Computer Vision techniques presents a variety of challenges

including perceptual bandwidth, human-computer-interaction, and person-centered

machine learning considerations. To surmount these barriers an approach called Per-

sonal Foveated Haptic Gaze (PFHG), is introduced. PFHG consists of two primary

components: a human visual system inspired interaction paradigm that is intuitive

and flexible enough to generalize to a variety of applications called Foveated Haptic

Gaze (FHG), and a person-centered learning component to address the expressivity

limitations of most SSDs. This component is called One-Shot Object Detection by

Data Augmentation (1SODDA), a one-shot object detection approach that allows a

user to specify the objects they are interested in locating visually and with minimal

effort realizing an object detection model that does so effectively.

The Personal Foveated Haptic Gaze framework was realized in a virtual and real-

world application: playing a 3D, interactive, first person video game (DOOM) and

finding user-specified real-world objects. User study results found Foveated Haptic

Gaze to be an effective and intuitive interface for interacting with dynamic visual

world using solely haptics. Additionally, 1SODDA achieves competitive performance

among few-shot object detection methods and high-framerate many-shot object de-

tectors. The combination of which paves the way for modern Sensory Substitution

Devices for vision.

Contributors

Agent

Created

Date Created
  • 2020