Matching Items (2)
Filtering by

Clear all filters

136593-Thumbnail Image.png
Description
Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the size, weight, shape, density, and texture of objects being interacted with. While vision is highly relied upon for many tasks,

Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the size, weight, shape, density, and texture of objects being interacted with. While vision is highly relied upon for many tasks, especially those involving accurate reaches, people can typically accomplish common daily skills without constant visual feedback, instead relying on tactile and proprioceptive cues. Amputees using prosthetic hands, however, do not currently have access to such cues, making these tasks impossible. This experiment was designed to test whether vibratory haptic cues could be used in replacement of tactile feedback to signal contact for a size discrimination task. Two experiments were run in which subjects were asked to identify changes in block size between consecutive trials using wither physical or virtual blocks to test the accuracy of size discrimination using tactile and haptic feedback, respectively. Blocks randomly increased or decreased in size in increments of 2 to 12 mm between trials for both experiments. This experiment showed that subjects were significantly better at determining size changes using tactile feedback than vibratory haptic cues. This suggests that, while haptic feedback can technically be used to grasp and discriminate between objects of different sizes, it does not lend the same level of input as tactile cues.
ContributorsOlson, Markey Cierra (Author) / Helms-Tilley, Stephen (Thesis director) / Buneo, Christopher (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2015-05
158792-Thumbnail Image.png
Description
Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from afar. As a sensory organ in particular, the eyes have an unparalleled ability to adjust to varying degrees of light, color, and distance. Therefore, in the case of a non-visual traveler, someone who is blind or low vision, access to visual information is unattainable if it is positioned beyond the reach of the preferred mobility device or outside the path of travel. Although, the area of assistive technology in terms of electronic travel aids (ETA’s) has received considerable attention over the last two decades; surprisingly, the field has seen little work in the area focused on augmenting rather than replacing current non-visual travel techniques, methods, and tools. Consequently, this work describes the design of an intuitive tactile language and series of wearable tactile interfaces (the Haptic Chair, HaptWrap, and HapBack) to deliver real-time spatiotemporal data. The overall intuitiveness of the haptic mappings conveyed through the tactile interfaces are evaluated using a combination of absolute identification accuracy of a series of patterns and subjective feedback through post-experiment surveys. Two types of spatiotemporal representations are considered: static patterns representing object location at a single time instance, and dynamic patterns, added in the HaptWrap, which represent object movement over a time interval. Results support the viability of multi-dimensional haptics applied to the body to yield an intuitive understanding of dynamic interactions occurring around the navigator during travel. Lastly, it is important to point out that the guiding principle of this work centered on providing the navigator with spatial knowledge otherwise unattainable through current mobility techniques, methods, and tools, thus, providing the \emph{navigator} with the information necessary to make informed navigation decisions independently, at a distance.
ContributorsDuarte, Bryan Joiner (Author) / McDaniel, Troy (Thesis advisor) / Davulcu, Hasan (Committee member) / Li, Baoxin (Committee member) / Venkateswara, Hemanth (Committee member) / Arizona State University (Publisher)
Created2020