This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 2 of 2
Filtering by

Clear all filters

136593-Thumbnail Image.png
Description
Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the size, weight, shape, density, and texture of objects being interacted with. While vision is highly relied upon for many tasks,

Humans rely on a complex interworking of visual, tactile and proprioceptive feedback to accomplish even the most simple of daily tasks. These senses work together to provide information about the size, weight, shape, density, and texture of objects being interacted with. While vision is highly relied upon for many tasks, especially those involving accurate reaches, people can typically accomplish common daily skills without constant visual feedback, instead relying on tactile and proprioceptive cues. Amputees using prosthetic hands, however, do not currently have access to such cues, making these tasks impossible. This experiment was designed to test whether vibratory haptic cues could be used in replacement of tactile feedback to signal contact for a size discrimination task. Two experiments were run in which subjects were asked to identify changes in block size between consecutive trials using wither physical or virtual blocks to test the accuracy of size discrimination using tactile and haptic feedback, respectively. Blocks randomly increased or decreased in size in increments of 2 to 12 mm between trials for both experiments. This experiment showed that subjects were significantly better at determining size changes using tactile feedback than vibratory haptic cues. This suggests that, while haptic feedback can technically be used to grasp and discriminate between objects of different sizes, it does not lend the same level of input as tactile cues.
ContributorsOlson, Markey Cierra (Author) / Helms-Tilley, Stephen (Thesis director) / Buneo, Christopher (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2015-05
154617-Thumbnail Image.png
Description
Humans constantly rely on a complex interaction of a variety of sensory modalities in order to complete even the simplest of daily tasks. For reaching and grasping to interact with objects, the visual, tactile, and proprioceptive senses provide the majority of the information used. While vision is often relied on

Humans constantly rely on a complex interaction of a variety of sensory modalities in order to complete even the simplest of daily tasks. For reaching and grasping to interact with objects, the visual, tactile, and proprioceptive senses provide the majority of the information used. While vision is often relied on for many tasks, most people are able to accomplish common daily rituals without constant visual attention, instead relying mainly on tactile and proprioceptive cues. However, amputees using prosthetic arms do not have access to these cues, making tasks impossible without vision. Even tasks with vision can be incredibly difficult as prosthesis users are unable to modify grip force using touch, and thus tend to grip objects excessively hard to make sure they don’t slip.

Methods such as vibratory sensory substitution have shown promise for providing prosthesis users with a sense of contact and have proved helpful in completing motor tasks. In this thesis, two experiments were conducted to determine whether vibratory cues could be useful in discriminating between sizes. In the first experiment, subjects were asked to grasp a series of hidden virtual blocks of varying sizes with vibrations on the fingertips as indication of contact and compare the size of consecutive boxes. Vibratory haptic feedback significantly increased the accuracy of size discrimination over objects with only visual indication of contact, though accuracy was not as great as for typical grasping tasks with physical blocks. In the second, subjects were asked to adjust their virtual finger position around a series of virtual boxes with vibratory feedback on the fingertips using either finger movement or EMG. It was found that EMG control allowed for significantly less accuracy in size discrimination, implying that, while proprioceptive feedback alone is not enough to determine size, direct kinesthetic information about finger position is still needed.
ContributorsOlson, Markey (Author) / Helms-Tillery, Stephen (Thesis advisor) / Buneo, Christopher (Committee member) / Santello, Marco (Committee member) / Arizona State University (Publisher)
Created2016