This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 2 of 2
Filtering by

Clear all filters

154986-Thumbnail Image.png
Description
A lot of strides have been made in enabling technologies to aid individuals with visual impairment live an independent life. The advent of smart devices and participatory web has especially facilitated the possibility of new interactions to aide everyday tasks. Current systems however tend to be complex and require multiple

A lot of strides have been made in enabling technologies to aid individuals with visual impairment live an independent life. The advent of smart devices and participatory web has especially facilitated the possibility of new interactions to aide everyday tasks. Current systems however tend to be complex and require multiple cumbersome devices which invariably come with steep learning curves. Building new cyber-human systems with simple integrated interfaces while keeping in mind the specific requirements of the target users would help alleviate their mundane yet significant daily needs. Navigation is one such significant need that forms an integral part of everyday life and is one of the areas where individuals with visual impairment face the most discomfort. There is little technology out there to help travelers with navigating new routes. A number of research prototypes have been proposed but none of them are available to the general population. This may be due to the need for special equipment that needs expertise before deployment, or trained professionals needing to calibrate devices or because of the fact that the systems are just not scalable. Another area that needs assistance is the field of education. Lot of the classroom material and textbook material is not readily available in alternate formats for use. Another such area that requires attention is information delivery in the age of web 2.0. Popular websites like Facebook, Amazon, etc are designed with sighted people as target audience. While the mobile editions with their pared down versions make it easier to navigate with screen readers, the truth remains that there is still a long way to go in making such websites truly accessible.
ContributorsPaladugu, Devi Archana (Author) / Li, Baoxin (Thesis advisor) / Hedgpeth, Terri (Committee member) / Atkinson, Robert (Committee member) / Walker, Erin (Committee member) / Arizona State University (Publisher)
Created2016
158792-Thumbnail Image.png
Description
Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from

Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from afar. As a sensory organ in particular, the eyes have an unparalleled ability to adjust to varying degrees of light, color, and distance. Therefore, in the case of a non-visual traveler, someone who is blind or low vision, access to visual information is unattainable if it is positioned beyond the reach of the preferred mobility device or outside the path of travel. Although, the area of assistive technology in terms of electronic travel aids (ETA’s) has received considerable attention over the last two decades; surprisingly, the field has seen little work in the area focused on augmenting rather than replacing current non-visual travel techniques, methods, and tools. Consequently, this work describes the design of an intuitive tactile language and series of wearable tactile interfaces (the Haptic Chair, HaptWrap, and HapBack) to deliver real-time spatiotemporal data. The overall intuitiveness of the haptic mappings conveyed through the tactile interfaces are evaluated using a combination of absolute identification accuracy of a series of patterns and subjective feedback through post-experiment surveys. Two types of spatiotemporal representations are considered: static patterns representing object location at a single time instance, and dynamic patterns, added in the HaptWrap, which represent object movement over a time interval. Results support the viability of multi-dimensional haptics applied to the body to yield an intuitive understanding of dynamic interactions occurring around the navigator during travel. Lastly, it is important to point out that the guiding principle of this work centered on providing the navigator with spatial knowledge otherwise unattainable through current mobility techniques, methods, and tools, thus, providing the \emph{navigator} with the information necessary to make informed navigation decisions independently, at a distance.
ContributorsDuarte, Bryan Joiner (Author) / McDaniel, Troy (Thesis advisor) / Davulcu, Hasan (Committee member) / Li, Baoxin (Committee member) / Venkateswara, Hemanth (Committee member) / Arizona State University (Publisher)
Created2020