Matching Items (4)
Filtering by

Clear all filters

156924-Thumbnail Image.png
Description
Highly automated vehicles require drivers to remain aware enough to takeover

during critical events. Driver distraction is a key factor that prevents drivers from reacting

adequately, and thus there is need for an alert to help drivers regain situational awareness

and be able to act quickly and successfully should a

Highly automated vehicles require drivers to remain aware enough to takeover

during critical events. Driver distraction is a key factor that prevents drivers from reacting

adequately, and thus there is need for an alert to help drivers regain situational awareness

and be able to act quickly and successfully should a critical event arise. This study

examines two aspects of alerts that could help facilitate driver takeover: mode (auditory

and tactile) and direction (towards and away). Auditory alerts appear to be somewhat

more effective than tactile alerts, though both modes produce significantly faster reaction

times than no alert. Alerts moving towards the driver also appear to be more effective

than alerts moving away from the driver. Future research should examine how

multimodal alerts differ from single mode, and see if higher fidelity alerts influence

takeover times.
ContributorsBrogdon, Michael A (Author) / Gray, Robert (Thesis advisor) / Branaghan, Russell (Committee member) / Chiou, Erin (Committee member) / Arizona State University (Publisher)
Created2018
157253-Thumbnail Image.png
Description
Reading partners’ actions correctly is essential for successful coordination, but interpretation does not always reflect reality. Attribution biases, such as self-serving and correspondence biases, lead people to misinterpret their partners’ actions and falsely assign blame after an unexpected event. These biases thus further influence people’s trust in their partners, including

Reading partners’ actions correctly is essential for successful coordination, but interpretation does not always reflect reality. Attribution biases, such as self-serving and correspondence biases, lead people to misinterpret their partners’ actions and falsely assign blame after an unexpected event. These biases thus further influence people’s trust in their partners, including machine partners. The increasing capabilities and complexity of machines allow them to work physically with humans. However, their improvements may interfere with the accuracy for people to calibrate trust in machines and their capabilities, which requires an understanding of attribution biases’ effect on human-machine coordination. Specifically, the current thesis explores how the development of trust in a partner is influenced by attribution biases and people’s assignment of blame for a negative outcome. This study can also suggest how a machine partner should be designed to react to environmental disturbances and report the appropriate level of information about external conditions.
ContributorsHsiung, Chi-Ping (M.S.) (Author) / Chiou, Erin (Thesis advisor) / Cooke, Nancy J. (Thesis advisor) / Zhang, Wenlong (Committee member) / Arizona State University (Publisher)
Created2019
157710-Thumbnail Image.png
Description
With the growth of autonomous vehicles’ prevalence, it is important to understand the relationship between autonomous vehicles and the other drivers around them. More specifically, how does one’s knowledge about autonomous vehicles (AV) affect positive and negative affect towards driving in their presence? Furthermore, how does trust of autonomous vehicles

With the growth of autonomous vehicles’ prevalence, it is important to understand the relationship between autonomous vehicles and the other drivers around them. More specifically, how does one’s knowledge about autonomous vehicles (AV) affect positive and negative affect towards driving in their presence? Furthermore, how does trust of autonomous vehicles correlate with those emotions? These questions were addressed by conducting a survey to measure participant’s positive affect, negative affect, and trust when driving in the presence of autonomous vehicles. Participants’ were issued a pretest measuring existing knowledge of autonomous vehicles, followed by measures of affect and trust. After completing this pre-test portion of the study, participants were given information about how autonomous vehicles work, and were then presented with a posttest identical to the pretest. The educational intervention had no effect on positive or negative affect, though there was a positive relationship between positive affect and trust and a negative relationship between negative affect and trust. These findings will be used to inform future research endeavors researching trust and autonomous vehicles using a test bed developed at Arizona State University. This test bed allows for researchers to examine the behavior of multiple participants at the same time and include autonomous vehicles in studies.
ContributorsMartin, Sterling (Author) / Cooke, Nancy J. (Thesis advisor) / Chiou, Erin (Committee member) / Gray, Robert (Committee member) / Arizona State University (Publisher)
Created2019
157988-Thumbnail Image.png
Description
The current study aims to explore factors affecting trust in human-drone collaboration. A current gap exists in research surrounding civilian drone use and the role of trust in human-drone interaction and collaboration. Specifically, existing research lacks an explanation of the relationship between drone pilot experience, trust, and trust-related behaviors as

The current study aims to explore factors affecting trust in human-drone collaboration. A current gap exists in research surrounding civilian drone use and the role of trust in human-drone interaction and collaboration. Specifically, existing research lacks an explanation of the relationship between drone pilot experience, trust, and trust-related behaviors as well as other factors. Using two dimensions of trust in human-automation team—purpose and performance—the effects of experience on drone design and trust is studied to explore factors that may contribute to such a model. An online survey was conducted to examine civilian drone operators’ experience, familiarity, expertise, and trust in commercially available drones. It was predicted that factors of prior experience (familiarity, self-reported expertise) would have a significant effect on trust in drones. The choice to use or exclude the drone propellers in a search-and-identify scenario, paired with the pilots’ experience with drones, would further confirm the relevance of the trust dimensions of purpose versus performance in the human-drone relationship. If the pilot has a positive sense of purpose and benevolence with the drone, the pilot trusts the drone has a positive intent towards them and the task. If the pilot has trust in the performance of the drone, they ascertain that the drone has the skill to do the task. The researcher found no significant differences between mean trust scores across levels of familiarity, but did find some interaction between self-report expertise, familiarity, and trust. Future research should further explore more concrete measures of situational participant factors such as self-confidence and expertise to understand their role in civilian pilots’ trust in their drone.
ContributorsNiichel, Madeline Kathleen (Author) / Chiou, Erin (Thesis advisor) / Cooke, Nancy J. (Committee member) / Craig, Scotty (Committee member) / Arizona State University (Publisher)
Created2019