Matching Items (4)
Filtering by

Clear all filters

157469-Thumbnail Image.png
Description
What if there is a way to integrate prosthetics seamlessly with the human body and robots could help improve the lives of children with disabilities? With physical human-robot interaction being seen in multiple aspects of life, including industry, medical, and social, how these robots are interacting with human becomes

What if there is a way to integrate prosthetics seamlessly with the human body and robots could help improve the lives of children with disabilities? With physical human-robot interaction being seen in multiple aspects of life, including industry, medical, and social, how these robots are interacting with human becomes even more important. Therefore, how smoothly the robot can interact with a person will determine how safe and efficient this relationship will be. This thesis investigates adaptive control method that allows a robot to adapt to the human's actions based on the interaction force. Allowing the relationship to become more effortless and less strained when the robot has a different goal than the human, as seen in Game Theory, using multiple techniques that adapts the system. Few applications this could be used for include robots in physical therapy, manufacturing robots that can adapt to a changing environment, and robots teaching people something new like dancing or learning how to walk after surgery.

The experience gained is the understanding of how a cost function of a system works, including the tracking error, speed of the system, the robot’s effort, and the human’s effort. Also, this two-agent system, results into a two-agent adaptive impedance model with an input for each agent of the system. This leads to a nontraditional linear quadratic regulator (LQR), that must be separated and then added together. Thus, creating a traditional LQR. This new experience can be used in the future to help build better safety protocols on manufacturing robots. In the future the knowledge learned from this research could be used to develop technologies for a robot to allow to adapt to help counteract human error.
ContributorsBell, Rebecca C (Author) / Zhang, Wenlong (Thesis advisor) / Chiou, Erin (Committee member) / Aukes, Daniel (Committee member) / Arizona State University (Publisher)
Created2019
157710-Thumbnail Image.png
Description
With the growth of autonomous vehicles’ prevalence, it is important to understand the relationship between autonomous vehicles and the other drivers around them. More specifically, how does one’s knowledge about autonomous vehicles (AV) affect positive and negative affect towards driving in their presence? Furthermore, how does trust of autonomous vehicles

With the growth of autonomous vehicles’ prevalence, it is important to understand the relationship between autonomous vehicles and the other drivers around them. More specifically, how does one’s knowledge about autonomous vehicles (AV) affect positive and negative affect towards driving in their presence? Furthermore, how does trust of autonomous vehicles correlate with those emotions? These questions were addressed by conducting a survey to measure participant’s positive affect, negative affect, and trust when driving in the presence of autonomous vehicles. Participants’ were issued a pretest measuring existing knowledge of autonomous vehicles, followed by measures of affect and trust. After completing this pre-test portion of the study, participants were given information about how autonomous vehicles work, and were then presented with a posttest identical to the pretest. The educational intervention had no effect on positive or negative affect, though there was a positive relationship between positive affect and trust and a negative relationship between negative affect and trust. These findings will be used to inform future research endeavors researching trust and autonomous vehicles using a test bed developed at Arizona State University. This test bed allows for researchers to examine the behavior of multiple participants at the same time and include autonomous vehicles in studies.
ContributorsMartin, Sterling (Author) / Cooke, Nancy J. (Thesis advisor) / Chiou, Erin (Committee member) / Gray, Robert (Committee member) / Arizona State University (Publisher)
Created2019
171724-Thumbnail Image.png
Description
Human-robot teams (HRTs) have seen more frequent use over the past few years,specifically, in the context of Search and Rescue (SAR) environments. Trust is an important factor in the success of HRTs. Both trust and reliance must be appropriately calibrated for the human operator to work faultlessly with a robot

Human-robot teams (HRTs) have seen more frequent use over the past few years,specifically, in the context of Search and Rescue (SAR) environments. Trust is an important factor in the success of HRTs. Both trust and reliance must be appropriately calibrated for the human operator to work faultlessly with a robot teammate. In highly complex and time restrictive environments, such as a search and rescue mission following a disaster, uncertainty information may be given by the robot in the form of confidence to help properly calibrate trust and reliance. This study seeks to examine the impact that confidence information may have on trust and how it may help calibrate reliance in complex HRTs. Trust and reliance data were gathered using a simulated SAR task environment for participants who then received confidence information from the robot for one of two missions. Results from this study indicated that trust was higher when participants received confidence information from the robot, however, no clear relationship between confidence and reliance were found. The findings from this study can be used to further improve human-robot teaming in search and rescue tasks.
ContributorsWolff, Alexandra (Author) / Cooke, Nancy J (Thesis advisor) / Chiou, Erin (Committee member) / Gray, Rob (Committee member) / Arizona State University (Publisher)
Created2022
157988-Thumbnail Image.png
Description
The current study aims to explore factors affecting trust in human-drone collaboration. A current gap exists in research surrounding civilian drone use and the role of trust in human-drone interaction and collaboration. Specifically, existing research lacks an explanation of the relationship between drone pilot experience, trust, and trust-related behaviors as

The current study aims to explore factors affecting trust in human-drone collaboration. A current gap exists in research surrounding civilian drone use and the role of trust in human-drone interaction and collaboration. Specifically, existing research lacks an explanation of the relationship between drone pilot experience, trust, and trust-related behaviors as well as other factors. Using two dimensions of trust in human-automation team—purpose and performance—the effects of experience on drone design and trust is studied to explore factors that may contribute to such a model. An online survey was conducted to examine civilian drone operators’ experience, familiarity, expertise, and trust in commercially available drones. It was predicted that factors of prior experience (familiarity, self-reported expertise) would have a significant effect on trust in drones. The choice to use or exclude the drone propellers in a search-and-identify scenario, paired with the pilots’ experience with drones, would further confirm the relevance of the trust dimensions of purpose versus performance in the human-drone relationship. If the pilot has a positive sense of purpose and benevolence with the drone, the pilot trusts the drone has a positive intent towards them and the task. If the pilot has trust in the performance of the drone, they ascertain that the drone has the skill to do the task. The researcher found no significant differences between mean trust scores across levels of familiarity, but did find some interaction between self-report expertise, familiarity, and trust. Future research should further explore more concrete measures of situational participant factors such as self-confidence and expertise to understand their role in civilian pilots’ trust in their drone.
ContributorsNiichel, Madeline Kathleen (Author) / Chiou, Erin (Thesis advisor) / Cooke, Nancy J. (Committee member) / Craig, Scotty (Committee member) / Arizona State University (Publisher)
Created2019