Matching Items (3)
134584-Thumbnail Image.png
Description
There are two common cognitive distortions present in risky decision-making behavior. The gambler's fallacy is the notion that a random game of chance is potentially biased by previous outcomes, and the near-miss effect is the overestimation of the probability of winning immediately after barely missing a win. This study replicated

There are two common cognitive distortions present in risky decision-making behavior. The gambler's fallacy is the notion that a random game of chance is potentially biased by previous outcomes, and the near-miss effect is the overestimation of the probability of winning immediately after barely missing a win. This study replicated a portion of the methods of Clark et al. (2014) in an attempt to support the presence of these two fallacies in online simulated risky decision-making tasks. One hundred individuals were recruited and asked to perform one of two classic gambling tasks, either predict the outcome of a dichromatic roulette wheel or spin a simplified, two-reel slot machine. An analysis of color predictions as a function of run length revealed a classic gambler's fallacy effect in the roulette wheel task. A heightened motivation to continue playing after a win, but not a near or full miss, was seen in the slot machine task. How pleased an individual was with the results of the previous round directly affected his or her interest in continuing to play in both experiments. These findings indicate that the gambler's fallacy is present in online decision-making simulations involving risk, but that the near-miss effect is not.
ContributorsCatinchi, Alexis Leigh (Author) / McClure, Samuel (Thesis director) / Glenberg, Arthur (Committee member) / Gatewood, Kira (Committee member) / School of Life Sciences (Contributor) / Department of Psychology (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
187640-Thumbnail Image.png
Description
ABSTRACT The responses to idealized cases of peer disagreement given in the peer disagreement literature are presented as though those responses ought to be applied to real-world cases of disagreement. In order to apply the advice given in the literature to actual disagreement situations, one must first confidently identify one’s

ABSTRACT The responses to idealized cases of peer disagreement given in the peer disagreement literature are presented as though those responses ought to be applied to real-world cases of disagreement. In order to apply the advice given in the literature to actual disagreement situations, one must first confidently identify one’s epistemic peers. Previous work in the literature, especially by Nathan King, suggests that one cannot confidently identify one’s epistemic peers in real-world cases of disagreement because it is unlikely that any two people will ever meet the idealized conditions of peerhood in real-world disagreements. I argue that due to the unconscious judgment-altering effects of certain cognitive biases, even if one could consciously meet the idealized conditions for epistemic peerhood as they are outlined in the peer disagreement literature, one should still not be confident that one has correctly identified others as one’s epistemic peers. I give examples of how cognitive biases can affect one’s judgments of one’s own epistemic abilities and the epistemic abilities of others, and I conclude that the peer disagreement literature’s prescriptions may not be suitable for, and are perhaps deleterious to, rational real-world disagreement resolution.
ContributorsBetts, Adam (Author) / Ballantyne, Nathan (Thesis advisor) / King, Nathan (Committee member) / Kung, Peter (Committee member) / Phillips, Ben (Committee member) / Arizona State University (Publisher)
Created2023
153207-Thumbnail Image.png
Description
Cyber threats are growing in number and sophistication making it important to continually study and improve all dimensions of cyber defense. Human teamwork in cyber defense analysis has been overlooked even though it has been identified as an important predictor of cyber defense performance. Also, to detect advanced forms of

Cyber threats are growing in number and sophistication making it important to continually study and improve all dimensions of cyber defense. Human teamwork in cyber defense analysis has been overlooked even though it has been identified as an important predictor of cyber defense performance. Also, to detect advanced forms of threats effective information sharing and collaboration between the cyber defense analysts becomes imperative. Therefore, through this dissertation work, I took a cognitive engineering approach to investigate and improve cyber defense teamwork. The approach involved investigating a plausible team-level bias called the information pooling bias in cyber defense analyst teams conducting the detection task that is part of forensics analysis through human-in-the-loop experimentation. The approach also involved developing agent-based models based on the experimental results to explore the cognitive underpinnings of this bias in human analysts. A prototype collaborative visualization tool was developed by considering the plausible cognitive limitations contributing to the bias to investigate whether a cognitive engineering-driven visualization tool can help mitigate the bias in comparison to off-the-shelf tools. It was found that participant teams conducting the collaborative detection tasks as part of forensics analysis, experience the information pooling bias affecting their performance. Results indicate that cognitive friendly visualizations can help mitigate the effect of this bias in cyber defense analysts. Agent-based modeling produced insights on internal cognitive processes that might be contributing to this bias which could be leveraged in building future visualizations. This work has multiple implications including the development of new knowledge about the science of cyber defense teamwork, a demonstration of the advantage of developing tools using a cognitive engineering approach, a demonstration of the advantage of using a hybrid cognitive engineering methodology to study teams in general and finally, a demonstration of the effect of effective teamwork on cyber defense performance.
ContributorsRajivan, Prashanth (Author) / Cooke, Nancy J. (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Janssen, Marcus (Committee member) / Arizona State University (Publisher)
Created2014