Matching Items (4)
Filtering by

Clear all filters

156023-Thumbnail Image.png
Description
This paper presents the results of an empirical analysis of deceptive data visualizations paired with explanatory text. Data visualizations are used to communicate information about important social issues to large audiences and are found in the news, social media, and the Internet (Kirk, 2012). Modern technology and software allow people

This paper presents the results of an empirical analysis of deceptive data visualizations paired with explanatory text. Data visualizations are used to communicate information about important social issues to large audiences and are found in the news, social media, and the Internet (Kirk, 2012). Modern technology and software allow people and organizations to easily produce and publish data visualizations, contributing to data visualizations becoming more prevalent as a means of communicating important information (Sue & Griffin, 2016). Ethical transgressions in data visualizations are the intentional or unintentional use of deceptive techniques with the potential of altering the audience’s understanding of the information being presented (Pandey et al., 2015). While many have discussed the importance of ethics in data visualization, scientists have only recently started to look at how deceptive data visualizations affect the reader. This study was administered as an on-line user survey and was designed to test the deceptive potential of data visualizations when they are accompanied by a paragraph of text. The study consisted of a demographic questionnaire, chart familiarity assessment, and data visualization survey. A total of 256 participants completed the survey and were evenly distributed between a control (non-deceptive) survey or a test (deceptive) survey in which participant were asked to observe a paragraph of text and data visualization paired together. Participants then answered a question relevant to the observed information to measure how they perceived the information to be. The individual differences between demographic groups and their responses were analyzed to understand how these groups reacted to deceptive data visualizations compared to the control group. The results of the study confirmed that deceptive techniques in data visualizations caused participants to misinterpret the information in the deceptive data visualizations even when they were accompanied by a paragraph of explanatory text. Furthermore, certain demographics and comfort levels with chart types were more susceptible to certain types of deceptive techniques. These results highlight the importance of education and practice in the area of data visualizations to ensure deceptive practices are not utilized and to avoid potential misinformation, especially when information can be called into question.
ContributorsO'Brien, Shaun (Author) / Laure, Claire (Thesis advisor) / Brumberger, Eva (Committee member) / D'Angelo, Barbara J. (Committee member) / Arizona State University (Publisher)
Created2017
148180-Thumbnail Image.png
Description

In this Barrett Honors Thesis, I developed a model to quantify the complexity of Sankey diagrams, which are a type of visualization technique that shows flow between groups. To do this, I created a carefully controlled dataset of synthetic Sankey diagrams of varying sizes as study stimuli. Then, a pair

In this Barrett Honors Thesis, I developed a model to quantify the complexity of Sankey diagrams, which are a type of visualization technique that shows flow between groups. To do this, I created a carefully controlled dataset of synthetic Sankey diagrams of varying sizes as study stimuli. Then, a pair of online crowdsourced user studies were conducted and analyzed. User performance for Sankey diagrams of varying size and features (number of groups, number of timesteps, and number of flow crossings) were algorithmically modeled as a formula to quantify the complexity of these diagrams. Model accuracy was measured based on the performance of users in the second crowdsourced study. The results of my experiment conclusively demonstrates that the algorithmic complexity formula I created closely models the visual complexity of the Sankey Diagrams in the dataset.

ContributorsGinjpalli, Shashank (Author) / Bryan, Chris (Thesis director) / Hsiao, Sharon (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

Grubhub's user reviews from the Apple IOS store were analyzed to provide alternate user experience (UX) solutions through answering the following:
1. How is Grubhub's mobile app meeting user expectations?
2. How can Grubhub improve the mobile app experience?

ContributorsDiaz, Samantha (Author) / Harris, LaVerne Abe (Degree committee member) / D'Angelo, Barbara J. (Degree committee member) / Mara, Andrew (Degree committee member)
Created2019-12-13
158908-Thumbnail Image.png
Description
While significant qualitative, user study-focused research has been done on augmented reality, relatively few studies have been conducted on multiple, co-located synchronously collaborating users in augmented reality. Recognizing the need for more collaborative user studies in augmented reality and the value such studies present, a user study is conducted of

While significant qualitative, user study-focused research has been done on augmented reality, relatively few studies have been conducted on multiple, co-located synchronously collaborating users in augmented reality. Recognizing the need for more collaborative user studies in augmented reality and the value such studies present, a user study is conducted of collaborative decision-making in augmented reality to investigate the following research question: “Does presenting data visualizations in augmented reality influence the collaborative decision-making behaviors of a team?” This user study evaluates how viewing data visualizations with augmented reality headsets impacts collaboration in small teams compared to viewing together on a single 2D desktop monitor as a baseline. Teams of two participants performed closed and open-ended evaluation tasks to collaboratively analyze data visualized in both augmented reality and on a desktop monitor. Multiple means of collecting and analyzing data were employed to develop a well-rounded context for results and conclusions, including software logging of participant interactions, qualitative analysis of video recordings of participant sessions, and pre- and post-study participant questionnaires. The results indicate that augmented reality doesn’t significantly change the quantity of team member communication but does impact the means and strategies participants use to collaborate.
ContributorsKintscher, Michael (Author) / Bryan, Chris (Thesis advisor) / Amresh, Ashish (Thesis advisor) / Hansford, Dianne (Committee member) / Johnson, Erik (Committee member) / Arizona State University (Publisher)
Created2020