Matching Items (4)
Filtering by

Clear all filters

133276-Thumbnail Image.png
Description
Crowdsourcing has become a popular method for collecting information from a large group of people at a low cost. This thesis looks at whether crowdsourcing is viable to collect data for Human Computer Interaction research and comparing collaborative crowdsourcing with individual crowdsourcing. It was hypothesized that collaborative crowdsourcing would provide

Crowdsourcing has become a popular method for collecting information from a large group of people at a low cost. This thesis looks at whether crowdsourcing is viable to collect data for Human Computer Interaction research and comparing collaborative crowdsourcing with individual crowdsourcing. It was hypothesized that collaborative crowdsourcing would provide higher quality results than individual crowdsourcing due to intrinsic motivation. The research draws upon the use of three things: top 10 usability problems, heuristic evaluation and WAMMI survey to measure the two groups. The two groups used these tools to analyze the website: Phoenix.Craigslist.com. The results were compared against each other and against a heuristic evaluation score given by an HCI researcher to determine their accuracy. The results of the experiment failed to confirm this hypothesis. In the end, both groups provided accurate results and were only marginally different from each other.
ContributorsGupta, Kartik (Author) / Atkinson, Robert (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
137620-Thumbnail Image.png
Description
The area of real-time baseball statistics presents several challenges that can be addressed using mobile devices. In order to accurately record real-time statistics, it is necessary to present the user with a concise interface that can be used to quickly record the necessary data during in-game events. In this project,

The area of real-time baseball statistics presents several challenges that can be addressed using mobile devices. In order to accurately record real-time statistics, it is necessary to present the user with a concise interface that can be used to quickly record the necessary data during in-game events. In this project, we use a mobile application to address this by separating out the required input into pre-game and in-game inputs. We also explore the use of a mobile application to leverage crowd sourcing techniques, which address the challenge of accuracy and precision in subjective real-time statistics.
ContributorsVan Egmond, Eric David (Author) / Tadayon-Navabi, Farideh (Thesis director) / Wilkerson, Kelly (Committee member) / Gorla, Mark (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2013-05
Description
This study aims to combine the wisdom of crowds with ML to make more accurate stock price predictions for a select set of stocks. Different from prior works, this study uses different input elicitation techniques to improve crowd performance. In addition, machine learning is used to support the crowd. The

This study aims to combine the wisdom of crowds with ML to make more accurate stock price predictions for a select set of stocks. Different from prior works, this study uses different input elicitation techniques to improve crowd performance. In addition, machine learning is used to support the crowd. The influence of ML on the crowd is tested by priming participants with suggestions from an ML model. Lastly, the market conditions and stock popularity is observed to better understand crowd behavior.
ContributorsBhogaraju, Harika (Author) / Escobedo, Adolfo R (Thesis director) / Meuth, Ryan (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2022-12
131386-Thumbnail Image.png
Description
Collecting accurate collective decisions via crowdsourcing
is challenging due to cognitive biases, varying
worker expertise, and varying subjective scales. This
work investigates new ways to determine collective decisions
by prompting users to provide input in multiple
formats. A crowdsourced task is created that aims
to determine ground-truth by collecting information in
two different ways: rankings and numerical

Collecting accurate collective decisions via crowdsourcing
is challenging due to cognitive biases, varying
worker expertise, and varying subjective scales. This
work investigates new ways to determine collective decisions
by prompting users to provide input in multiple
formats. A crowdsourced task is created that aims
to determine ground-truth by collecting information in
two different ways: rankings and numerical estimates.
Results indicate that accurate collective decisions can
be achieved with less people when ordinal and cardinal
information is collected and aggregated together
using consensus-based, multimodal models. We also
show that presenting users with larger problems produces
more valuable ordinal information, and is a more
efficient way to collect an aggregate ranking. As a result,
we suggest input-elicitation to be more widely considered
for future work in crowdsourcing and incorporated
into future platforms to improve accuracy and efficiency.
ContributorsKemmer, Ryan Wyeth (Author) / Escobedo, Adolfo (Thesis director) / Maciejewski, Ross (Committee member) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05