Barrett, The Honors College at Arizona State University proudly showcases the work of undergraduate honors students by sharing this collection exclusively with the ASU community.

Barrett accepts high performing, academically engaged undergraduate students and works with them in collaboration with all of the other academic units at Arizona State University. All Barrett students complete a thesis or creative project which is an opportunity to explore an intellectual interest and produce an original piece of scholarly research. The thesis or creative project is supervised and defended in front of a faculty committee. Students are able to engage with professors who are nationally recognized in their fields and committed to working with honors students. Completing a Barrett thesis or creative project is an opportunity for undergraduate honors students to contribute to the ASU academic community in a meaningful way.

Displaying 1 - 10 of 157
Filtering by

Clear all filters

133352-Thumbnail Image.png
Description
The inherent risk in testing drugs has been hotly debated since the government first started regulating the drug industry in the early 1900s. Who can assume the risks associated with trying new pharmaceuticals is unclear when looked at through society's lens. In the mid twentieth century, the US Food and

The inherent risk in testing drugs has been hotly debated since the government first started regulating the drug industry in the early 1900s. Who can assume the risks associated with trying new pharmaceuticals is unclear when looked at through society's lens. In the mid twentieth century, the US Food and Drug Administration (FDA) published several guidance documents encouraging researchers to exclude women from early clinical drug research. The motivation to publish those documents and the subsequent guidance documents in which the FDA and other regulatory offices established their standpoints on women in drug research may have been connected to current events at the time. The problem of whether women should be involved in drug research is a question of who can assume risk and who is responsible for disseminating what specific kinds of information. The problem tends to be framed as one that juxtaposes the health of women and fetuses and sets their health as in opposition. That opposition, coupled with the inherent uncertainty in testing drugs, provides for a complex set of issues surrounding consent and access to information.
ContributorsMeek, Caroline Jane (Author) / Maienschein, Jane (Thesis director) / Brian, Jennifer (Committee member) / School of Life Sciences (Contributor) / Sanford School of Social and Family Dynamics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133880-Thumbnail Image.png
Description
In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form

In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form a dependency tree. An agent operating within these environments have access to low amounts of data about the environment before interacting with it, so it is crucial that this agent is able to effectively utilize a tree of dependencies and its environmental surroundings to make judgements about which sub-goals are most efficient to pursue at any point in time. A successful agent aims to minimizes cost when completing a given goal. A deep neural network in combination with Q-learning techniques was employed to act as the agent in this environment. This agent consistently performed better than agents using alternate models (models that used dependency tree heuristics or human-like approaches to make sub-goal oriented choices), with an average performance advantage of 33.86% (with a standard deviation of 14.69%) over the best alternate agent. This shows that machine learning techniques can be consistently employed to make goal-oriented choices within an environment with recursive sub-goal dependencies and low amounts of pre-known information.
ContributorsKoleber, Derek (Author) / Acuna, Ruben (Thesis director) / Bansal, Ajay (Committee member) / W.P. Carey School of Business (Contributor) / Software Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133894-Thumbnail Image.png
Description
Pandora is a play exploring our relationship with gendered technology through the lens of artificial intelligence. Can women be subjective under patriarchy? Do robots who look like women have subjectivity? Hoping to create a better version of ourselves, The Engineer must navigate the loss of her creation, and Pandora must

Pandora is a play exploring our relationship with gendered technology through the lens of artificial intelligence. Can women be subjective under patriarchy? Do robots who look like women have subjectivity? Hoping to create a better version of ourselves, The Engineer must navigate the loss of her creation, and Pandora must navigate their new world. The original premiere run was March 27-28, 2018, original cast: Caitlin Andelora, Rikki Tremblay, and Michael Tristano Jr.
ContributorsToye, Abigail Elizabeth (Author) / Linde, Jennifer (Thesis director) / Abele, Kelsey (Committee member) / Department of Information Systems (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133901-Thumbnail Image.png
Description
This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally

This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally accepted model of an artificial neuron is broken down into its key components and then analyzed for functionality by relating back to its biological counterpart. The role of a neuron is then described in the context of a neural network, with equal emphasis placed on how it individually undergoes training and then for an entire network. Using the technique of supervised learning, the neural network is trained with three main factors for housing price classification, including its total number of rooms, bathrooms, and square footage. Once trained with most of the generated data set, it is tested for accuracy by introducing the remainder of the data-set and observing how closely its computed output for each set of inputs compares to the target value. From a programming perspective, the artificial neuron is implemented in C so that it would be more closely tied to the operating system and therefore make the collected profiler data more precise during the program's execution. The program is designed to break down each stage of the neuron's training process into distinct functions. In addition to utilizing more functional code, the struct data type is used as the underlying data structure for this project to not only represent the neuron but for implementing the neuron's training and test data. Once fully trained, the neuron's test results are then graphed to visually depict how well the neuron learned from its sample training set. Finally, the profiler data is analyzed to describe how the program operated from a data management perspective on the software and hardware level.
ContributorsRichards, Nicholas Giovanni (Author) / Miller, Phillip (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135668-Thumbnail Image.png
Description
In the medical industry, there have been promising advances in the increase of new types of healthcare to the public. As of 2015, there was a 98% Premarket Approval rate, a 38% increase since 2010. In addition, there were 41 new novel drugs approved for clinical usage in 2014 where

In the medical industry, there have been promising advances in the increase of new types of healthcare to the public. As of 2015, there was a 98% Premarket Approval rate, a 38% increase since 2010. In addition, there were 41 new novel drugs approved for clinical usage in 2014 where the average in the previous years from 2005-2013 was 25. However, the research process towards creating and delivering new healthcare to the public remains remarkably inefficient. It takes on average 15 years, over $900 million by one estimate, for a less than 12% success rate of discovering a novel drug for clinical usage. Medical devices do not fare much better. Between 2005-2009, there were over 700 recalls per year. In addition, it takes at minimum 3.25 years for a 510(k) exempt premarket approval. Plus, a time lag exists where it takes 17 years for only 14% of medical discoveries to be implemented clinically. Coupled with these inefficiencies, government funding for medical research has been decreasing since 2002 (2.5% of Gross Domestic Product) and is predicted to be 1.5% of Gross Domestic Product by 2019. Translational research, the conversion of bench-side discoveries to clinical usage for a simplistic definition, has been on the rise since the 1990s. This may be driving the increased premarket approvals and new novel drug approvals. At the very least, it is worth considering as translational research is directly related towards healthcare practices. In this paper, I propose to improve the outcomes of translational research in order to better deliver advancing healthcare to the public. I suggest Best Value Performance Information Procurement System (BV PIPS) should be adapted in the selection process of translational research projects to fund. BV PIPS has been shown to increase the efficiency and success rate of delivering projects and services. There has been over 17 years of research with $6.3 billion of projects and services delivered showing that BV PIPS has a 98% customer satisfaction, 90% minimized management effort, and utilizes 50% less manpower and effort. Using University of Michigan \u2014 Coulter Foundation Program's funding process as a baseline and standard in the current selection of translational research projects to fund, I offer changes to this process based on BV PIPS that may ameliorate it. As concepts implemented in this process are congruent with literature on successful translational research, it may suggest that this new model for selecting translational research projects to fund will reduce costs, increase efficiency, and increase success. This may then lead to more Premarket Approvals, more new novel drug approvals, quicker delivery time to the market, and lower recalls.
ContributorsDel Rosario, Joseph Paul (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
This project explores the dimensions that affect the success of a nonprofit organizations' web presence by using a dance and health nonprofit website as the foundation of the research and redesign. This report includes literature and design research, analysis, recommendations and a journal of the web design process. Through research,

This project explores the dimensions that affect the success of a nonprofit organizations' web presence by using a dance and health nonprofit website as the foundation of the research and redesign. This report includes literature and design research, analysis, recommendations and a journal of the web design process. Through research, three categories were identified as the primary dimensions affecting the success of a website: content, technical adequacy and appearance. Furthermore, website success is influenced by how the strength of individual categories relies on one another. To improve the web design of Dancers and Health Together Inc., content implementations and redesign elements were both research and personal preference-based. The redesigned website can be found at www.collaydennis.com and will become inactive after May 31, 2015.
ContributorsDennis, Collay Carole (Author) / Coleman, Grisha (Thesis director) / Hosmer, Anthony Ryan (Committee member) / Barrett, The Honors College (Contributor) / Walter Cronkite School of Journalism and Mass Communication (Contributor)
Created2015-05
136202-Thumbnail Image.png
Description
The objective of this research is to determine an approach for automating the learning of the initial lexicon used in translating natural language sentences to their formal knowledge representations based on lambda-calculus expressions. Using a universal knowledge representation and its associated parser, this research attempts to use word alignment techniques

The objective of this research is to determine an approach for automating the learning of the initial lexicon used in translating natural language sentences to their formal knowledge representations based on lambda-calculus expressions. Using a universal knowledge representation and its associated parser, this research attempts to use word alignment techniques to align natural language sentences to the linearized parses of their associated knowledge representations in order to learn the meanings of individual words. The work includes proposing and analyzing an approach that can be used to learn some of the initial lexicon.
ContributorsBaldwin, Amy Lynn (Author) / Baral, Chitta (Thesis director) / Vo, Nguyen (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2015-05
133393-Thumbnail Image.png
Description
Artificial intelligence (AI) is a burgeoning technology, industry, and field of study. While interest levels regarding its applications in marketing have not yet translated into widespread adoption, AI holds tremendous potential for vastly altering how marketing is done. As such, AI in marketing is a crucial topic to research. By

Artificial intelligence (AI) is a burgeoning technology, industry, and field of study. While interest levels regarding its applications in marketing have not yet translated into widespread adoption, AI holds tremendous potential for vastly altering how marketing is done. As such, AI in marketing is a crucial topic to research. By analyzing its current applications, its potential use cases in the near future, how to implement it and its areas for improvement, we can achieve a high-level understanding of AI's long-term implications in marketing. AI offers an improvement to current marketing tactics, as well as entirely new ways of creating and distributing value to customers. For example, programmatic advertising and social media marketing can allow for a more comprehensive view of customer behavior, predictive analytics, and deeper insights through integration with AI. New marketing tools like biometrics, voice, and conversational user interfaces offer novel ways to add value for brands and consumers alike. These innovations all carry similar characteristics of hyper-personalization, efficient spending, scalable experiences, and deep insights. There are important issues that need to be addressed before AI is extensively implemented, including the potential for it to be used maliciously, its effects on job displacement, and the technology itself. The recent progression of AI in marketing is indicative that it will be adopted by a majority of companies soon. The long-term implications of vast implementation are crucial to consider, as an AI-powered industry entails fundamental changes to the skill-sets required to thrive, the way marketers and brands work, and consumer expectations.
ContributorsCannella, James (Author) / Ostrom, Amy (Thesis director) / Giles, Charles (Committee member) / Department of Marketing (Contributor) / Department of Management and Entrepreneurship (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133401-Thumbnail Image.png
Description
As robotics technology advances, robots are being created for use in situations where they collaborate with humans on complex tasks.  For this to be safe and successful, it is important to understand what causes humans to trust robots more or less during a collaborative task.  This research project aims to

As robotics technology advances, robots are being created for use in situations where they collaborate with humans on complex tasks.  For this to be safe and successful, it is important to understand what causes humans to trust robots more or less during a collaborative task.  This research project aims to investigate human-robot trust through a collaborative game of logic that can be played with a human and a robot together. This thesis details the development of a game of logic that could be used for this purpose. The game of logic is based upon a popular game in AI research called ‘Wumpus World’. The original Wumpus World game was a low-interactivity game to be played by humans alone. In this project, the Wumpus World game is modified for a high degree of interactivity with a human player, while also allowing the game to be played simultaneously by an AI algorithm.
ContributorsBoateng, Andrew Owusu (Author) / Sodemann, Angela (Thesis director) / Martin, Thomas (Committee member) / Software Engineering (Contributor) / Engineering Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133932-Thumbnail Image.png
Description
The spread of fake news (rumors) has been a growing problem on the internet in the past few years due to the increase of social media services. People share fake news articles on social media sometimes without knowing that those articles contain false information. Not knowing whether an article is

The spread of fake news (rumors) has been a growing problem on the internet in the past few years due to the increase of social media services. People share fake news articles on social media sometimes without knowing that those articles contain false information. Not knowing whether an article is fake or real is a problem because it causes social media news to lose credibility. Prior research on fake news has focused on how to detect fake news, but efforts towards controlling fake news articles on the internet are still facing challenges. Some of these challenges include; it is hard to collect large sets of fake news data, it is hard to collect locations of people who are spreading fake news, and it is difficult to study the geographic distribution of fake news. To address these challenges, I am examining how fake news spreads in the United States (US) by developing a geographic visualization system for misinformation. I am collecting a set of fake news articles from a website called snopes.com. After collecting these articles I am extracting the keywords from each article and storing them in a file. I then use the stored keywords to search on Twitter in order to find out the locations of users who spread the rumors. Finally, I mark those locations on a map in order to show the geographic distribution of fake news. Having access to large sets of fake news data, knowing the locations of people who are spreading fake news, and being able to understand the geographic distribution of fake news will help in the efforts towards addressing the fake news problem on the internet by providing target areas.
ContributorsNgweta, Lilian Mathias (Author) / Liu, Huan (Thesis director) / Wu, Liang (Committee member) / Software Engineering (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05