Barrett, The Honors College at Arizona State University proudly showcases the work of undergraduate honors students by sharing this collection exclusively with the ASU community.

Barrett accepts high performing, academically engaged undergraduate students and works with them in collaboration with all of the other academic units at Arizona State University. All Barrett students complete a thesis or creative project which is an opportunity to explore an intellectual interest and produce an original piece of scholarly research. The thesis or creative project is supervised and defended in front of a faculty committee. Students are able to engage with professors who are nationally recognized in their fields and committed to working with honors students. Completing a Barrett thesis or creative project is an opportunity for undergraduate honors students to contribute to the ASU academic community in a meaningful way.

Displaying 1 - 10 of 216
Filtering by

Clear all filters

131527-Thumbnail Image.png
Description
Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic

Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic indoor or urban environments. Using recent improvements in the field of machine learning, this project proposes a new method of localization using networks with several wireless transceivers and implemented without heavy computational loads or high costs. This project aims to build a proof-of-concept prototype and demonstrate that the proposed technique is feasible and accurate.

Modern communication networks heavily depend upon an estimate of the communication channel, which represents the distortions that a transmitted signal takes as it moves towards a receiver. A channel can become quite complicated due to signal reflections, delays, and other undesirable effects and, as a result, varies significantly with each different location. This localization system seeks to take advantage of this distinctness by feeding channel information into a machine learning algorithm, which will be trained to associate channels with their respective locations. A device in need of localization would then only need to calculate a channel estimate and pose it to this algorithm to obtain its location.

As an additional step, the effect of location noise is investigated in this report. Once the localization system described above demonstrates promising results, the team demonstrates that the system is robust to noise on its location labels. In doing so, the team demonstrates that this system could be implemented in a continued learning environment, in which some user agents report their estimated (noisy) location over a wireless communication network, such that the model can be implemented in an environment without extensive data collection prior to release.
ContributorsChang, Roger (Co-author) / Kann, Trevor (Co-author) / Alkhateeb, Ahmed (Thesis director) / Bliss, Daniel (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131537-Thumbnail Image.png
Description
At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment.

At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment. An automated, stable, and accurate method to evaluate Parkinson’s would be significant in streamlining diagnoses of patients and providing families more time for corrective measures. We propose a methodology which incorporates TDA into analyzing Parkinson’s disease postural shifts data through the representation of persistence images. Studying the topology of a system has proven to be invariant to small changes in data and has been shown to perform well in discrimination tasks. The contributions of the paper are twofold. We propose a method to 1) classify healthy patients from those afflicted by disease and 2) diagnose the severity of disease. We explore the use of the proposed method in an application involving a Parkinson’s disease dataset comprised of healthy-elderly, healthy-young and Parkinson’s disease patients.
ContributorsRahman, Farhan Nadir (Co-author) / Nawar, Afra (Co-author) / Turaga, Pavan (Thesis director) / Krishnamurthi, Narayanan (Committee member) / Electrical Engineering Program (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
133880-Thumbnail Image.png
Description
In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form

In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form a dependency tree. An agent operating within these environments have access to low amounts of data about the environment before interacting with it, so it is crucial that this agent is able to effectively utilize a tree of dependencies and its environmental surroundings to make judgements about which sub-goals are most efficient to pursue at any point in time. A successful agent aims to minimizes cost when completing a given goal. A deep neural network in combination with Q-learning techniques was employed to act as the agent in this environment. This agent consistently performed better than agents using alternate models (models that used dependency tree heuristics or human-like approaches to make sub-goal oriented choices), with an average performance advantage of 33.86% (with a standard deviation of 14.69%) over the best alternate agent. This shows that machine learning techniques can be consistently employed to make goal-oriented choices within an environment with recursive sub-goal dependencies and low amounts of pre-known information.
ContributorsKoleber, Derek (Author) / Acuna, Ruben (Thesis director) / Bansal, Ajay (Committee member) / W.P. Carey School of Business (Contributor) / Software Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133901-Thumbnail Image.png
Description
This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally

This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally accepted model of an artificial neuron is broken down into its key components and then analyzed for functionality by relating back to its biological counterpart. The role of a neuron is then described in the context of a neural network, with equal emphasis placed on how it individually undergoes training and then for an entire network. Using the technique of supervised learning, the neural network is trained with three main factors for housing price classification, including its total number of rooms, bathrooms, and square footage. Once trained with most of the generated data set, it is tested for accuracy by introducing the remainder of the data-set and observing how closely its computed output for each set of inputs compares to the target value. From a programming perspective, the artificial neuron is implemented in C so that it would be more closely tied to the operating system and therefore make the collected profiler data more precise during the program's execution. The program is designed to break down each stage of the neuron's training process into distinct functions. In addition to utilizing more functional code, the struct data type is used as the underlying data structure for this project to not only represent the neuron but for implementing the neuron's training and test data. Once fully trained, the neuron's test results are then graphed to visually depict how well the neuron learned from its sample training set. Finally, the profiler data is analyzed to describe how the program operated from a data management perspective on the software and hardware level.
ContributorsRichards, Nicholas Giovanni (Author) / Miller, Phillip (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135453-Thumbnail Image.png
Description
This is a creative thesis project on the topic of the third party logistics industry, and the improvements that are possible through the implementation of goods to person technologies. The scope of the project entails the relationship between Company X, which is a third party logistics provider, and Company Y,

This is a creative thesis project on the topic of the third party logistics industry, and the improvements that are possible through the implementation of goods to person technologies. The scope of the project entails the relationship between Company X, which is a third party logistics provider, and Company Y, a major toy retailer. This thesis identifies current trends for the third party logistics industry such as rising operating costs and average savings achieved through these business relationships. After identifying the negative trends that Company X is vulnerable to such as high human resources costs, and cost of quality issues. Given the findings derived from industry data, a final recommendation was settled on to improve productivity and ultimately reduce the use of temporary labor for Company X. The implementation of a goods to person technology solution provides the opportunity to reduce hours of operation, man hours, as well as direct and indirect costs such as labor. Research has proven that firms operating in the retail industry rely heavily on temporary labor to handle the seasonal demand brought by the holidays, thus this recommendation could be applied to a variety of operations. The data compiled throughout this thesis have major implications for the third party logistics industry and achieving long term profitability in operations management.
ContributorsFonseca, Tanner (Author) / Printezis, Antonios (Thesis director) / Kellso, James (Committee member) / Department of Supply Chain Management (Contributor) / School of Sustainability (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136794-Thumbnail Image.png
Description
Supply Chain Management has many fundamental principles that can be applied to all businesses to improve efficiency and create more transparency, this in turn, encourages collaboration and fosters healthy professional relationships. Using the fundamental principles of supply chain management, I evaluated the Veterans Administration(VA) hospital in regards to their provided

Supply Chain Management has many fundamental principles that can be applied to all businesses to improve efficiency and create more transparency, this in turn, encourages collaboration and fosters healthy professional relationships. Using the fundamental principles of supply chain management, I evaluated the Veterans Administration(VA) hospital in regards to their provided treatment for Post-traumatic Stress Disorder(PTSD) to look for places where efficiency can be improved. I analyzed the problem in relation to Supply Chain Management, PTSD, and design in order to create a more complete solution. Once these areas were addressed, I proposed a solution that included creating a separate clinic for PTSD treatment that addressed the current issues in regards to treatment at the VA hospital. My goal was to improve space efficiencies and design a treatment environment that is more evolved and conducive to veterans suffering from PTSD. Though the creation of one PTSD clinic will not be able to completely change the system, it can be a step in the right direction to bring about the change that needs to occur within the VA medical system.
ContributorsGriffin, Kailey Anne (Author) / Brandt, Beverly (Thesis director) / Davila, Eddie (Committee member) / Damore-Minchew, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / W. P. Carey School of Business (Contributor) / The Design School (Contributor)
Created2014-05
136596-Thumbnail Image.png
Description
This article summarizes exploratory research conducted on private and public hospital systems in Australia and Costa Rica analyzing the trends observed within supply chain procurement. Physician preferences and a general lack of available comparative effectiveness research—both of which are challenges unique to the health care industry—were found to be barriers

This article summarizes exploratory research conducted on private and public hospital systems in Australia and Costa Rica analyzing the trends observed within supply chain procurement. Physician preferences and a general lack of available comparative effectiveness research—both of which are challenges unique to the health care industry—were found to be barriers to effective supply chain performance in both systems. Among other insights, the ability of policy to catalyze improved procurement performance in public hospital systems was also was observed. The role of centralization was also found to be fundamental to the success of the systems examined, allowing hospitals to focus on strategic rather than operational decisions and conduct value-streaming activities to generate increased cost savings.
ContributorsBudgett, Alexander Jay (Author) / Schneller, Eugene (Thesis director) / Gopalakrishnan, Mohan (Committee member) / Barrett, The Honors College (Contributor) / Department of Supply Chain Management (Contributor) / Department of English (Contributor)
Created2015-05
136471-Thumbnail Image.png
Description
The rationale behind this thesis is grounded in nearly two years of experience interning at UTC Aerospace Systems (UTAS). I was able to gain a wide exposure to different facets of the supply chain management organization during my time as an intern, from strategic sourcing and commodity management, to tactical

The rationale behind this thesis is grounded in nearly two years of experience interning at UTC Aerospace Systems (UTAS). I was able to gain a wide exposure to different facets of the supply chain management organization during my time as an intern, from strategic sourcing and commodity management, to tactical procurement and supplier development. In each of these respective areas, I observed a variety of initiatives that did not reach their full potential because employees were not provided the tools for success. One of these areas in particular is the New Product Introduction (NPI) process management, in which there is not a standard process for program managers to follow from start to finish. I saw this as an opportunity to hone in the scope of my thesis research and experience at UTAS to improve a process and provide standard work and tools for it to be consistently executed. The current state process is not formalized \u2014 it merely tracks certain metrics that are not necessarily applicable to the overall health of the program because they do not monitor the progress of the program. This resulted in heavy costs incurred from inadequate planning, a skewed timeline, and customer frustration. The aim of the desired state NPI process is to gather cross-functional expertise and weigh in, adhere to a strict entry to market timeline, and increase customer satisfaction, all while minimizing costs incurred throughout the life of the program. The dominant output of this project will be a cross-functional flow chart of the process for each group to follow and standard work and tools to support the process across a variety of NPI program applications.
ContributorsThorn, Taylor Aiko Marie (Author) / Brown, Steven (Thesis director) / Arrigoni, Gregory (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / Department of Supply Chain Management (Contributor) / W. P. Carey School of Business (Contributor)
Created2015-05
136475-Thumbnail Image.png
Description
Epilepsy affects numerous people around the world and is characterized by recurring seizures, prompting the ability to predict them so precautionary measures may be employed. One promising algorithm extracts spatiotemporal correlation based features from intracranial electroencephalography signals for use with support vector machines. The robustness of this methodology is tested

Epilepsy affects numerous people around the world and is characterized by recurring seizures, prompting the ability to predict them so precautionary measures may be employed. One promising algorithm extracts spatiotemporal correlation based features from intracranial electroencephalography signals for use with support vector machines. The robustness of this methodology is tested through a sensitivity analysis. Doing so also provides insight about how to construct more effective feature vectors.
ContributorsMa, Owen (Author) / Bliss, Daniel (Thesis director) / Berisha, Visar (Committee member) / Barrett, The Honors College (Contributor) / Electrical Engineering Program (Contributor)
Created2015-05
136516-Thumbnail Image.png
Description
Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot

Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot detection, we are interested in bots on Twitter that tweet Arabic extremist-like phrases. A testing dataset is collected using the honeypot method, and five different heuristics are measured for their effectiveness in detecting bots. The model underperformed, but we have laid the ground-work for a vastly untapped focus on bot detection: extremist ideal diffusion through bots.
ContributorsKarlsrud, Mark C. (Author) / Liu, Huan (Thesis director) / Morstatter, Fred (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05