Barrett, The Honors College at Arizona State University proudly showcases the work of undergraduate honors students by sharing this collection exclusively with the ASU community.

Barrett accepts high performing, academically engaged undergraduate students and works with them in collaboration with all of the other academic units at Arizona State University. All Barrett students complete a thesis or creative project which is an opportunity to explore an intellectual interest and produce an original piece of scholarly research. The thesis or creative project is supervised and defended in front of a faculty committee. Students are able to engage with professors who are nationally recognized in their fields and committed to working with honors students. Completing a Barrett thesis or creative project is an opportunity for undergraduate honors students to contribute to the ASU academic community in a meaningful way.

Displaying 1 - 10 of 307
Filtering by

Clear all filters

133359-Thumbnail Image.png
Description
The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power.

The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power. Many parallel program- ming libraries are intended for use in high performance computing (HPC) clusters. Unlike the IOT environment described, HPC clusters will in general look to obtain very consistent network speeds and topologies. There are a significant number of software choices that make up what is referred to as the HPC stack or parallel processing stack. My thesis focused on building an HPC stack that would run on the SCB computer name the Raspberry Pi. The intention in making this Raspberry Pi cluster is to research performance of MPI implementations in an IOT environment, which had an impact on the design choices of the cluster. This thesis is a compilation of my research efforts in creating this cluster as well as an evaluation of the software that was chosen to create the parallel processing stack.
ContributorsO'Meara, Braedon Richard (Author) / Meuth, Ryan (Thesis director) / Dasgupta, Partha (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133364-Thumbnail Image.png
Description
The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide

The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide possibilities it can provide. This can convolute the topic by being too broad when defined. Instead, the focus will be maintained on explaining the technical details about how and why this technology works in improving the supply chain. The scope of explanation will not be limited to the solutions, but will also detail current problems. Both public and private blockchain networks will be explained and solutions they provide in supply chains. In addition, other non-blockchain systems will be described that provide important pieces in supply chain operations that blockchain cannot provide. Blockchain when applied to the supply chain provides improved consumer transparency, management of resources, logistics, trade finance, and liquidity.
ContributorsKrukar, Joel Michael (Author) / Oke, Adegoke (Thesis director) / Duarte, Brett (Committee member) / Hahn, Richard (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
131525-Thumbnail Image.png
Description
The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark

The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark fantasy theme. We will first be exploring the challenges that came
with programming my own game - not quite from scratch, but also without a prebuilt
engine - then transition into game design and how Helix has evolved from its original form
to what we see today.
ContributorsDiscipulo, Isaiah K (Author) / Meuth, Ryan (Thesis director) / Kobayashi, Yoshihiro (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131529-Thumbnail Image.png
Description
RecyclePlus is an iOS mobile application that allows users to be knowledgeable in the realms of sustainability. It gives encourages users to be environmental responsible by providing them access to recycling information. In particular, it allows users to search up certain materials and learn about its recyclability and how to

RecyclePlus is an iOS mobile application that allows users to be knowledgeable in the realms of sustainability. It gives encourages users to be environmental responsible by providing them access to recycling information. In particular, it allows users to search up certain materials and learn about its recyclability and how to properly dispose of the material. Some searches will show locations of facilities near users that collect certain materials and dispose of the materials properly. This is a full stack software project that explores open source software and APIs, UI/UX design, and iOS development.
ContributorsTran, Nikki (Author) / Ganesh, Tirupalavanam (Thesis director) / Meuth, Ryan (Committee member) / Watts College of Public Service & Community Solut (Contributor) / Department of Information Systems (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
133901-Thumbnail Image.png
Description
This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally

This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally accepted model of an artificial neuron is broken down into its key components and then analyzed for functionality by relating back to its biological counterpart. The role of a neuron is then described in the context of a neural network, with equal emphasis placed on how it individually undergoes training and then for an entire network. Using the technique of supervised learning, the neural network is trained with three main factors for housing price classification, including its total number of rooms, bathrooms, and square footage. Once trained with most of the generated data set, it is tested for accuracy by introducing the remainder of the data-set and observing how closely its computed output for each set of inputs compares to the target value. From a programming perspective, the artificial neuron is implemented in C so that it would be more closely tied to the operating system and therefore make the collected profiler data more precise during the program's execution. The program is designed to break down each stage of the neuron's training process into distinct functions. In addition to utilizing more functional code, the struct data type is used as the underlying data structure for this project to not only represent the neuron but for implementing the neuron's training and test data. Once fully trained, the neuron's test results are then graphed to visually depict how well the neuron learned from its sample training set. Finally, the profiler data is analyzed to describe how the program operated from a data management perspective on the software and hardware level.
ContributorsRichards, Nicholas Giovanni (Author) / Miller, Phillip (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133918-Thumbnail Image.png
Description
The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the

The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the impact of this law on the labor force in Arizona, specifically regarding undocumented workers and less educated native workers. Overall, the data shows that the wage bias against undocumented immigrants doubled in the four years studied, and the wages of native workers without a high school degree saw a temporary, positive increase compared to comparable workers in other states. The law did not have an effect on the wages of native workers with a high school degree.
ContributorsSantiago, Maria Christina (Author) / Pereira, Claudiney (Thesis director) / Mendez, Jose (Committee member) / School of International Letters and Cultures (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135581-Thumbnail Image.png
Description
As the need for data concerning the health of the world's oceans increases, it becomes necessary to develop large, networked communication systems underwater. This research involves the development of an embedded operating system that is suited for optically-linked underwater wireless sensor networks (WSNs). Optical WSNs are unique in that large

As the need for data concerning the health of the world's oceans increases, it becomes necessary to develop large, networked communication systems underwater. This research involves the development of an embedded operating system that is suited for optically-linked underwater wireless sensor networks (WSNs). Optical WSNs are unique in that large sums of data may be received relatively infrequently, and so an operating system for each node must be very responsive. Additionally, the volatile nature of the underwater environment means that the operating system must be accurate, while still maintaining a low profile on a relatively small microprocessor core. The first part of this research concerns the actual implementation of the operating system's task scheduler and additional libraries to maintain synchronization, and the second part involves testing the operating system for responsiveness to interrupts and overall performance.
ContributorsTueller, Peter Michael (Author) / Youngbull, Cody (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135587-Thumbnail Image.png
Description
The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000

The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000 deaths, three times as many injuries, and roughly $1.1 billion in damages, which accounted for approximately 30% of Guatemala's GDP. The earthquake which hit just outside of Christchurch, New Zealand early in the morning on September 4, 2010 had a magnitude of 7.1 and caused just two injuries, no deaths, and roughly 7.2 billion USD in damages (5% of GDP). These three earthquakes, all with magnitudes over 7 on the Richter scale, caused extremely varied amounts of economic damage for these three countries. This thesis aims to identify a possible explanation as to why this was the case and suggest ways in which to improve disaster risk management going forward.
ContributorsHeuermann, Jamie Lynne (Author) / Schoellman, Todd (Thesis director) / Mendez, Jose (Committee member) / Department of Supply Chain Management (Contributor) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135208-Thumbnail Image.png
Description
Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of

Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of Earth and Space Exploration at ASU use radiometric dating extensively in their research, and have very specific procedures, hardware, and software to perform the dating calculations. Researchers use lasers to drill small holes, or ablations, in rock faces, collect the masses of various isotopes using a mass spectrometer, and scan the pit with an interferometer, which records the z heights of the pit on an x-y grid. This scan is then processed by custom-made software to determine the volume of the pit, which then is used along with the isotope masses and known decay rates to determine the age of the rock. My research has been focused on improving this volume calculation through computational geometry methods of surface reconstruction. During the process, I created an web application that reads interferometer scans, reconstructs a surface from those scans with Poisson reconstruction, renders the surface in the browser, and calculates the volume of the pit based on parameters provided by the researcher. The scans are stored in a central cloud datastore for future analysis, allowing the researchers in the geochronology community to collaborate together on scans from various rocks in their individual labs. The result of the project has been a complete and functioning application that is accessible to any researcher and reproducible from any computer. The 3D representation of the scan data allows researchers to easily understand the topology of the pit ablation and determine early on whether the measurements of the interferometer are trustworthy for the particular ablation. The volume calculation by the new software also reduces the variability in the volume calculation, which hopefully indicates the process is removing noise from the scan data and performing volume calculations on a more realistic representation of the actual ablation. In the future, this research will be used as the groundwork for more robust testing and closer approximations through implementation of different reconstruction algorithms. As the project grows and becomes more usable, hopefully there will be adoption in the community and it will become a reproducible standard for geochronologists performing radiometric dating.
ContributorsPruitt, Jacob Richard (Author) / Hodges, Kip (Thesis director) / Mercer, Cameron (Committee member) / van Soest, Matthijs (Committee member) / Department of Economics (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The

I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The research involves two models spanning two periods. The first period encompasses 2000 to late 2007 and the second period encompasses early 2010 to 2016. The dependent variable for this model is monthly average WTI crude oil prices. The independent variables are based on what the academic community believes are drivers of crude oil prices. While the studies may be scattered across different time periods, they provide valuable insight on what the academic community believes drives oil prices. The model includes variables that address two different data groups including: 1. Market fundamentals/expectations of market fundamentals 2. Speculation One of the biggest challenges I faced was defining and quantifying "speculation". I ended up using a previous study's definition of "speculation", which it defined as the activity of certain market participants in the Commitment of Traders report released by the Commodity Futures Trading Commission. My research shows that the West Texas Intermediate crude oil market exhibited a structural change after the Great Recession. Furthermore, my research also presents interesting findings that warrant further research. For example, I find that 3-month T-bills and 10yr Treasury notes lose their predictive edge starting in the second period (2010-2016). Furthermore, the positive correlation between oil and the U.S. dollar in the period 2000-2007 warrants further investigation. Lastly, it might be interesting to see why T-bills are positively correlated to WTI prices and 10yr Treasury notes are negatively correlated to WTI prices.
ContributorsMirza, Hisham Tariq (Author) / McDaniel, Cara (Thesis director) / Budolfson, Arthur (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05