Barrett, The Honors College at Arizona State University proudly showcases the work of undergraduate honors students by sharing this collection exclusively with the ASU community.

Barrett accepts high performing, academically engaged undergraduate students and works with them in collaboration with all of the other academic units at Arizona State University. All Barrett students complete a thesis or creative project which is an opportunity to explore an intellectual interest and produce an original piece of scholarly research. The thesis or creative project is supervised and defended in front of a faculty committee. Students are able to engage with professors who are nationally recognized in their fields and committed to working with honors students. Completing a Barrett thesis or creative project is an opportunity for undergraduate honors students to contribute to the ASU academic community in a meaningful way.

Displaying 1 - 10 of 37
Filtering by

Clear all filters

Description
Bhairavi is a solo performance that investigates belonging and dis-belonging in diaspora communities, especially as it relates to the female body. Specifically, through my experience as a second-generation Indian-American woman - I expose and challenge the notion of ‘tradition,’ as it is forced into women’s bodies, and displaces them in

Bhairavi is a solo performance that investigates belonging and dis-belonging in diaspora communities, especially as it relates to the female body. Specifically, through my experience as a second-generation Indian-American woman - I expose and challenge the notion of ‘tradition,’ as it is forced into women’s bodies, and displaces them in their own homes. Bhairavi is a story told through movement and theatrical narrative composition with research and material collected through structured and unstructured observation of my family, cultural community, and myself.

Note: This work of creative scholarship is rooted in collaboration between three female artist-scholars: Carly Bates, Raji Ganesan, and Allyson Yoder. Working from a common intersectional, feminist framework, we served as artistic co-directors of each other’s solo pieces and co-producers of Negotiations, in which we share these pieces in relationship to each other. Thus, Negotiations is not a showcase of three individual works, but rather a conversation among three voices. As collaborators, we have been uncompromising in the pursuit of our own unique inquiries and voices, and each of our works of creative scholarship stand alone. However, we believe that all of the parts are best understood in relationship to each other, and to the whole. For this reason, we have chosen to cross-reference our thesis documents.

French Vanilla: An Exploration of Biracial Identity Through Narrative Performance by Carly Bates

Deep roots, shared fruits: Emergent creative process and the ecology of solo performance through “Dress in Something Plain and Dark” by Allyson Yoder

Bhairavi: A Performance-Investigation of Belonging and Dis-Belonging in Diaspora
Communities by Raji Ganesan
ContributorsGanesan, Raji J (Author) / Underiner, Tamara (Thesis director) / Stephens, Mary (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135581-Thumbnail Image.png
Description
As the need for data concerning the health of the world's oceans increases, it becomes necessary to develop large, networked communication systems underwater. This research involves the development of an embedded operating system that is suited for optically-linked underwater wireless sensor networks (WSNs). Optical WSNs are unique in that large

As the need for data concerning the health of the world's oceans increases, it becomes necessary to develop large, networked communication systems underwater. This research involves the development of an embedded operating system that is suited for optically-linked underwater wireless sensor networks (WSNs). Optical WSNs are unique in that large sums of data may be received relatively infrequently, and so an operating system for each node must be very responsive. Additionally, the volatile nature of the underwater environment means that the operating system must be accurate, while still maintaining a low profile on a relatively small microprocessor core. The first part of this research concerns the actual implementation of the operating system's task scheduler and additional libraries to maintain synchronization, and the second part involves testing the operating system for responsiveness to interrupts and overall performance.
ContributorsTueller, Peter Michael (Author) / Youngbull, Cody (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135207-Thumbnail Image.png
Description
Situations present themselves in which someone needs to navigate inside of a building, for example, to the exit or to retrieve and object. Sometimes, vision is not a reliable sense of spatial awareness, maybe because of a smoky environment, a dark environment, distractions, etc. I propose a wearable haptic device,

Situations present themselves in which someone needs to navigate inside of a building, for example, to the exit or to retrieve and object. Sometimes, vision is not a reliable sense of spatial awareness, maybe because of a smoky environment, a dark environment, distractions, etc. I propose a wearable haptic device, a belt or vest, that provides haptic feedback to help people navigate inside of a building that does not rely on the user's vision. The first proposed device has an obstacle avoidance component and a navigation component. This paper discussed the challenges of designing and implementing this kind of technology in the context of indoor navigation, where GPS signal is poor. Analyzing accelerometer data for the purpose of indoor navigation and then using haptic cues from a wearable haptic device for the navigation were explored in this project, and the device is promising.
ContributorsBerk, Emily Marie (Author) / Atkinson, Robert (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135208-Thumbnail Image.png
Description
Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of

Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of Earth and Space Exploration at ASU use radiometric dating extensively in their research, and have very specific procedures, hardware, and software to perform the dating calculations. Researchers use lasers to drill small holes, or ablations, in rock faces, collect the masses of various isotopes using a mass spectrometer, and scan the pit with an interferometer, which records the z heights of the pit on an x-y grid. This scan is then processed by custom-made software to determine the volume of the pit, which then is used along with the isotope masses and known decay rates to determine the age of the rock. My research has been focused on improving this volume calculation through computational geometry methods of surface reconstruction. During the process, I created an web application that reads interferometer scans, reconstructs a surface from those scans with Poisson reconstruction, renders the surface in the browser, and calculates the volume of the pit based on parameters provided by the researcher. The scans are stored in a central cloud datastore for future analysis, allowing the researchers in the geochronology community to collaborate together on scans from various rocks in their individual labs. The result of the project has been a complete and functioning application that is accessible to any researcher and reproducible from any computer. The 3D representation of the scan data allows researchers to easily understand the topology of the pit ablation and determine early on whether the measurements of the interferometer are trustworthy for the particular ablation. The volume calculation by the new software also reduces the variability in the volume calculation, which hopefully indicates the process is removing noise from the scan data and performing volume calculations on a more realistic representation of the actual ablation. In the future, this research will be used as the groundwork for more robust testing and closer approximations through implementation of different reconstruction algorithms. As the project grows and becomes more usable, hopefully there will be adoption in the community and it will become a reproducible standard for geochronologists performing radiometric dating.
ContributorsPruitt, Jacob Richard (Author) / Hodges, Kip (Thesis director) / Mercer, Cameron (Committee member) / van Soest, Matthijs (Committee member) / Department of Economics (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135380-Thumbnail Image.png
Description
Bioscience High School, a small magnet high school located in Downtown Phoenix and a STEAM (Science, Technology, Engineering, Arts, Math) focused school, has been pushing to establish a computer science curriculum for all of their students from freshman to senior year. The school's Mision (Mission and Vision) is to: "..provide

Bioscience High School, a small magnet high school located in Downtown Phoenix and a STEAM (Science, Technology, Engineering, Arts, Math) focused school, has been pushing to establish a computer science curriculum for all of their students from freshman to senior year. The school's Mision (Mission and Vision) is to: "..provide a rigorous, collaborative, and relevant academic program emphasizing an innovative, problem-based curriculum that develops literacy in the sciences, mathematics, and the arts, thus cultivating critical thinkers, creative problem-solvers, and compassionate citizens, who are able to thrive in our increasingly complex and technological communities." Computational thinking is an important part in developing a future problem solver Bioscience High School is looking to produce. Bioscience High School is unique in the fact that every student has a computer available for him or her to use. Therefore, it makes complete sense for the school to add computer science to their curriculum because one of the school's goals is to be able to utilize their resources to their full potential. However, the school's attempt at computer science integration falls short due to the lack of expertise amongst the math and science teachers. The lack of training and support has postponed the development of the program and they are desperately in need of someone with expertise in the field to help reboot the program. As a result, I've decided to create a course that is focused on teaching students the concepts of computational thinking and its application through Scratch and Arduino programming.
ContributorsLiu, Deming (Author) / Meuth, Ryan (Thesis director) / Nakamura, Mutsumi (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
This project's goal was to design a Central Processing Unit (CPU) incorporating a fairly large instruction set and a multistage pipeline design with the potential to be used in a multi-core system. The CPU was coded and synthesized with Verilog. This was accomplished by building on the CPU design from

This project's goal was to design a Central Processing Unit (CPU) incorporating a fairly large instruction set and a multistage pipeline design with the potential to be used in a multi-core system. The CPU was coded and synthesized with Verilog. This was accomplished by building on the CPU design from fundamentals learned in CSE320 and increasing the instruction set to resemble a proper Reduced Instruction Set Computing (RISC) CPU system. A multistage pipeline was incorporated to the CPU to increase instruction throughput, or instructions per second. A major area of focus was on creating a multi-core design. The design used is master-slave in nature. The master core instructs the sub-cores where they should begin execution, the idea being that the operating system or kernel will be executing on the master core and the "user space" programs will be run on the sub-cores. The rationale behind this is that the system would specialize in running several small functions on all of its many supported cores. The system supports around 45 instructions, which include several types of jumps and branches (for changing the program counter based on conditions), arithmetic operations (addition, subtraction, or, and, etc.), and system calls (for controlling the core execution). The system has a very low Clocks per Instruction ratio (CPI), but to achieve this the second stage contains several modules and would most likely be a bottleneck for performance if implemented. The CPU is not perfect and contains a few errors and oversights, but the system as a whole functions as intended.
ContributorsKolden, Brian Andrew (Author) / Burger, Kevin (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135339-Thumbnail Image.png
Description
Observations of four times ionized iron and nickel (Fe V & Ni V) in the G191-B2B white dwarf spectrum have been used to test for variations in the fine structure constant, α, in the presence of strong gravitational fields. The laboratory wavelengths for these ions were thought to be the

Observations of four times ionized iron and nickel (Fe V & Ni V) in the G191-B2B white dwarf spectrum have been used to test for variations in the fine structure constant, α, in the presence of strong gravitational fields. The laboratory wavelengths for these ions were thought to be the cause of inconsistent conclusions regarding the
variation of α as observed through the white dwarf spectrum. This thesis presents 129 revised Fe V wavelengths (1200 Å to 1600 Å) and 161 revised Ni V wavelengths (1200 Å to 1400 Å) with uncertainties of approximately 3 mÅ. A systematic calibration error
is identified in the previous Ni V wavelengths and is corrected in this work. The evaluation of the fine structure variation is significantly improved with the results
found in this thesis.
ContributorsWard, Jacob Wolfgang (Author) / Treacy, Michael (Thesis director) / Alarcon, Ricardo (Committee member) / Nave, Gillian (Committee member) / Department of Physics (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135340-Thumbnail Image.png
Description
Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and

Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and daily operations. One of the most important parts is being able to predict and foreshadow failures in the system, in order to make sure that those are fixed before they turn into large issues. One specific area where preventive maintenance is a very big part of daily activity is the automotive industry. Automobile owners are encouraged to take their cars in for maintenance on a routine schedule (based on mileage or time), or when their car signals that there is an issue (low oil levels for example). Although this level of maintenance is enough when people are in charge of cars, the rise of autonomous vehicles, specifically self-driving cars, changes that. Now instead of a human being able to look at a car and diagnose any issues, the car needs to be able to do this itself. The objective of this project was to create such a system. The Electronics Preventive Maintenance System is an internal system that is designed to meet all these criteria and more. The EPMS system is comprised of a central computer which monitors all major electronic components in an autonomous vehicle through the use of standard off-the-shelf sensors. The central computer compiles the sensor data, and is able to sort and analyze the readings. The filtered data is run through several mathematical models, each of which diagnoses issues in different parts of the vehicle. The data for each component in the vehicle is compared to pre-set operating conditions. These operating conditions are set in order to encompass all normal ranges of output. If the sensor data is outside the margins, the warning and deviation are recorded and a severity level is calculated. In addition to the individual focus, there's also a vehicle-wide model, which predicts how necessary maintenance is for the vehicle. All of these results are analyzed by a simple heuristic algorithm and a decision is made for the vehicle's health status, which is sent out to the Fleet Management System. This system allows for accurate, effortless monitoring of all parts of an autonomous vehicle as well as predictive modeling that allows the system to determine maintenance needs. With this system, human inspectors are no longer necessary for a fleet of autonomous vehicles. Instead, the Fleet Management System is able to oversee inspections, and the system operator is able to set parameters to decide when to send cars for maintenance. All the models used for the sensor and component analysis are tailored specifically to the vehicle. The models and operating margins are created using empirical data collected during normal testing operations. The system is modular and can be used in a variety of different vehicle platforms, including underwater autonomous vehicles and aerial vehicles.
ContributorsMian, Sami T. (Author) / Collofello, James (Thesis director) / Chen, Yinong (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135654-Thumbnail Image.png
Description
Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key

Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key issue for Company X is how to commercialize RealSense's depth recognition capabilities. This thesis addresses the problem by examining which markets to address and how to monetize this technology. The first part of the analysis identified potential markets for RealSense. This was achieved by evaluating current markets that could benefit from the camera's gesture recognition, 3D scanning, and depth sensing abilities. After identifying seven industries where RealSense could add value, a model of the available, addressable, and obtainable market sizes was developed for each segment. Key competitors and market dynamics were used to estimate the portion of the market that Company X could capture. These models provided a forecast of the discounted gross profits that could be earned over the next five years. These forecasted gross profits, combined with an examination of the competitive landscape and synergistic opportunities, resulted in the selection of the three segments thought to be most profitable to Company X. These segments are smart home, consumer drones, and automotive. The final part of the analysis investigated entrance strategies. Company X's competitive advantages in each space were found by examining the competition, both for the RealSense camera in general and other technologies specific to each industry. Finally, ideas about ways to monetize RealSense were developed by exploring various revenue models and channels.
ContributorsDunn, Nicole (Co-author) / Boudreau, Thomas (Co-author) / Kinzy, Chris (Co-author) / Radigan, Thomas (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / WPC Graduate Programs (Contributor) / Department of Psychology (Contributor) / Department of Finance (Contributor) / School of Accountancy (Contributor) / Department of Economics (Contributor) / School of Mathematical and Statistical Science (Contributor) / W. P. Carey School of Business (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135660-Thumbnail Image.png
Description
This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can

This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can be used to understand the image better through recognizing different features present within the image. Deep CNNs, however, require training sets that can be larger than a million pictures in order to fine tune their feature detectors. For the case of facial expression datasets, none of these large datasets are available. Due to this limited availability of data required to train a new CNN, the idea of using naïve domain adaptation is explored. Instead of creating and using a new CNN trained specifically to extract features related to FER, a previously trained CNN originally trained for another computer vision task is used. Work for this research involved creating a system that can run a CNN, can extract feature vectors from the CNN, and can classify these extracted features. Once this system was built, different aspects of the system were tested and tuned. These aspects include the pre-trained CNN that was used, the layer from which features were extracted, normalization used on input images, and training data for the classifier. Once properly tuned, the created system returned results more accurate than previous attempts on facial expression recognition. Based on these positive results, naïve domain adaptation is shown to successfully leverage advantages of deep CNNs for facial expression recognition.
ContributorsEusebio, Jose Miguel Ang (Author) / Panchanathan, Sethuraman (Thesis director) / McDaniel, Troy (Committee member) / Venkateswara, Hemanth (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05