This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 10 of 435
Filtering by

Clear all filters

133364-Thumbnail Image.png
Description
The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide

The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide possibilities it can provide. This can convolute the topic by being too broad when defined. Instead, the focus will be maintained on explaining the technical details about how and why this technology works in improving the supply chain. The scope of explanation will not be limited to the solutions, but will also detail current problems. Both public and private blockchain networks will be explained and solutions they provide in supply chains. In addition, other non-blockchain systems will be described that provide important pieces in supply chain operations that blockchain cannot provide. Blockchain when applied to the supply chain provides improved consumer transparency, management of resources, logistics, trade finance, and liquidity.
ContributorsKrukar, Joel Michael (Author) / Oke, Adegoke (Thesis director) / Duarte, Brett (Committee member) / Hahn, Richard (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133369-Thumbnail Image.png
Description
Breast microcalcifications are a potential indicator of cancerous tumors. Current visualization methods are either uncomfortable or impractical. Impedance measurement studies have been performed, but not in a clinical setting due to a low sensitivity and specificity. We are hoping to overcome this challenge with the development of a highly accurate

Breast microcalcifications are a potential indicator of cancerous tumors. Current visualization methods are either uncomfortable or impractical. Impedance measurement studies have been performed, but not in a clinical setting due to a low sensitivity and specificity. We are hoping to overcome this challenge with the development of a highly accurate impedance probe on a biopsy needle. With this technique, microcalcifications and the surrounding tissue could be differentiated in an efficient and comfortable manner than current techniques for biopsy procedures. We have developed and tested a functioning prototype for a biopsy needle using bioimpedance sensors to detect microcalcifications in the human body. In the final prototype a waveform generator sends a sin wave at a relatively low frequency(<1KHz) into the pre-amplifier, which both stabilizes and amplifies the signal. A modified howland bridge is then used to achieve a steady AC current through the electrodes. The voltage difference across the electrodes is then used to calculate the impedance being experienced between the electrodes. In our testing, the microcalcifications we are looking for have a noticeably higher impedance than the surrounding breast tissue, this spike in impedance is used to signal the presence of the calcifications, which are then sampled for examination by radiology.
ContributorsWen, Robert Bobby (Co-author) / Grula, Adam (Co-author) / Vergara, Marvin (Co-author) / Ramkumar, Shreya (Co-author) / Kozicki, Michael (Thesis director) / Ranjani, Kumaran (Committee member) / School of Molecular Sciences (Contributor) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
131527-Thumbnail Image.png
Description
Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic

Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic indoor or urban environments. Using recent improvements in the field of machine learning, this project proposes a new method of localization using networks with several wireless transceivers and implemented without heavy computational loads or high costs. This project aims to build a proof-of-concept prototype and demonstrate that the proposed technique is feasible and accurate.

Modern communication networks heavily depend upon an estimate of the communication channel, which represents the distortions that a transmitted signal takes as it moves towards a receiver. A channel can become quite complicated due to signal reflections, delays, and other undesirable effects and, as a result, varies significantly with each different location. This localization system seeks to take advantage of this distinctness by feeding channel information into a machine learning algorithm, which will be trained to associate channels with their respective locations. A device in need of localization would then only need to calculate a channel estimate and pose it to this algorithm to obtain its location.

As an additional step, the effect of location noise is investigated in this report. Once the localization system described above demonstrates promising results, the team demonstrates that the system is robust to noise on its location labels. In doing so, the team demonstrates that this system could be implemented in a continued learning environment, in which some user agents report their estimated (noisy) location over a wireless communication network, such that the model can be implemented in an environment without extensive data collection prior to release.
ContributorsChang, Roger (Co-author) / Kann, Trevor (Co-author) / Alkhateeb, Ahmed (Thesis director) / Bliss, Daniel (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131537-Thumbnail Image.png
Description
At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment.

At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment. An automated, stable, and accurate method to evaluate Parkinson’s would be significant in streamlining diagnoses of patients and providing families more time for corrective measures. We propose a methodology which incorporates TDA into analyzing Parkinson’s disease postural shifts data through the representation of persistence images. Studying the topology of a system has proven to be invariant to small changes in data and has been shown to perform well in discrimination tasks. The contributions of the paper are twofold. We propose a method to 1) classify healthy patients from those afflicted by disease and 2) diagnose the severity of disease. We explore the use of the proposed method in an application involving a Parkinson’s disease dataset comprised of healthy-elderly, healthy-young and Parkinson’s disease patients.
ContributorsRahman, Farhan Nadir (Co-author) / Nawar, Afra (Co-author) / Turaga, Pavan (Thesis director) / Krishnamurthi, Narayanan (Committee member) / Electrical Engineering Program (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
133918-Thumbnail Image.png
Description
The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the

The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the impact of this law on the labor force in Arizona, specifically regarding undocumented workers and less educated native workers. Overall, the data shows that the wage bias against undocumented immigrants doubled in the four years studied, and the wages of native workers without a high school degree saw a temporary, positive increase compared to comparable workers in other states. The law did not have an effect on the wages of native workers with a high school degree.
ContributorsSantiago, Maria Christina (Author) / Pereira, Claudiney (Thesis director) / Mendez, Jose (Committee member) / School of International Letters and Cultures (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
134177-Thumbnail Image.png
Description
Buck converters are a class of switched-mode power converters often used to step down DC input voltages to a lower DC output voltage. These converters naturally produce a current and voltage ripple at their output due to their switching action. Traditional methods of reducing this ripple have involved adding large

Buck converters are a class of switched-mode power converters often used to step down DC input voltages to a lower DC output voltage. These converters naturally produce a current and voltage ripple at their output due to their switching action. Traditional methods of reducing this ripple have involved adding large discrete inductors and capacitors to filter the ripple, but large discrete components cannot be integrated onto chips. As an alternative to using passive filtering components, this project investigates the use of active ripple cancellation to reduce the peak output ripple. Hysteretic controlled buck converters were chosen for their simplicity of design and fast transient response. The proposed cancellation circuits sense the output ripple of the buck converter and inject an equal ripple exactly out of phase with the sensed ripple. Both current-mode and voltage-mode feedback loops are simulated, and the effectiveness of each cancellation circuit is examined. Results show that integrated active ripple cancellation circuits offer a promising substitute for large discrete filters.
ContributorsWang, Ziyan (Author) / Bakkaloglu, Bertan (Thesis director) / Kitchen, Jennifer (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2017-12
Description
This creative project thesis involves electronic music composition and production, and it uses some elements of algorithmic music composition (through recurrent neural networks). Algorithmic composition techniques are used here as a tool in composing the pieces, but are not the main focus. Thematically, this project explores the analogy between artificial

This creative project thesis involves electronic music composition and production, and it uses some elements of algorithmic music composition (through recurrent neural networks). Algorithmic composition techniques are used here as a tool in composing the pieces, but are not the main focus. Thematically, this project explores the analogy between artificial neural networks and neural activity in the brain. This project consists of three short pieces, each exploring these concept in different ways.
ContributorsKarpur, Ajay (Author) / Suzuki, Kotoka (Thesis director) / Ingalls, Todd (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135587-Thumbnail Image.png
Description
The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000

The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000 deaths, three times as many injuries, and roughly $1.1 billion in damages, which accounted for approximately 30% of Guatemala's GDP. The earthquake which hit just outside of Christchurch, New Zealand early in the morning on September 4, 2010 had a magnitude of 7.1 and caused just two injuries, no deaths, and roughly 7.2 billion USD in damages (5% of GDP). These three earthquakes, all with magnitudes over 7 on the Richter scale, caused extremely varied amounts of economic damage for these three countries. This thesis aims to identify a possible explanation as to why this was the case and suggest ways in which to improve disaster risk management going forward.
ContributorsHeuermann, Jamie Lynne (Author) / Schoellman, Todd (Thesis director) / Mendez, Jose (Committee member) / Department of Supply Chain Management (Contributor) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135208-Thumbnail Image.png
Description
Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of

Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of Earth and Space Exploration at ASU use radiometric dating extensively in their research, and have very specific procedures, hardware, and software to perform the dating calculations. Researchers use lasers to drill small holes, or ablations, in rock faces, collect the masses of various isotopes using a mass spectrometer, and scan the pit with an interferometer, which records the z heights of the pit on an x-y grid. This scan is then processed by custom-made software to determine the volume of the pit, which then is used along with the isotope masses and known decay rates to determine the age of the rock. My research has been focused on improving this volume calculation through computational geometry methods of surface reconstruction. During the process, I created an web application that reads interferometer scans, reconstructs a surface from those scans with Poisson reconstruction, renders the surface in the browser, and calculates the volume of the pit based on parameters provided by the researcher. The scans are stored in a central cloud datastore for future analysis, allowing the researchers in the geochronology community to collaborate together on scans from various rocks in their individual labs. The result of the project has been a complete and functioning application that is accessible to any researcher and reproducible from any computer. The 3D representation of the scan data allows researchers to easily understand the topology of the pit ablation and determine early on whether the measurements of the interferometer are trustworthy for the particular ablation. The volume calculation by the new software also reduces the variability in the volume calculation, which hopefully indicates the process is removing noise from the scan data and performing volume calculations on a more realistic representation of the actual ablation. In the future, this research will be used as the groundwork for more robust testing and closer approximations through implementation of different reconstruction algorithms. As the project grows and becomes more usable, hopefully there will be adoption in the community and it will become a reproducible standard for geochronologists performing radiometric dating.
ContributorsPruitt, Jacob Richard (Author) / Hodges, Kip (Thesis director) / Mercer, Cameron (Committee member) / van Soest, Matthijs (Committee member) / Department of Economics (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The

I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The research involves two models spanning two periods. The first period encompasses 2000 to late 2007 and the second period encompasses early 2010 to 2016. The dependent variable for this model is monthly average WTI crude oil prices. The independent variables are based on what the academic community believes are drivers of crude oil prices. While the studies may be scattered across different time periods, they provide valuable insight on what the academic community believes drives oil prices. The model includes variables that address two different data groups including: 1. Market fundamentals/expectations of market fundamentals 2. Speculation One of the biggest challenges I faced was defining and quantifying "speculation". I ended up using a previous study's definition of "speculation", which it defined as the activity of certain market participants in the Commitment of Traders report released by the Commodity Futures Trading Commission. My research shows that the West Texas Intermediate crude oil market exhibited a structural change after the Great Recession. Furthermore, my research also presents interesting findings that warrant further research. For example, I find that 3-month T-bills and 10yr Treasury notes lose their predictive edge starting in the second period (2010-2016). Furthermore, the positive correlation between oil and the U.S. dollar in the period 2000-2007 warrants further investigation. Lastly, it might be interesting to see why T-bills are positively correlated to WTI prices and 10yr Treasury notes are negatively correlated to WTI prices.
ContributorsMirza, Hisham Tariq (Author) / McDaniel, Cara (Thesis director) / Budolfson, Arthur (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05