Matching Items (248)
Filtering by

Clear all filters

157200-Thumbnail Image.png
Description

The built environment is responsible for a significant portion of global waste generation.

Construction and demolition (C&D) waste requires significant landfill areas and costs

billions of dollars. New business models that reduce this waste may prove to be financially

beneficial and generally more sustainable. One such model is referred to as the “Circular

Economy”

The built environment is responsible for a significant portion of global waste generation.

Construction and demolition (C&D) waste requires significant landfill areas and costs

billions of dollars. New business models that reduce this waste may prove to be financially

beneficial and generally more sustainable. One such model is referred to as the “Circular

Economy” (CE), which promotes the efficient use of materials to minimize waste

generation and raw material consumption. CE is achieved by maximizing the life of

materials and components and by reclaiming the typically wasted value at the end of their

life. This thesis identifies the potential opportunities for using CE in the built environment.

It first calculates the magnitude of C&D waste and its main streams, highlights the top

C&D materials based on weight and value using data from various regions, identifies the

top C&D materials’ current recycling and reuse rates, and finally estimates a potential

financial benefit of $3.7 billion from redirecting C&D waste using the CE concept in the

United States.

ContributorsAldaaja, Mohammad (Author) / El Asmar, Mounir (Thesis advisor) / Buch, Rajesh (Committee member) / Kaloush, Kamil (Committee member) / Arizona State University (Publisher)
Created2019
131527-Thumbnail Image.png
Description
Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic

Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic indoor or urban environments. Using recent improvements in the field of machine learning, this project proposes a new method of localization using networks with several wireless transceivers and implemented without heavy computational loads or high costs. This project aims to build a proof-of-concept prototype and demonstrate that the proposed technique is feasible and accurate.

Modern communication networks heavily depend upon an estimate of the communication channel, which represents the distortions that a transmitted signal takes as it moves towards a receiver. A channel can become quite complicated due to signal reflections, delays, and other undesirable effects and, as a result, varies significantly with each different location. This localization system seeks to take advantage of this distinctness by feeding channel information into a machine learning algorithm, which will be trained to associate channels with their respective locations. A device in need of localization would then only need to calculate a channel estimate and pose it to this algorithm to obtain its location.

As an additional step, the effect of location noise is investigated in this report. Once the localization system described above demonstrates promising results, the team demonstrates that the system is robust to noise on its location labels. In doing so, the team demonstrates that this system could be implemented in a continued learning environment, in which some user agents report their estimated (noisy) location over a wireless communication network, such that the model can be implemented in an environment without extensive data collection prior to release.
ContributorsChang, Roger (Co-author) / Kann, Trevor (Co-author) / Alkhateeb, Ahmed (Thesis director) / Bliss, Daniel (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131537-Thumbnail Image.png
Description
At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment.

At present, the vast majority of human subjects with neurological disease are still diagnosed through in-person assessments and qualitative analysis of patient data. In this paper, we propose to use Topological Data Analysis (TDA) together with machine learning tools to automate the process of Parkinson’s disease classification and severity assessment. An automated, stable, and accurate method to evaluate Parkinson’s would be significant in streamlining diagnoses of patients and providing families more time for corrective measures. We propose a methodology which incorporates TDA into analyzing Parkinson’s disease postural shifts data through the representation of persistence images. Studying the topology of a system has proven to be invariant to small changes in data and has been shown to perform well in discrimination tasks. The contributions of the paper are twofold. We propose a method to 1) classify healthy patients from those afflicted by disease and 2) diagnose the severity of disease. We explore the use of the proposed method in an application involving a Parkinson’s disease dataset comprised of healthy-elderly, healthy-young and Parkinson’s disease patients.
ContributorsRahman, Farhan Nadir (Co-author) / Nawar, Afra (Co-author) / Turaga, Pavan (Thesis director) / Krishnamurthi, Narayanan (Committee member) / Electrical Engineering Program (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
133880-Thumbnail Image.png
Description
In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form

In this project, the use of deep neural networks for the process of selecting actions to execute within an environment to achieve a goal is explored. Scenarios like this are common in crafting based games such as Terraria or Minecraft. Goals in these environments have recursive sub-goal dependencies which form a dependency tree. An agent operating within these environments have access to low amounts of data about the environment before interacting with it, so it is crucial that this agent is able to effectively utilize a tree of dependencies and its environmental surroundings to make judgements about which sub-goals are most efficient to pursue at any point in time. A successful agent aims to minimizes cost when completing a given goal. A deep neural network in combination with Q-learning techniques was employed to act as the agent in this environment. This agent consistently performed better than agents using alternate models (models that used dependency tree heuristics or human-like approaches to make sub-goal oriented choices), with an average performance advantage of 33.86% (with a standard deviation of 14.69%) over the best alternate agent. This shows that machine learning techniques can be consistently employed to make goal-oriented choices within an environment with recursive sub-goal dependencies and low amounts of pre-known information.
ContributorsKoleber, Derek (Author) / Acuna, Ruben (Thesis director) / Bansal, Ajay (Committee member) / W.P. Carey School of Business (Contributor) / Software Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133901-Thumbnail Image.png
Description
This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally

This thesis dives into the world of artificial intelligence by exploring the functionality of a single layer artificial neural network through a simple housing price classification example while simultaneously considering its impact from a data management perspective on both the software and hardware level. To begin this study, the universally accepted model of an artificial neuron is broken down into its key components and then analyzed for functionality by relating back to its biological counterpart. The role of a neuron is then described in the context of a neural network, with equal emphasis placed on how it individually undergoes training and then for an entire network. Using the technique of supervised learning, the neural network is trained with three main factors for housing price classification, including its total number of rooms, bathrooms, and square footage. Once trained with most of the generated data set, it is tested for accuracy by introducing the remainder of the data-set and observing how closely its computed output for each set of inputs compares to the target value. From a programming perspective, the artificial neuron is implemented in C so that it would be more closely tied to the operating system and therefore make the collected profiler data more precise during the program's execution. The program is designed to break down each stage of the neuron's training process into distinct functions. In addition to utilizing more functional code, the struct data type is used as the underlying data structure for this project to not only represent the neuron but for implementing the neuron's training and test data. Once fully trained, the neuron's test results are then graphed to visually depict how well the neuron learned from its sample training set. Finally, the profiler data is analyzed to describe how the program operated from a data management perspective on the software and hardware level.
ContributorsRichards, Nicholas Giovanni (Author) / Miller, Phillip (Thesis director) / Meuth, Ryan (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133918-Thumbnail Image.png
Description
The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the

The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the impact of this law on the labor force in Arizona, specifically regarding undocumented workers and less educated native workers. Overall, the data shows that the wage bias against undocumented immigrants doubled in the four years studied, and the wages of native workers without a high school degree saw a temporary, positive increase compared to comparable workers in other states. The law did not have an effect on the wages of native workers with a high school degree.
ContributorsSantiago, Maria Christina (Author) / Pereira, Claudiney (Thesis director) / Mendez, Jose (Committee member) / School of International Letters and Cultures (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135587-Thumbnail Image.png
Description
The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000

The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000 deaths, three times as many injuries, and roughly $1.1 billion in damages, which accounted for approximately 30% of Guatemala's GDP. The earthquake which hit just outside of Christchurch, New Zealand early in the morning on September 4, 2010 had a magnitude of 7.1 and caused just two injuries, no deaths, and roughly 7.2 billion USD in damages (5% of GDP). These three earthquakes, all with magnitudes over 7 on the Richter scale, caused extremely varied amounts of economic damage for these three countries. This thesis aims to identify a possible explanation as to why this was the case and suggest ways in which to improve disaster risk management going forward.
ContributorsHeuermann, Jamie Lynne (Author) / Schoellman, Todd (Thesis director) / Mendez, Jose (Committee member) / Department of Supply Chain Management (Contributor) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
134315-Thumbnail Image.png
Description
Sustainable Materials Management and Circular Economy are both frameworks for considering the way we interact with the world's resources. Different organizations and institutions across the world have adopted one philosophy or the other. To some, there seems to be little overlap of the two, and to others, they are perceived

Sustainable Materials Management and Circular Economy are both frameworks for considering the way we interact with the world's resources. Different organizations and institutions across the world have adopted one philosophy or the other. To some, there seems to be little overlap of the two, and to others, they are perceived as being interchangeable. This paper evaluates Sustainable Materials Management (SMM) and Circular Economy (CE) individually and in comparison to see how truly different these frameworks are from one another. This comparison is then extended into a theoretical walk-through of an SMM treatment of concrete pavement in contrast with a CE treatment. With concrete being a ubiquitous in the world's buildings and roads, as well as being a major constituent of Construction & Demolition waste generated, its analysis is applicable to a significant portion of the world's material flow. The ultimate test of differentiation between SMM and CE would ask: 1) If SMM principles guided action, would the outcomes be aligned with or at odds with CE principles? and conversely 2) If CE principles guided action, would the outcomes be aligned with or at odds with SMM principles? Using concrete pavement as an example, this paper seeks to determine whether or not Sustainable Materials Management and Circular Economy are simply different roads leading to the same destination.
ContributorsAbdul-Quadir, Anisa (Author) / Kelman, Candice (Thesis director) / Buch, Rajesh (Committee member) / Barrett, The Honors College (Contributor)
Created2017-05
135389-Thumbnail Image.png
Description
The ability to draft and develop productive Major League players is vital to the success of any MLB organization. A core of cost-controlled, productive players is as important as ever with free agent salaries continuing to rise dramatically. In a sport where mere percentage points separate winners from losers at

The ability to draft and develop productive Major League players is vital to the success of any MLB organization. A core of cost-controlled, productive players is as important as ever with free agent salaries continuing to rise dramatically. In a sport where mere percentage points separate winners from losers at the end of a long season, any slight advantage in identifying talent is valuable. This study examines the 2004-2008 MLB Amateur Drafts in order to analyze whether certain types of prospects are more valuable selections than others. If organizations can better identify which draft prospects will more likely contribute at the Major League level in the future, they can more optimally spend their allotted signing bonus pool in order to acquire as much potential production as possible through the draft. Based on the data examined, during these five drafts high school prospects provided higher value than college prospects. While college players reached the Majors at a higher rate, high school players produced greater value in their first six seasons of service time. In the all-important first round of the draft, where signing bonuses are at their largest, college players proved the more valuable selection. When players were separated by position, position players held greater expected value than pitchers, with corner infielders leading the way as the position group with the highest expected value. College players were found to provide better value than high school players at defensively demanding positions such as catcher and middle infield, while high school players were more valuable among outfielders and pitchers.
ContributorsGildea, Adam Joseph (Author) / Eaton, John (Thesis director) / McIntosh, Daniel (Committee member) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135331-Thumbnail Image.png
Description
The following paper consists of a review of sovereign debt sustainability economics and IMF debt sustainability frameworks, as well as a historical case study of Greece and a variable suggestion for the IMF to improve baseline assumptions. The purpose of this paper is to review the current methodology of perceiving

The following paper consists of a review of sovereign debt sustainability economics and IMF debt sustainability frameworks, as well as a historical case study of Greece and a variable suggestion for the IMF to improve baseline assumptions. The purpose of this paper is to review the current methodology of perceiving debt and improve upon it in the face of an increasingly indebted global economy. Thus, this paper suggests the IMF adopt the variable calculated in Reinhart and Rogoff (2009) as a new benchmark for determining debt sustainability of market access countries. Through an exploration of the most recent Greek crisis, as well as modern Greek financial and political history, the author of this paper contends the IMF should reduce the broadness of the MAC DSA, as it will make for better debt sustainability projections and assumptions in implementing debt program policy.
ContributorsJennings, Zane Phillips (Author) / Mendez, Jose (Thesis director) / Roberts, Nancy (Committee member) / Economics Program in CLAS (Contributor) / School of Politics and Global Studies (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05