Matching Items (1,237)
Filtering by

Clear all filters

135377-Thumbnail Image.png
Description
A specific species of the genus Geobacter exhibits useful electrical properties when processing a molecule often found in waste water. A team at ASU including Dr Cèsar Torres and Dr Sudeep Popat used that species to create a special type of solid oxide fuel cell we refer to as a

A specific species of the genus Geobacter exhibits useful electrical properties when processing a molecule often found in waste water. A team at ASU including Dr Cèsar Torres and Dr Sudeep Popat used that species to create a special type of solid oxide fuel cell we refer to as a microbial fuel cell. Identification of possible chemical processes and properties of the reactions used by the Geobacter are investigated indirectly by taking measurements using Electrochemical Impedance Spectroscopy of the electrode-electrolyte interface of the microbial fuel cell to obtain the value of the fuel cell's complex impedance at specific frequencies. Investigation of the multiple polarization processes which give rise to measured impedance values is difficult to do directly and so examination of the distribution function of relaxation times (DRT) is considered instead. The DRT is related to the measured complex impedance values using a general, non-physical equivalent circuit model. That model is originally given in terms of a Fredholm integral equation with a non-square integrable kernel which makes the inverse problem of determining the DRT given the impedance measurements an ill-posed problem. The original integral equation is rewritten in terms of new variables into an equation relating the complex impedance to the convolution of a function based upon the original integral kernel and a related but separate distribution function which we call the convolutional distribution function. This new convolutional equation is solved by reducing the convolution to a pointwise product using the Fourier transform and then solving the inverse problem by pointwise division and application of a filter function (equivalent to regularization). The inverse Fourier transform is then taken to get the convolutional distribution function. In the literature the convolutional distribution function is then examined and certain values of a specific, less general equivalent circuit model are calculated from which aspects of the original chemical processes are derived. We attempted to instead directly determine the original DRT from the calculated convolutional distribution function. This method proved to be practically less useful due to certain values determined at the time of experiment which meant the original DRT could only be recovered in a window which would not normally contain the desired information for the original DRT. This limits any attempt to extend the solution for the convolutional distribution function to the original DRT. Further research may determine a method for interpreting the convolutional distribution function without an equivalent circuit model as is done with the regularization method used to solve directly for the original DRT.
ContributorsBaker, Robert Simpson (Author) / Renaut, Rosemary (Thesis director) / Kostelich, Eric (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135378-Thumbnail Image.png
Description
A problem of interest in theoretical physics is the issue of the evaporation of black holes via Hawking radiation subject to a fixed background. We approach this problem by considering an electromagnetic analogue, where we have substituted Hawking radiation with the Schwinger effect. We treat the case of massless QED

A problem of interest in theoretical physics is the issue of the evaporation of black holes via Hawking radiation subject to a fixed background. We approach this problem by considering an electromagnetic analogue, where we have substituted Hawking radiation with the Schwinger effect. We treat the case of massless QED in 1+1 dimensions with the path integral approach to quantum field theory, and discuss the resulting Feynman diagrams from our analysis. The results from this thesis may be useful to find a version of the Schwinger effect that can be solved exactly and perturbatively, as this version may provide insights to the gravitational problem of Hawking radiation.
ContributorsDhumuntarao, Aditya (Author) / Parikh, Maulik (Thesis director) / Davies, Paul C. W. (Committee member) / Department of Physics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135387-Thumbnail Image.png
Description
For this thesis, the authors would like to create a hypothetical Private Equity Real Estate Investment firm that focuses on creating value for partners by taking an opportunistic approach to acquiring under-performing urban multi-family properties with large upside potential for investing. The project will focus on both the market analysis

For this thesis, the authors would like to create a hypothetical Private Equity Real Estate Investment firm that focuses on creating value for partners by taking an opportunistic approach to acquiring under-performing urban multi-family properties with large upside potential for investing. The project will focus on both the market analysis and financial modeling associated with investment strategy and transactions. There is a substantial amount of complexity within commercial real estate and this thesis seeks to offer an accurate and comprehensive documentary of the process, while simplifying it for everyday readers. Additionally, there are a significant amount of risk factors associated with investment decisions, so the best practices from the industry documented in this manuscript are valuable tools for successful investing in the future. To gain the most profound and reliable industry knowledge, the authors leveraged the experience of dozens of industry professionals through research and personal interviews. Through careful analysis, the authors were able to ascertain the current economic position in the real estate cycle and to create a plan for future investing. Additionally, they were able to identify and evaluate a specific asset for purchase. As a result, the authors found that multifamily properties are a sound investment for the next two years and that the company should slowly start to shift directions to office and retail in 2018.
ContributorsBacon, David (Co-author) / Soto, Justin (Co-author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Department of Finance (Contributor) / Department of Supply Chain Management (Contributor) / Department of Marketing (Contributor) / W. P. Carey School of Business (Contributor) / School of Accountancy (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135324-Thumbnail Image.png
Description
The Clean Power Plan seeks to reduce CO2 emissions in the energy industry, which is the largest source of CO2 emissions in the United States. In order to comply with the Clean Power Plan, electric utilities in Arizona will need to meet the electricity demand while reducing the use of

The Clean Power Plan seeks to reduce CO2 emissions in the energy industry, which is the largest source of CO2 emissions in the United States. In order to comply with the Clean Power Plan, electric utilities in Arizona will need to meet the electricity demand while reducing the use of fossil fuel sources in generation. The study first outlines the organization of the power sector in the United States and the structural and price changes attempted in the industry during the period of restructuring. The recent final rule of the Clean Power Plan is then described in detail with a narrowed focus on Arizona. Data from APS, a representative utility of Arizona, is used for the remainder of the analysis to determine the price increase necessary to cut Arizona's CO2 emissions in order to meet the federal goal. The first regression models the variables which affect total demand and thus generation load, from which we estimate the marginal effect of price on demand. The second regression models CO2 emissions as a function of different levels of generation. This allows the effect of generation on emissions to fluctuate with ranges of load, following the logic of the merit order of plants and changing rates of emissions for different sources. Two methods are used to find the necessary percentage increase in price to meet the CPP goals: one based on the mass-based goal for Arizona and the other based on the percentage reduction for Arizona. Then a price increase is calculated for a projection into the future using known changes in energy supply.
ContributorsHerman, Laura Alexandra (Author) / Silverman, Daniel (Thesis director) / Kuminoff, Nicolai (Committee member) / Department of Economics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135327-Thumbnail Image.png
Description
A semi-implicit, fourth-order time-filtered leapfrog numerical scheme is investigated for accuracy and stability, and applied to several test cases, including one-dimensional advection and diffusion, the anelastic equations to simulate the Kelvin-Helmholtz instability, and the global shallow water spectral model to simulate the nonlinear evolution of twin tropical cyclones. The leapfrog

A semi-implicit, fourth-order time-filtered leapfrog numerical scheme is investigated for accuracy and stability, and applied to several test cases, including one-dimensional advection and diffusion, the anelastic equations to simulate the Kelvin-Helmholtz instability, and the global shallow water spectral model to simulate the nonlinear evolution of twin tropical cyclones. The leapfrog scheme leads to computational modes in the solutions to highly nonlinear systems, and time-filters are often used to damp these modes. The proposed filter damps the computational modes without appreciably degrading the physical mode. Its performance in these metrics is superior to the second-order time-filtered leapfrog scheme developed by Robert and Asselin.
Created2016-05
135336-Thumbnail Image.png
Description
Alternative currencies have a long and varied history, in which Bitcoin is the latest chapter. The pseudonymous Satoshi Nakamoto created Bitcoin as an implementation of the concept of a cryptocurrency, or a decentralized currency based on the principles of cryptography. Since its creation in 2008, Bitcoin has had a fairly

Alternative currencies have a long and varied history, in which Bitcoin is the latest chapter. The pseudonymous Satoshi Nakamoto created Bitcoin as an implementation of the concept of a cryptocurrency, or a decentralized currency based on the principles of cryptography. Since its creation in 2008, Bitcoin has had a fairly tumultuous existence that limited its adoption. Wide price fluctuations occurred as the appeal of free money by running a piece of computer software drove people to purchase expensive hardware, and high-profile scandals cast Bitcoin as an unstable currency well-suited primarily for purchasing illicit materials. Consumer confidence in the currency was extremely low, and businesses were extremely hesitant to accept a currency that could easily lose half (or more) of its value overnight. However, recent years have seen the currency begin to stabilize as businesses and mainstream investors have begun to accept and support it. Alternative cryptocurrencies, titled "altcoins," have also been created to fill market niches that Bitcoin was not addressing. Governmental intervention, a concern of many following the currency, has been surprisingly restrained and has actually contributed to its stability. The future of Bitcoin looks very bright as it carries the dream of the alternative currency forward into the 21st century.
ContributorsReardon, Brett (Co-author) / Burke, Ryan (Co-author) / Happel, Stephen (Thesis director) / Boyes, William (Committee member) / School of Politics and Global Studies (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135340-Thumbnail Image.png
Description
Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and

Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and daily operations. One of the most important parts is being able to predict and foreshadow failures in the system, in order to make sure that those are fixed before they turn into large issues. One specific area where preventive maintenance is a very big part of daily activity is the automotive industry. Automobile owners are encouraged to take their cars in for maintenance on a routine schedule (based on mileage or time), or when their car signals that there is an issue (low oil levels for example). Although this level of maintenance is enough when people are in charge of cars, the rise of autonomous vehicles, specifically self-driving cars, changes that. Now instead of a human being able to look at a car and diagnose any issues, the car needs to be able to do this itself. The objective of this project was to create such a system. The Electronics Preventive Maintenance System is an internal system that is designed to meet all these criteria and more. The EPMS system is comprised of a central computer which monitors all major electronic components in an autonomous vehicle through the use of standard off-the-shelf sensors. The central computer compiles the sensor data, and is able to sort and analyze the readings. The filtered data is run through several mathematical models, each of which diagnoses issues in different parts of the vehicle. The data for each component in the vehicle is compared to pre-set operating conditions. These operating conditions are set in order to encompass all normal ranges of output. If the sensor data is outside the margins, the warning and deviation are recorded and a severity level is calculated. In addition to the individual focus, there's also a vehicle-wide model, which predicts how necessary maintenance is for the vehicle. All of these results are analyzed by a simple heuristic algorithm and a decision is made for the vehicle's health status, which is sent out to the Fleet Management System. This system allows for accurate, effortless monitoring of all parts of an autonomous vehicle as well as predictive modeling that allows the system to determine maintenance needs. With this system, human inspectors are no longer necessary for a fleet of autonomous vehicles. Instead, the Fleet Management System is able to oversee inspections, and the system operator is able to set parameters to decide when to send cars for maintenance. All the models used for the sensor and component analysis are tailored specifically to the vehicle. The models and operating margins are created using empirical data collected during normal testing operations. The system is modular and can be used in a variety of different vehicle platforms, including underwater autonomous vehicles and aerial vehicles.
ContributorsMian, Sami T. (Author) / Collofello, James (Thesis director) / Chen, Yinong (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135355-Thumbnail Image.png
Description
Glioblastoma multiforme (GBM) is a malignant, aggressive and infiltrative cancer of the central nervous system with a median survival of 14.6 months with standard care. Diagnosis of GBM is made using medical imaging such as magnetic resonance imaging (MRI) or computed tomography (CT). Treatment is informed by medical images and

Glioblastoma multiforme (GBM) is a malignant, aggressive and infiltrative cancer of the central nervous system with a median survival of 14.6 months with standard care. Diagnosis of GBM is made using medical imaging such as magnetic resonance imaging (MRI) or computed tomography (CT). Treatment is informed by medical images and includes chemotherapy, radiation therapy, and surgical removal if the tumor is surgically accessible. Treatment seldom results in a significant increase in longevity, partly due to the lack of precise information regarding tumor size and location. This lack of information arises from the physical limitations of MR and CT imaging coupled with the diffusive nature of glioblastoma tumors. GBM tumor cells can migrate far beyond the visible boundaries of the tumor and will result in a recurring tumor if not killed or removed. Since medical images are the only readily available information about the tumor, we aim to improve mathematical models of tumor growth to better estimate the missing information. Particularly, we investigate the effect of random variation in tumor cell behavior (anisotropy) using stochastic parameterizations of an established proliferation-diffusion model of tumor growth. To evaluate the performance of our mathematical model, we use MR images from an animal model consisting of Murine GL261 tumors implanted in immunocompetent mice, which provides consistency in tumor initiation and location, immune response, genetic variation, and treatment. Compared to non-stochastic simulations, stochastic simulations showed improved volume accuracy when proliferation variability was high, but diffusion variability was found to only marginally affect tumor volume estimates. Neither proliferation nor diffusion variability significantly affected the spatial distribution accuracy of the simulations. While certain cases of stochastic parameterizations improved volume accuracy, they failed to significantly improve simulation accuracy overall. Both the non-stochastic and stochastic simulations failed to achieve over 75% spatial distribution accuracy, suggesting that the underlying structure of the model fails to capture one or more biological processes that affect tumor growth. Two biological features that are candidates for further investigation are angiogenesis and anisotropy resulting from differences between white and gray matter. Time-dependent proliferation and diffusion terms could be introduced to model angiogenesis, and diffusion weighed imaging (DTI) could be used to differentiate between white and gray matter, which might allow for improved estimates brain anisotropy.
ContributorsAnderies, Barrett James (Author) / Kostelich, Eric (Thesis director) / Kuang, Yang (Committee member) / Stepien, Tracy (Committee member) / Harrington Bioengineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135360-Thumbnail Image.png
Description
Aberrant glycosylation has been shown to be linked to specific cancers, and using this idea, it was proposed that the levels of glycans in the blood could predict stage I adenocarcinoma. To track this glycosylation, glycan were broken down into glycan nodes via methylation analysis. This analysis utilized information from

Aberrant glycosylation has been shown to be linked to specific cancers, and using this idea, it was proposed that the levels of glycans in the blood could predict stage I adenocarcinoma. To track this glycosylation, glycan were broken down into glycan nodes via methylation analysis. This analysis utilized information from N-, O-, and lipid linked glycans detected from gas chromatography-mass spectrometry. The resulting glycan node-ratios represent the initial quantitative data that were used in this experiment.
For this experiment, two Sets of 50 µl blood plasma samples were provided by NYU Medical School. These samples were then analyzed by Dr. Borges’s lab so that they contained normalized biomarker levels from patients with stage 1 adenocarcinoma and control patients with matched age, smoking status, and gender were examined. An ROC curve was constructed under individual and paired conditions and AUC calculated in Wolfram Mathematica 10.2. Methods such as increasing size of training set, using hard vs. soft margins, and processing biomarkers together and individually were used in order to increase the AUC. Using a soft margin for this particular data set was proved to be most useful compared to the initial set hard margin, raising the AUC from 0.6013 to 0.6585. In regards to which biomarkers yielded the better value, 6-Glc/6-Man and 3,6-Gal glycan node ratios had the best with 0.7687 AUC and a sensitivity of .7684 and specificity of .6051. While this is not enough accuracy to become a primary diagnostic tool for diagnosing stage I adenocarcinoma, the methods examined in the paper should be evaluated further. . By comparison, the current clinical standard blood test for prostate cancer that has an AUC of only 0.67.
ContributorsDe Jesus, Celine Spicer (Author) / Taylor, Thomas (Thesis director) / Borges, Chad (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135415-Thumbnail Image.png
Description
This thesis looks into the current method a particular company uses to value its inventory carrying costs (ICC). By identifying costs incurred during all stages of production, along with incorporating industry standards and academic research while avoiding the shortcomings of the company's current method, this thesis was able to derive

This thesis looks into the current method a particular company uses to value its inventory carrying costs (ICC). By identifying costs incurred during all stages of production, along with incorporating industry standards and academic research while avoiding the shortcomings of the company's current method, this thesis was able to derive a more comprehensive and manageable tool for measuring ICC. Our findings led to concrete recommendations, which will provide real value to company managers by improving the accuracy of project finance calculations, supply chain optimization modeling, and numerous other decisions relying on accurate inventory data inputs.
ContributorsDougherty, Mitch (Co-author) / Marshall, Jeffrey (Co-author) / Zieler, Jason (Co-author) / Gilmore, Eric (Co-author) / Hertzel, Michael (Thesis director) / Simonson, Mark (Committee member) / Yarn, James (Committee member) / Walter Cronkite School of Journalism and Mass Communication (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2014-05