Matching Items (52)
Filtering by

Clear all filters

135327-Thumbnail Image.png
Description
A semi-implicit, fourth-order time-filtered leapfrog numerical scheme is investigated for accuracy and stability, and applied to several test cases, including one-dimensional advection and diffusion, the anelastic equations to simulate the Kelvin-Helmholtz instability, and the global shallow water spectral model to simulate the nonlinear evolution of twin tropical cyclones. The leapfrog

A semi-implicit, fourth-order time-filtered leapfrog numerical scheme is investigated for accuracy and stability, and applied to several test cases, including one-dimensional advection and diffusion, the anelastic equations to simulate the Kelvin-Helmholtz instability, and the global shallow water spectral model to simulate the nonlinear evolution of twin tropical cyclones. The leapfrog scheme leads to computational modes in the solutions to highly nonlinear systems, and time-filters are often used to damp these modes. The proposed filter damps the computational modes without appreciably degrading the physical mode. Its performance in these metrics is superior to the second-order time-filtered leapfrog scheme developed by Robert and Asselin.
Created2016-05
135651-Thumbnail Image.png
Description
Honey bees (Apis mellifera) are responsible for pollinating nearly 80\% of all pollinated plants, meaning humans depend on honey bees to pollinate many staple crops. The success or failure of a colony is vital to global food production. There are various complex factors that can contribute to a colony's failure,

Honey bees (Apis mellifera) are responsible for pollinating nearly 80\% of all pollinated plants, meaning humans depend on honey bees to pollinate many staple crops. The success or failure of a colony is vital to global food production. There are various complex factors that can contribute to a colony's failure, including pesticides. Neonicotoids are a popular pesticide that have been used in recent times. In this study we concern ourselves with pesticides and its impact on honey bee colonies. Previous investigations that we draw significant inspiration from include Khoury et Al's \emph{A Quantitative Model of Honey Bee Colony Population Dynamics}, Henry et Al's \emph{A Common Pesticide Decreases Foraging Success and Survival in Honey Bees}, and Brown's \emph{ Mathematical Models of Honey Bee Populations: Rapid Population Decline}. In this project we extend a mathematical model to investigate the impact of pesticides on a honey bee colony, with birth rates and death rates being dependent on pesticides, and we see how these death rates influence the growth of a colony. Our studies have found an equilibrium point that depends on pesticides. Trace amounts of pesticide are detrimental as they not only affect death rates, but birth rates as well.
ContributorsSalinas, Armando (Author) / Vaz, Paul (Thesis director) / Jones, Donald (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136625-Thumbnail Image.png
Description
A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to

A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to the material similar to that of which is presented in class at ASU. The guide is available to students and professors in the new Actuarial Science degree program offered by ASU. There are twelve chapters, including financial calculator tips, detailed notes, examples, and practice exercises. Included at the end of the guide is a list of referenced material.
ContributorsDougher, Caroline Marie (Author) / Milovanovic, Jelena (Thesis director) / Boggess, May (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136691-Thumbnail Image.png
Description
Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has

Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has size close to or equal to the minimum possible. The construction of such permutation coverings has proven to be computationally difficult. While many examples for permutations of small length have been found, and strong asymptotic behavior is known, there are few explicit constructions for permutations of intermediate lengths. Most of these are generated from scratch using greedy algorithms. We explore a different approach here. Starting with a set of permutations with the desired coverage properties, we compute local changes to individual permutations that retain the total coverage of the set. By choosing these local changes so as to make one permutation less "essential" in maintaining the coverage of the set, our method attempts to make a permutation completely non-essential, so it can be removed without sacrificing total coverage. We develop a post-optimization method to do this and present results on sequence covering arrays and other types of permutation covering problems demonstrating that it is surprisingly effective.
ContributorsMurray, Patrick Charles (Author) / Colbourn, Charles (Thesis director) / Czygrinow, Andrzej (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-12
136520-Thumbnail Image.png
Description
Deconvolution of noisy data is an ill-posed problem, and requires some form of regularization to stabilize its solution. Tikhonov regularization is the most common method used, but it depends on the choice of a regularization parameter λ which must generally be estimated using one of several common methods. These methods

Deconvolution of noisy data is an ill-posed problem, and requires some form of regularization to stabilize its solution. Tikhonov regularization is the most common method used, but it depends on the choice of a regularization parameter λ which must generally be estimated using one of several common methods. These methods can be computationally intensive, so I consider their behavior when only a portion of the sampled data is used. I show that the results of these methods converge as the sampling resolution increases, and use this to suggest a method of downsampling to estimate λ. I then present numerical results showing that this method can be feasible, and propose future avenues of inquiry.
ContributorsHansen, Jakob Kristian (Author) / Renaut, Rosemary (Thesis director) / Cochran, Douglas (Committee member) / Barrett, The Honors College (Contributor) / School of Music (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136526-Thumbnail Image.png
Description
The purpose of this thesis is to examine the events surrounding the creation of the oboe and its rapid spread throughout Europe during the mid to late seventeenth century. The first section describes similar instruments that existed for thousands of years before the invention of the oboe. The following sections

The purpose of this thesis is to examine the events surrounding the creation of the oboe and its rapid spread throughout Europe during the mid to late seventeenth century. The first section describes similar instruments that existed for thousands of years before the invention of the oboe. The following sections examine reasons and methods for the oboe's invention, as well as possible causes of its migration from its starting place in France to other European countries, as well as many other places around the world. I conclude that the oboe was invented to suit the needs of composers in the court of Louis XIV, and that it was brought to other countries by French performers who left France for many reasons, including to escape from the authority of composer Jean-Baptiste Lully and in some cases to promote French culture in other countries.
ContributorsCook, Mary Katherine (Author) / Schuring, Martin (Thesis director) / Micklich, Albie (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Music (Contributor)
Created2015-05
136406-Thumbnail Image.png
Description
In this paper, I analyze representations of nature in popular film, using the feminist / deconstructionist concept of a dualism to structure my critique. Using Val Plumwood’s analysis of the logical structure of dualism and the 5 ‘features of a dualism’ that she identifies, I critique 5 popular movies –

In this paper, I analyze representations of nature in popular film, using the feminist / deconstructionist concept of a dualism to structure my critique. Using Val Plumwood’s analysis of the logical structure of dualism and the 5 ‘features of a dualism’ that she identifies, I critique 5 popular movies – Star Wars, Lord of the Rings, Brave, Grizzly Man, and Planet Earth – by locating within each of them one of the 5 features and explaining how the movie functions to reinforce the Nature/Culture dualism . By showing how the Nature/Culture dualism shapes and is shaped by popular cinema, I show how “Nature” is a social construct, created as part of this very dualism, and reified through popular culture. I conclude with the introduction of a number of ‘subversive’ pieces of visual art that undermine and actively deconstruct the Nature/Culture dualism and show to the viewer a more honest presentation of the non-human world.
ContributorsBarton, Christopher Joseph (Author) / Broglio, Ron (Thesis director) / Minteer, Ben (Committee member) / Barrett, The Honors College (Contributor) / School of Sustainability (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Geographical Sciences and Urban Planning (Contributor)
Created2015-05
136340-Thumbnail Image.png
Description
This paper focuses on the Szemerédi regularity lemma, a result in the field of extremal graph theory. The lemma says that every graph can be partitioned into bounded equal parts such that most edges of the graph span these partitions, and these edges are distributed in a fairly uniform way.

This paper focuses on the Szemerédi regularity lemma, a result in the field of extremal graph theory. The lemma says that every graph can be partitioned into bounded equal parts such that most edges of the graph span these partitions, and these edges are distributed in a fairly uniform way. Definitions and notation will be established, leading to explorations of three proofs of the regularity lemma. These are a version of the original proof, a Pythagoras proof utilizing elemental geometry, and a proof utilizing concepts of spectral graph theory. This paper is intended to supplement the proofs with background information about the concepts utilized. Furthermore, it is the hope that this paper will serve as another resource for students and others to begin study of the regularity lemma.
ContributorsByrne, Michael John (Author) / Czygrinow, Andrzej (Thesis director) / Kierstead, Hal (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Chemistry and Biochemistry (Contributor)
Created2015-05
136236-Thumbnail Image.png
Description
Lights Out is a puzzle game where the goal is to turn off all the lights on a nxn board starting from a random configuration. In order to find the solution of a configuration, the game is constructed using a matrix basis in the span of the field Z mod

Lights Out is a puzzle game where the goal is to turn off all the lights on a nxn board starting from a random configuration. In order to find the solution of a configuration, the game is constructed using a matrix basis in the span of the field Z mod 2.This the game can be modeled by the system Ap=s which will be the center of the investigation when determining the solvability for any n×n board since A is not always invertable leading to some interesting cases. The goal of this thesis was to construct a model that will allow the player to solve for the pushes to attain the zero-state for an nxn system. Constructing the model gave a procedure that will allow to solve the puzzle game. The procedure presented here first uses a simple clearing technique (valid for any board size) to turn off all the lights except in the last row, which we call the standard-clear. The heart of the technique, is to give a way to use the information about which lights remain lit in the last row to determine which switches in the first row need to be pushed before the standard-clear. This part of the solution algorithm we call the first row adjustment, and it depends heavily on the specific board size n of the problem. Finally, after these first row pushes are made, the standard clear will now turn off all the lights including (seemingly magically) the last row. Thus the solution to the Lights Out puzzle of a given size is reduced to finding a first row adjustment for that size. (Please refer to the actual thesis for the full abstract)
Created2015-05
133177-Thumbnail Image.png
Description
From 2007 to 2017, the state of California experienced two major droughts that required significant governmental action to decrease urban water demand. The purpose of this project is to isolate and explore the effects of these policy changes on water use during and after these droughts, and to see how

From 2007 to 2017, the state of California experienced two major droughts that required significant governmental action to decrease urban water demand. The purpose of this project is to isolate and explore the effects of these policy changes on water use during and after these droughts, and to see how these policies interact with hydroclimatic variability. As explanatory variables in multiple linear regression (MLR) models, water use policies were found to be significant at both the zip code and city levels. Policies that specifically target behavioral changes were significant mathematical drivers of water use in city-level models. Policy data was aggregated into a timeline and coded based on categories including user type, whether the policy was voluntary or mandatory, the targeted water use type, and whether the change in question concerns active or passive conservation. The analyzed policies include but are not limited to state drought declarations, regulatory municipal ordinances, and incentive programs for household appliances. Spatial averages of available hydroclimatic data have been computed and validated using inverse distance weighting methods. The data was aggregated at the zip code level to be comparable to the available water use data for use in MLR models. Factors already known to affect water use, such as temperature, precipitation, income, and water stress, were brought into the MLR models as explanatory variables. After controlling for these factors, the timeline policies were brought into the model as coded variables to test their effect on water demand during the years 2000-2017. Clearly identifying which policy traits are effective will inform future policymaking in cities aiming to conserve water. The findings suggest that drought-related policies impact per capita urban water use. The results of the city level MLR models indicate that implementation of mandatory policies that target water use behaviors effectively reduce water use. Temperature, income, unemployment, and the WaSSI were also observed to be mathematical drivers of water use. Interaction effects between policies and the WaSSI were statistically significant at both model scales.
ContributorsHjelmstad, Annika Margaret (Author) / Garcia, Margaret (Thesis director) / Larson, Kelli (Committee member) / Civil, Environmental and Sustainable Eng Program (Contributor, Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-12