Matching Items (19)
Filtering by

Clear all filters

149960-Thumbnail Image.png
Description
By the von Neumann min-max theorem, a two person zero sum game with finitely many pure strategies has a unique value for each player (summing to zero) and each player has a non-empty set of optimal mixed strategies. If the payoffs are independent, identically distributed (iid) uniform (0,1) random

By the von Neumann min-max theorem, a two person zero sum game with finitely many pure strategies has a unique value for each player (summing to zero) and each player has a non-empty set of optimal mixed strategies. If the payoffs are independent, identically distributed (iid) uniform (0,1) random variables, then with probability one, both players have unique optimal mixed strategies utilizing the same number of pure strategies with positive probability (Jonasson 2004). The pure strategies with positive probability in the unique optimal mixed strategies are called saddle squares. In 1957, Goldman evaluated the probability of a saddle point (a 1 by 1 saddle square), which was rediscovered by many authors including Thorp (1979). Thorp gave two proofs of the probability of a saddle point, one using combinatorics and one using a beta integral. In 1965, Falk and Thrall investigated the integrals required for the probabilities of a 2 by 2 saddle square for 2 × n and m × 2 games with iid uniform (0,1) payoffs, but they were not able to evaluate the integrals. This dissertation generalizes Thorp's beta integral proof of Goldman's probability of a saddle point, establishing an integral formula for the probability that a m × n game with iid uniform (0,1) payoffs has a k by k saddle square (k ≤ m,n). Additionally, the probabilities of a 2 by 2 and a 3 by 3 saddle square for a 3 × 3 game with iid uniform(0,1) payoffs are found. For these, the 14 integrals observed by Falk and Thrall are dissected into 38 disjoint domains, and the integrals are evaluated using the basic properties of the dilogarithm function. The final results for the probabilities of a 2 by 2 and a 3 by 3 saddle square in a 3 × 3 game are linear combinations of 1, π2, and ln(2) with rational coefficients.
ContributorsManley, Michael (Author) / Kadell, Kevin W. J. (Thesis advisor) / Kao, Ming-Hung (Committee member) / Lanchier, Nicolas (Committee member) / Lohr, Sharon (Committee member) / Reiser, Mark R. (Committee member) / Arizona State University (Publisher)
Created2011
151976-Thumbnail Image.png
Description
Parallel Monte Carlo applications require the pseudorandom numbers used on each processor to be independent in a probabilistic sense. The TestU01 software package is the standard testing suite for detecting stream dependence and other properties that make certain pseudorandom generators ineffective in parallel (as well as serial) settings. TestU01 employs

Parallel Monte Carlo applications require the pseudorandom numbers used on each processor to be independent in a probabilistic sense. The TestU01 software package is the standard testing suite for detecting stream dependence and other properties that make certain pseudorandom generators ineffective in parallel (as well as serial) settings. TestU01 employs two basic schemes for testing parallel generated streams. The first applies serial tests to the individual streams and then tests the resulting P-values for uniformity. The second turns all the parallel generated streams into one long vector and then applies serial tests to the resulting concatenated stream. Various forms of stream dependence can be missed by each approach because neither one fully addresses the multivariate nature of the accumulated data when generators are run in parallel. This dissertation identifies these potential faults in the parallel testing methodologies of TestU01 and investigates two different methods to better detect inter-stream dependencies: correlation motivated multivariate tests and vector time series based tests. These methods have been implemented in an extension to TestU01 built in C++ and the unique aspects of this extension are discussed. A variety of different generation scenarios are then examined using the TestU01 suite in concert with the extension. This enhanced software package is found to better detect certain forms of inter-stream dependencies than the original TestU01 suites of tests.
ContributorsIsmay, Chester (Author) / Eubank, Randall (Thesis advisor) / Young, Dennis (Committee member) / Kao, Ming-Hung (Committee member) / Lanchier, Nicolas (Committee member) / Reiser, Mark R. (Committee member) / Arizona State University (Publisher)
Created2013
136550-Thumbnail Image.png
Description
The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team

The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team be as successful as possible by defining which positions are most important to a team's success. Data from fifteen years of NFL games was collected and information on every player in the league was analyzed. First there needed to be a benchmark which describes a team as being average and then every player in the NFL must be compared to that average. Based on properties of linear regression using ordinary least squares this project aims to define such a model that shows each position's importance. Finally, once such a model had been established then the focus turned to the NFL draft in which the goal was to find a strategy of where each position needs to be drafted so that it is most likely to give the best payoff based on the results of the regression in part one.
ContributorsBalzer, Kevin Ryan (Author) / Goegan, Brian (Thesis director) / Dassanayake, Maduranga (Committee member) / Barrett, The Honors College (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136330-Thumbnail Image.png
Description
We model communication among social insects as an interacting particle system in which individuals perform one of two tasks and neighboring sites anti-mimic one another. Parameters of our model are a probability of defection 2 (0; 1) and relative cost ci > 0 to the individual performing task i. We

We model communication among social insects as an interacting particle system in which individuals perform one of two tasks and neighboring sites anti-mimic one another. Parameters of our model are a probability of defection 2 (0; 1) and relative cost ci > 0 to the individual performing task i. We examine this process on complete graphs, bipartite graphs, and the integers, answering questions about the relationship between communication, defection rates and the division of labor. Assuming the division of labor is ideal when exactly half of the colony is performing each task, we nd that on some bipartite graphs and the integers it can eventually be made arbitrarily close to optimal if defection rates are sufficiently small. On complete graphs the fraction of individuals performing each task is also closest to one half when there is no defection, but is bounded by a constant dependent on the relative costs of each task.
ContributorsArcuri, Alesandro Antonio (Author) / Lanchier, Nicolas (Thesis director) / Kang, Yun (Committee member) / Fewell, Jennifer (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
137637-Thumbnail Image.png
Description
The Axelrod Model is an agent-based adaptive model. The Axelrod Model shows the eects of a mechanism of convergent social inuence. Do local conver- gences generate global polarization ? Will it be possible for all dierences between individuals in a population comprised of neighbors to disappear ? There are many

The Axelrod Model is an agent-based adaptive model. The Axelrod Model shows the eects of a mechanism of convergent social inuence. Do local conver- gences generate global polarization ? Will it be possible for all dierences between individuals in a population comprised of neighbors to disappear ? There are many mechanisms to approach this issue ; the Axelrod Model is one of them.
ContributorsYu, Yili (Author) / Lanchier, Nicolas (Thesis director) / Kang, Yun (Committee member) / Brooks, Dan (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor)
Created2013-05
137559-Thumbnail Image.png
Description
Serge Galams voting systems and public debate models are used to model voting behaviors of two competing opinions in democratic societies. Galam assumes that individuals in the population are independently in favor of one opinion with a fixed probability p, making the initial number of that type of opinion a

Serge Galams voting systems and public debate models are used to model voting behaviors of two competing opinions in democratic societies. Galam assumes that individuals in the population are independently in favor of one opinion with a fixed probability p, making the initial number of that type of opinion a binomial random variable. This analysis revisits Galams models from the point of view of the hypergeometric random variable by assuming the initial number of individuals in favor of an opinion is a fixed deterministic number. This assumption is more realistic, especially when analyzing small populations. Evolution of the models is based on majority rules, with a bias introduced when there is a tie. For the hier- archical voting system model, in order to derive the probability that opinion +1 would win, the analysis was done by reversing time and assuming that an individual in favor of opinion +1 wins. Then, working backwards we counted the number of configurations at the next lowest level that could induce each possible configuration at the level above, and continued this process until reaching the bottom level, i.e., the initial population. Using this method, we were able to derive an explicit formula for the probability that an individual in favor of opinion +1 wins given any initial count of that opinion, for any group size greater than or equal to three. For the public debate model, we counted the total number of individuals in favor of opinion +1 at each time step and used this variable to define a random walk. Then, we used first-step analysis to derive an explicit formula for the probability that an individual in favor of opinion +1 wins given any initial count of that opinion for group sizes of three. The spatial public debate model evolves based on the proportional rule. For the spatial model, the most natural graphical representation to construct the process results in a model that is not mathematically tractable. Thus, we defined a different graphical representation that is mathematically equivalent to the first graphical representation, but in this model it is possible to define a dual process that is mathematically tractable. Using this graphical representation we prove clustering in 1D and 2D and coexistence in higher dimensions following the same approach as for the voter model interacting particle system.
ContributorsTaylor, Nicole Robyn (Co-author) / Lanchier, Nicolas (Co-author, Thesis director) / Smith, Hal (Committee member) / Hurlbert, Glenn (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2013-05
132394-Thumbnail Image.png
Description
In baseball, a starting pitcher has historically been a more durable pitcher capable of lasting long into games without tiring. For the entire history of Major League Baseball, these pitchers have been expected to last 6 innings or more into a game before being replaced. However, with the advances in

In baseball, a starting pitcher has historically been a more durable pitcher capable of lasting long into games without tiring. For the entire history of Major League Baseball, these pitchers have been expected to last 6 innings or more into a game before being replaced. However, with the advances in statistics and sabermetrics and their gradual acceptance by professional coaches, the role of the starting pitcher is beginning to change. Teams are experimenting with having starters being replaced quicker, challenging the traditional role of the starting pitcher. The goal of this study is to determine if there is an exact point at which a team would benefit from replacing a starting or relief pitcher with another pitcher using statistical analyses. We will use logistic stepwise regression to predict the likelihood of a team scoring a run if a substitution is made or not made given the current game situation.
ContributorsBuckley, Nicholas J (Author) / Samara, Marko (Thesis director) / Lanchier, Nicolas (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132677-Thumbnail Image.png
Description
This paper analyzes responses to a survey using a modified fourfold pattern of preference to determine if implicit information, once made explicit, is practically significant in nudging irrational decision makers towards more rational decisions. Respondents chose between two scenarios and an option for indifference for each of the four questions

This paper analyzes responses to a survey using a modified fourfold pattern of preference to determine if implicit information, once made explicit, is practically significant in nudging irrational decision makers towards more rational decisions. Respondents chose between two scenarios and an option for indifference for each of the four questions from the fourfold pattern with expected value being implicit information. Then respondents were asked familiarity with expected value and given the same four questions again but with the expected value for each scenario then explicitly given. Respondents were asked to give feedback if their answers had changed and if the addition of the explicit information was the reason for that change. Results found the addition of the explicit information in the form of expected value to be practically significant with ~90% of respondents who changed their answers giving that for the reason. In the implicit section of the survey, three out of four of the questions had a response majority of lower expected value answers given compared to the alternative. In the explicit section of the survey, all four questions achieved a response majority of higher expected value answers given compared to the alternative. In moving from the implicit to the explicit section, for each question, the scenario with lower expected value experienced a decrease in percentage of responses, and the scenario with higher expected value and indifference between the scenarios both experienced an increase in percentage of responses.
ContributorsJohnson, Matthew (Author) / Goegan, Brian (Thesis director) / Foster, William (Committee member) / School of Sustainability (Contributor) / Economics Program in CLAS (Contributor) / Dean, W.P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133036-Thumbnail Image.png
Description
This study examines the economic impact of the opioid crisis in the United States. Primarily testing the years 2007-2018, I gathered data from the Census Bureau, Centers for Disease Control, and Kaiser Family Foundation in order to examine the relative impact of a one dollar increase in GDP per Capita

This study examines the economic impact of the opioid crisis in the United States. Primarily testing the years 2007-2018, I gathered data from the Census Bureau, Centers for Disease Control, and Kaiser Family Foundation in order to examine the relative impact of a one dollar increase in GDP per Capita on the death rates caused by opioids. By implementing a fixed-effects panel data design, I regressed deaths on GDP per Capita while holding the following constant: population, U.S. retail opioid prescriptions per 100 people, annual average unemployment rate, percent of the population that is Caucasian, and percent of the population that is male. I found that GDP per Capita and opioid related deaths are negatively correlated, meaning that with every additional person dying from opioids, GDP per capita decreases. The finding of this research is important because opioid overdose is harmful to society, as U.S. life expectancy is consistently dropping as opioid death rates rise. Increasing awareness on this topic can help prevent misuse and the overall reduction in opioid related deaths.
ContributorsRavi, Ritika Lisa (Author) / Goegan, Brian (Thesis director) / Hill, John (Committee member) / Department of Economics (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132991-Thumbnail Image.png
Description
More than 40% of all U.S. opioid overdose deaths in 2016 involved a prescription opioid, with more than 46 people dying every day from overdoses involving prescription opioids, (CDC, 2017). Over the years, lawmakers have implemented policies and laws to address the opioid epidemic, and many of these vary from

More than 40% of all U.S. opioid overdose deaths in 2016 involved a prescription opioid, with more than 46 people dying every day from overdoses involving prescription opioids, (CDC, 2017). Over the years, lawmakers have implemented policies and laws to address the opioid epidemic, and many of these vary from state to state. This study will lay out the basic guidelines of common pieces of legislation. It also examines relationships between 6 state-specific prescribing or preventative laws and associated changes in opioid-related deaths using a longitudinal cross-state study design (2007-2015). Specifically, it uses a linear regression to examine changes in state-specific rates of opioid-related deaths after implementation of specific policies, and whether states implementing these policies saw smaller increases than states without these policies. Initial key findings of this study show that three policies have a statistically significant association with opioid related overdose deaths are—Good Samaritan Laws, Standing Order Laws, and Naloxone Liability Laws. Paradoxically, all three policies correlated with an increase in opioid overdose deaths between 2007 and 2016. However, after correcting for the potential spurious relationship between state-specific timing of policy implementation and death rates, two policies have a statistically significant association (alpha <0.05) with opioid overdose death rates. First, the Naloxone Liability Laws were significantly associated with changes in opioid-related deaths and was correlated with a 0.33 log increase in opioid overdose death rates, or a 29% increase. This equates to about 1.39 more deaths per year per 100,000 people. Second, the legislation that allows for 3rd Party Naloxone prescriptions correlated with a 0.33 log decrease in opioid overdose death rates, or a 29% decrease. This equates to 1.39 fewer deaths per year per 100,000 people.
ContributorsDavis, Joshua Alan (Author) / Hruschka, Daniel (Thesis director) / Gaughan, Monica (Committee member) / School of Human Evolution & Social Change (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05