Matching Items (28)
Filtering by

Clear all filters

136549-Thumbnail Image.png
Description
A primary goal in computer science is to develop autonomous systems. Usually, we provide computers with tasks and rules for completing those tasks, but what if we could extend this type of system to physical technology as well? In the field of programmable matter, researchers are tasked with developing synthetic

A primary goal in computer science is to develop autonomous systems. Usually, we provide computers with tasks and rules for completing those tasks, but what if we could extend this type of system to physical technology as well? In the field of programmable matter, researchers are tasked with developing synthetic materials that can change their physical properties \u2014 such as color, density, and even shape \u2014 based on predefined rules or continuous, autonomous collection of input. In this research, we are most interested in particles that can perform computations, bond with other particles, and move. In this paper, we provide a theoretical particle model that can be used to simulate the performance of such physical particle systems, as well as an algorithm to perform expansion, wherein these particles can be used to enclose spaces or even objects.
ContributorsLaff, Miles (Author) / Richa, Andrea (Thesis director) / Bazzi, Rida (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136516-Thumbnail Image.png
Description
Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot

Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot detection, we are interested in bots on Twitter that tweet Arabic extremist-like phrases. A testing dataset is collected using the honeypot method, and five different heuristics are measured for their effectiveness in detecting bots. The model underperformed, but we have laid the ground-work for a vastly untapped focus on bot detection: extremist ideal diffusion through bots.
ContributorsKarlsrud, Mark C. (Author) / Liu, Huan (Thesis director) / Morstatter, Fred (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
135739-Thumbnail Image.png
Description
Many programmable matter systems have been proposed and realized recently, each often tailored toward a particular task or physical setting. In our work on self-organizing particle systems, we abstract away from specific settings and instead describe programmable matter as a collection of simple computational elements (to be referred to as

Many programmable matter systems have been proposed and realized recently, each often tailored toward a particular task or physical setting. In our work on self-organizing particle systems, we abstract away from specific settings and instead describe programmable matter as a collection of simple computational elements (to be referred to as particles) with limited computational power that each perform fully distributed, local, asynchronous algorithms to solve system-wide problems of movement, configuration, and coordination. In this thesis, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in the presence of some underlying geometry. While there are many ways to formalize what it means for a particle system to be compressed, we address three different notions of compression: (1) local compression, in which each individual particle utilizes local rules to create an overall convex structure containing no holes, (2) hole elimination, in which the particle system seeks to detect and eliminate any holes it contains, and (3) alpha-compression, in which the particle system seeks to shrink its perimeter to be within a constant factor of the minimum possible value. We analyze the behavior of each of these algorithms, examining correctness and convergence where appropriate. In the case of the Markov Chain Algorithm for Compression, we provide improvements to the original bounds for the bias parameter lambda which influences the system to either compress or expand. Lastly, we briefly discuss contributions to the problem of leader election--in which a particle system elects a single leader--since it acts as an important prerequisite for compression algorithms that use a predetermined seed particle.
ContributorsDaymude, Joshua Jungwoo (Author) / Richa, Andrea (Thesis director) / Kierstead, Henry (Committee member) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136691-Thumbnail Image.png
Description
Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has

Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has size close to or equal to the minimum possible. The construction of such permutation coverings has proven to be computationally difficult. While many examples for permutations of small length have been found, and strong asymptotic behavior is known, there are few explicit constructions for permutations of intermediate lengths. Most of these are generated from scratch using greedy algorithms. We explore a different approach here. Starting with a set of permutations with the desired coverage properties, we compute local changes to individual permutations that retain the total coverage of the set. By choosing these local changes so as to make one permutation less "essential" in maintaining the coverage of the set, our method attempts to make a permutation completely non-essential, so it can be removed without sacrificing total coverage. We develop a post-optimization method to do this and present results on sequence covering arrays and other types of permutation covering problems demonstrating that it is surprisingly effective.
ContributorsMurray, Patrick Charles (Author) / Colbourn, Charles (Thesis director) / Czygrinow, Andrzej (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-12
137021-Thumbnail Image.png
Description
Economists, political philosophers, and others have often characterized social preferences regarding inequality by imagining a hypothetical choice of distributions behind "a veil of ignorance". Recent behavioral economics work has shown that subjects care about equality of outcomes, and are willing to sacrifice, in experimental contexts, some amount of personal gain

Economists, political philosophers, and others have often characterized social preferences regarding inequality by imagining a hypothetical choice of distributions behind "a veil of ignorance". Recent behavioral economics work has shown that subjects care about equality of outcomes, and are willing to sacrifice, in experimental contexts, some amount of personal gain in order to achieve greater equality. We review some of this literature and then conduct an experiment of our own, comparing subjects' choices in two risky situations, one being a choice for a purely individualized lottery for themselves, and the other a choice among possible distributions to members of a randomly selected group. We find that choosing in the group situation makes subjects significantly more risk averse than when choosing an individual lottery. This supports the hypothesis that an additional preference for equality exists alongside ordinary risk aversion, and that in a hypothetical "veil of ignorance" scenario, such preferences may make subjects significantly more averse to unequal distributions of rewards than can be explained by risk aversion alone.
ContributorsTheisen, Alexander Scott (Co-author) / McMullin, Caitlin (Co-author) / Li, Marilyn (Co-author) / DeSerpa, Allan (Thesis director) / Schlee, Edward (Committee member) / Baldwin, Marjorie (Committee member) / Barrett, The Honors College (Contributor) / Department of Economics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / School of Historical, Philosophical and Religious Studies (Contributor)
Created2014-05
137627-Thumbnail Image.png
Description
Polar ice masses can be valuable indicators of trends in global climate. In an effort to better understand the dynamics of Arctic ice, this project analyzes sea ice concentration anomaly data collected over gridded regions (cells) and builds graphs based upon high correlations between cells. These graphs offer the opportunity

Polar ice masses can be valuable indicators of trends in global climate. In an effort to better understand the dynamics of Arctic ice, this project analyzes sea ice concentration anomaly data collected over gridded regions (cells) and builds graphs based upon high correlations between cells. These graphs offer the opportunity to use metrics such as clustering coefficients and connected components to isolate representative trends in ice masses. Based upon this analysis, the structure of sea ice graphs differs at a statistically significant level from random graphs, and several regions show erratically decreasing trends in sea ice concentration.
ContributorsWallace-Patterson, Chloe Rae (Author) / Syrotiuk, Violet (Thesis director) / Colbourn, Charles (Committee member) / Montgomery, Douglas (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor)
Created2013-05
147704-Thumbnail Image.png
Description

This paper examines infrastructure spending in a model economy. Infrastructure is subdivided into two types: one that makes future production more efficient, and another that decreases the risk of devastation to the future economy. We call the first type base infrastructure, and the second type risk-reducing infrastructure. Our model assumes

This paper examines infrastructure spending in a model economy. Infrastructure is subdivided into two types: one that makes future production more efficient, and another that decreases the risk of devastation to the future economy. We call the first type base infrastructure, and the second type risk-reducing infrastructure. Our model assumes that a single representative individual makes all the decisions within a society and optimizes their own total utility over the present and future. We then calibrate an aggregate economic, two-period model to identify the optimal allocation of today’s output into consumption, base infrastructure, and risk-reducing infrastructure. This model finds that many governments can make substantive improvements to the happiness of their citizens by investing significantly more into risk-reducing infrastructure.

ContributorsFink, Justin (Co-author) / Fuller, John "Jack" (Co-author) / Prescott, Edward (Thesis director) / Millington, Matthew (Committee member) / School of Mathematical and Statistical Sciences (Contributor, Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
136297-Thumbnail Image.png
Description
Dr. Dean Kashiwagi created a new thinking paradigm, Information Measurement Theory (IMT), which utilizes the understanding of natural laws to help individuals minimize decision-making and risk, which leads to reduced stress. In this new paradigm, any given situation can only have one unique outcome. The more information an individual has

Dr. Dean Kashiwagi created a new thinking paradigm, Information Measurement Theory (IMT), which utilizes the understanding of natural laws to help individuals minimize decision-making and risk, which leads to reduced stress. In this new paradigm, any given situation can only have one unique outcome. The more information an individual has for the given situation, the better they can predict the outcome. Using IMT can help correctly "predict the future" of any situation if given enough of the correct information. A prime example of using IMT would be: to correctly predict what a young woman will be like when she's older, simply look at the young woman's mother. In essence, if you can't fall in love with the mother, don't marry the young woman. The researchers are utilizing the concept of IMT and extrapolating it to the financial investing world. They researched different financial investing strategies and were able to come to the conclusion that a strategy utilizing IMT would yield the highest results for investors while minimizing stress. Investors using deductive logic to invest received, on average, 1300% more returns than investors who did not over a 25-year period. Where other investors made many decisions and were constantly stressed with the tribulations of the market, the investors utilizing IMT made one decision and made much more than other investors. The research confirms the stock market will continue to increase over time by looking at the history of the stock market from a birds-eye view. Throughout the existence of the stock market, there have been highs and lows, but at the end of the day, the market continues to break through new ceilings. Investing in the stock market can be a dark and scary place for the blind investor. Using the concept of IMT can eliminate that blindfold to reduce stress on investors while earning the highest financial return potential. Using the basis of IMT, the researchers predict the market will continue to increase in the future; in conclusion, the best investment strategy is to invest in blue chip stocks that have a history of past success, in order to capture secure growth with minimal risk and stress.
ContributorsBerns, Ryan (Co-author) / Ybanez, Julian (Co-author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Department of Finance (Contributor) / Department of Marketing (Contributor) / W. P. Carey School of Business (Contributor)
Created2015-05
131235-Thumbnail Image.png
DescriptionA two-way deterministic finite pushdown automaton ("2PDA") is developed for the Lua language. This 2PDA is evaluated against both a purpose-built Lua syntax test suite and the test suite used by the reference implementation of Lua, and fully passes both.
ContributorsStevens, Kevin A (Author) / Shoshitaishvili, Yan (Thesis director) / Wang, Ruoyu (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
132360-Thumbnail Image.png
Description
We consider programmable matter as a collection of simple computational elements (or particles) that self-organize to solve system-wide problems of movement, configuration, and coordination. Here, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in

We consider programmable matter as a collection of simple computational elements (or particles) that self-organize to solve system-wide problems of movement, configuration, and coordination. Here, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in the presence of some underlying geometry. Within this model a configuration of particles can be represented as a unique closed self-avoiding walk on the triangular lattice. In this paper we will examine the bias parameter of a Markov chain based algorithm that solves the compression problem under the geometric amoebot model, for particle systems that begin in a connected configuration with no holes. This bias parameter, $\lambda$, determines the behavior of the algorithm. It has been shown that for $\lambda > 2+\sqrt{2}$, with all but exponentially small probability, the algorithm achieves compression. Additionally the same algorithm can be used for expansion for small values of $\lambda$; in particular, for all $0 < \lambda < \sqrt{\tau}$, where $\lim_{n\to\infty} {(p_n)^{1
}}=\tau$. This research will focus on improving approximations on the lower bound of $\tau$. Toward this end we will examine algorithmic enumeration, and series analysis for self-avoiding polygons.
ContributorsLough, Kevin James (Author) / Richa, Andrea (Thesis director) / Fishel, Susanna (Committee member) / School of Mathematical and Statistical Sciences (Contributor, Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05