Filtering by
- Creators: School of Mathematical and Statistical Sciences
- Creators: Sen, Arunabha
- Creators: Dean, W.P. Carey School of Business
Over the years, advances in research have continued to decrease the size of computers from the size of<br/>a room to a small device that could fit in one’s palm. However, if an application does not require extensive<br/>computation power nor accessories such as a screen, the corresponding machine could be microscopic,<br/>only a few nanometers big. Researchers at MIT have successfully created Syncells, which are micro-<br/>scale robots with limited computation power and memory that can communicate locally to achieve<br/>complex collective tasks. In order to control these Syncells for a desired outcome, they must each run a<br/>simple distributed algorithm. As they are only capable of local communication, Syncells cannot receive<br/>commands from a control center, so their algorithms cannot be centralized. In this work, we created a<br/>distributed algorithm that each Syncell can execute so that the system of Syncells is able to find and<br/>converge to a specific target within the environment. The most direct applications of this problem are in<br/>medicine. Such a system could be used as a safer alternative to invasive surgery or could be used to treat<br/>internal bleeding or tumors. We tested and analyzed our algorithm through simulation and visualization<br/>in Python. Overall, our algorithm successfully caused the system of particles to converge on a specific<br/>target present within the environment.
Optimal foraging theory provides a suite of tools that model the best way that an animal will <br/>structure its searching and processing decisions in uncertain environments. It has been <br/>successful characterizing real patterns of animal decision making, thereby providing insights<br/>into why animals behave the way they do. However, it does not speak to how animals make<br/>decisions that tend to be adaptive. Using simulation studies, prior work has shown empirically<br/>that a simple decision-making heuristic tends to produce prey-choice behaviors that, on <br/>average, match the predicted behaviors of optimal foraging theory. That heuristic chooses<br/>to spend time processing an encountered prey item if that prey item's marginal rate of<br/>caloric gain (in calories per unit of processing time) is greater than the forager's<br/>current long-term rate of accumulated caloric gain (in calories per unit of total searching<br/>and processing time). Although this heuristic may seem intuitive, a rigorous mathematical<br/>argument for why it tends to produce the theorized optimal foraging theory behavior has<br/>not been developed. In this thesis, an analytical argument is given for why this<br/>simple decision-making heuristic is expected to realize the optimal performance<br/>predicted by optimal foraging theory. This theoretical guarantee not only provides support<br/>for why such a heuristic might be favored by natural selection, but it also provides<br/>support for why such a heuristic might a reliable tool for decision-making in autonomous<br/>engineered agents moving through theatres of uncertain rewards. Ultimately, this simple<br/>decision-making heuristic may provide a recipe for reinforcement learning in small robots<br/>with little computational capabilities.
The era of mass data collection is upon us and only recently have people begun to consider the value of their data. All of our clicks and likes have helped big tech companies build predictive models to tailor their product to the buying patterns of the consumer. Big data collection has its advantages in increasing profitability and efficiency, but many are concerned about the lack of transparency in these technologies (Dwyer). The dependency on algorithms to make and influence decisions has become a growing concern in law enforcement. The use of this technology is commonly referred to as data-driven decision making, which is also known as predictive policing. These technologies are thought to reduce the biases held in traditional policing by creating statistically sound evidence-based models. Although, many lawsuits have highlighted the fact that predictive technologies do more to reflect historical bias rather than to eradicate it. The clandestine measures behind the algorithms may be in conflict with the due process clause and the penumbra of privacy rights enumerated in the First, Third, Fourth, and Fifth Amendments. <br/> Predictive policing technology has come under fire for over policing historically black and latinx neighborhoods. GIS (Geographical Information Systems) is supposed to help officers identify where crime will likely happen over the next twelve hours. However, the LAPD’s own internal audit of their program concluded that the technology did not help officers solve crimes or reduce crime rate any better than traditional patrol methods (Puente). Similarly, other types of tools used to calculate recidivism risk for bond sentencing are disproportionately biased to calculate black people as having a higher risk to reoffend (Angwin). Lawsuits from civil liberties groups have been filed against the police departments that utilized these technologies. This paper will examine the constitutional pitfalls of predictive technology and propose ways that the system could work to ameliorate its practices.
A statistical method is proposed to learn what the diffusion coefficient is at any point in space of a cell membrane. The method used bayesian non-parametrics to learn this value. Learning the diffusion coefficient might be useful for understanding more about cellular dynamics.
}}=\tau$. This research will focus on improving approximations on the lower bound of $\tau$. Toward this end we will examine algorithmic enumeration, and series analysis for self-avoiding polygons.
This thesis intends to show that the diversity of algorithmic choreography can be reduced into more specific categories. As algorithmic choreography is fundamentally intertwined with the concept of computation, it is natural to propose that algorithmic choreography works be separated based on a spectrum that is defined by the extent of the involvement of computation within each piece.
This thesis seeks to specifically outline three primary categories that algorithmic works can fall into: pieces that involve minimal computational influence, entirely computationally generated pieces, and pieces that lie in between. Three original works were created to reflect each of these categories. These works provide examples of the various methods by which computation can influence and enhance choreography.
The first piece, entitled Rαinwater, displays a minimal amount of computational influence. The use of space in the piece was limited to random, computationally generated paths. The dancers extracted a narrative element from the random paths. This iteration resulted in a piece that explores the dancers’ emotional interaction within the context of a rainy environment. The second piece, entitled Mymec, utilizes an intermediary amount of computation. The piece sees a dancer interact with a projected display of an Ant Colony Optimization (ACO) algorithm. The dancer is to take direct inspiration from the movement of the virtual ants and embody the visualization of the algorithm. The final piece, entitled nSkeleton, exhibited maximal computational influence. Kinect position data was manipulated using iterative methods from computational mathematics to create computer-generated movement to be performed by a dancer on-stage.
Each original piece was originally intended to be presented to the public as part of an evening-length show. However, due to the rise of the COVID-19 pandemic caused by the novel coronavirus, all public campus events have been canceled and the government has recommended that gatherings with more than 10 people be entirely avoided. Thus, the pieces will instead be presented in the form of a video published online. This video will encompass information about the creation of each piece as well as clips of choreography.