Matching Items (37)
Filtering by

Clear all filters

148049-Thumbnail Image.png
Description

Cancer rates vary between people, between cultures, and between tissue types, driven by clinically relevant distinctions in the risk factors that lead to different cancer types. Despite the importance of cancer location in human health, little is known about tissue-specific cancers in non-human animals. We can gain significant insight into

Cancer rates vary between people, between cultures, and between tissue types, driven by clinically relevant distinctions in the risk factors that lead to different cancer types. Despite the importance of cancer location in human health, little is known about tissue-specific cancers in non-human animals. We can gain significant insight into how evolutionary history has shaped mechanisms of cancer suppression by examining how life history traits impact cancer susceptibility across species. Here, we perform multi-level analysis to test how species-level life history strategies are associated with differences in neoplasia prevalence, and apply this to mammary neoplasia within mammals. We propose that the same patterns of cancer prevalence that have been reported across species will be maintained at the tissue-specific level. We used a combination of factor analysis and phylogenetic regression on 13 life history traits across 90 mammalian species to determine the correlation between a life history trait and how it relates to mammary neoplasia prevalence. The factor analysis presented ways to calculate quantifiable underlying factors that contribute to covariance of entangled life history variables. A greater risk of mammary neoplasia was found to be correlated most significantly with shorter gestation length. With this analysis, a framework is provided for how different life history modalities can influence cancer vulnerability. Additionally, statistical methods developed for this project present a framework for future comparative oncology studies and have the potential for many diverse applications.

ContributorsFox, Morgan Shane (Author) / Maley, Carlo C. (Thesis director) / Boddy, Amy (Committee member) / Compton, Zachary (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of Molecular Sciences (Contributor) / School of Life Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
136549-Thumbnail Image.png
Description
A primary goal in computer science is to develop autonomous systems. Usually, we provide computers with tasks and rules for completing those tasks, but what if we could extend this type of system to physical technology as well? In the field of programmable matter, researchers are tasked with developing synthetic

A primary goal in computer science is to develop autonomous systems. Usually, we provide computers with tasks and rules for completing those tasks, but what if we could extend this type of system to physical technology as well? In the field of programmable matter, researchers are tasked with developing synthetic materials that can change their physical properties \u2014 such as color, density, and even shape \u2014 based on predefined rules or continuous, autonomous collection of input. In this research, we are most interested in particles that can perform computations, bond with other particles, and move. In this paper, we provide a theoretical particle model that can be used to simulate the performance of such physical particle systems, as well as an algorithm to perform expansion, wherein these particles can be used to enclose spaces or even objects.
ContributorsLaff, Miles (Author) / Richa, Andrea (Thesis director) / Bazzi, Rida (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136199-Thumbnail Image.png
Description
Despite the 40-year war on cancer, very limited progress has been made in developing a cure for the disease. This failure has prompted the reevaluation of the causes and development of cancer. One resulting model, coined the atavistic model of cancer, posits that cancer is a default phenotype of the

Despite the 40-year war on cancer, very limited progress has been made in developing a cure for the disease. This failure has prompted the reevaluation of the causes and development of cancer. One resulting model, coined the atavistic model of cancer, posits that cancer is a default phenotype of the cells of multicellular organisms which arises when the cell is subjected to an unusual amount of stress. Since this default phenotype is similar across cell types and even organisms, it seems it must be an evolutionarily ancestral phenotype. We take a phylostratigraphical approach, but systematically add species divergence time data to estimate gene ages numerically and use these ages to investigate the ages of genes involved in cancer. We find that ancient disease-recessive cancer genes are significantly enriched for DNA repair and SOS activity, which seems to imply that a core component of cancer development is not the regulation of growth, but the regulation of mutation. Verification of this finding could drastically improve cancer treatment and prevention.
ContributorsOrr, Adam James (Author) / Davies, Paul (Thesis director) / Bussey, Kimberly (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Chemistry and Biochemistry (Contributor) / School of Life Sciences (Contributor)
Created2015-05
136516-Thumbnail Image.png
Description
Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot

Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot detection, we are interested in bots on Twitter that tweet Arabic extremist-like phrases. A testing dataset is collected using the honeypot method, and five different heuristics are measured for their effectiveness in detecting bots. The model underperformed, but we have laid the ground-work for a vastly untapped focus on bot detection: extremist ideal diffusion through bots.
ContributorsKarlsrud, Mark C. (Author) / Liu, Huan (Thesis director) / Morstatter, Fred (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
135739-Thumbnail Image.png
Description
Many programmable matter systems have been proposed and realized recently, each often tailored toward a particular task or physical setting. In our work on self-organizing particle systems, we abstract away from specific settings and instead describe programmable matter as a collection of simple computational elements (to be referred to as

Many programmable matter systems have been proposed and realized recently, each often tailored toward a particular task or physical setting. In our work on self-organizing particle systems, we abstract away from specific settings and instead describe programmable matter as a collection of simple computational elements (to be referred to as particles) with limited computational power that each perform fully distributed, local, asynchronous algorithms to solve system-wide problems of movement, configuration, and coordination. In this thesis, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in the presence of some underlying geometry. While there are many ways to formalize what it means for a particle system to be compressed, we address three different notions of compression: (1) local compression, in which each individual particle utilizes local rules to create an overall convex structure containing no holes, (2) hole elimination, in which the particle system seeks to detect and eliminate any holes it contains, and (3) alpha-compression, in which the particle system seeks to shrink its perimeter to be within a constant factor of the minimum possible value. We analyze the behavior of each of these algorithms, examining correctness and convergence where appropriate. In the case of the Markov Chain Algorithm for Compression, we provide improvements to the original bounds for the bias parameter lambda which influences the system to either compress or expand. Lastly, we briefly discuss contributions to the problem of leader election--in which a particle system elects a single leader--since it acts as an important prerequisite for compression algorithms that use a predetermined seed particle.
ContributorsDaymude, Joshua Jungwoo (Author) / Richa, Andrea (Thesis director) / Kierstead, Henry (Committee member) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136857-Thumbnail Image.png
Description
Glioblastoma Multiforme (GBM) is an aggressive and deadly form of brain cancer with a median survival time of about a year with treatment. Due to the aggressive nature of these tumors and the tendency of gliomas to follow white matter tracks in the brain, each tumor mass has a unique

Glioblastoma Multiforme (GBM) is an aggressive and deadly form of brain cancer with a median survival time of about a year with treatment. Due to the aggressive nature of these tumors and the tendency of gliomas to follow white matter tracks in the brain, each tumor mass has a unique growth pattern. Consequently it is difficult for neurosurgeons to anticipate where the tumor will spread in the brain, making treatment planning difficult. Archival patient data including MRI scans depicting the progress of tumors have been helpful in developing a model to predict Glioblastoma proliferation, but limited scans per patient make the tumor growth rate difficult to determine. Furthermore, patient treatment between scan points can significantly compound the challenge of accurately predicting the tumor growth. A partnership with Barrow Neurological Institute has allowed murine studies to be conducted in order to closely observe tumor growth and potentially improve the current model to more closely resemble intermittent stages of GBM growth without treatment effects.
ContributorsSnyder, Lena Haley (Author) / Kostelich, Eric (Thesis director) / Frakes, David (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
136691-Thumbnail Image.png
Description
Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has

Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has size close to or equal to the minimum possible. The construction of such permutation coverings has proven to be computationally difficult. While many examples for permutations of small length have been found, and strong asymptotic behavior is known, there are few explicit constructions for permutations of intermediate lengths. Most of these are generated from scratch using greedy algorithms. We explore a different approach here. Starting with a set of permutations with the desired coverage properties, we compute local changes to individual permutations that retain the total coverage of the set. By choosing these local changes so as to make one permutation less "essential" in maintaining the coverage of the set, our method attempts to make a permutation completely non-essential, so it can be removed without sacrificing total coverage. We develop a post-optimization method to do this and present results on sequence covering arrays and other types of permutation covering problems demonstrating that it is surprisingly effective.
ContributorsMurray, Patrick Charles (Author) / Colbourn, Charles (Thesis director) / Czygrinow, Andrzej (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-12
137627-Thumbnail Image.png
Description
Polar ice masses can be valuable indicators of trends in global climate. In an effort to better understand the dynamics of Arctic ice, this project analyzes sea ice concentration anomaly data collected over gridded regions (cells) and builds graphs based upon high correlations between cells. These graphs offer the opportunity

Polar ice masses can be valuable indicators of trends in global climate. In an effort to better understand the dynamics of Arctic ice, this project analyzes sea ice concentration anomaly data collected over gridded regions (cells) and builds graphs based upon high correlations between cells. These graphs offer the opportunity to use metrics such as clustering coefficients and connected components to isolate representative trends in ice masses. Based upon this analysis, the structure of sea ice graphs differs at a statistically significant level from random graphs, and several regions show erratically decreasing trends in sea ice concentration.
ContributorsWallace-Patterson, Chloe Rae (Author) / Syrotiuk, Violet (Thesis director) / Colbourn, Charles (Committee member) / Montgomery, Douglas (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor)
Created2013-05
148396-Thumbnail Image.png
Description

Over time, tumor treatment resistance inadvertently develops when androgen de-privation therapy (ADT) is applied to metastasized prostate cancer (PCa). To combat tumor resistance, while reducing the harsh side effects of hormone therapy, the clinician may opt to cyclically alternates the patient’s treatment on and off. This method,known as intermittent ADT,

Over time, tumor treatment resistance inadvertently develops when androgen de-privation therapy (ADT) is applied to metastasized prostate cancer (PCa). To combat tumor resistance, while reducing the harsh side effects of hormone therapy, the clinician may opt to cyclically alternates the patient’s treatment on and off. This method,known as intermittent ADT, is an alternative to continuous ADT that improves the patient’s quality of life while testosterone levels recover between cycles. In this paper,we explore the response of intermittent ADT to metastasized prostate cancer by employing a previously clinical data validated mathematical model to new clinical data from patients undergoing Abiraterone therapy. This cell quota model, a system of ordinary differential equations constructed using Droop’s nutrient limiting theory, assumes the tumor comprises of castration-sensitive (CS) and castration-resistant (CR)cancer sub-populations. The two sub-populations rely on varying levels of intracellular androgen for growth, death and transformation. Due to the complexity of the model,we carry out sensitivity analyses to study the effect of certain parameters on their outputs, and to increase the identifiability of each patient’s unique parameter set. The model’s forecasting results show consistent accuracy for patients with sufficient data,which means the model could give useful information in practice, especially to decide whether an additional round of treatment would be effective.

ContributorsBennett, Justin Klark (Author) / Kuang, Yang (Thesis director) / Kostelich, Eric (Committee member) / Phan, Tin (Committee member) / School of Mathematical and Statistical Sciences (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
131235-Thumbnail Image.png
DescriptionA two-way deterministic finite pushdown automaton ("2PDA") is developed for the Lua language. This 2PDA is evaluated against both a purpose-built Lua syntax test suite and the test suite used by the reference implementation of Lua, and fully passes both.
ContributorsStevens, Kevin A (Author) / Shoshitaishvili, Yan (Thesis director) / Wang, Ruoyu (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2020-05