Matching Items (77)
150418-Thumbnail Image.png
Description
Diseases have been part of human life for generations and evolve within the population, sometimes dying out while other times becoming endemic or the cause of recurrent outbreaks. The long term influence of a disease stems from different dynamics within or between pathogen-host, that have been analyzed and studied by

Diseases have been part of human life for generations and evolve within the population, sometimes dying out while other times becoming endemic or the cause of recurrent outbreaks. The long term influence of a disease stems from different dynamics within or between pathogen-host, that have been analyzed and studied by many researchers using mathematical models. Co-infection with different pathogens is common, yet little is known about how infection with one pathogen affects the host's immunological response to another. Moreover, no work has been found in the literature that considers the variability of the host immune health or that examines a disease at the population level and its corresponding interconnectedness with the host immune system. Knowing that the spread of the disease in the population starts at the individual level, this thesis explores how variability in immune system response within an endemic environment affects an individual's vulnerability, and how prone it is to co-infections. Immunology-based models of Malaria and Tuberculosis (TB) are constructed by extending and modifying existing mathematical models in the literature. The two are then combined to give a single nine-variable model of co-infection with Malaria and TB. Because these models are difficult to gain any insight analytically due to the large number of parameters, a phenomenological model of co-infection is proposed with subsystems corresponding to the individual immunology-based model of a single infection. Within this phenomenological model, the variability of the host immune health is also incorporated through three different pathogen response curves using nonlinear bounded Michaelis-Menten functions that describe the level or state of immune system (healthy, moderate and severely compromised). The immunology-based models of Malaria and TB give numerical results that agree with the biological observations. The Malaria--TB co-infection model gives reasonable results and these suggest that the order in which the two diseases are introduced have an impact on the behavior of both. The subsystems of the phenomenological models that correspond to a single infection (either of Malaria or TB) mimic much of the observed behavior of the immunology-based counterpart and can demonstrate different behavior depending on the chosen pathogen response curve. In addition, varying some of the parameters and initial conditions in the phenomenological model yields a range of topologically different mathematical behaviors, which suggests that this behavior may be able to be observed in the immunology-based models as well. The phenomenological models clearly replicate the qualitative behavior of primary and secondary infection as well as co-infection. The mathematical solutions of the models correspond to the fundamental states described by immunologists: virgin state, immune state and tolerance state. The phenomenological model of co-infection also demonstrates a range of parameter values and initial conditions in which the introduction of a second disease causes both diseases to grow without bound even though those same parameters and initial conditions did not yield unbounded growth in the corresponding subsystems. This results applies to all three states of the host immune system. In terms of the immunology-based system, this would suggest the following: there may be parameter values and initial conditions in which a person can clear Malaria or TB (separately) from their system but in which the presence of both can result in the person dying of one of the diseases. Finally, this thesis studies links between epidemiology (population level) and immunology in an effort to assess the impact of pathogen's spread within the population on the immune response of individuals. Models of Malaria and TB are proposed that incorporate the immune system of the host into a mathematical model of an epidemic at the population level.
ContributorsSoho, Edmé L (Author) / Wirkus, Stephen (Thesis advisor) / Castillo-Chavez, Carlos (Thesis advisor) / Chowell-Puente, Gerardo (Committee member) / Arizona State University (Publisher)
Created2011
150234-Thumbnail Image.png
Description
Introductory programming courses, also known as CS1, have a specific set of expected outcomes related to the learning of the most basic and essential computational concepts in computer science (CS). However, two of the most often heard complaints in such courses are that (1) they are divorced from the reality

Introductory programming courses, also known as CS1, have a specific set of expected outcomes related to the learning of the most basic and essential computational concepts in computer science (CS). However, two of the most often heard complaints in such courses are that (1) they are divorced from the reality of application and (2) they make the learning of the basic concepts tedious. The concepts introduced in CS1 courses are highly abstract and not easily comprehensible. In general, the difficulty is intrinsic to the field of computing, often described as "too mathematical or too abstract." This dissertation presents a small-scale mixed method study conducted during the fall 2009 semester of CS1 courses at Arizona State University. This study explored and assessed students' comprehension of three core computational concepts - abstraction, arrays of objects, and inheritance - in both algorithm design and problem solving. Through this investigation students' profiles were categorized based on their scores and based on their mistakes categorized into instances of five computational thinking concepts: abstraction, algorithm, scalability, linguistics, and reasoning. It was shown that even though the notion of computational thinking is not explicit in the curriculum, participants possessed and/or developed this skill through the learning and application of the CS1 core concepts. Furthermore, problem-solving experiences had a direct impact on participants' knowledge skills, explanation skills, and confidence. Implications for teaching CS1 and for future research are also considered.
ContributorsBillionniere, Elodie V (Author) / Collofello, James (Thesis advisor) / Ganesh, Tirupalavanam G. (Thesis advisor) / VanLehn, Kurt (Committee member) / Burleson, Winslow (Committee member) / Arizona State University (Publisher)
Created2011
151802-Thumbnail Image.png
Description
The complexity of the systems that software engineers build has continuously grown since the inception of the field. What has not changed is the engineers' mental capacity to operate on about seven distinct pieces of information at a time. The widespread use of UML has led to more abstract software

The complexity of the systems that software engineers build has continuously grown since the inception of the field. What has not changed is the engineers' mental capacity to operate on about seven distinct pieces of information at a time. The widespread use of UML has led to more abstract software design activities, however the same cannot be said for reverse engineering activities. The introduction of abstraction to reverse engineering will allow the engineer to move farther away from the details of the system, increasing his ability to see the role that domain level concepts play in the system. In this thesis, we present a technique that facilitates filtering of classes from existing systems at the source level based on their relationship to concepts in the domain via a classification method using machine learning. We showed that concepts can be identified using a machine learning classifier based on source level metrics. We developed an Eclipse plugin to assist with the process of manually classifying Java source code, and collecting metrics and classifications into a standard file format. We developed an Eclipse plugin to act as a concept identifier that visually indicates a class as a domain concept or not. We minimized the size of training sets to ensure a useful approach in practice. This allowed us to determine that a training set of 7:5 to 10% is nearly as effective as a training set representing 50% of the system. We showed that random selection is the most consistent and effective means of selecting a training set. We found that KNN is the most consistent performer among the learning algorithms tested. We determined the optimal feature set for this classification problem. We discussed two possible structures besides a one to one mapping of domain knowledge to implementation. We showed that classes representing more than one concept are simply concepts at differing levels of abstraction. We also discussed composite concepts representing a domain concept implemented by more than one class. We showed that these composite concepts are difficult to detect because the problem is NP-complete.
ContributorsCarey, Maurice (Author) / Colbourn, Charles (Thesis advisor) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Sarjoughian, Hessam S. (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
151507-Thumbnail Image.png
Description
Solution methods for certain linear and nonlinear evolution equations are presented in this dissertation. Emphasis is placed mainly on the analytical treatment of nonautonomous differential equations, which are challenging to solve despite the existent numerical and symbolic computational software programs available. Ideas from the transformation theory are adopted allowing one

Solution methods for certain linear and nonlinear evolution equations are presented in this dissertation. Emphasis is placed mainly on the analytical treatment of nonautonomous differential equations, which are challenging to solve despite the existent numerical and symbolic computational software programs available. Ideas from the transformation theory are adopted allowing one to solve the problems under consideration from a non-traditional perspective. First, the Cauchy initial value problem is considered for a class of nonautonomous and inhomogeneous linear diffusion-type equation on the entire real line. Explicit transformations are used to reduce the equations under study to their corresponding standard forms emphasizing on natural relations with certain Riccati(and/or Ermakov)-type systems. These relations give solvability results for the Cauchy problem of the parabolic equation considered. The superposition principle allows to solve formally this problem from an unconventional point of view. An eigenfunction expansion approach is also considered for this general evolution equation. Examples considered to corroborate the efficacy of the proposed solution methods include the Fokker-Planck equation, the Black-Scholes model and the one-factor Gaussian Hull-White model. The results obtained in the first part are used to solve the Cauchy initial value problem for certain inhomogeneous Burgers-type equation. The connection between linear (the Diffusion-type) and nonlinear (Burgers-type) parabolic equations is stress in order to establish a strong commutative relation. Traveling wave solutions of a nonautonomous Burgers equation are also investigated. Finally, it is constructed explicitly the minimum-uncertainty squeezed states for quantum harmonic oscillators. They are derived by the action of corresponding maximal kinematical invariance group on the standard ground state solution. It is shown that the product of the variances attains the required minimum value only at the instances that one variance is a minimum and the other is a maximum, when the squeezing of one of the variances occurs. Such explicit construction is possible due to the relation between the diffusion-type equation studied in the first part and the time-dependent Schrodinger equation. A modication of the radiation field operators for squeezed photons in a perfect cavity is also suggested with the help of a nonstandard solution of Heisenberg's equation of motion.
ContributorsVega-Guzmán, José Manuel, 1982- (Author) / Sulov, Sergei K (Thesis advisor) / Castillo-Chavez, Carlos (Thesis advisor) / Platte, Rodrigo (Committee member) / Chowell-Puente, Gerardo (Committee member) / Arizona State University (Publisher)
Created2013
151940-Thumbnail Image.png
Description
Biological systems are complex in many dimensions as endless transportation and communication networks all function simultaneously. Our ability to intervene within both healthy and diseased systems is tied directly to our ability to understand and model core functionality. The progress in increasingly accurate and thorough high-throughput measurement technologies has provided

Biological systems are complex in many dimensions as endless transportation and communication networks all function simultaneously. Our ability to intervene within both healthy and diseased systems is tied directly to our ability to understand and model core functionality. The progress in increasingly accurate and thorough high-throughput measurement technologies has provided a deluge of data from which we may attempt to infer a representation of the true genetic regulatory system. A gene regulatory network model, if accurate enough, may allow us to perform hypothesis testing in the form of computational experiments. Of great importance to modeling accuracy is the acknowledgment of biological contexts within the models -- i.e. recognizing the heterogeneous nature of the true biological system and the data it generates. This marriage of engineering, mathematics and computer science with systems biology creates a cycle of progress between computer simulation and lab experimentation, rapidly translating interventions and treatments for patients from the bench to the bedside. This dissertation will first discuss the landscape for modeling the biological system, explore the identification of targets for intervention in Boolean network models of biological interactions, and explore context specificity both in new graphical depictions of models embodying context-specific genomic regulation and in novel analysis approaches designed to reveal embedded contextual information. Overall, the dissertation will explore a spectrum of biological modeling with a goal towards therapeutic intervention, with both formal and informal notions of biological context, in such a way that will enable future work to have an even greater impact in terms of direct patient benefit on an individualized level.
ContributorsVerdicchio, Michael (Author) / Kim, Seungchan (Thesis advisor) / Baral, Chitta (Committee member) / Stolovitzky, Gustavo (Committee member) / Collofello, James (Committee member) / Arizona State University (Publisher)
Created2013
Description
Laboratory automation systems have seen a lot of technological advances in recent times. As a result, the software that is written for them are becoming increasingly sophisticated. Existing software architectures and standards are targeted to a wider domain of software development and need to be customized in order to use

Laboratory automation systems have seen a lot of technological advances in recent times. As a result, the software that is written for them are becoming increasingly sophisticated. Existing software architectures and standards are targeted to a wider domain of software development and need to be customized in order to use them for developing software for laboratory automation systems. This thesis proposes an architecture that is based on existing software architectural paradigms and is specifically tailored to developing software for a laboratory automation system. The architecture is based on fairly autonomous software components that can be distributed across multiple computers. The components in the architecture make use of asynchronous communication methodologies that are facilitated by passing messages between one another. The architecture can be used to develop software that is distributed, responsive and thread-safe. The thesis also proposes a framework that has been developed to implement the ideas proposed by the architecture. The framework is used to develop software that is scalable, distributed, responsive and thread-safe. The framework currently has components to control very commonly used laboratory automation devices such as mechanical stages, cameras, and also to do common laboratory automation functionalities such as imaging.
ContributorsKuppuswamy, Venkataramanan (Author) / Meldrum, Deirdre (Thesis advisor) / Collofello, James (Thesis advisor) / Sarjoughian, Hessam S. (Committee member) / Johnson, Roger (Committee member) / Arizona State University (Publisher)
Created2012
152427-Thumbnail Image.png
Description
Consideration of both biological and human-use dynamics in coupled social-ecological systems is essential for the success of interventions such as marine reserves. As purely human institutions, marine reserves have no direct effects on ecological systems. Consequently, the success of a marine reserve depends on managers` ability to alter human behavior

Consideration of both biological and human-use dynamics in coupled social-ecological systems is essential for the success of interventions such as marine reserves. As purely human institutions, marine reserves have no direct effects on ecological systems. Consequently, the success of a marine reserve depends on managers` ability to alter human behavior in the direction and magnitude that supports reserve objectives. Further, a marine reserve is just one component in a larger coupled social-ecological system. The social, economic, political, and biological landscape all determine the social acceptability of a reserve, conflicts that arise, how the reserve interacts with existing fisheries management, accuracy of reserve monitoring, and whether the reserve is ultimately able to meet conservation and fishery enhancement goals. Just as the social-ecological landscape is critical at all stages for marine reserve, from initial establishment to maintenance, the reserve in turn interacts with biological and human use dynamics beyond its borders. Those interactions can lead to the failure of a reserve to meet management goals, or compromise management goals outside the reserve. I use a bio-economic model of a fishery in a spatially patchy environment to demonstrate how the pre-reserve fisheries management strategy determines the pattern of fishing effort displacement once the reserve is established, and discuss the social, political, and biological consequences of different patterns for the reserve and the fishery. Using a stochastic bio-economic model, I demonstrate how biological and human use connectivity can confound the accurate detection of reserve effects by violating assumptions in the quasi-experimental framework. Finally, I examine data on recreational fishing site selection to investigate changes in response to the announcement of enforcement of a marine reserve in the Gulf of California, Mexico. I generate a scale of fines that would fully or partially protect the reserve, providing a data-driven way for managers to balance biological and socio-economic goals. I suggest that natural resource managers consider human use dynamics with the same frequency, rigor, and tools as they do biological stocks.
ContributorsFujitani, Marie (Author) / Abbott, Joshua (Thesis advisor) / Fenichel, Eli (Thesis advisor) / Gerber, Leah (Committee member) / Anderies, John (Committee member) / Arizona State University (Publisher)
Created2014
151177-Thumbnail Image.png
Description
Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of

Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of single cells. Yet to date, no live-cell compatible version of the technology exists. In this thesis, a microfluidic chip with the ability to rotate live single cells in hydrodynamic microvortices about an axis parallel to the optical focal plane has been demonstrated. The chip utilizes a novel 3D microchamber design arranged beneath a main channel creating flow detachment into the chamber, producing recirculating flow conditions. Single cells are flowed through the main channel, held in the center of the microvortex by an optical trap, and rotated by the forces induced by the recirculating fluid flow. Computational fluid dynamics (CFD) was employed to optimize the geometry of the microchamber. Two methods for the fabrication of the 3D microchamber were devised: anisotropic etching of silicon and backside diffuser photolithography (BDPL). First, the optimization of the silicon etching conditions was demonstrated through design of experiment (DOE). In addition, a non-conventional method of soft-lithography was demonstrated which incorporates the use of two positive molds, one of the main channel and the other of the microchambers, compressed together during replication to produce a single ultra-thin (<200 µm) negative used for device assembly. Second, methods for using thick negative photoresists such as SU-8 with BDPL have been developed which include a new simple and effective method for promoting the adhesion of SU-8 to glass. An assembly method that bonds two individual ultra-thin (<100 µm) replications of the channel and the microfeatures has also been demonstrated. Finally, a pressure driven pumping system with nanoliter per minute flow rate regulation, sub-second response times, and < 3% flow variability has been designed and characterized. The fabrication and assembly of this device is inexpensive and utilizes simple variants of conventional microfluidic fabrication techniques, making it easily accessible to the single cell analysis community.
ContributorsMyers, Jakrey R (Author) / Meldrum, Deirdre (Thesis advisor) / Johnson, Roger (Committee member) / Frakes, David (Committee member) / Arizona State University (Publisher)
Created2012
151275-Thumbnail Image.png
Description
The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to

The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to an earn-as-you-go profit model for many cloud based applications. These applications can benefit from low level analyses for cost optimization and verification. Testing cloud applications to ensure they meet monetary cost objectives has not been well explored in the current literature. When considering revenues and costs for cloud applications, the resource economic model can be scaled down to the transaction level in order to associate source code with costs incurred while running in the cloud. Both static and dynamic analysis techniques can be developed and applied to understand how and where cloud applications incur costs. Such analyses can help optimize (i.e. minimize) costs and verify that they stay within expected tolerances. An adaptation of Worst Case Execution Time (WCET) analysis is presented here to statically determine worst case monetary costs of cloud applications. This analysis is used to produce an algorithm for determining control flow paths within an application that can exceed a given cost threshold. The corresponding results are used to identify path sections that contribute most to cost excess. A hybrid approach for determining cost excesses is also presented that is comprised mostly of dynamic measurements but that also incorporates calculations that are based on the static analysis approach. This approach uses operational profiles to increase the precision and usefulness of the calculations.
ContributorsBuell, Kevin, Ph.D (Author) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Lindquist, Timothy (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2012
136083-Thumbnail Image.png
Description
Mortality of 1918 influenza virus was high, partly due to bacteria coinfections. We characterize pandemic mortality in Arizona, which had high prevalence of tuberculosis. We applied regressions to over 35,000 data points to estimate the basic reproduction number and excess mortality. Age-specific mortality curves show elevated mortality for all age

Mortality of 1918 influenza virus was high, partly due to bacteria coinfections. We characterize pandemic mortality in Arizona, which had high prevalence of tuberculosis. We applied regressions to over 35,000 data points to estimate the basic reproduction number and excess mortality. Age-specific mortality curves show elevated mortality for all age groups, especially the young, and senior sparing effects. The low value for reproduction number indicates that transmissibility was moderately low.
ContributorsJenner, Melinda Eva (Author) / Chowell-Puente, Gerardo (Thesis director) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Life Sciences (Contributor)
Created2015-05