Matching Items (449)
Filtering by

Clear all filters

150111-Thumbnail Image.png
Description
Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to

Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to improve the upper bound on the size of the optimal solution. This dissertation presents an alternative method which can be used to improve a solution to a problem rather than construct a solution from scratch. Necessity analysis, which is the key to this approach, is the process of analyzing the necessity of each element in a solution. The post-optimization algorithm presented here utilizes the result of the necessity analysis to improve the quality of the solution by eliminating unnecessary objects from the solution. While this technique could potentially be applied to different domains, this dissertation focuses on k-restriction problems, where a solution to the problem can be presented as an array. A scalable post-optimization algorithm for covering arrays is described, which starts from a valid solution and performs necessity analysis to iteratively improve the quality of the solution. It is shown that not only can this technique improve upon the previously best known results, it can also be added as a refinement step to any construction technique and in most cases further improvements are expected. The post-optimization algorithm is then modified to accommodate every k-restriction problem; and this generic algorithm can be used as a starting point to create a reasonable sized solution for any such problem. This generic algorithm is then further refined for hash family problems, by adding a conflict graph analysis to the necessity analysis phase. By recoloring the conflict graphs a new degree of flexibility is explored, which can further improve the quality of the solution.
ContributorsNayeri, Peyman (Author) / Colbourn, Charles (Thesis advisor) / Konjevod, Goran (Thesis advisor) / Sen, Arunabha (Committee member) / Stanzione Jr, Daniel (Committee member) / Arizona State University (Publisher)
Created2011
150114-Thumbnail Image.png
Description
Reverse engineering gene regulatory networks (GRNs) is an important problem in the domain of Systems Biology. Learning GRNs is challenging due to the inherent complexity of the real regulatory networks and the heterogeneity of samples in available biomedical data. Real world biological data are commonly collected from broad surveys (profiling

Reverse engineering gene regulatory networks (GRNs) is an important problem in the domain of Systems Biology. Learning GRNs is challenging due to the inherent complexity of the real regulatory networks and the heterogeneity of samples in available biomedical data. Real world biological data are commonly collected from broad surveys (profiling studies) and aggregate highly heterogeneous biological samples. Popular methods to learn GRNs simplistically assume a single universal regulatory network corresponding to available data. They neglect regulatory network adaptation due to change in underlying conditions and cellular phenotype or both. This dissertation presents a novel computational framework to learn common regulatory interactions and networks underlying the different sets of relatively homogeneous samples from real world biological data. The characteristic set of samples/conditions and corresponding regulatory interactions defines the cellular context (context). Context, in this dissertation, represents the deterministic transcriptional activity within the specific cellular regulatory mechanism. The major contributions of this framework include - modeling and learning context specific GRNs; associating enriched samples with contexts to interpret contextual interactions using biological knowledge; pruning extraneous edges from the context-specific GRN to improve the precision of the final GRNs; integrating multisource data to learn inter and intra domain interactions and increase confidence in obtained GRNs; and finally, learning combinatorial conditioning factors from the data to identify regulatory cofactors. The framework, Expattern, was applied to both real world and synthetic data. Interesting insights were obtained into mechanism of action of drugs on analysis of NCI60 drug activity and gene expression data. Application to refractory cancer data and Glioblastoma multiforme yield GRNs that were readily annotated with context-specific phenotypic information. Refractory cancer GRNs also displayed associations between distinct cancers, not observed through only clustering. Performance comparisons on multi-context synthetic data show the framework Expattern performs better than other comparable methods.
ContributorsSen, Ina (Author) / Kim, Seungchan (Thesis advisor) / Baral, Chitta (Committee member) / Bittner, Michael (Committee member) / Konjevod, Goran (Committee member) / Arizona State University (Publisher)
Created2011
149703-Thumbnail Image.png
Description
This dissertation studies routing in small-world networks such as grids plus long-range edges and real networks. Kleinberg showed that geography-based greedy routing in a grid-based network takes an expected number of steps polylogarithmic in the network size, thus justifying empirical efficiency observed beginning with Milgram. A counterpart for the grid-based

This dissertation studies routing in small-world networks such as grids plus long-range edges and real networks. Kleinberg showed that geography-based greedy routing in a grid-based network takes an expected number of steps polylogarithmic in the network size, thus justifying empirical efficiency observed beginning with Milgram. A counterpart for the grid-based model is provided; it creates all edges deterministically and shows an asymptotically matching upper bound on the route length. The main goal is to improve greedy routing through a decentralized machine learning process. Two considered methods are based on weighted majority and an algorithm of de Farias and Megiddo, both learning from feedback using ensembles of experts. Tests are run on both artificial and real networks, with decentralized spectral graph embedding supplying geometric information for real networks where it is not intrinsically available. An important measure analyzed in this work is overpayment, the difference between the cost of the method and that of the shortest path. Adaptive routing overtakes greedy after about a hundred or fewer searches per node, consistently across different network sizes and types. Learning stabilizes, typically at overpayment of a third to a half of that by greedy. The problem is made more difficult by eliminating the knowledge of neighbors' locations or by introducing uncooperative nodes. Even under these conditions, the learned routes are usually better than the greedy routes. The second part of the dissertation is related to the community structure of unannotated networks. A modularity-based algorithm of Newman is extended to work with overlapping communities (including considerably overlapping communities), where each node locally makes decisions to which potential communities it belongs. To measure quality of a cover of overlapping communities, a notion of a node contribution to modularity is introduced, and subsequently the notion of modularity is extended from partitions to covers. The final part considers a problem of network anonymization, mostly by the means of edge deletion. The point of interest is utility preservation. It is shown that a concentration on the preservation of routing abilities might damage the preservation of community structure, and vice versa.
ContributorsBakun, Oleg (Author) / Konjevod, Goran (Thesis advisor) / Richa, Andrea (Thesis advisor) / Syrotiuk, Violet R. (Committee member) / Czygrinow, Andrzej (Committee member) / Arizona State University (Publisher)
Created2011
137695-Thumbnail Image.png
Description
The use of synthetic cathinones or "bath salts" has risen dramatically in recent years with one of the most popular being Methylendioxypyrovalerone (MDPV). Following the temporary legislative ban on the sale and distribution of this compound , a multitude of other cathinone derivatives have been synthesized. The current study seeks

The use of synthetic cathinones or "bath salts" has risen dramatically in recent years with one of the most popular being Methylendioxypyrovalerone (MDPV). Following the temporary legislative ban on the sale and distribution of this compound , a multitude of other cathinone derivatives have been synthesized. The current study seeks to compare the abuse potential of MDPV with one of the emergent synthetic cathinones 4-methylethcathinone (4-MEC), based on their respective ability to lower current thresholds in an intracranial self-stimulation (ICSS) paradigm. Following acute administration (0.1, 0.5, 1 and 2 mg/kg i.p.) MDPV was found to significantly lower ICSS thresholds at all doses tested (F4,35=11.549, p<0.001). However, following acute administration (0.3,1,3,10,30 mg/kg i.p) 4-MEC produced no significant ICSS threshold depression (F5,135= 0.622, p = 0.684). Together these findings suggest that while MDPV may possess significant abuse potential, other synthetic cathinones such as 4-MEC may have a drastically reduced potential for abuse.
ContributorsWegner, Scott Andrew (Author) / Olive, M. Foster (Thesis director) / Presson, Clark (Committee member) / Sanabria, Federico (Committee member) / Barrett, The Honors College (Contributor) / Department of Chemistry and Biochemistry (Contributor) / Department of Psychology (Contributor)
Created2013-05
152286-Thumbnail Image.png
Description
Chronic restraint stress impairs hippocampal-mediated spatial learning and memory, which improves following a post-stress recovery period. Here, we investigated whether brain derived neurotrophic factor (BDNF), a protein important for hippocampal function, would alter the recovery from chronic stress-induced spatial memory deficits. Adult male Sprague-Dawley rats were infused into the hippocampus

Chronic restraint stress impairs hippocampal-mediated spatial learning and memory, which improves following a post-stress recovery period. Here, we investigated whether brain derived neurotrophic factor (BDNF), a protein important for hippocampal function, would alter the recovery from chronic stress-induced spatial memory deficits. Adult male Sprague-Dawley rats were infused into the hippocampus with adeno- associated viral vectors containing the coding sequence for short interfering (si)RNA directed against BDNF or a scrambled sequence (Scr), with both containing the coding information for green fluorescent protein to aid in anatomical localization. Rats were then chronically restrained (wire mesh, 6h/d/21d) and assessed for spatial learning and memory using a radial arm water maze (RAWM) either immediately after stressor cessation (Str-Imm) or following a 21-day post-stress recovery period (Str-Rec). All groups learned the RAWM task similarly, but differed on the memory retention trial. Rats in the Str-Imm group, regardless of viral vector contents, committed more errors in the spatial reference memory domain than did non-stressed controls. Importantly, the typical improvement in spatial memory following recovery from chronic stress was blocked with the siRNA against BDNF, as Str-Rec-siRNA performed worse on the RAWM compared to the non-stressed controls or Str-Rec-Scr. These effects were specific for the reference memory domain as repeated entry errors that reflect spatial working memory were unaffected by stress condition or viral vector contents. These results demonstrate that hippocampal BDNF is necessary for the recovery from stress-induced hippocampal dependent spatial memory deficits in the reference memory domain.
ContributorsOrtiz, J. Bryce (Author) / Conrad, Cheryl D. (Thesis advisor) / Olive, M. Foster (Committee member) / Taylor, Sara (Committee member) / Bimonte-Nelson, Heather A. (Committee member) / Arizona State University (Publisher)
Created2013
152325-Thumbnail Image.png
Description
The brain is a fundamental target of the stress response that promotes adaptation and survival but the repeated activation of the stress response has the potential alter cognition, emotion, and motivation, key functions of the limbic system. Three structures of the limbic system in particular, the hippocampus, medial prefrontal cortex

The brain is a fundamental target of the stress response that promotes adaptation and survival but the repeated activation of the stress response has the potential alter cognition, emotion, and motivation, key functions of the limbic system. Three structures of the limbic system in particular, the hippocampus, medial prefrontal cortex (mPFC), and amygdala, are of special interest due to documented structural changes and their implication in post-traumatic stress disorder (PTSD). One of many notable chronic stress-induced changes include dendritic arbor restructuring, which reflect plasticity patterns in parallel with the direction of alterations observed in functional imaging studies in PTSD patients. For instance, chronic stress produces dendritic retraction in the hippocampus and mPFC, but dendritic hypertrophy in the amygdala, consistent with functional imaging in patients with PTSD. Some have hypothesized that these limbic region's modifications contribute to one's susceptibility to develop PTSD following a traumatic event. Consequently, we used a familiar chronic stress procedure in a rat model to create a vulnerable brain that might develop traits consistent with PTSD when presented with a challenge. In adult male rats, chronic stress by wire mesh restraint (6h/d/21d) was followed by a variety of behavioral tasks including radial arm water maze (RAWM), fear conditioning and extinction, and fear memory reconsolidation to determine chronic stress effects on behaviors mediated by these limbic structures. In chapter 2, we corroborated past findings that chronic stress caused hippocampal CA3 dendritic retraction. Importantly, we present new findings that CA3 dendritic retraction corresponded with poor spatial memory in the RAWM and that these outcomes reversed after a recovery period. In chapter 3, we also showed that chronic stress impaired mPFC-mediated extinction memory, findings that others have reported. Using carefully assessed behavior, we present new findings that chronic stress impacted nonassociative fear by enhancing contextual fear during extinction that generalized to a new context. Moreover, the generalization behavior corresponded with enhanced functional activation in the hippocampus and amygdala during fear extinction memory retrieval. In chapter 5, we showed for the first time that chronic stress enhanced amygdala functional activation during fear memory retrieval, i.e., reactivation. Moreover, these enhanced fear memories were resistant to protein synthesis interference to disrupt a previously formed memory, called reconsolidation in a novel attempt to weaken chronic stress enhanced traumatic memory. Collectively, these studies demonstrated the plastic and dynamic effects of chronic stress on limbic neurocircuitry implicated in PTSD. We showed that chronic stress created a structural and functional imbalance across the hippocampus, mPFC, and amygdala, which lead to a PTSD-like phenotype with persistent and exaggerated fear following fear conditioning. These behavioral disruptions in conjunction with morphological and functional imaging data reflect a chronic stress-induced imbalance between hippocampal and mPFC regulation in favor of amygdala function overdrive, and supports a novel approach for traumatic memory processing in PTSD.
ContributorsHoffman, Ann (Author) / Conrad, Cheryl D. (Thesis advisor) / Olive, M. Foster (Committee member) / Hammer, Jr., Ronald P. (Committee member) / Sanabria, Federico (Committee member) / Arizona State University (Publisher)
Created2013
152165-Thumbnail Image.png
Description
Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are

Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are observed during residency for judgment of their skills. Although the value of this method of skills assessment cannot be ignored, novel methodologies of objective skills assessment need to be designed, developed, and evaluated that augment the traditional approach. Several sensor-based systems have been developed to measure a user's skill quantitatively, but use of sensors could interfere with skill execution and thus limit the potential for evaluating real-life surgery. However, having a method to judge skills automatically in real-life conditions should be the ultimate goal, since only with such features that a system would be widely adopted. This research proposes a novel video-based approach for observing surgeons' hand and surgical tool movements in minimally invasive surgical training exercises as well as during laparoscopic surgery. Because our system does not require surgeons to wear special sensors, it has the distinct advantage over alternatives of offering skills assessment in both learning and real-life environments. The system automatically detects major skill-measuring features from surgical task videos using a computing system composed of a series of computer vision algorithms and provides on-screen real-time performance feedback for more efficient skill learning. Finally, the machine-learning approach is used to develop an observer-independent composite scoring model through objective and quantitative measurement of surgical skills. To increase effectiveness and usability of the developed system, it is integrated with a cloud-based tool, which automatically assesses surgical videos upload to the cloud.
ContributorsIslam, Gazi (Author) / Li, Baoxin (Thesis advisor) / Liang, Jianming (Thesis advisor) / Dinu, Valentin (Committee member) / Greenes, Robert (Committee member) / Smith, Marshall (Committee member) / Kahol, Kanav (Committee member) / Patel, Vimla L. (Committee member) / Arizona State University (Publisher)
Created2013
152172-Thumbnail Image.png
Description
The primary function of the medium access control (MAC) protocol is managing access to a shared communication channel. From the viewpoint of transmitters, the MAC protocol determines each transmitter's persistence, the fraction of time it is permitted to spend transmitting. Schedule-based schemes implement stable persistences, achieving low variation in delay

The primary function of the medium access control (MAC) protocol is managing access to a shared communication channel. From the viewpoint of transmitters, the MAC protocol determines each transmitter's persistence, the fraction of time it is permitted to spend transmitting. Schedule-based schemes implement stable persistences, achieving low variation in delay and throughput, and sometimes bounding maximum delay. However, they adapt slowly, if at all, to changes in the network. Contention-based schemes are agile, adapting quickly to changes in perceived contention, but suffer from short-term unfairness, large variations in packet delay, and poor performance at high load. The perfect MAC protocol, it seems, embodies the strengths of both contention- and schedule-based approaches while avoiding their weaknesses. This thesis culminates in the design of a Variable-Weight and Adaptive Topology Transparent (VWATT) MAC protocol. The design of VWATT first required answers for two questions: (1) If a node is equipped with schedules of different weights, which weight should it employ? (2) How is the node to compute the desired weight in a network lacking centralized control? The first question is answered by the Topology- and Load-Aware (TLA) allocation which defines target persistences that conform to both network topology and traffic load. Simulations show the TLA allocation to outperform IEEE 802.11, improving on the expectation and variation of delay, throughput, and drop rate. The second question is answered in the design of an Adaptive Topology- and Load-Aware Scheduled (ATLAS) MAC that computes the TLA allocation in a decentralized and adaptive manner. Simulation results show that ATLAS converges quickly on the TLA allocation, supporting highly dynamic networks. With these questions answered, a construction based on transversal designs is given for a variable-weight topology transparent schedule that allows nodes to dynamically and independently select weights to accommodate local topology and traffic load. The schedule maintains a guarantee on maximum delay when the maximum neighbourhood size is not too large. The schedule is integrated with the distributed computation of ATLAS to create VWATT. Simulations indicate that VWATT offers the stable performance characteristics of a scheduled MAC while adapting quickly to changes in topology and traffic load.
ContributorsLutz, Jonathan (Author) / Colbourn, Charles J (Thesis advisor) / Syrotiuk, Violet R. (Thesis advisor) / Konjevod, Goran (Committee member) / Lloyd, Errol L. (Committee member) / Arizona State University (Publisher)
Created2013
152123-Thumbnail Image.png
Description
This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems

This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems biology level, I provide new targets to explore for the research community. Furthermore I present a new online web resource that unifies various bioinformatics databases to enable discovery of relevant features in 3D protein structures.
ContributorsMielke, Clinton (Author) / Mandarino, Lawrence (Committee member) / LaBaer, Joshua (Committee member) / Magee, D. Mitchell (Committee member) / Dinu, Valentin (Committee member) / Willis, Wayne (Committee member) / Arizona State University (Publisher)
Created2013
150897-Thumbnail Image.png
Description
The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary

The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary to obtain an answer and 2) in the underlying structure of the data, which does not conform to classical normal theory statistical assumptions and includes clustering and unobserved latent constructs. Until recently, the methods and tools necessary to effectively address the complexity of biomedical data were not ordinarily available. The utility of four methods--High Performance Computing, Monte Carlo Simulations, Multi-Level Modeling and Structural Equation Modeling--designed to help make sense of complex biomedical data are presented here.
ContributorsBrown, Justin Reed (Author) / Dinu, Valentin (Thesis advisor) / Johnson, William (Committee member) / Petitti, Diana (Committee member) / Arizona State University (Publisher)
Created2012