Matching Items (1,041)
Filtering by

Clear all filters

150038-Thumbnail Image.png
Description
In a large network (graph) it would be desirable to guarantee the existence of some local property based only on global knowledge of the network. Consider the following classical example: how many connections are necessary to guarantee that the network contains three nodes which are pairwise adjacent? It turns out

In a large network (graph) it would be desirable to guarantee the existence of some local property based only on global knowledge of the network. Consider the following classical example: how many connections are necessary to guarantee that the network contains three nodes which are pairwise adjacent? It turns out that more than n^2/4 connections are needed, and no smaller number will suffice in general. Problems of this type fall into the category of ``extremal graph theory.'' Generally speaking, extremal graph theory is the study of how global parameters of a graph are related to local properties. This dissertation deals with the relationship between minimum degree conditions of a host graph G and the property that G contains a specified spanning subgraph (or class of subgraphs). The goal is to find the optimal minimum degree which guarantees the existence of a desired spanning subgraph. This goal is achieved in four different settings, with the main tools being Szemeredi's Regularity Lemma; the Blow-up Lemma of Komlos, Sarkozy, and Szemeredi; and some basic probabilistic techniques.
ContributorsDeBiasio, Louis (Author) / Kierstead, Henry A (Thesis advisor) / Czygrinow, Andrzej (Thesis advisor) / Hurlbert, Glenn (Committee member) / Kadell, Kevin (Committee member) / Fishel, Susanna (Committee member) / Arizona State University (Publisher)
Created2011
149788-Thumbnail Image.png
Description
Residue number systems have gained significant importance in the field of high-speed digital signal processing due to their carry-free nature and speed-up provided by parallelism. The critical aspect in the application of RNS is the selection of the moduli set and the design of the conversion units. There have been

Residue number systems have gained significant importance in the field of high-speed digital signal processing due to their carry-free nature and speed-up provided by parallelism. The critical aspect in the application of RNS is the selection of the moduli set and the design of the conversion units. There have been several RNS moduli sets proposed for the implementation of digital filters. However, some are unbalanced and some do not provide the required dynamic range. This thesis addresses the drawbacks of existing RNS moduli sets and proposes a new moduli set for efficient implementation of FIR filters. An efficient VLSI implementation model has been derived for the design of a reverse converter from RNS to the conventional two's complement representation. This model facilitates the realization of a reverse converter for better performance with less hardware complexity when compared with the reverse converter designs of the existing balanced 4-moduli sets. Experimental results comparing multiply and accumulate units using RNS that are implemented using the proposed four-moduli set with the state-of-the-art balanced four-moduli sets, show large improvements in area (46%) and power (43%) reduction for various dynamic ranges. RNS FIR filters using the proposed moduli-set and existing balanced 4-moduli set are implemented in RTL and compared for chip area and power and observed 20% improvements. This thesis also presents threshold logic implementation of the reverse converter.
ContributorsChalivendra, Gayathri (Author) / Vrudhula, Sarma (Thesis advisor) / Shrivastava, Aviral (Committee member) / Bakkaloglu, Bertan (Committee member) / Arizona State University (Publisher)
Created2011
149851-Thumbnail Image.png
Description
This research describes software based remote attestation schemes for obtaining the integrity of an executing user application and the Operating System (OS) text section of an untrusted client platform. A trusted external entity issues a challenge to the client platform. The challenge is executable code which the client must execute,

This research describes software based remote attestation schemes for obtaining the integrity of an executing user application and the Operating System (OS) text section of an untrusted client platform. A trusted external entity issues a challenge to the client platform. The challenge is executable code which the client must execute, and the code generates results which are sent to the external entity. These results provide the external entity an assurance as to whether the client application and the OS are in pristine condition. This work also presents a technique where it can be verified that the application which was attested, did not get replaced by a different application after completion of the attestation. The implementation of these three techniques was achieved entirely in software and is backward compatible with legacy machines on the Intel x86 architecture. This research also presents two approaches to incorporating software based "root of trust" using Virtual Machine Monitors (VMMs). The first approach determines the integrity of an executing Guest OS from the Host OS using Linux Kernel-based Virtual Machine (KVM) and qemu emulation software. The second approach implements a small VMM called MIvmm that can be utilized as a trusted codebase to build security applications such as those implemented in this research. MIvmm was conceptualized and implemented without using any existing codebase; its minimal size allows it to be trustworthy. Both the VMM approaches leverage processor support for virtualization in the Intel x86 architecture.
ContributorsSrinivasan, Raghunathan (Author) / Dasgupta, Partha (Thesis advisor) / Colbourn, Charles (Committee member) / Shrivastava, Aviral (Committee member) / Huang, Dijiang (Committee member) / Dewan, Prashant (Committee member) / Arizona State University (Publisher)
Created2011
150111-Thumbnail Image.png
Description
Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to

Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to improve the upper bound on the size of the optimal solution. This dissertation presents an alternative method which can be used to improve a solution to a problem rather than construct a solution from scratch. Necessity analysis, which is the key to this approach, is the process of analyzing the necessity of each element in a solution. The post-optimization algorithm presented here utilizes the result of the necessity analysis to improve the quality of the solution by eliminating unnecessary objects from the solution. While this technique could potentially be applied to different domains, this dissertation focuses on k-restriction problems, where a solution to the problem can be presented as an array. A scalable post-optimization algorithm for covering arrays is described, which starts from a valid solution and performs necessity analysis to iteratively improve the quality of the solution. It is shown that not only can this technique improve upon the previously best known results, it can also be added as a refinement step to any construction technique and in most cases further improvements are expected. The post-optimization algorithm is then modified to accommodate every k-restriction problem; and this generic algorithm can be used as a starting point to create a reasonable sized solution for any such problem. This generic algorithm is then further refined for hash family problems, by adding a conflict graph analysis to the necessity analysis phase. By recoloring the conflict graphs a new degree of flexibility is explored, which can further improve the quality of the solution.
ContributorsNayeri, Peyman (Author) / Colbourn, Charles (Thesis advisor) / Konjevod, Goran (Thesis advisor) / Sen, Arunabha (Committee member) / Stanzione Jr, Daniel (Committee member) / Arizona State University (Publisher)
Created2011
137869-Thumbnail Image.png
Description
Meteorology is an uncommon term rarely resonating through elementary classrooms. However, it is a concept found in both fourth and sixth grade Arizona science standards. As issues involving the environment are becoming more pertinent, it is important to study and understand atmospheric processes along with fulfilling the standards for each

Meteorology is an uncommon term rarely resonating through elementary classrooms. However, it is a concept found in both fourth and sixth grade Arizona science standards. As issues involving the environment are becoming more pertinent, it is important to study and understand atmospheric processes along with fulfilling the standards for each grade level. This thesis project teaches the practical skills of weather map reading and weather forecasting through the creation and execution of an after school lesson with the aide of seven teen assistants.
ContributorsChoulet, Shayna (Author) / Walters, Debra (Thesis director) / Oliver, Jill (Committee member) / Balling, Robert (Committee member) / Barrett, The Honors College (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
137870-Thumbnail Image.png
Description
Plants are essential to human life. They release oxygen into the atmosphere for us to breathe. They also provide shelter, medicine, clothing, tools, and food. For many people, the food that is on their tables and in their supermarkets isn't given much thought. Where did it come from? What part

Plants are essential to human life. They release oxygen into the atmosphere for us to breathe. They also provide shelter, medicine, clothing, tools, and food. For many people, the food that is on their tables and in their supermarkets isn't given much thought. Where did it come from? What part of the plant is it? How does it relate to others in the plant kingdom? How do other cultures use this plant? The most many of us know about them is that they are at the supermarket when we need them for dinner (Nabhan, 2009) (Vileisis, 2008).
ContributorsBarron, Kara (Author) / Landrum, Leslie (Thesis director) / Swanson, Tod (Committee member) / Pigg, Kathleen (Committee member) / Barrett, The Honors College (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
137871-Thumbnail Image.png
DescriptionBased on previous research and findings it is proven that a non-profit class to create awareness will be beneficial in the prevention of eating disorders. This analysis will provide significant research to defend the proposed class.
ContributorsAllen, Brittany (Author) / Chung, Deborah (Author) / Fey, Richard (Thesis director) / Peck, Sidnee (Committee member) / Mazurkiewicz, Milena (Committee member) / Barrett, The Honors College (Contributor) / W. P. Carey School of Business (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
Description
Restraint stress is the most commonly used laboratory stressor. It is difficult to characterize as psychological or physical, because past studies show psychological features, but the nature of confinement adds a physical dimension. This was the first study to investigate how experience with restraint stress affects brain response to the

Restraint stress is the most commonly used laboratory stressor. It is difficult to characterize as psychological or physical, because past studies show psychological features, but the nature of confinement adds a physical dimension. This was the first study to investigate how experience with restraint stress affects brain response to the next stress without a physical burden. Pair-housed adult male rats were transported to a novel context and restrained or left undisturbed (6hr). The next day, rats were returned to the same context and were either restrained or left undisturbed in the context (n=8/group). After 90min, rats were euthanized to determine functional activation in limbic structures using Fos immunohistochemistry and to measure HPA axis reactivity through blood serum corticosterone levels. Regardless of day 1 experience, context exposure on day 2 enhanced Fos expression in CA1 and CA3 of the hippocampus, basolateral amygdala, and central amygdala. Conversely, other regions and corticosterone levels demonstrated modulation from the previous day's experience. Specifically, rats that were placed back into the restraint context but not restrained on day 2 showed enhanced Fos expression in the dentate gyrus suprapyramidal blade (DGSup), and infralimbic cortex (IL). Also Fos expression was attenuated in rats that received two restraint exposures in the IL and medial amygdala (MEA), suggesting habituation. Only the DG infrapyramidal blade (DGInf) showed enhanced Fos expression to restraint on day 2 without influence of the previous day. While context predominately directed Fos activation, prior experience with restraint influenced Fos expression in the DGSup, IL, MEA and corticosterone levels to support restraint having psychological components.
ContributorsAnouti, P. Danya (Author) / Conrad, D. Cheryl (Thesis director) / Hammer, Ronald (Committee member) / Hoffman, N. Ann (Committee member) / Barrett, The Honors College (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
151802-Thumbnail Image.png
Description
The complexity of the systems that software engineers build has continuously grown since the inception of the field. What has not changed is the engineers' mental capacity to operate on about seven distinct pieces of information at a time. The widespread use of UML has led to more abstract software

The complexity of the systems that software engineers build has continuously grown since the inception of the field. What has not changed is the engineers' mental capacity to operate on about seven distinct pieces of information at a time. The widespread use of UML has led to more abstract software design activities, however the same cannot be said for reverse engineering activities. The introduction of abstraction to reverse engineering will allow the engineer to move farther away from the details of the system, increasing his ability to see the role that domain level concepts play in the system. In this thesis, we present a technique that facilitates filtering of classes from existing systems at the source level based on their relationship to concepts in the domain via a classification method using machine learning. We showed that concepts can be identified using a machine learning classifier based on source level metrics. We developed an Eclipse plugin to assist with the process of manually classifying Java source code, and collecting metrics and classifications into a standard file format. We developed an Eclipse plugin to act as a concept identifier that visually indicates a class as a domain concept or not. We minimized the size of training sets to ensure a useful approach in practice. This allowed us to determine that a training set of 7:5 to 10% is nearly as effective as a training set representing 50% of the system. We showed that random selection is the most consistent and effective means of selecting a training set. We found that KNN is the most consistent performer among the learning algorithms tested. We determined the optimal feature set for this classification problem. We discussed two possible structures besides a one to one mapping of domain knowledge to implementation. We showed that classes representing more than one concept are simply concepts at differing levels of abstraction. We also discussed composite concepts representing a domain concept implemented by more than one class. We showed that these composite concepts are difficult to detect because the problem is NP-complete.
ContributorsCarey, Maurice (Author) / Colbourn, Charles (Thesis advisor) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Sarjoughian, Hessam S. (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
152166-Thumbnail Image.png
Description
The advent of threshold logic simplifies the traditional Boolean logic to the single level multi-input function. Threshold logic latch (TLL), among implementations of threshold logic, is functionally equivalent to a multi-input function with an edge triggered flip-flop, which stands out to improve area and both dynamic and leakage power consumption,

The advent of threshold logic simplifies the traditional Boolean logic to the single level multi-input function. Threshold logic latch (TLL), among implementations of threshold logic, is functionally equivalent to a multi-input function with an edge triggered flip-flop, which stands out to improve area and both dynamic and leakage power consumption, providing an appropriate design alternative. Accordingly, the TLL standard cell library is designed. Through technology mapping, hybrid circuit is generated by absorbing the logic cone backward from each flip-flip to get the smallest remaining feeder. With the scan test methodology adopted, design for testability (DFT) is proposed, including scan element design and scan chain insertion. Test synthesis flow is then introduced, according to the Cadence tool, RTL compiler. Test application is the process of applying vectors and the response analysis, which is mainly about the testbench design. A parameterized generic self-checking Verilog testbench is designed for static fault detection. Test development refers to the fault modeling, and test generation. Firstly, functional truth table test generation on TLL cells is proposed. Before the truth table test of the threshold function, the dependence of sequence of vectors applied, i.e., the dependence of current state on the previous state, should be eliminated. Transition test (dynamic pattern) on all weak inputs is proved to be able to test the reset function, which is supposed to erase the history in the reset phase before every evaluation phase. Remaining vectors in the truth table except the weak inputs are then applied statically (static pattern). Secondly, dynamic patterns for all weak inputs are proposed to detect structural transistor level faults analyzed in the TLL cell, with single fault assumption and stuck-at faults, stuck-on faults, and stuck-open faults under consideration. Containing those patterns, the functional test covers all testable structural faults inside the TLL. Thirdly, with the scope of the whole hybrid netlist, the procedure of test generation is proposed with three steps: scan chain test; test of feeders and other scan elements except TLLs; functional pattern test of TLL cells. Implementation of this procedure is discussed in the automatic test pattern generation (ATPG) chapter.
ContributorsHu, Yang (Author) / Vrudhula, Sarma (Thesis advisor) / Barnaby, Hugh (Committee member) / Yu, Shimeng (Committee member) / Arizona State University (Publisher)
Created2013