Matching Items (1,852)
Filtering by

Clear all filters

150019-Thumbnail Image.png
Description
Currently Java is making its way into the embedded systems and mobile devices like androids. The programs written in Java are compiled into machine independent binary class byte codes. A Java Virtual Machine (JVM) executes these classes. The Java platform additionally specifies the Java Native Interface (JNI). JNI allows Java

Currently Java is making its way into the embedded systems and mobile devices like androids. The programs written in Java are compiled into machine independent binary class byte codes. A Java Virtual Machine (JVM) executes these classes. The Java platform additionally specifies the Java Native Interface (JNI). JNI allows Java code that runs within a JVM to interoperate with applications or libraries that are written in other languages and compiled to the host CPU ISA. JNI plays an important role in embedded system as it provides a mechanism to interact with libraries specific to the platform. This thesis addresses the overhead incurred in the JNI due to reflection and serialization when objects are accessed on android based mobile devices. It provides techniques to reduce this overhead. It also provides an API to access objects through its reference through pinning its memory location. The Android emulator was used to evaluate the performance of these techniques and we observed that there was 5 - 10 % performance gain in the new Java Native Interface.
ContributorsChandrian, Preetham (Author) / Lee, Yann-Hang (Thesis advisor) / Davulcu, Hasan (Committee member) / Li, Baoxin (Committee member) / Arizona State University (Publisher)
Created2011
150026-Thumbnail Image.png
Description
As pointed out in the keynote speech by H. V. Jagadish in SIGMOD'07, and also commonly agreed in the database community, the usability of structured data by casual users is as important as the data management systems' functionalities. A major hardness of using structured data is the problem of easily

As pointed out in the keynote speech by H. V. Jagadish in SIGMOD'07, and also commonly agreed in the database community, the usability of structured data by casual users is as important as the data management systems' functionalities. A major hardness of using structured data is the problem of easily retrieving information from them given a user's information needs. Learning and using a structured query language (e.g., SQL and XQuery) is overwhelmingly burdensome for most users, as not only are these languages sophisticated, but the users need to know the data schema. Keyword search provides us with opportunities to conveniently access structured data and potentially significantly enhances the usability of structured data. However, processing keyword search on structured data is challenging due to various types of ambiguities such as structural ambiguity (keyword queries have no structure), keyword ambiguity (the keywords may not be accurate), user preference ambiguity (the user may have implicit preferences that are not indicated in the query), as well as the efficiency challenges due to large search space. This dissertation performs an expansive study on keyword search processing techniques as a gateway for users to access structured data and retrieve desired information. The key issues addressed include: (1) Resolving structural ambiguities in keyword queries by generating meaningful query results, which involves identifying relevant keyword matches, identifying return information, composing query results based on relevant matches and return information. (2) Resolving structural, keyword and user preference ambiguities through result analysis, including snippet generation, result differentiation, result clustering, result summarization/query expansion, etc. (3) Resolving the efficiency challenge in processing keyword search on structured data by utilizing and efficiently maintaining materialized views. These works deliver significant technical contributions towards building a full-fledged search engine for structured data.
ContributorsLiu, Ziyang (Author) / Chen, Yi (Thesis advisor) / Candan, Kasim S (Committee member) / Davulcu, Hasan (Committee member) / Jagadish, H V (Committee member) / Arizona State University (Publisher)
Created2011
149977-Thumbnail Image.png
Description
Reliable extraction of human pose features that are invariant to view angle and body shape changes is critical for advancing human movement analysis. In this dissertation, the multifactor analysis techniques, including the multilinear analysis and the multifactor Gaussian process methods, have been exploited to extract such invariant pose features from

Reliable extraction of human pose features that are invariant to view angle and body shape changes is critical for advancing human movement analysis. In this dissertation, the multifactor analysis techniques, including the multilinear analysis and the multifactor Gaussian process methods, have been exploited to extract such invariant pose features from video data by decomposing various key contributing factors, such as pose, view angle, and body shape, in the generation of the image observations. Experimental results have shown that the resulting pose features extracted using the proposed methods exhibit excellent invariance properties to changes in view angles and body shapes. Furthermore, using the proposed invariant multifactor pose features, a suite of simple while effective algorithms have been developed to solve the movement recognition and pose estimation problems. Using these proposed algorithms, excellent human movement analysis results have been obtained, and most of them are superior to those obtained from state-of-the-art algorithms on the same testing datasets. Moreover, a number of key movement analysis challenges, including robust online gesture spotting and multi-camera gesture recognition, have also been addressed in this research. To this end, an online gesture spotting framework has been developed to automatically detect and learn non-gesture movement patterns to improve gesture localization and recognition from continuous data streams using a hidden Markov network. In addition, the optimal data fusion scheme has been investigated for multicamera gesture recognition, and the decision-level camera fusion scheme using the product rule has been found to be optimal for gesture recognition using multiple uncalibrated cameras. Furthermore, the challenge of optimal camera selection in multi-camera gesture recognition has also been tackled. A measure to quantify the complementary strength across cameras has been proposed. Experimental results obtained from a real-life gesture recognition dataset have shown that the optimal camera combinations identified according to the proposed complementary measure always lead to the best gesture recognition results.
ContributorsPeng, Bo (Author) / Qian, Gang (Thesis advisor) / Ye, Jieping (Committee member) / Li, Baoxin (Committee member) / Spanias, Andreas (Committee member) / Arizona State University (Publisher)
Created2011
150029-Thumbnail Image.png
Description
A dual-channel directional digital hearing aid (DHA) front-end using a fully differential difference amplifier (FDDA) based Microphone interface circuit (MIC) for a capacitive Micro Electro Mechanical Systems (MEMS) microphones and an adaptive-power analog font end (AFE) is presented. The Microphone interface circuit based on FDDA converts

A dual-channel directional digital hearing aid (DHA) front-end using a fully differential difference amplifier (FDDA) based Microphone interface circuit (MIC) for a capacitive Micro Electro Mechanical Systems (MEMS) microphones and an adaptive-power analog font end (AFE) is presented. The Microphone interface circuit based on FDDA converts the capacitance variations into voltage signal, achieves a noise of 32 dB SPL (sound pressure level) and an SNR of 72 dB, additionally it also performs single to differential conversion allowing for fully differential analog signal chain. The analog front-end consists of 40dB VGA and a power scalable continuous time sigma delta ADC, with 68dB SNR dissipating 67u¬W from a 1.2V supply. The ADC implements a self calibrating feedback DAC, for calibrating the 2nd order non-linearity. The VGA and power scalable ADC is fabricated on 0.25 um CMOS TSMC process. The dual channels of the DHA are precisely matched and achieve about 0.5dB gain mismatch, resulting in greater than 5dB directivity index. This will enable a highly integrated and low power DHA
ContributorsNaqvi, Syed Roomi (Author) / Kiaei, Sayfe (Thesis advisor) / Bakkaloglu, Bertan (Committee member) / Chae, Junseok (Committee member) / Barnby, Hugh (Committee member) / Aberle, James T., 1961- (Committee member) / Arizona State University (Publisher)
Created2011
150036-Thumbnail Image.png
Description
Demand for biosensor research applications is growing steadily. According to a new report by Frost & Sullivan, the biosensor market is expected to reach $14.42 billion by 2016. Clinical diagnostic applications continue to be the largest market for biosensors, and this demand is likely to continue through 2016 and beyond.

Demand for biosensor research applications is growing steadily. According to a new report by Frost & Sullivan, the biosensor market is expected to reach $14.42 billion by 2016. Clinical diagnostic applications continue to be the largest market for biosensors, and this demand is likely to continue through 2016 and beyond. Biosensor technology for use in clinical diagnostics, however, requires translational research that moves bench science and theoretical knowledge toward marketable products. Despite the high volume of academic research to date, only a handful of biomedical devices have become viable commercial applications. Academic research must increase its focus on practical uses for biosensors. This dissertation is an example of this increased focus, and discusses work to advance microfluidic-based protein biosensor technologies for practical use in clinical diagnostics. Four areas of work are discussed: The first involved work to develop reusable/reconfigurable biosensors that are useful in applications like biochemical science and analytical chemistry that require detailed sensor calibration. This work resulted in a prototype sensor and an in-situ electrochemical surface regeneration technique that can be used to produce microfluidic-based reusable biosensors. The second area of work looked at non-specific adsorption (NSA) of biomolecules, which is a persistent challenge in conventional microfluidic biosensors. The results of this work produced design methods that reduce the NSA. The third area of work involved a novel microfluidic sensing platform that was designed to detect target biomarkers using competitive protein adsorption. This technique uses physical adsorption of proteins to a surface rather than complex and time-consuming immobilization procedures. This method enabled us to selectively detect a thyroid cancer biomarker, thyroglobulin, in a controlled-proteins cocktail and a cardiovascular biomarker, fibrinogen, in undiluted human serum. The fourth area of work involved expanding the technique to produce a unique protein identification method; Pattern-recognition. A sample mixture of proteins generates a distinctive composite pattern upon interaction with a sensing platform consisting of multiple surfaces whereby each surface consists of a distinct type of protein pre-adsorbed on the surface. The utility of the "pattern-recognition" sensing mechanism was then verified via recognition of a particular biomarker, C-reactive protein, in the cocktail sample mixture.
ContributorsChoi, Seokheun (Author) / Chae, Junseok (Thesis advisor) / Tao, Nongjian (Committee member) / Yu, Hongyu (Committee member) / Forzani, Erica (Committee member) / Arizona State University (Publisher)
Created2011
150046-Thumbnail Image.png
Description
This thesis describes a synthetic task environment, CyberCog, created for the purposes of 1) understanding and measuring individual and team situation awareness in the context of a cyber security defense task and 2) providing a context for evaluating algorithms, visualizations, and other interventions that are intended to improve cyber situation

This thesis describes a synthetic task environment, CyberCog, created for the purposes of 1) understanding and measuring individual and team situation awareness in the context of a cyber security defense task and 2) providing a context for evaluating algorithms, visualizations, and other interventions that are intended to improve cyber situation awareness. CyberCog provides an interactive environment for conducting human-in-loop experiments in which the participants of the experiment perform the tasks of a cyber security defense analyst in response to a cyber-attack scenario. CyberCog generates the necessary performance measures and interaction logs needed for measuring individual and team cyber situation awareness. Moreover, the CyberCog environment provides good experimental control for conducting effective situation awareness studies while retaining realism in the scenario and in the tasks performed.
ContributorsRajivan, Prashanth (Author) / Femiani, John (Thesis advisor) / Cooke, Nancy J. (Thesis advisor) / Lindquist, Timothy (Committee member) / Gary, Kevin (Committee member) / Arizona State University (Publisher)
Created2011
149988-Thumbnail Image.png
Description
Alzheimer's Disease (AD) is a debilitating neurodegenerative disease. The disease leads to dementia and loss of cognitive functions and affects about 4.5 million people in the United States. It is the 7th leading cause of death and is a huge financial burden on the healthcare industry. There are no means

Alzheimer's Disease (AD) is a debilitating neurodegenerative disease. The disease leads to dementia and loss of cognitive functions and affects about 4.5 million people in the United States. It is the 7th leading cause of death and is a huge financial burden on the healthcare industry. There are no means of diagnosing the disease before neurodegeneration is significant and sadly there is no cure that controls its progression. The protein beta-amyloid or Aâ plays an important role in the progression of the disease. It is formed from the cleavage of the Amyloid Precursor Protein by two enzymes - â and ã-secretases and is found in the plaques that are deposits found in Alzheimer brains. This work describes the generation of therapeutics based on inhibition of the cleavage by â-secretase. Using in-vitro recombinant antibody display libraries to screen for single chain variable fragment (scFv) antibodies; this work describes the isolation and characterization of scFv that target the â-secretase cleavage site on APP. This approach is especially relevant since non-specific inhibition of the enzyme may have undesirable effects since the enzyme has been shown to have other important substrates. The scFv iBSEC1 successfully recognized APP, reduced â-secretase cleavage of APP and reduced Aâ levels in a cell model of Alzheimer's Disease. This work then describes the first application of bispecific antibody therapeutics to Alzheimer's Disease. iBSEC1 scFv was combined with a proteolytic scFv that enhances the "good" pathway (á-secretase cleavage) that results in alternative cleavage of APP to generate the bispecific tandem scFv - DIA10D. DIA10D reduced APP cleavage by â-secretase and steered it towards the "good" pathway thus increasing the generation of the fragment sAPPá which is neuroprotective. Finally, treatment with iBSEC1 is evaluated for reduced oxidative stress, which is observed in cells over expressing APP when they are exposed to stress. Recombinant antibody based therapeutics like scFv have several advantages since they retain the high specificity of the antibodies but are safer since they lack the constant region and are smaller, potentially facilitating easier delivery to the brain
ContributorsBoddapati, Shanta (Author) / Sierks, Michael (Thesis advisor) / Arizona State University (Publisher)
Created2011
149991-Thumbnail Image.png
Description
With the introduction of compressed sensing and sparse representation,many image processing and computer vision problems have been looked at in a new way. Recent trends indicate that many challenging computer vision and image processing problems are being solved using compressive sensing and sparse representation algorithms. This thesis assays some applications

With the introduction of compressed sensing and sparse representation,many image processing and computer vision problems have been looked at in a new way. Recent trends indicate that many challenging computer vision and image processing problems are being solved using compressive sensing and sparse representation algorithms. This thesis assays some applications of compressive sensing and sparse representation with regards to image enhancement, restoration and classication. The first application deals with image Super-Resolution through compressive sensing based sparse representation. A novel framework is developed for understanding and analyzing some of the implications of compressive sensing in reconstruction and recovery of an image through raw-sampled and trained dictionaries. Properties of the projection operator and the dictionary are examined and the corresponding results presented. In the second application a novel technique for representing image classes uniquely in a high-dimensional space for image classification is presented. In this method, design and implementation strategy of the image classification system through unique affine sparse codes is presented, which leads to state of the art results. This further leads to analysis of some of the properties attributed to these unique sparse codes. In addition to obtaining these codes, a strong classier is designed and implemented to boost the results obtained. Evaluation with publicly available datasets shows that the proposed method outperforms other state of the art results in image classication. The final part of the thesis deals with image denoising with a novel approach towards obtaining high quality denoised image patches using only a single image. A new technique is proposed to obtain highly correlated image patches through sparse representation, which are then subjected to matrix completion to obtain high quality image patches. Experiments suggest that there may exist a structure within a noisy image which can be exploited for denoising through a low-rank constraint.
ContributorsKulkarni, Naveen (Author) / Li, Baoxin (Thesis advisor) / Ye, Jieping (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2011
149782-Thumbnail Image.png
Description
In this work, a novel method is developed for making nano- and micro- fibrous hydrogels capable of preventing the rejection of implanted materials. This is achieved by either (1) mimicking the native cellular environment, to exert fine control over the cellular response or (2) acting as a protective barrier, to

In this work, a novel method is developed for making nano- and micro- fibrous hydrogels capable of preventing the rejection of implanted materials. This is achieved by either (1) mimicking the native cellular environment, to exert fine control over the cellular response or (2) acting as a protective barrier, to camouflage the foreign nature of a material and evade recognition by the immune system. Comprehensive characterization and in vitro studies described here provide a foundation for developing substrates for use in clinical applications. Hydrogel dextran and poly(acrylic acid) (PAA) fibers are formed via electrospinning, in sizes ranging from nanometers to microns in diameter. While "as-electrospun" fibers are continuous in length, sonication is used to fragment fibers into short fiber "bristles" and generate nano- and micro- fibrous surface coatings over a wide range of topographies. Dex-PAA fibrous surfaces are chemically modified, and then optimized and characterized for non-fouling and ECM-mimetic properties. The non-fouling nature of fibers is verified, and cell culture studies show differential responses dependent upon chemical, topographical and mechanical properties. Dex-PAA fibers are advantageously unique in that (1) a fine degree of control is possible over three significant parameters critical for modifying cellular response: topography, chemistry and mechanical properties, over a range emulating that of native cellular environments, (2) the innate nature of the material is non-fouling, providing an inert background for adding back specific bioactive functionality, and (3) the fibers can be applied as a surface coating or comprise the scaffold itself. This is the first reported work of dex-PAA hydrogel fibers formed via electrospinning and thermal cross-linking, and unique to this method, no toxic solvents or cross-linking agents are needed to create hydrogels or for surface attachment. This is also the first reported work of using sonication to fragment electrospun hydrogel fibers, and in which surface coatings were made via simple electrostatic interaction and dehydration. These versatile features enable fibrous surface coatings to be applied to virtually any material. Results of this research broadly impact the design of biomaterials which contact cells in the body by directing the consequent cell-material interaction.
ContributorsLouie, Katherine BoYook (Author) / Massia, Stephen P (Thesis advisor) / Bennett, Kevin (Committee member) / Garcia, Antonio (Committee member) / Pauken, Christine (Committee member) / Vernon, Brent (Committee member) / Arizona State University (Publisher)
Created2011
149794-Thumbnail Image.png
Description
Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them

Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them as a basis to determine the significance of other candidate genes, which will then be ranked based on the association they exhibit with respect to the given set of known genes. Experimental and computational data of various kinds have different reliability and relevance to a disease under study. This work presents a gene prioritization method based on integrated biological networks that incorporates and models the various levels of relevance and reliability of diverse sources. The method is shown to achieve significantly higher performance as compared to two well-known gene prioritization algorithms. Essentially, no bias in the performance was seen as it was applied to diseases of diverse ethnology, e.g., monogenic, polygenic and cancer. The method was highly stable and robust against significant levels of noise in the data. Biological networks are often sparse, which can impede the operation of associationbased gene prioritization algorithms such as the one presented here from a computational perspective. As a potential approach to overcome this limitation, we explore the value that transcription factor binding sites can have in elucidating suitable targets. Transcription factors are needed for the expression of most genes, especially in higher organisms and hence genes can be associated via their genetic regulatory properties. While each transcription factor recognizes specific DNA sequence patterns, such patterns are mostly unknown for many transcription factors. Even those that are known are inconsistently reported in the literature, implying a potentially high level of inaccuracy. We developed computational methods for prediction and improvement of transcription factor binding patterns. Tests performed on the improvement method by employing synthetic patterns under various conditions showed that the method is very robust and the patterns produced invariably converge to nearly identical series of patterns. Preliminary tests were conducted to incorporate knowledge from transcription factor binding sites into our networkbased model for prioritization, with encouraging results. Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them as a basis to determine the significance of other candidate genes, which will then be ranked based on the association they exhibit with respect to the given set of known genes. Experimental and computational data of various kinds have different reliability and relevance to a disease under study. This work presents a gene prioritization method based on integrated biological networks that incorporates and models the various levels of relevance and reliability of diverse sources. The method is shown to achieve significantly higher performance as compared to two well-known gene prioritization algorithms. Essentially, no bias in the performance was seen as it was applied to diseases of diverse ethnology, e.g., monogenic, polygenic and cancer. The method was highly stable and robust against significant levels of noise in the data. Biological networks are often sparse, which can impede the operation of associationbased gene prioritization algorithms such as the one presented here from a computational perspective. As a potential approach to overcome this limitation, we explore the value that transcription factor binding sites can have in elucidating suitable targets. Transcription factors are needed for the expression of most genes, especially in higher organisms and hence genes can be associated via their genetic regulatory properties. While each transcription factor recognizes specific DNA sequence patterns, such patterns are mostly unknown for many transcription factors. Even those that are known are inconsistently reported in the literature, implying a potentially high level of inaccuracy. We developed computational methods for prediction and improvement of transcription factor binding patterns. Tests performed on the improvement method by employing synthetic patterns under various conditions showed that the method is very robust and the patterns produced invariably converge to nearly identical series of patterns. Preliminary tests were conducted to incorporate knowledge from transcription factor binding sites into our networkbased model for prioritization, with encouraging results. To validate these approaches in a disease-specific context, we built a schizophreniaspecific network based on the inferred associations and performed a comprehensive prioritization of human genes with respect to the disease. These results are expected to be validated empirically, but computational validation using known targets are very positive.
ContributorsLee, Jang (Author) / Gonzalez, Graciela (Thesis advisor) / Ye, Jieping (Committee member) / Davulcu, Hasan (Committee member) / Gallitano-Mendel, Amelia (Committee member) / Arizona State University (Publisher)
Created2011