This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 311
Filtering by

Clear all filters

150019-Thumbnail Image.png
Description
Currently Java is making its way into the embedded systems and mobile devices like androids. The programs written in Java are compiled into machine independent binary class byte codes. A Java Virtual Machine (JVM) executes these classes. The Java platform additionally specifies the Java Native Interface (JNI). JNI allows Java

Currently Java is making its way into the embedded systems and mobile devices like androids. The programs written in Java are compiled into machine independent binary class byte codes. A Java Virtual Machine (JVM) executes these classes. The Java platform additionally specifies the Java Native Interface (JNI). JNI allows Java code that runs within a JVM to interoperate with applications or libraries that are written in other languages and compiled to the host CPU ISA. JNI plays an important role in embedded system as it provides a mechanism to interact with libraries specific to the platform. This thesis addresses the overhead incurred in the JNI due to reflection and serialization when objects are accessed on android based mobile devices. It provides techniques to reduce this overhead. It also provides an API to access objects through its reference through pinning its memory location. The Android emulator was used to evaluate the performance of these techniques and we observed that there was 5 - 10 % performance gain in the new Java Native Interface.
ContributorsChandrian, Preetham (Author) / Lee, Yann-Hang (Thesis advisor) / Davulcu, Hasan (Committee member) / Li, Baoxin (Committee member) / Arizona State University (Publisher)
Created2011
150026-Thumbnail Image.png
Description
As pointed out in the keynote speech by H. V. Jagadish in SIGMOD'07, and also commonly agreed in the database community, the usability of structured data by casual users is as important as the data management systems' functionalities. A major hardness of using structured data is the problem of easily

As pointed out in the keynote speech by H. V. Jagadish in SIGMOD'07, and also commonly agreed in the database community, the usability of structured data by casual users is as important as the data management systems' functionalities. A major hardness of using structured data is the problem of easily retrieving information from them given a user's information needs. Learning and using a structured query language (e.g., SQL and XQuery) is overwhelmingly burdensome for most users, as not only are these languages sophisticated, but the users need to know the data schema. Keyword search provides us with opportunities to conveniently access structured data and potentially significantly enhances the usability of structured data. However, processing keyword search on structured data is challenging due to various types of ambiguities such as structural ambiguity (keyword queries have no structure), keyword ambiguity (the keywords may not be accurate), user preference ambiguity (the user may have implicit preferences that are not indicated in the query), as well as the efficiency challenges due to large search space. This dissertation performs an expansive study on keyword search processing techniques as a gateway for users to access structured data and retrieve desired information. The key issues addressed include: (1) Resolving structural ambiguities in keyword queries by generating meaningful query results, which involves identifying relevant keyword matches, identifying return information, composing query results based on relevant matches and return information. (2) Resolving structural, keyword and user preference ambiguities through result analysis, including snippet generation, result differentiation, result clustering, result summarization/query expansion, etc. (3) Resolving the efficiency challenge in processing keyword search on structured data by utilizing and efficiently maintaining materialized views. These works deliver significant technical contributions towards building a full-fledged search engine for structured data.
ContributorsLiu, Ziyang (Author) / Chen, Yi (Thesis advisor) / Candan, Kasim S (Committee member) / Davulcu, Hasan (Committee member) / Jagadish, H V (Committee member) / Arizona State University (Publisher)
Created2011
Description
The focus of this study was the first Serbian opera, Na Uranku (At Dawn). It was written by Stanislav Binièki (1872-1942) and was first performed in 1903 at the National Theatre in Belgrade. There were two objectives of this project: (1) a live concert performance of the opera, which produced

The focus of this study was the first Serbian opera, Na Uranku (At Dawn). It was written by Stanislav Binièki (1872-1942) and was first performed in 1903 at the National Theatre in Belgrade. There were two objectives of this project: (1) a live concert performance of the opera, which produced an audio recording that can be found as an appendix; and, (2) an accompanying document containing a history and an analysis of the work. While Binièki's opera is recognized as an extraordinary artistic achievement, and a new genre of musical enrichment for Serbian music, little had been previously written either about the composer or the work. At Dawn is a romantic opera in the verismo tradition with national elements. The significance of this opera is not only in its artistic expression but also in how it helped the music of Serbia evolve. Early opera settings in Serbia in the mid-nineteenth to early twentieth century did not have the same wealth of history upon which to draw as had existed in the rich operatic oeuvre in Western Europe and Russia. Similarly, conditions for performance were not satisfactory, as were no professional orchestras or singers. Furthermore, audiences were not accustomed to this type of art form. The opera served as an educational instrument for the audience, not only training them to a different type of music but also evolving its national consciousness. Binièki's opera was a foundation on which later generations of composers built. The artistic value of this opera is emphasized. The musical language includes an assimilation of various influences from Western Europe and Russia, properly incorporated into the Serbian musical core. Audience reaction is discussed, a positive affirmation that Binièki was moving in the right direction in establishing a path for the further development of the artistic field of Serbian musical culture. A synopsis of the work as well as the requisite performing forces is also included.
ContributorsMinov, Jana (Author) / Russell, Timothy (Thesis advisor) / Levy, Benjamin (Committee member) / Schildkret, David (Committee member) / Rogers, Rodney (Committee member) / Reber, William (Committee member) / Arizona State University (Publisher)
Created2011
Description
Delirium is a piece for large wind ensemble that synthesizes compositional techniques to generate unique juxtapositions of contrasting musical elements. The piece is about 8:30 long and uses the full complement of winds, brass, and percussion. Although the composition begins tonally, chromatic alterations gradually shift the melodic content outside of

Delirium is a piece for large wind ensemble that synthesizes compositional techniques to generate unique juxtapositions of contrasting musical elements. The piece is about 8:30 long and uses the full complement of winds, brass, and percussion. Although the composition begins tonally, chromatic alterations gradually shift the melodic content outside of the tonal center. In addition to changes in the melody, octatonic, chromatic, and synthetic scales and quartal and quintal harmonies are progressively introduced throughout the piece to add color and create dissonance. Delirium contains four primary sections that are all related by chromatic mediant. The subdivisions of the first part create abrupt transitions between contrasting material, evocative of the symptoms of delirium. As each sub-section progresses, the A minor tonality of the opening gradually gives way to increased chromaticism and dissonance. The next area transitions to C minor and begins to feature octatonic scales, secundal harmonies, and chromatic flourishes more prominently. The full sound of the ensemble then drops to solo instruments in the third section, now in G# minor, where the elements of the previous section are built upon with the addition of synthetic scales and quartal harmonies. The last division, before the recapitulation of the opening material, provides a drastic change in atmosphere as the chromatic elements from before are removed and the tense sound of the quartal harmonies are replaced with quintal sonorities and a more tonal melody. The tonality of this final section is used to return to the opening material. After an incomplete recapitulation, the descending motive that is used throughout the piece, which can be found in measure 61 in the flutes, is inverted and layered by minor 3rds. This inverted figure builds to the same sonority found in measure138, before ending on an F# chord, a minor third away from the A minor tonal center of the opening and where the piece seems like it should end.
ContributorsBell, Jeremy, 1986- (Composer) / Rogers, Rodney (Thesis advisor) / Oldani, Robert (Committee member) / Levy, Benjamin (Committee member) / Arizona State University (Publisher)
Created2011
149794-Thumbnail Image.png
Description
Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them

Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them as a basis to determine the significance of other candidate genes, which will then be ranked based on the association they exhibit with respect to the given set of known genes. Experimental and computational data of various kinds have different reliability and relevance to a disease under study. This work presents a gene prioritization method based on integrated biological networks that incorporates and models the various levels of relevance and reliability of diverse sources. The method is shown to achieve significantly higher performance as compared to two well-known gene prioritization algorithms. Essentially, no bias in the performance was seen as it was applied to diseases of diverse ethnology, e.g., monogenic, polygenic and cancer. The method was highly stable and robust against significant levels of noise in the data. Biological networks are often sparse, which can impede the operation of associationbased gene prioritization algorithms such as the one presented here from a computational perspective. As a potential approach to overcome this limitation, we explore the value that transcription factor binding sites can have in elucidating suitable targets. Transcription factors are needed for the expression of most genes, especially in higher organisms and hence genes can be associated via their genetic regulatory properties. While each transcription factor recognizes specific DNA sequence patterns, such patterns are mostly unknown for many transcription factors. Even those that are known are inconsistently reported in the literature, implying a potentially high level of inaccuracy. We developed computational methods for prediction and improvement of transcription factor binding patterns. Tests performed on the improvement method by employing synthetic patterns under various conditions showed that the method is very robust and the patterns produced invariably converge to nearly identical series of patterns. Preliminary tests were conducted to incorporate knowledge from transcription factor binding sites into our networkbased model for prioritization, with encouraging results. Genes have widely different pertinences to the etiology and pathology of diseases. Thus, they can be ranked according to their disease-significance on a genomic scale, which is the subject of gene prioritization. Given a set of genes known to be related to a disease, it is reasonable to use them as a basis to determine the significance of other candidate genes, which will then be ranked based on the association they exhibit with respect to the given set of known genes. Experimental and computational data of various kinds have different reliability and relevance to a disease under study. This work presents a gene prioritization method based on integrated biological networks that incorporates and models the various levels of relevance and reliability of diverse sources. The method is shown to achieve significantly higher performance as compared to two well-known gene prioritization algorithms. Essentially, no bias in the performance was seen as it was applied to diseases of diverse ethnology, e.g., monogenic, polygenic and cancer. The method was highly stable and robust against significant levels of noise in the data. Biological networks are often sparse, which can impede the operation of associationbased gene prioritization algorithms such as the one presented here from a computational perspective. As a potential approach to overcome this limitation, we explore the value that transcription factor binding sites can have in elucidating suitable targets. Transcription factors are needed for the expression of most genes, especially in higher organisms and hence genes can be associated via their genetic regulatory properties. While each transcription factor recognizes specific DNA sequence patterns, such patterns are mostly unknown for many transcription factors. Even those that are known are inconsistently reported in the literature, implying a potentially high level of inaccuracy. We developed computational methods for prediction and improvement of transcription factor binding patterns. Tests performed on the improvement method by employing synthetic patterns under various conditions showed that the method is very robust and the patterns produced invariably converge to nearly identical series of patterns. Preliminary tests were conducted to incorporate knowledge from transcription factor binding sites into our networkbased model for prioritization, with encouraging results. To validate these approaches in a disease-specific context, we built a schizophreniaspecific network based on the inferred associations and performed a comprehensive prioritization of human genes with respect to the disease. These results are expected to be validated empirically, but computational validation using known targets are very positive.
ContributorsLee, Jang (Author) / Gonzalez, Graciela (Thesis advisor) / Ye, Jieping (Committee member) / Davulcu, Hasan (Committee member) / Gallitano-Mendel, Amelia (Committee member) / Arizona State University (Publisher)
Created2011
150348-Thumbnail Image.png
Description
Demands in file size and transfer rates for consumer-orientated products have escalated in recent times. This is primarily due to the emergence of high definition video content. Now factor in the consumer desire for convenience, and we find that wireless service is the most desired approach for inter-connectivity. Consumers expect

Demands in file size and transfer rates for consumer-orientated products have escalated in recent times. This is primarily due to the emergence of high definition video content. Now factor in the consumer desire for convenience, and we find that wireless service is the most desired approach for inter-connectivity. Consumers expect wireless service to emulate wired service with little to virtually no difference in quality of service (QoS). The background section of this document examines the QoS requirements for wireless connectivity of high definition video applications. I then proceed to look at proposed solutions at the physical (PHY) and the media access control (MAC) layers as well as cross-layer schemes. These schemes are subsequently are evaluated in terms of usefulness in a multi-gigabit, 60 GHz wireless multimedia system targeting the average consumer. It is determined that a substantial gap in published literature exists pertinent to this application. Specifically, little or no work has been found that shows how an adaptive PHYMAC cross-layer solution that provides real-time compensation for varying channel conditions might be actually implemented. Further, no work has been found that shows results of such a model. This research proposes, develops and implements in Matlab code an alternate cross-layer solution that will provide acceptable QoS service for multimedia applications. Simulations using actual high definition video sequences are used to test the proposed solution. Results based on the average PSNR metric show that a quasi-adaptive algorithm provides greater than 7 dB of improvement over a non-adaptive approach while a fully-adaptive alogrithm provides over18 dB of improvement. The fully adaptive implementation has been conclusively shown to be superior to non-adaptive techniques and sufficiently superior to even quasi-adaptive algorithms.
ContributorsBosco, Bruce (Author) / Reisslein, Martin (Thesis advisor) / Tepedelenlioğlu, Cihan (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2011
150398-Thumbnail Image.png
Description
Underwater acoustic communications face significant challenges unprecedented in radio terrestrial communications including long multipath delay spreads, strong Doppler effects, and stringent bandwidth requirements. Recently, multi-carrier communications based on orthogonal frequency division multiplexing (OFDM) have seen significant growth in underwater acoustic (UWA) communications, thanks to their well well-known robustness against severely

Underwater acoustic communications face significant challenges unprecedented in radio terrestrial communications including long multipath delay spreads, strong Doppler effects, and stringent bandwidth requirements. Recently, multi-carrier communications based on orthogonal frequency division multiplexing (OFDM) have seen significant growth in underwater acoustic (UWA) communications, thanks to their well well-known robustness against severely time-dispersive channels. However, the performance of OFDM systems over UWA channels significantly deteriorates due to severe intercarrier interference (ICI) resulting from rapid time variations of the channel. With the motivation of developing enabling techniques for OFDM over UWA channels, the major contributions of this thesis include (1) two effective frequencydomain equalizers that provide general means to counteract the ICI; (2) a family of multiple-resampling receiver designs dealing with distortions caused by user and/or path specific Doppler scaling effects; (3) proposal of using orthogonal frequency division multiple access (OFDMA) as an effective multiple access scheme for UWA communications; (4) the capacity evaluation for single-resampling versus multiple-resampling receiver designs. All of the proposed receiver designs have been verified both through simulations and emulations based on data collected in real-life UWA communications experiments. Particularly, the frequency domain equalizers are shown to be effective with significantly reduced pilot overhead and offer robustness against Doppler and timing estimation errors. The multiple-resampling designs, where each branch is tasked with the Doppler distortion of different paths and/or users, overcome the disadvantages of the commonly-used single-resampling receivers and yield significant performance gains. Multiple-resampling receivers are also demonstrated to be necessary for UWA OFDMA systems. The unique design effectively mitigates interuser interference (IUI), opening up the possibility to exploit advanced user subcarrier assignment schemes. Finally, the benefits of the multiple-resampling receivers are further demonstrated through channel capacity evaluation results.
ContributorsTu, Kai (Author) / Duman, Tolga M. (Thesis advisor) / Zhang, Junshan (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Papandreou-Suppappola, Antonia (Committee member) / Arizona State University (Publisher)
Created2011
150380-Thumbnail Image.png
Description
Great advances have been made in the construction of photovoltaic (PV) cells and modules, but array level management remains much the same as it has been in previous decades. Conventionally, the PV array is connected in a fixed topology which is not always appropriate in the presence of faults in

Great advances have been made in the construction of photovoltaic (PV) cells and modules, but array level management remains much the same as it has been in previous decades. Conventionally, the PV array is connected in a fixed topology which is not always appropriate in the presence of faults in the array, and varying weather conditions. With the introduction of smarter inverters and solar modules, the data obtained from the photovoltaic array can be used to dynamically modify the array topology and improve the array power output. This is beneficial especially when module mismatches such as shading, soiling and aging occur in the photovoltaic array. This research focuses on the topology optimization of PV arrays under shading conditions using measurements obtained from a PV array set-up. A scheme known as topology reconfiguration method is proposed to find the optimal array topology for a given weather condition and faulty module information. Various topologies such as the series-parallel (SP), the total cross-tied (TCT), the bridge link (BL) and their bypassed versions are considered. The topology reconfiguration method compares the efficiencies of the topologies, evaluates the percentage gain in the generated power that would be obtained by reconfiguration of the array and other factors to find the optimal topology. This method is employed for various possible shading patterns to predict the best topology. The results demonstrate the benefit of having an electrically reconfigurable array topology. The effects of irradiance and shading on the array performance are also studied. The simulations are carried out using a SPICE simulator. The simulation results are validated with the experimental data provided by the PACECO Company.
ContributorsBuddha, Santoshi Tejasri (Author) / Spanias, Andreas (Thesis advisor) / Tepedelenlioğlu, Cihan (Thesis advisor) / Zhang, Junshan (Committee member) / Arizona State University (Publisher)
Created2011
150333-Thumbnail Image.png
Description
A systematic approach to composition has been used by a variety of composers to control an assortment of musical elements in their pieces. This paper begins with a brief survey of some of the important systematic approaches that composers have employed in their compositions, devoting particular attention to Pierre Boulez's

A systematic approach to composition has been used by a variety of composers to control an assortment of musical elements in their pieces. This paper begins with a brief survey of some of the important systematic approaches that composers have employed in their compositions, devoting particular attention to Pierre Boulez's Structures Ia . The purpose of this survey is to examine several systematic approaches to composition by prominent composers and their philosophy in adopting this type of approach. The next section of the paper introduces my own systematic approach to composition: the Take-Away System. The third provides several musical applications of the system, citing my work, Octulus for two pianos, as an example. The appendix details theorems and observations within the system for further study.
ContributorsHarbin, Doug (Author) / Hackbarth, Glenn (Thesis advisor) / DeMars, James (Committee member) / Etezady, Roshanne, 1973- (Committee member) / Rockmaker, Jody (Committee member) / Rogers, Rodney (Committee member) / Arizona State University (Publisher)
Created2011
150358-Thumbnail Image.png
Description
During the twentieth-century, the dual influence of nationalism and modernism in the eclectic music from Latin America promoted an idiosyncratic style which naturally combined traditional themes, popular genres and secular music. The saxophone, commonly used as a popular instrument, started to develop a prominent role in Latin American classical music

During the twentieth-century, the dual influence of nationalism and modernism in the eclectic music from Latin America promoted an idiosyncratic style which naturally combined traditional themes, popular genres and secular music. The saxophone, commonly used as a popular instrument, started to develop a prominent role in Latin American classical music beginning in 1970. The lack of exposure and distribution of the Latin American repertoire has created a general perception that composers are not interested in the instrument, and that Latin American repertoire for classical saxophone is minimal. However, there are more than 1100 works originally written for saxophone in the region, and the amount continues to grow. This Modern Latin American Repertoire for Classical Saxophone: Recording Project and Performance Guide document establishes and exhibits seven works by seven representative Latin American composers.The recording includes works by Carlos Gonzalo Guzman (Colombia), Ricardo Tacuchian (Brazil), Roque Cordero (Panama), Luis Naón (Argentina), Andrés Alén-Rodriguez (Cuba), Alejandro César Morales (Mexico) and Jose-Luis Maúrtua (Peru), featuring a range of works for solo alto saxophone to alto saxophone with piano, alto saxophone with vibraphone, and tenor saxophone with electronic tape; thus forming an important selection of Latin American repertoire. Complete recorded performances of all seven pieces are supplemented by biographical, historical, and performance practice suggestions. The result is a written and audio guide to some of the most important pieces composed for classical saxophone in Latin America, with an emphasis on fostering interest in, and research into, composers who have contributed in the development and creation of the instrument in Latin America.
ContributorsOcampo Cardona, Javier Andrés (Author) / McAllister, Timothy (Thesis advisor) / Spring, Robert (Committee member) / Hill, Gary (Committee member) / Pilafian, Sam (Committee member) / Rogers, Rodney (Committee member) / Gardner, Joshua (Committee member) / Arizona State University (Publisher)
Created2011