This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 200
Filtering by

Clear all filters

151687-Thumbnail Image.png
Description

In recent years, an increase of environmental temperature in urban areas has raised many concerns. These areas are subjected to higher temperature compared to the rural surrounding areas. Modification of land surface and the use of materials such as concrete and/or asphalt are the main factors influencing the surface energy

In recent years, an increase of environmental temperature in urban areas has raised many concerns. These areas are subjected to higher temperature compared to the rural surrounding areas. Modification of land surface and the use of materials such as concrete and/or asphalt are the main factors influencing the surface energy balance and therefore the environmental temperature in the urban areas. Engineered materials have relatively higher solar energy absorption and tend to trap a relatively higher incoming solar radiation. They also possess a higher heat storage capacity that allows them to retain heat during the day and then slowly release it back into the atmosphere as the sun goes down. This phenomenon is known as the Urban Heat Island (UHI) effect and causes an increase in the urban air temperature. Many researchers believe that albedo is the key pavement affecting the urban heat island. However, this research has shown that the problem is more complex and that solar reflectivity may not be the only important factor to evaluate the ability of a pavement to mitigate UHI. The main objective of this study was to analyze and research the influence of pavement materials on the near surface air temperature. In order to accomplish this effort, test sections consisting of Hot Mix Asphalt (HMA), Porous Hot Mix asphalt (PHMA), Portland Cement Concrete (PCC), Pervious Portland Cement Concrete (PPCC), artificial turf, and landscape gravels were constructed in the Phoenix, Arizona area. Air temperature, albedo, wind speed, solar radiation, and wind direction were recorded, analyzed and compared above each pavement material type. The results showed that there was no significant difference in the air temperature at 3-feet and above, regardless of the type of the pavement. Near surface pavement temperatures were also measured and modeled. The results indicated that for the UHI analysis, it is important to consider the interaction between pavement structure, material properties, and environmental factors. Overall, this study demonstrated the complexity of evaluating pavement structures for UHI mitigation; it provided great insight on the effects of material types and properties on surface temperatures and near surface air temperature.

ContributorsPourshams-Manzouri, Tina (Author) / Kaloush, Kamil (Thesis advisor) / Wang, Zhihua (Thesis advisor) / Zapata, Claudia E. (Committee member) / Mamlouk, Michael (Committee member) / Arizona State University (Publisher)
Created2013
152208-Thumbnail Image.png
Description
Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households

Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households making more trips in larger vehicles with lower fuel economy. During the 1990s, SUVs were the fastest growing segment of the automotive industry, comprising 7% of the total light vehicle market in 1990, and 25% in 2005. More recently, due to rising oil prices, greater awareness to environmental sensitivity, the desire to reduce dependence on foreign oil, and the availability of new vehicle technologies, many households are considering the use of newer vehicles with better fuel economy, such as hybrids and electric vehicles, over the use of the SUV or low fuel economy vehicles they may already own. The goal of this research is to examine how vehicle miles traveled, fuel consumption and emissions may be reduced through shifts in vehicle type choice behavior. Using the 2009 National Household Travel Survey data it is possible to develop a model to estimate household travel demand and total fuel consumption. If given a vehicle choice shift scenario, using the model it would be possible to calculate the potential fuel consumption savings that would result from such a shift. In this way, it is possible to estimate fuel consumption reductions that would take place under a wide variety of scenarios.
ContributorsChristian, Keith (Author) / Pendyala, Ram M. (Thesis advisor) / Chester, Mikhail (Committee member) / Kaloush, Kamil (Committee member) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2013
151867-Thumbnail Image.png
Description
Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located

Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located within natural-language text and their semantic type is determined. This step is critical for later tasks in an information extraction pipeline, including normalization and relationship extraction. BANNER is a benchmark biomedical NER system using linear-chain conditional random fields and the rich feature set approach. A case study with BANNER locating genes and proteins in biomedical literature is described. The first corpus for disease NER adequate for use as training data is introduced, and employed in a case study of disease NER. The first corpus locating adverse drug reactions (ADRs) in user posts to a health-related social website is also described, and a system to locate and identify ADRs in social media text is created and evaluated. The rich feature set approach to creating NER feature sets is argued to be subject to diminishing returns, implying that additional improvements may require more sophisticated methods for creating the feature set. This motivates the first application of multivariate feature selection with filters and false discovery rate analysis to biomedical NER, resulting in a feature set at least 3 orders of magnitude smaller than the set created by the rich feature set approach. Finally, two novel approaches to NER by modeling the semantics of token sequences are introduced. The first method focuses on the sequence content by using language models to determine whether a sequence resembles entries in a lexicon of entity names or text from an unlabeled corpus more closely. The second method models the distributional semantics of token sequences, determining the similarity between a potential mention and the token sequences from the training data by analyzing the contexts where each sequence appears in a large unlabeled corpus. The second method is shown to improve the performance of BANNER on multiple data sets.
ContributorsLeaman, James Robert (Author) / Gonzalez, Graciela (Thesis advisor) / Baral, Chitta (Thesis advisor) / Cohen, Kevin B (Committee member) / Liu, Huan (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
151747-Thumbnail Image.png
Description
Heating of asphalt during production and construction causes the volatilization and oxidation of binders used in mixes. Volatilization and oxidation causes degradation of asphalt pavements by increasing the stiffness of the binders, increasing susceptibility to cracking and negatively affecting the functional and structural performance of the pavements. Degradation of asphalt

Heating of asphalt during production and construction causes the volatilization and oxidation of binders used in mixes. Volatilization and oxidation causes degradation of asphalt pavements by increasing the stiffness of the binders, increasing susceptibility to cracking and negatively affecting the functional and structural performance of the pavements. Degradation of asphalt binders by volatilization and oxidation due to high production temperature occur during early stages of pavement life and are known as Short Term Aging (STA). Elevated temperatures and increased exposure time to elevated temperatures causes increased STA of asphalt. The objective of this research was to investigate how elevated mixing temperatures and exposure time to elevated temperatures affect aging and stiffening of binders, thus influencing properties of the asphalt mixtures. The study was conducted in two stages. The first stage evaluated STA effect of asphalt binders. It involved aging two Performance Graded (PG) virgin asphalt binders, PG 76-16 and PG 64-22 at two different temperatures and durations, then measuring their viscosities. The second stage involved evaluating the effects of elevated STA temperature and time on properties of the asphalt mixtures. It involved STA of asphalt mixtures produced in the laboratory with the PG 64-22 binder at mixing temperatures elevated 25OF above standard practice; STA times at 2 and 4 hours longer than standard practices, and then compacted in a gyratory compactor. Dynamic modulus (E*) and Indirect Tensile Strength (IDT) were measured for the aged mixtures for each temperature and duration to determine the effect of different aging times and temperatures on the stiffness and fatigue properties of the aged asphalt mixtures. The binder test results showed that in all cases, there was increased viscosity. The results showed the highest increase in viscosity resulted from increased aging time. The results also indicated that PG 64-22 was more susceptible to elevated STA temperature and extended time than the PG 76-16 binders. The asphalt mixture test results confirmed the expected outcome that increasing the STA and mixing temperature by 25oF alters the stiffness of mixtures. Significant change in the dynamic modulus mostly occurred at four hour increase in STA time regardless of temperature.
ContributorsLolly, Rubben (Author) / Kaloush, Kamil (Thesis advisor) / Bearup, Wylie (Committee member) / Zapata, Claudia (Committee member) / Mamlouk, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151940-Thumbnail Image.png
Description
Biological systems are complex in many dimensions as endless transportation and communication networks all function simultaneously. Our ability to intervene within both healthy and diseased systems is tied directly to our ability to understand and model core functionality. The progress in increasingly accurate and thorough high-throughput measurement technologies has provided

Biological systems are complex in many dimensions as endless transportation and communication networks all function simultaneously. Our ability to intervene within both healthy and diseased systems is tied directly to our ability to understand and model core functionality. The progress in increasingly accurate and thorough high-throughput measurement technologies has provided a deluge of data from which we may attempt to infer a representation of the true genetic regulatory system. A gene regulatory network model, if accurate enough, may allow us to perform hypothesis testing in the form of computational experiments. Of great importance to modeling accuracy is the acknowledgment of biological contexts within the models -- i.e. recognizing the heterogeneous nature of the true biological system and the data it generates. This marriage of engineering, mathematics and computer science with systems biology creates a cycle of progress between computer simulation and lab experimentation, rapidly translating interventions and treatments for patients from the bench to the bedside. This dissertation will first discuss the landscape for modeling the biological system, explore the identification of targets for intervention in Boolean network models of biological interactions, and explore context specificity both in new graphical depictions of models embodying context-specific genomic regulation and in novel analysis approaches designed to reveal embedded contextual information. Overall, the dissertation will explore a spectrum of biological modeling with a goal towards therapeutic intervention, with both formal and informal notions of biological context, in such a way that will enable future work to have an even greater impact in terms of direct patient benefit on an individualized level.
ContributorsVerdicchio, Michael (Author) / Kim, Seungchan (Thesis advisor) / Baral, Chitta (Committee member) / Stolovitzky, Gustavo (Committee member) / Collofello, James (Committee member) / Arizona State University (Publisher)
Created2013
151963-Thumbnail Image.png
Description
Currently, to interact with computer based systems one needs to learn the specific interface language of that system. In most cases, interaction would be much easier if it could be done in natural language. For that, we will need a module which understands natural language and automatically translates it to

Currently, to interact with computer based systems one needs to learn the specific interface language of that system. In most cases, interaction would be much easier if it could be done in natural language. For that, we will need a module which understands natural language and automatically translates it to the interface language of the system. NL2KR (Natural language to knowledge representation) v.1 system is a prototype of such a system. It is a learning based system that learns new meanings of words in terms of lambda-calculus formulas given an initial lexicon of some words and their meanings and a training corpus of sentences with their translations. As a part of this thesis, we take the prototype NL2KR v.1 system and enhance various components of it to make it usable for somewhat substantial and useful interface languages. We revamped the lexicon learning components, Inverse-lambda and Generalization modules, and redesigned the lexicon learning algorithm which uses these components to learn new meanings of words. Similarly, we re-developed an inbuilt parser of the system in Answer Set Programming (ASP) and also integrated external parser with the system. Apart from this, we added some new rich features like various system configurations and memory cache in the learning component of the NL2KR system. These enhancements helped in learning more meanings of the words, boosted performance of the system by reducing the computation time by a factor of 8 and improved the usability of the system. We evaluated the NL2KR system on iRODS domain. iRODS is a rule-oriented data system, which helps in managing large set of computer files using policies. This system provides a Rule-Oriented interface langauge whose syntactic structure is like any procedural programming language (eg. C). However, direct translation of natural language (NL) to this interface language is difficult. So, for automatic translation of NL to this language, we define a simple intermediate Policy Declarative Language (IPDL) to represent the knowledge in the policies, which then can be directly translated to iRODS rules. We develop a corpus of 100 policy statements and manually translate them to IPDL langauge. This corpus is then used for the evaluation of NL2KR system. We performed 10 fold cross validation on the system. Furthermore, using this corpus, we illustrate how different components of our NL2KR system work.
ContributorsKumbhare, Kanchan Ravishankar (Author) / Baral, Chitta (Thesis advisor) / Ye, Jieping (Committee member) / Li, Baoxin (Committee member) / Arizona State University (Publisher)
Created2013
151793-Thumbnail Image.png
Description
Linear Temporal Logic is gaining increasing popularity as a high level specification language for robot motion planning due to its expressive power and scalability of LTL control synthesis algorithms. This formalism, however, requires expert knowledge and makes it inaccessible to non-expert users. This thesis introduces a graphical specification environment to

Linear Temporal Logic is gaining increasing popularity as a high level specification language for robot motion planning due to its expressive power and scalability of LTL control synthesis algorithms. This formalism, however, requires expert knowledge and makes it inaccessible to non-expert users. This thesis introduces a graphical specification environment to create high level motion plans to control robots in the field by converting a visual representation of the motion/task plan into a Linear Temporal Logic (LTL) specification. The visual interface is built on the Android tablet platform and provides functionality to create task plans through a set of well defined gestures and on screen controls. It uses the notion of waypoints to quickly and efficiently describe the motion plan and enables a variety of complex Linear Temporal Logic specifications to be described succinctly and intuitively by the user without the need for the knowledge and understanding of LTL specification. Thus, it opens avenues for its use by personnel in military, warehouse management, and search and rescue missions. This thesis describes the construction of LTL for various scenarios used for robot navigation using the visual interface developed and leverages the use of existing LTL based motion planners to carry out the task plan by a robot.
ContributorsSrinivas, Shashank (Author) / Fainekos, Georgios (Thesis advisor) / Baral, Chitta (Committee member) / Burleson, Winslow (Committee member) / Arizona State University (Publisher)
Created2013
151653-Thumbnail Image.png
Description
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling

Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling language in order to enhance expressivity, such as incorporating aggregates and interfaces with ontologies. Also, in order to overcome the grounding bottleneck of computation in ASP, there are increasing interests in integrating ASP with other computing paradigms, such as Constraint Programming (CP) and Satisfiability Modulo Theories (SMT). Due to the non-monotonic nature of the ASP semantics, such enhancements turned out to be non-trivial and the existing extensions are not fully satisfactory. We observe that one main reason for the difficulties rooted in the propositional semantics of ASP, which is limited in handling first-order constructs (such as aggregates and ontologies) and functions (such as constraint variables in CP and SMT) in natural ways. This dissertation presents a unifying view on these extensions by viewing them as instances of formulas with generalized quantifiers and intensional functions. We extend the first-order stable model semantics by by Ferraris, Lee, and Lifschitz to allow generalized quantifiers, which cover aggregate, DL-atoms, constraints and SMT theory atoms as special cases. Using this unifying framework, we study and relate different extensions of ASP. We also present a tight integration of ASP with SMT, based on which we enhance action language C+ to handle reasoning about continuous changes. Our framework yields a systematic approach to study and extend non-monotonic languages.
ContributorsMeng, Yunsong (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Fainekos, Georgios (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2013
151676-Thumbnail Image.png
Description
Laboratory assessment of crack resistance and propagation in asphalt concrete is a difficult task that challenges researchers and engineers. Several fracture mechanics based laboratory tests currently exist; however, these tests and subsequent analysis methods rely on elastic behavior assumptions and do not consider the time-dependent nature of asphalt concrete. The

Laboratory assessment of crack resistance and propagation in asphalt concrete is a difficult task that challenges researchers and engineers. Several fracture mechanics based laboratory tests currently exist; however, these tests and subsequent analysis methods rely on elastic behavior assumptions and do not consider the time-dependent nature of asphalt concrete. The C* Line Integral test has shown promise to capture crack resistance and propagation within asphalt concrete. In addition, the fracture mechanics based C* parameter considers the time-dependent creep behavior of the materials. However, previous research was limited and lacked standardized test procedure and detailed data analysis methods were not fully presented. This dissertation describes the development and refinement of the C* Fracture Test (CFT) based on concepts of the C* line integral test. The CFT is a promising test to assess crack propagation and fracture resistance especially in modified mixtures. A detailed CFT test protocol was developed based on a laboratory study of different specimen sizes and test conditions. CFT numerical simulations agreed with laboratory results and indicated that the maximum horizontal tensile stress (Mode I) occurs at the crack tip but diminishes at longer crack lengths when shear stress (Mode II) becomes present. Using CFT test results and the principles of time-temperature superposition, a crack growth rate master curve was successfully developed to describe crack growth over a range of test temperatures. This master curve can be applied to pavement design and analysis to describe crack propagation as a function of traffic conditions and pavement temperatures. Several plant mixtures were subjected to the CFT and results showed differences in resistance to crack propagation, especially when comparing an asphalt rubber mixture to a conventional one. Results indicated that crack propagation is ideally captured within a given range of dynamic modulus values. Crack growth rates and C* prediction models were successfully developed for all unmodified mixtures in the CFT database. These models can be used to predict creep crack propagation and the C* parameter when laboratory testing is not feasible. Finally, a conceptual approach to incorporate crack growth rate and the C* parameter into pavement design and analysis was presented.
ContributorsStempihar, Jeffrey (Author) / Kaloush, Kamil (Thesis advisor) / Witczak, Matthew (Committee member) / Mamlouk, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151471-Thumbnail Image.png
Description
In this dissertation I develop a deep theory of temporal planning well-suited to analyzing, understanding, and improving the state of the art implementations (as of 2012). At face-value the work is strictly theoretical; nonetheless its impact is entirely real and practical. The easiest portion of that impact to highlight concerns

In this dissertation I develop a deep theory of temporal planning well-suited to analyzing, understanding, and improving the state of the art implementations (as of 2012). At face-value the work is strictly theoretical; nonetheless its impact is entirely real and practical. The easiest portion of that impact to highlight concerns the notable improvements to the format of the temporal fragment of the International Planning Competitions (IPCs). Particularly: the theory I expound upon here is the primary cause of--and justification for--the altered (i) selection of benchmark problems, and (ii) notion of "winning temporal planner". For higher level motivation: robotics, web service composition, industrial manufacturing, business process management, cybersecurity, space exploration, deep ocean exploration, and logistics all benefit from applying domain-independent automated planning technique. Naturally, actually carrying out such case studies has much to offer. For example, we may extract the lesson that reasoning carefully about deadlines is rather crucial to planning in practice. More generally, effectively automating specifically temporal planning is well-motivated from applications. Entirely abstractly, the aim is to improve the theory of automated temporal planning by distilling from its practice. My thesis is that the key feature of computational interest is concurrency. To support, I demonstrate by way of compilation methods, worst-case counting arguments, and analysis of algorithmic properties such as completeness that the more immediately pressing computational obstacles (facing would-be temporal generalizations of classical planning systems) can be dealt with in theoretically efficient manner. So more accurately the technical contribution here is to demonstrate: The computationally significant obstacle to automated temporal planning that remains is just concurrency.
ContributorsCushing, William Albemarle (Author) / Kambhampati, Subbarao (Thesis advisor) / Weld, Daniel S. (Committee member) / Smith, David E. (Committee member) / Baral, Chitta (Committee member) / Davalcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2012