Matching Items (1,853)
Filtering by

Clear all filters

151793-Thumbnail Image.png
Description
Linear Temporal Logic is gaining increasing popularity as a high level specification language for robot motion planning due to its expressive power and scalability of LTL control synthesis algorithms. This formalism, however, requires expert knowledge and makes it inaccessible to non-expert users. This thesis introduces a graphical specification environment to

Linear Temporal Logic is gaining increasing popularity as a high level specification language for robot motion planning due to its expressive power and scalability of LTL control synthesis algorithms. This formalism, however, requires expert knowledge and makes it inaccessible to non-expert users. This thesis introduces a graphical specification environment to create high level motion plans to control robots in the field by converting a visual representation of the motion/task plan into a Linear Temporal Logic (LTL) specification. The visual interface is built on the Android tablet platform and provides functionality to create task plans through a set of well defined gestures and on screen controls. It uses the notion of waypoints to quickly and efficiently describe the motion plan and enables a variety of complex Linear Temporal Logic specifications to be described succinctly and intuitively by the user without the need for the knowledge and understanding of LTL specification. Thus, it opens avenues for its use by personnel in military, warehouse management, and search and rescue missions. This thesis describes the construction of LTL for various scenarios used for robot navigation using the visual interface developed and leverages the use of existing LTL based motion planners to carry out the task plan by a robot.
ContributorsSrinivas, Shashank (Author) / Fainekos, Georgios (Thesis advisor) / Baral, Chitta (Committee member) / Burleson, Winslow (Committee member) / Arizona State University (Publisher)
Created2013
151830-Thumbnail Image.png
Description
The lack of substantive, multi-dimensional perspectives on civic space planning and design has undermined the potential role of these valuable social and ecological amenities in advancing urban sustainability goals. Responding to these deficiencies, this dissertation utilized mixed quantitative and qualitative methods and synthesized multiple social and natural science perspectives to

The lack of substantive, multi-dimensional perspectives on civic space planning and design has undermined the potential role of these valuable social and ecological amenities in advancing urban sustainability goals. Responding to these deficiencies, this dissertation utilized mixed quantitative and qualitative methods and synthesized multiple social and natural science perspectives to inform the development of progressive civic space planning and design, theory, and public policy aimed at improving the social, economic, and environmental health of cities. Using Phoenix, Arizona as a case study, the analysis was tailored to arid cities, yet the products and findings are flexible enough to be geographically customized to the social, environmental, built, and public policy goals of other urbanized regions. Organized into three articles, the first paper applies geospatial and statistical methods to analyze and classify urban parks in Phoenix based on multiple social, ecological, and built criteria, including landuse-land cover, `greenness,' and site amenities, as well as the socio- economic and built characteristics of park neighborhoods. The second article uses spatial empirical analysis to rezone the City of Phoenix following transect form-based code. The current park system was then assessed within this framework and recommendations are presented to inform the planning and design of civic spaces sensitive to their social and built context. The final paper culminates in the development of a planning tool and site design guidelines for civic space planning and design across the urban-to-natural gradient augmented with multiple ecosystem service considerations and tailored to desert cities.
ContributorsIbes, Dorothy (Author) / Talen, Emily (Thesis advisor) / Boone, Christopher (Committee member) / Crewe, Katherine (Committee member) / Arizona State University (Publisher)
Created2013
151996-Thumbnail Image.png
Description
Despite the arid climate of Maricopa County, Arizona, vector-borne diseases have presented significant health challenges to the residents and public health professionals of Maricopa County in the past, and will continue to do so in the foreseeable future. Currently, West Nile virus is the only mosquitoes-transmitted disease actively, and natively,

Despite the arid climate of Maricopa County, Arizona, vector-borne diseases have presented significant health challenges to the residents and public health professionals of Maricopa County in the past, and will continue to do so in the foreseeable future. Currently, West Nile virus is the only mosquitoes-transmitted disease actively, and natively, transmitted throughout the state of Arizona. In an effort to gain a more complete understanding of the transmission dynamics of West Nile virus this thesis examines human, vector, and environment interactions as they exist within Maricopa County. Through ethnographic and geographic information systems research methods this thesis identifies 1) the individual factors that influence residents' knowledge and behaviors regarding mosquitoes, 2) the individual and regional factors that influence residents' knowledge of mosquito ecology and the spatial distribution of local mosquito populations, and 3) the environmental, demographic, and socioeconomic factors that influence mosquito abundance within Maricopa County. By identifying the factors that influence human-vector and vector-environment interactions, the results of this thesis may influence current and future educational and mosquito control efforts throughout Maricopa County.
ContributorsKunzweiler, Colin (Author) / Boone, Christopher (Thesis advisor) / Wutich, Amber (Committee member) / Brewis-Slade, Alexandra (Committee member) / Arizona State University (Publisher)
Created2013
152001-Thumbnail Image.png
Description
Despite significant advances in digital pathology and automation sciences, current diagnostic practice for cancer detection primarily relies on a qualitative manual inspection of tissue architecture and cell and nuclear morphology in stained biopsies using low-magnification, two-dimensional (2D) brightfield microscopy. The efficacy of this process is limited by inter-operator variations in

Despite significant advances in digital pathology and automation sciences, current diagnostic practice for cancer detection primarily relies on a qualitative manual inspection of tissue architecture and cell and nuclear morphology in stained biopsies using low-magnification, two-dimensional (2D) brightfield microscopy. The efficacy of this process is limited by inter-operator variations in sample preparation and imaging, and by inter-observer variability in assessment. Over the past few decades, the predictive value quantitative morphology measurements derived from computerized analysis of micrographs has been compromised by the inability of 2D microscopy to capture information in the third dimension, and by the anisotropic spatial resolution inherent to conventional microscopy techniques that generate volumetric images by stacking 2D optical sections to approximate 3D. To gain insight into the analytical 3D nature of cells, this dissertation explores the application of a new technology for single-cell optical computed tomography (optical cell CT) that is a promising 3D tomographic imaging technique which uses visible light absorption to image stained cells individually with sub-micron, isotropic spatial resolution. This dissertation provides a scalable analytical framework to perform fully-automated 3D morphological analysis from transmission-mode optical cell CT images of hematoxylin-stained cells. The developed framework performs rapid and accurate quantification of 3D cell and nuclear morphology, facilitates assessment of morphological heterogeneity, and generates shape- and texture-based biosignatures predictive of the cell state. Custom 3D image segmentation methods were developed to precisely delineate volumes of interest (VOIs) from reconstructed cell images. Comparison with user-defined ground truth assessments yielded an average agreement (DICE coefficient) of 94% for the cell and its nucleus. Seventy nine biologically relevant morphological descriptors (features) were computed from the segmented VOIs, and statistical classification methods were implemented to determine the subset of features that best predicted cell health. The efficacy of our proposed framework was demonstrated on an in vitro model of multistep carcinogenesis in human Barrett's esophagus (BE) and classifier performance using our 3D morphometric analysis was compared against computerized analysis of 2D image slices that reflected conventional cytological observation. Our results enable sensitive and specific nuclear grade classification for early cancer diagnosis and underline the value of the approach as an objective adjunctive tool to better understand morphological changes associated with malignant transformation.
ContributorsNandakumar, Vivek (Author) / Meldrum, Deirdre R (Thesis advisor) / Nelson, Alan C. (Committee member) / Karam, Lina J (Committee member) / Ye, Jieping (Committee member) / Johnson, Roger H (Committee member) / Bussey, Kimberly J (Committee member) / Arizona State University (Publisher)
Created2013
152003-Thumbnail Image.png
Description
We solve the problem of activity verification in the context of sustainability. Activity verification is the process of proving the user assertions pertaining to a certain activity performed by the user. Our motivation lies in incentivizing the user for engaging in sustainable activities like taking public transport or recycling. Such

We solve the problem of activity verification in the context of sustainability. Activity verification is the process of proving the user assertions pertaining to a certain activity performed by the user. Our motivation lies in incentivizing the user for engaging in sustainable activities like taking public transport or recycling. Such incentivization schemes require the system to verify the claim made by the user. The system verifies these claims by analyzing the supporting evidence captured by the user while performing the activity. The proliferation of portable smart-phones in the past few years has provided us with a ubiquitous and relatively cheap platform, having multiple sensors like accelerometer, gyroscope, microphone etc. to capture this evidence data in-situ. In this research, we investigate the supervised and semi-supervised learning techniques for activity verification. Both these techniques make use the data set constructed using the evidence submitted by the user. Supervised learning makes use of annotated evidence data to build a function to predict the class labels of the unlabeled data points. The evidence data captured can be either unimodal or multimodal in nature. We use the accelerometer data as evidence for transportation mode verification and image data as evidence for recycling verification. After training the system, we achieve maximum accuracy of 94% when classifying the transport mode and 81% when detecting recycle activity. In the case of recycle verification, we could improve the classification accuracy by asking the user for more evidence. We present some techniques to ask the user for the next best piece of evidence that maximizes the probability of classification. Using these techniques for detecting recycle activity, the accuracy increases to 93%. The major disadvantage of using supervised models is that it requires extensive annotated training data, which expensive to collect. Due to the limited training data, we look at the graph based inductive semi-supervised learning methods to propagate the labels among the unlabeled samples. In the semi-supervised approach, we represent each instance in the data set as a node in the graph. Since it is a complete graph, edges interconnect these nodes, with each edge having some weight representing the similarity between the points. We propagate the labels in this graph, based on the proximity of the data points to the labeled nodes. We estimate the performance of these algorithms by measuring how close the probability distribution of the data after label propagation is to the probability distribution of the ground truth data. Since labeling has a cost associated with it, in this thesis we propose two algorithms that help us in selecting minimum number of labeled points to propagate the labels accurately. Our proposed algorithm achieves a maximum of 73% increase in performance when compared to the baseline algorithm.
ContributorsDesai, Vaishnav (Author) / Sundaram, Hari (Thesis advisor) / Li, Baoxin (Thesis advisor) / Turaga, Pavan (Committee member) / Arizona State University (Publisher)
Created2013
152016-Thumbnail Image.png
Description
Energy is a central concern of sustainability because how we produce and consume energy affects society, economy, and the environment. Sustainability scientists are interested in energy transitions away from fossil fuels because they are nonrenewable, increasingly expensive, have adverse health effects, and may be the main driver of climate change.

Energy is a central concern of sustainability because how we produce and consume energy affects society, economy, and the environment. Sustainability scientists are interested in energy transitions away from fossil fuels because they are nonrenewable, increasingly expensive, have adverse health effects, and may be the main driver of climate change. They see an opportunity for developing countries to avoid the negative consequences fossil-fuel-based energy systems, and also to increase resilience, by leap-frogging-over the centralized energy grid systems that dominate the developed world. Energy transitions pose both challenges and opportunities. Obstacles to transitions include 1) an existing, centralized, complex energy-grid system, whose function is invisible to most users, 2) coordination and collective-action problems that are path dependent, and 3) difficulty in scaling up RE technologies. Because energy transitions rely on technological and social innovations, I am interested in how institutional factors can be leveraged to surmount these obstacles. The overarching question that underlies my research is: What constellation of institutional, biophysical, and social factors are essential for an energy transition? My objective is to derive a set of "design principles," that I term institutional drivers, for energy transitions analogous to Ostrom's institutional design principles. My dissertation research will analyze energy transitions using two approaches: applying the Institutional Analysis and Development Framework and a comparative case study analysis comprised of both primary and secondary sources. This dissertation includes: 1) an analysis of the world's energy portfolio; 2) a case study analysis of five countries; 3) a description of the institutional factors likely to promote a transition to renewable-energy use; and 4) an in-depth case study of Thailand's progress in replacing nonrenewable energy sources with renewable energy sources. My research will contribute to our understanding of how energy transitions at different scales can be accomplished in developing countries and what it takes for innovation to spread in a society.
ContributorsKoster, Auriane Magdalena (Author) / Anderies, John M (Thesis advisor) / Aggarwal, Rimjhim (Committee member) / Van Der Leeuw, Sander (Committee member) / Arizona State University (Publisher)
Created2013
151653-Thumbnail Image.png
Description
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling

Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling language in order to enhance expressivity, such as incorporating aggregates and interfaces with ontologies. Also, in order to overcome the grounding bottleneck of computation in ASP, there are increasing interests in integrating ASP with other computing paradigms, such as Constraint Programming (CP) and Satisfiability Modulo Theories (SMT). Due to the non-monotonic nature of the ASP semantics, such enhancements turned out to be non-trivial and the existing extensions are not fully satisfactory. We observe that one main reason for the difficulties rooted in the propositional semantics of ASP, which is limited in handling first-order constructs (such as aggregates and ontologies) and functions (such as constraint variables in CP and SMT) in natural ways. This dissertation presents a unifying view on these extensions by viewing them as instances of formulas with generalized quantifiers and intensional functions. We extend the first-order stable model semantics by by Ferraris, Lee, and Lifschitz to allow generalized quantifiers, which cover aggregate, DL-atoms, constraints and SMT theory atoms as special cases. Using this unifying framework, we study and relate different extensions of ASP. We also present a tight integration of ASP with SMT, based on which we enhance action language C+ to handle reasoning about continuous changes. Our framework yields a systematic approach to study and extend non-monotonic languages.
ContributorsMeng, Yunsong (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Fainekos, Georgios (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2013
151673-Thumbnail Image.png
Description
Life Cycle Assessment (LCA) quantifies environmental impacts of products in raw material extraction, processing, manufacturing, distribution, use and final disposal. The findings of an LCA can be used to improve industry practices, to aid in product development, and guide public policy. Unfortunately, existing approaches to LCA are unreliable in the

Life Cycle Assessment (LCA) quantifies environmental impacts of products in raw material extraction, processing, manufacturing, distribution, use and final disposal. The findings of an LCA can be used to improve industry practices, to aid in product development, and guide public policy. Unfortunately, existing approaches to LCA are unreliable in the cases of emerging technologies, where data is unavailable and rapid technological advances outstrip environmental knowledge. Previous studies have demonstrated several shortcomings to existing practices, including the masking of environmental impacts, the difficulty of selecting appropriate weight sets for multi-stakeholder problems, and difficulties in exploration of variability and uncertainty. In particular, there is an acute need for decision-driven interpretation methods that can guide decision makers towards making balanced, environmentally sound decisions in instances of high uncertainty. We propose the first major methodological innovation in LCA since early establishment of LCA as the analytical perspective of choice in problems of environmental management. We propose to couple stochastic multi-criteria decision analytic tools with existing approaches to inventory building and characterization to create a robust approach to comparative technology assessment in the context of high uncertainty, rapid technological change, and evolving stakeholder values. Namely, this study introduces a novel method known as Stochastic Multi-attribute Analysis for Life Cycle Impact Assessment (SMAA-LCIA) that uses internal normalization by means of outranking and exploration of feasible weight spaces.
ContributorsPrado, Valentina (Author) / Seager, Thomas P (Thesis advisor) / Landis, Amy E. (Committee member) / Chester, Mikhail (Committee member) / White, Philip (Committee member) / Arizona State University (Publisher)
Created2013
151323-Thumbnail Image.png
Description
This study investigates how well prominent behavioral theories from social psychology explain green purchasing behavior (GPB). I assess three prominent theories in terms of their suitability for GPB research, their attractiveness to GPB empiricists, and the strength of their empirical evidence when applied to GPB. First, a qualitative assessment of

This study investigates how well prominent behavioral theories from social psychology explain green purchasing behavior (GPB). I assess three prominent theories in terms of their suitability for GPB research, their attractiveness to GPB empiricists, and the strength of their empirical evidence when applied to GPB. First, a qualitative assessment of the Theory of Planned Behavior (TPB), Norm Activation Theory (NAT), and Value-Belief-Norm Theory (VBN) is conducted to evaluate a) how well the phenomenon and concepts in each theory match the characteristics of pro-environmental behavior and b) how well the assumptions made in each theory match common assumptions made in purchasing theory. Second, a quantitative assessment of these three theories is conducted in which r2 values and methodological parameters (e.g., sample size) are collected from a sample of 21 empirical studies on GPB to evaluate the accuracy and generalize-ability of empirical evidence. In the qualitative assessment, the results show each theory has its advantages and disadvantages. The results also provide a theoretically-grounded roadmap for modifying each theory to be more suitable for GPB research. In the quantitative assessment, the TPB outperforms the other two theories in every aspect taken into consideration. It proves to 1) create the most accurate models 2) be supported by the most generalize-able empirical evidence and 3) be the most attractive theory to empiricists. Although the TPB establishes itself as the best foundational theory for an empiricist to start from, it's clear that a more comprehensive model is needed to achieve consistent results and improve our understanding of GPB. NAT and the Theory of Interpersonal Behavior (TIB) offer pathways to extend the TPB. The TIB seems particularly apt for this endeavor, while VBN does not appear to have much to offer. Overall, the TPB has already proven to hold a relatively high predictive value. But with the state of ecosystem services continuing to decline on a global scale, it's important for models of GPB to become more accurate and reliable. Better models have the capacity to help marketing professionals, product developers, and policy makers develop strategies for encouraging consumers to buy green products.
ContributorsRedd, Thomas Christopher (Author) / Dooley, Kevin (Thesis advisor) / Basile, George (Committee member) / Darnall, Nicole (Committee member) / Arizona State University (Publisher)
Created2012
151336-Thumbnail Image.png
Description
Over 2 billion people are using online social network services, such as Facebook, Twitter, Google+, LinkedIn, and Pinterest. Users update their status, post their photos, share their information, and chat with others in these social network sites every day; however, not everyone shares the same amount of information. This thesis

Over 2 billion people are using online social network services, such as Facebook, Twitter, Google+, LinkedIn, and Pinterest. Users update their status, post their photos, share their information, and chat with others in these social network sites every day; however, not everyone shares the same amount of information. This thesis explores methods of linking publicly available data sources as a means of extrapolating missing information of Facebook. An application named "Visual Friends Income Map" has been created on Facebook to collect social network data and explore geodemographic properties to link publicly available data, such as the US census data. Multiple predictors are implemented to link data sets and extrapolate missing information from Facebook with accurate predictions. The location based predictor matches Facebook users' locations with census data at the city level for income and demographic predictions. Age and relationship based predictors are created to improve the accuracy of the proposed location based predictor utilizing social network link information. In the case where a user does not share any location information on their Facebook profile, a kernel density estimation location predictor is created. This predictor utilizes publicly available telephone record information of all people with the same surname of this user in the US to create a likelihood distribution of the user's location. This is combined with the user's IP level information in order to narrow the probability estimation down to a local regional constraint.
ContributorsMao, Jingxian (Author) / Maciejewski, Ross (Thesis advisor) / Farin, Gerald (Committee member) / Wang, Yalin (Committee member) / Arizona State University (Publisher)
Created2012