Matching Items (117)
Filtering by

Clear all filters

151718-Thumbnail Image.png
Description
The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a

The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a reputation score for each tweet that is based not just on content, but also additional information from the Twitter ecosystem that consists of users, tweets, and the web pages that tweets link to. This information is obtained by modeling the Twitter ecosystem as a three-layer graph. The reputation score is used to power two novel methods of ranking tweets by propagating the reputation over an agreement graph based on tweets' content similarity. Additionally, I show how the agreement graph helps counter tweet spam. An evaluation of my method on 16~million tweets from the TREC 2011 Microblog Dataset shows that it doubles the precision over baseline Twitter Search and achieves higher precision than current state of the art method. I present a detailed internal empirical evaluation of RAProp in comparison to several alternative approaches proposed by me, as well as external evaluation in comparison to the current state of the art method.
ContributorsRavikumar, Srijith (Author) / Kambhampati, Subbarao (Thesis advisor) / Davulcu, Hasan (Committee member) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
152190-Thumbnail Image.png
Description
This dissertation explores vulnerability to extreme heat hazards in the Maricopa County, Arizona metropolitan region. By engaging an interdisciplinary approach, I uncover the epidemiological, historical-geographical, and mitigation dimensions of human vulnerability to extreme heat in a rapidly urbanizing region characterized by an intense urban heat island and summertime heat waves.

This dissertation explores vulnerability to extreme heat hazards in the Maricopa County, Arizona metropolitan region. By engaging an interdisciplinary approach, I uncover the epidemiological, historical-geographical, and mitigation dimensions of human vulnerability to extreme heat in a rapidly urbanizing region characterized by an intense urban heat island and summertime heat waves. I first frame the overall research within global climate change and hazards vulnerability research literature, and then present three case studies. I conclude with a synthesis of the findings and lessons learned from my interdisciplinary approach using an urban political ecology framework. In the first case study I construct and map a predictive index of sensitivity to heat health risks for neighborhoods, compare predicted neighborhood sensitivity to heat-related hospitalization rates, and estimate relative risk of hospitalizations for neighborhoods. In the second case study, I unpack the history and geography of land use/land cover change, urban development and marginalization of minorities that created the metropolitan region's urban heat island and consequently, the present conditions of extreme heat exposure and vulnerability in the urban core. The third study uses computational microclimate modeling to evaluate the potential of a vegetation-based intervention for mitigating extreme heat in an urban core neighborhood. Several findings relevant to extreme heat vulnerability emerge from the case studies. First, two main socio-demographic groups are found to be at higher risk for heat illness: low-income minorities in sparsely-vegetated neighborhoods in the urban core, and the elderly and socially-isolated in the expansive suburban fringe of Maricopa County. The second case study reveals that current conditions of heat exposure in the region's urban heat island are the legacy of historical marginalization of minorities and large-scale land-use/land cover transformations of natural desert land covers into heat-retaining urban surfaces of the built environment. Third, summertime air temperature reductions in the range 0.9-1.9 °C and of up to 8.4 °C in surface temperatures in the urban core can be achieved through desert-adapted canopied vegetation, suggesting that, at the microscale, the urban heat island can be mitigated by creating vegetated park cool islands. A synthesis of the three case studies using the urban political ecology framework argues that climate changed-induced heat hazards in cities must be problematized within the socio-ecological transformations that produce and reproduce urban landscapes of risk. The interdisciplinary approach to heat hazards in this dissertation advances understanding of the social and ecological drivers of extreme heat by drawing on multiple theories and methods from sociology, urban and Marxist geography, microclimatology, spatial epidemiology, environmental history, political economy and urban political ecology.
ContributorsDeclet-Barreto, Juan (Author) / Harlan, Sharon L (Thesis advisor) / Bolin, Bob (Thesis advisor) / Hirt, Paul (Committee member) / Boone, Christopher (Committee member) / Arizona State University (Publisher)
Created2013
151867-Thumbnail Image.png
Description
Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located

Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located within natural-language text and their semantic type is determined. This step is critical for later tasks in an information extraction pipeline, including normalization and relationship extraction. BANNER is a benchmark biomedical NER system using linear-chain conditional random fields and the rich feature set approach. A case study with BANNER locating genes and proteins in biomedical literature is described. The first corpus for disease NER adequate for use as training data is introduced, and employed in a case study of disease NER. The first corpus locating adverse drug reactions (ADRs) in user posts to a health-related social website is also described, and a system to locate and identify ADRs in social media text is created and evaluated. The rich feature set approach to creating NER feature sets is argued to be subject to diminishing returns, implying that additional improvements may require more sophisticated methods for creating the feature set. This motivates the first application of multivariate feature selection with filters and false discovery rate analysis to biomedical NER, resulting in a feature set at least 3 orders of magnitude smaller than the set created by the rich feature set approach. Finally, two novel approaches to NER by modeling the semantics of token sequences are introduced. The first method focuses on the sequence content by using language models to determine whether a sequence resembles entries in a lexicon of entity names or text from an unlabeled corpus more closely. The second method models the distributional semantics of token sequences, determining the similarity between a potential mention and the token sequences from the training data by analyzing the contexts where each sequence appears in a large unlabeled corpus. The second method is shown to improve the performance of BANNER on multiple data sets.
ContributorsLeaman, James Robert (Author) / Gonzalez, Graciela (Thesis advisor) / Baral, Chitta (Thesis advisor) / Cohen, Kevin B (Committee member) / Liu, Huan (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
151991-Thumbnail Image.png
Description
Residential historic preservation occurs through inhabitation. Through day-to-day domesticities a suite of bodily comportments and aesthetic practices are perpetually at work tearing and stitching the historic fabric anew. Such paradoxical practice materializes seemingly incompatible relations between past and present, people and things. Through a playful posture of experience/experiment, this dissertation

Residential historic preservation occurs through inhabitation. Through day-to-day domesticities a suite of bodily comportments and aesthetic practices are perpetually at work tearing and stitching the historic fabric anew. Such paradoxical practice materializes seemingly incompatible relations between past and present, people and things. Through a playful posture of experience/experiment, this dissertation attends to the materiality of historic habitation vis-à-vis practices and performances in the Coronado historic neighborhood (1907-1942) in Phoenix, Arizona. Characterized by diversity in the built and social environs, Coronado defies preservation's exclusionary tendencies. First, I propose a theoretical frame to account for the amorphous expression of nostalgia, the way it seeps, tugs, and lures `historic' people and things together. I push the argument that everyday nostalgic practice and performance in Coronado gives rise to an aesthetic of pastness that draws attention to what is near, a sensual attunement of care rather than strict adherence to preservation guidelines. Drawing on the institutional legacy of Neighborhood Housing Services, I then rethink residential historic preservation in Coronado as urban bricolage, the aesthetic ordering of urban space through practices of inclusivity, temporal juxtaposition, and the art of everyday living. Finally, I explore the historic practice of home touring in Coronado as demonstrative of urban hospitality, an opening of self and neighborhood toward other bodies, critical in the making of viable, ethical urban communities. These three moments contribute to the body of literature rethinking urbanism as sensual, enchanted, and hospitable.
ContributorsKitson, Jennifer (Author) / McHugh, Kevin (Thesis advisor) / Lukinbeal, Christopher (Committee member) / Bolin, Bob (Committee member) / Klett, Mark (Committee member) / Arizona State University (Publisher)
Created2013
151605-Thumbnail Image.png
Description
In most social networking websites, users are allowed to perform interactive activities. One of the fundamental features that these sites provide is to connecting with users of their kind. On one hand, this activity makes online connections visible and tangible; on the other hand, it enables the exploration of our

In most social networking websites, users are allowed to perform interactive activities. One of the fundamental features that these sites provide is to connecting with users of their kind. On one hand, this activity makes online connections visible and tangible; on the other hand, it enables the exploration of our connections and the expansion of our social networks easier. The aggregation of people who share common interests forms social groups, which are fundamental parts of our social lives. Social behavioral analysis at a group level is an active research area and attracts many interests from the industry. Challenges of my work mainly arise from the scale and complexity of user generated behavioral data. The multiple types of interactions, highly dynamic nature of social networking and the volatile user behavior suggest that these data are complex and big in general. Effective and efficient approaches are required to analyze and interpret such data. My work provide effective channels to help connect the like-minded and, furthermore, understand user behavior at a group level. The contributions of this dissertation are in threefold: (1) proposing novel representation of collective tagging knowledge via tag networks; (2) proposing the new information spreader identification problem in egocentric soical networks; (3) defining group profiling as a systematic approach to understanding social groups. In sum, the research proposes novel concepts and approaches for connecting the like-minded, enables the understanding of user groups, and exposes interesting research opportunities.
ContributorsWang, Xufei (Author) / Liu, Huan (Thesis advisor) / Kambhampati, Subbarao (Committee member) / Sundaram, Hari (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
151517-Thumbnail Image.png
Description
Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like

Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like data with relevant consumption information but stored in different format and insufficient data about project attributes to interpret consumption data. Our first goal is to clean the historical data and organize it into meaningful structures for analysis. Once the preprocessing on data is completed, different data mining techniques like clustering is applied to find projects which involve resources of similar skillsets and which involve similar complexities and size. This results in "resource utilization templates" for groups of related projects from a resource consumption perspective. Then project characteristics are identified which generate this diversity in headcounts and skillsets. These characteristics are not currently contained in the data base and are elicited from the managers of historical projects. This represents an opportunity to improve the usefulness of the data collection system for the future. The ultimate goal is to match the product technical features with the resource requirement for projects in the past as a model to forecast resource requirements by skill set for future projects. The forecasting model is developed using linear regression with cross validation of the training data as the past project execution are relatively few in number. Acceptable levels of forecast accuracy are achieved relative to human experts' results and the tool is applied to forecast some future projects' resource demand.
ContributorsBhattacharya, Indrani (Author) / Sen, Arunabha (Thesis advisor) / Kempf, Karl G. (Thesis advisor) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
152541-Thumbnail Image.png
Description
Contemporary online social platforms present individuals with social signals in the form of news feed on their peers' activities. On networks such as Facebook, Quora, network operator decides how that information is shown to an individual. Then the user, with her own interests and resource constraints selectively acts on a

Contemporary online social platforms present individuals with social signals in the form of news feed on their peers' activities. On networks such as Facebook, Quora, network operator decides how that information is shown to an individual. Then the user, with her own interests and resource constraints selectively acts on a subset of items presented to her. The network operator again, shows that activity to a selection of peers, and thus creating a behavioral loop. That mechanism of interaction and information flow raises some very interesting questions such as: can network operator design social signals to promote a particular activity like sustainability, public health care awareness, or to promote a specific product? The focus of my thesis is to answer that question. In this thesis, I develop a framework to personalize social signals for users to guide their activities on an online platform. As the result, we gradually nudge the activity distribution on the platform from the initial distribution p to the target distribution q. My work is particularly applicable to guiding collaborations, guiding collective actions, and online advertising. In particular, I first propose a probabilistic model on how users behave and how information flows on the platform. The main part of this thesis after that discusses the Influence Individuals through Social Signals (IISS) framework. IISS consists of four main components: (1) Learner: it learns users' interests and characteristics from their historical activities using Bayesian model, (2) Calculator: it uses gradient descent method to compute the intermediate activity distributions, (3) Selector: it selects users who can be influenced to adopt or drop specific activities, (4) Designer: it personalizes social signals for each user. I evaluate the performance of IISS framework by simulation on several network topologies such as preferential attachment, small world, and random. I show that the framework gradually nudges users' activities to approach the target distribution. I use both simulation and mathematical method to analyse convergence properties such as how fast and how close we can approach the target distribution. When the number of activities is 3, I show that for about 45% of target distributions, we can achieve KL-divergence as low as 0.05. But for some other distributions KL-divergence can be as large as 0.5.
ContributorsLe, Tien D (Author) / Sundaram, Hari (Thesis advisor) / Davulcu, Hasan (Thesis advisor) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2014
152158-Thumbnail Image.png
Description
Most data cleaning systems aim to go from a given deterministic dirty database to another deterministic but clean database. Such an enterprise pre–supposes that it is in fact possible for the cleaning process to uniquely recover the clean versions of each dirty data tuple. This is not possible in many

Most data cleaning systems aim to go from a given deterministic dirty database to another deterministic but clean database. Such an enterprise pre–supposes that it is in fact possible for the cleaning process to uniquely recover the clean versions of each dirty data tuple. This is not possible in many cases, where the most a cleaning system can do is to generate a (hopefully small) set of clean candidates for each dirty tuple. When the cleaning system is required to output a deterministic database, it is forced to pick one clean candidate (say the "most likely" candidate) per tuple. Such an approach can lead to loss of information. For example, consider a situation where there are three equally likely clean candidates of a dirty tuple. An appealing alternative that avoids such an information loss is to abandon the requirement that the output database be deterministic. In other words, even though the input (dirty) database is deterministic, I allow the reconstructed database to be probabilistic. Although such an approach does avoid the information loss, it also brings forth several challenges. For example, how many alternatives should be kept per tuple in the reconstructed database? Maintaining too many alternatives increases the size of the reconstructed database, and hence the query processing time. Second, while processing queries on the probabilistic database may well increase recall, how would they affect the precision of the query processing? In this thesis, I investigate these questions. My investigation is done in the context of a data cleaning system called BayesWipe that has the capability of producing multiple clean candidates per each dirty tuple, along with the probability that they are the correct cleaned version. I represent these alternatives as tuples in a tuple disjoint probabilistic database, and use the Mystiq system to process queries on it. This probabilistic reconstruction (called BayesWipe–PDB) is compared to a deterministic reconstruction (called BayesWipe–DET)—where the most likely clean candidate for each tuple is chosen, and the rest of the alternatives discarded.
ContributorsRihan, Preet Inder Singh (Author) / Kambhampati, Subbarao (Thesis advisor) / Liu, Huan (Committee member) / Davulcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2013
Description
This study explores the potential risks associated with the 65 U.S.-based commercial nuclear power plants (NPPs) and the distribution of those risks among the populations of both their respective host communities and of the communities located in outlying areas. First, I examine the relevant environmental justice issues. I start by

This study explores the potential risks associated with the 65 U.S.-based commercial nuclear power plants (NPPs) and the distribution of those risks among the populations of both their respective host communities and of the communities located in outlying areas. First, I examine the relevant environmental justice issues. I start by examining the racial/ethnic composition of the host community populations, as well as the disparities in socio-economic status that exist, if any, between the host communities and communities located in outlying areas. Second, I estimate the statistical associations that exist, if any, between a population's distance from a NPP and several independent variables. I conduct multivariate ordinary least square (OLS) regression analyses and spatial autocorrelation regression (SAR) analyses at the national, regional and individual-NPP levels. Third, I construct a NPP potential risk index (NPP PRI) that defines four discrete risk categories--namely, very high risk, high risk, moderate risk, and low risk. The NPP PRI allows me then to estimate the demographic characteristics of the populations exposed to each so-defined level of risk. Fourth, using the Palo Verde NPP as the subject, I simulate a scenario in which a NPP experiences a core-damage accident. I use the RASCAL 4.3 software to simulate the path of dispersion of the resultant radioactive plume, and to investigate the statistical associations that exist, if any, between the dispersed radioactive plume and the demographic characteristics of the populations located within the plume's footprint. This study utilizes distributive justice theories to understand the distribution of the potential risks associated with NPPs, many of which are unpredictable, irreversible and inescapable. I employ an approach that takes into account multiple stakeholders in order to provide avenues for all parties to express concerns, and to ensure the relevance and actionability of any resulting policy recommendations.
ContributorsKyne, Dean (Author) / Bolin, Bob (Thesis advisor) / Boone, Christopher (Committee member) / Pijawka, David (Committee member) / Arizona State University (Publisher)
Created2014
152513-Thumbnail Image.png
Description
The coastal fishing community of Barrington, Southwest Nova Scotia (SWNS), has depended on the resilience of ocean ecosystems and resource-based economic activities for centuries. But while many coastal fisheries have developed unique ways to govern their resources, global environmental and economic change presents new challenges. In this study, I examine

The coastal fishing community of Barrington, Southwest Nova Scotia (SWNS), has depended on the resilience of ocean ecosystems and resource-based economic activities for centuries. But while many coastal fisheries have developed unique ways to govern their resources, global environmental and economic change presents new challenges. In this study, I examine the multi-species fishery of Barrington. My objective was to understand what makes the fishery and its governance system robust to economic and ecological change, what makes fishing households vulnerable, and how household vulnerability and system level robustness interact. I addressed these these questions by focusing on action arenas, their contexts, interactions and outcomes. I used a combination of case comparisons, ethnography, surveys, quantitative and qualitative analysis to understand what influences action arenas in Barrington, Southwest Nova Scotia (SWNS). I found that robustness of the fishery at the system level depended on the strength of feedback between the operational level, where resource users interact with the resource, and the collective-choice level, where agents develop rules to influence fishing behavior. Weak feedback in Barrington has precipitated governance mismatches. At the household level, accounts from harvesters, buyers and experts suggested that decision-making arenas lacked procedural justice. Households preferred individual strategies to acquire access to and exploit fisheries resources. But the transferability of quota and licenses has created divisions between haves and have-nots. Those who have lost their traditional access to other species, such as cod, halibut, and haddock, have become highly dependent on lobster. Based on regressions and multi-criteria decision analysis, I found that new entrants in the lobster fishery needed to maintain high effort and catches to service their debts. But harvesters who did not enter the race for higher catches were most sensitive to low demand and low prices for lobster. This study demonstrates the importance of combining multiple methods and theoretical approaches to avoid tunnel vision in fisheries policy.
ContributorsBarnett, Allain J. D (Author) / Anderies, John M (Thesis advisor) / Abbott, Joshua K (Committee member) / Bolin, Bob (Committee member) / Eakin, Hallie (Committee member) / Arizona State University (Publisher)
Created2014