Matching Items (64)
Filtering by

Clear all filters

Does School Participatory Budgeting Increase Students’ Political Efficacy? Bandura’s “Sources,” Civic Pedagogy, and Education for Democracy
Description

Does school participatory budgeting (SPB) increase students’ political efficacy? SPB, which is implemented in thousands of schools around the world, is a democratic process of deliberation and decision-making in which students determine how to spend a portion of the school’s budget. We examined the impact of SPB on political efficacy

Does school participatory budgeting (SPB) increase students’ political efficacy? SPB, which is implemented in thousands of schools around the world, is a democratic process of deliberation and decision-making in which students determine how to spend a portion of the school’s budget. We examined the impact of SPB on political efficacy in one middle school in Arizona. Our participants’ (n = 28) responses on survey items designed to measure self-perceived growth in political efficacy indicated a large effect size (Cohen’s d = 1.46), suggesting that SPB is an effective approach to civic pedagogy, with promising prospects for developing students’ political efficacy.

ContributorsGibbs, Norman P. (Author) / Bartlett, Tara Lynn (Author) / Schugurensky, Daniel, 1958- (Author)
Created2021-05-01
150070-Thumbnail Image.png
Description
This dissertation creates models of past potential vegetation in the Southern Levant during most of the Holocene, from the beginnings of farming through the rise of urbanized civilization (12 to 2.5 ka BP). The time scale encompasses the rise and collapse of the earliest agrarian civilizations in this region. The

This dissertation creates models of past potential vegetation in the Southern Levant during most of the Holocene, from the beginnings of farming through the rise of urbanized civilization (12 to 2.5 ka BP). The time scale encompasses the rise and collapse of the earliest agrarian civilizations in this region. The archaeological record suggests that increases in social complexity were linked to climatic episodes (e.g., favorable climatic conditions coincide with intervals of prosperity or marked social development such as the Neolithic Revolution ca. 11.5 ka BP, the Secondary Products Revolution ca. 6 ka BP, and the Middle Bronze Age ca. 4 ka BP). The opposite can be said about periods of climatic deterioration, when settled villages were abandoned as the inhabitants returned to nomadic or semi nomadic lifestyles (e.g., abandonment of the largest Neolithic farming towns after 8 ka BP and collapse of Bronze Age towns and cities after 3.5 ka BP during the Late Bronze Age). This study develops chronologically refined models of past vegetation from 12 to 2.5 ka BP, at 500 year intervals, using GIS, remote sensing and statistical modeling tools (MAXENT) that derive from species distribution modeling. Plants are sensitive to alterations in their environment and respond accordingly. Because of this, they are valuable indicators of landscape change. An extensive database of historical and field gathered observations was created. Using this database as well as environmental variables that include temperature and precipitation surfaces for the whole study period (also at 500 year intervals), the potential vegetation of the region was modeled. Through this means, a continuous chronology of potential vegetation of the Southern Levantwas built. The produced paleo-vegetation models generally agree with the proxy records. They indicate a gradual decline of forests and expansion of steppe and desert throughout the Holocene, interrupted briefly during the Mid Holocene (ca. 4 ka BP, Middle Bronze Age). They also suggest that during the Early Holocene, forest areas were extensive, spreading into the Northern Negev. The two remaining forested areas in the Northern and Southern Plateau Region in Jordan were also connected during this time. The models also show general agreement with the major cultural developments, with forested areas either expanding or remaining stable during prosperous periods (e.g., Pre Pottery Neolithic and Middle Bronze Age), and significantly contracting during moments of instability (e.g., Late Bronze Age).
ContributorsSoto-Berelov, Mariela (Author) / Fall, Patricia L. (Thesis advisor) / Myint, Soe (Committee member) / Turner, Billie L (Committee member) / Falconer, Steven (Committee member) / Arizona State University (Publisher)
Created2011
150288-Thumbnail Image.png
Description
In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human

In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human samples. As this takes place, a serendipitous opportunity has become evident. By the virtue that as one narrows the focus towards "single" protein targets (instead of entire proteomes) using pan-antibody-based enrichment techniques, a discovery science has emerged, so to speak. This is due to the largely unknown context in which "single" proteins exist in blood (i.e. polymorphisms, transcript variants, and posttranslational modifications) and hence, targeted proteomics has applications for established biomarkers. Furthermore, besides protein heterogeneity accounting for interferences with conventional immunometric platforms, it is becoming evident that this formerly hidden dimension of structural information also contains rich-pathobiological information. Consequently, targeted proteomics studies that aim to ascertain a protein's genuine presentation within disease- stratified populations and serve as a stepping-stone within a biomarker translational pipeline are of clinical interest. Roughly 128 million Americans are pre-diabetic, diabetic, and/or have kidney disease and public and private spending for treating these diseases is in the hundreds of billions of dollars. In an effort to create new solutions for the early detection and management of these conditions, described herein is the design, development, and translation of mass spectrometric immunoassays targeted towards diabetes and kidney disease. Population proteomics experiments were performed for the following clinically relevant proteins: insulin, C-peptide, RANTES, and parathyroid hormone. At least thirty-eight protein isoforms were detected. Besides the numerous disease correlations confronted within the disease-stratified cohorts, certain isoforms also appeared to be causally related to the underlying pathophysiology and/or have therapeutic implications. Technical advancements include multiplexed isoform quantification as well a "dual- extraction" methodology for eliminating non-specific proteins while simultaneously validating isoforms. Industrial efforts towards widespread clinical adoption are also described. Consequently, this work lays a foundation for the translation of mass spectrometric immunoassays into the clinical arena and simultaneously presents the most recent advancements concerning the mass spectrometric immunoassay approach.
ContributorsOran, Paul (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Ros, Alexandra (Committee member) / Williams, Peter (Committee member) / Arizona State University (Publisher)
Created2011
152296-Thumbnail Image.png
Description
Ten regional climate models (RCMs) and atmosphere-ocean generalized model parings from the North America Regional Climate Change Assessment Program were used to estimate the shift of extreme precipitation due to climate change using present-day and future-day climate scenarios. RCMs emulate winter storms and one-day duration events at the sub-regional level.

Ten regional climate models (RCMs) and atmosphere-ocean generalized model parings from the North America Regional Climate Change Assessment Program were used to estimate the shift of extreme precipitation due to climate change using present-day and future-day climate scenarios. RCMs emulate winter storms and one-day duration events at the sub-regional level. Annual maximum series were derived for each model pairing, each modeling period; and for annual and winter seasons. The reliability ensemble average (REA) method was used to qualify each RCM annual maximum series to reproduce historical records and approximate average predictions, because there are no future records. These series determined (a) shifts in extreme precipitation frequencies and magnitudes, and (b) shifts in parameters during modeling periods. The REA method demonstrated that the winter season had lower REA factors than the annual season. For the winter season the RCM pairing of the Hadley regional Model 3 and the Geophysical Fluid-Dynamics Laboratory atmospheric-land generalized model had the lowest REA factors. However, in replicating present-day climate, the pairing of the Abdus Salam International Center for Theoretical Physics' Regional Climate Model Version 3 with the Geophysical Fluid-Dynamics Laboratory atmospheric-land generalized model was superior. Shifts of extreme precipitation in the 24-hour event were measured using precipitation magnitude for each frequency in the annual maximum series, and the difference frequency curve in the generalized extreme-value-function parameters. The average trend of all RCM pairings implied no significant shift in the winter annual maximum series, however the REA-selected models showed an increase in annual-season precipitation extremes: 0.37 inches for the 100-year return period and for the winter season suggested approximately 0.57 inches for the same return period. Shifts of extreme precipitation were estimated using predictions 70 years into the future based on RCMs. Although these models do not provide climate information for the intervening 70 year period, the models provide an assertion on the behavior of future climate. The shift in extreme precipitation may be significant in the frequency distribution function, and will vary depending on each model-pairing condition. The proposed methodology addresses the many uncertainties associated with the current methodologies dealing with extreme precipitation.
ContributorsRiaño, Alejandro (Author) / Mays, Larry W. (Thesis advisor) / Vivoni, Enrique (Committee member) / Huang, Huei-Ping (Committee member) / Arizona State University (Publisher)
Created2013
152183-Thumbnail Image.png
Description
Two critical limitations for hyperspatial imagery are higher imagery variances and large data sizes. Although object-based analyses with a multi-scale framework for diverse object sizes are the solution, more data sources and large amounts of testing at high costs are required. In this study, I used tree density segmentation as

Two critical limitations for hyperspatial imagery are higher imagery variances and large data sizes. Although object-based analyses with a multi-scale framework for diverse object sizes are the solution, more data sources and large amounts of testing at high costs are required. In this study, I used tree density segmentation as the key element of a three-level hierarchical vegetation framework for reducing those costs, and a three-step procedure was used to evaluate its effects. A two-step procedure, which involved environmental stratifications and the random walker algorithm, was used for tree density segmentation. I determined whether variation in tone and texture could be reduced within environmental strata, and whether tree density segmentations could be labeled by species associations. At the final level, two tree density segmentations were partitioned into smaller subsets using eCognition in order to label individual species or tree stands in two test areas of two tree densities, and the Z values of Moran's I were used to evaluate whether imagery objects have different mean values from near segmentations as a measure of segmentation accuracy. The two-step procedure was able to delineating tree density segments and label species types robustly, compared to previous hierarchical frameworks. However, eCognition was not able to produce detailed, reasonable image objects with optimal scale parameters for species labeling. This hierarchical vegetation framework is applicable for fine-scale, time-series vegetation mapping to develop baseline data for evaluating climate change impacts on vegetation at low cost using widely available data and a personal laptop.
ContributorsLiau, Yan-ting (Author) / Franklin, Janet (Thesis advisor) / Turner, Billie (Committee member) / Myint, Soe (Committee member) / Arizona State University (Publisher)
Created2013
151294-Thumbnail Image.png
Description
The partitioning of available solar energy into different fluxes at the Earth's surface is important in determining different physical processes, such as turbulent transport, subsurface hydrology, land-atmospheric interactions, etc. Direct measurements of these turbulent fluxes were carried out using eddy-covariance (EC) towers. However, the distribution of EC towers is sparse

The partitioning of available solar energy into different fluxes at the Earth's surface is important in determining different physical processes, such as turbulent transport, subsurface hydrology, land-atmospheric interactions, etc. Direct measurements of these turbulent fluxes were carried out using eddy-covariance (EC) towers. However, the distribution of EC towers is sparse due to relatively high cost and practical difficulties in logistics and deployment. As a result, data is temporally and spatially limited and is inadequate to be used for researches at large scales, such as regional and global climate modeling. Besides field measurements, an alternative way is to estimate turbulent fluxes based on the intrinsic relations between surface energy budget components, largely through thermodynamic equilibrium. These relations, referred as relative efficiency, have been included in several models to estimate the magnitude of turbulent fluxes in surface energy budgets such as latent heat and sensible heat. In this study, three theoretical models based on the lumped heat transfer model, the linear stability analysis and the maximum entropy principle respectively, were investigated. Model predictions of relative efficiencies were compared with turbulent flux data over different land covers, viz. lake, grassland and suburban surfaces. Similar results were observed over lake and suburban surface but significant deviation is found over vegetation surface. The relative efficiency of outgoing longwave radiation is found to be orders of magnitude deviated from theoretic predictions. Meanwhile, results show that energy partitioning process is influenced by the surface water availability to a great extent. The study provides insight into what property is determining energy partitioning process over different land covers and gives suggestion for future models.
ContributorsYang, Jiachuan (Author) / Wang, Zhihua (Thesis advisor) / Huang, Huei-Ping (Committee member) / Vivoni, Enrique (Committee member) / Mays, Larry (Committee member) / Arizona State University (Publisher)
Created2012
151436-Thumbnail Image.png
Description
Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay

Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay (MSIA) which has been one of the primary methods of biomarker discovery techniques. MSIA analyzes protein molecules as potential biomarkers using time of flight mass spectrometry (TOF-MS). Peak detection in TOF-MS is important for biomarker analysis and many other MS related application. Though many peak detection algorithms exist, most of them are based on heuristics models. One of the ways of detecting signal peaks is by deploying stochastic models of the signal and noise observations. Likelihood ratio test (LRT) detector, based on the Neyman-Pearson (NP) lemma, is an uniformly most powerful test to decision making in the form of a hypothesis test. The primary goal of this dissertation is to develop signal and noise models for the electrospray ionization (ESI) TOF-MS data. A new method is proposed for developing the signal model by employing first principles calculations based on device physics and molecular properties. The noise model is developed by analyzing MS data from careful experiments in the ESI mass spectrometer. A non-flat baseline in MS data is common. The reasons behind the formation of this baseline has not been fully comprehended. A new signal model explaining the presence of baseline is proposed, though detailed experiments are needed to further substantiate the model assumptions. Signal detection schemes based on these signal and noise models are proposed. A maximum likelihood (ML) method is introduced for estimating the signal peak amplitudes. The performance of the detection methods and ML estimation are evaluated with Monte Carlo simulation which shows promising results. An application of these methods is proposed for fractional abundance calculation for biomarker analysis, which is mathematically robust and fundamentally different than the current algorithms. Biomarker panels for type 2 diabetes and cardiovascular disease are analyzed using existing MS analysis algorithms. Finally, a support vector machine based multi-classification algorithm is developed for evaluating the biomarkers' effectiveness in discriminating type 2 diabetes and cardiovascular diseases and is shown to perform better than a linear discriminant analysis based classifier.
ContributorsBuddi, Sai (Author) / Taylor, Thomas (Thesis advisor) / Cochran, Douglas (Thesis advisor) / Nelson, Randall (Committee member) / Duman, Tolga (Committee member) / Arizona State University (Publisher)
Created2012
151928-Thumbnail Image.png
Description
Land transformation under conditions of rapid urbanization has significantly altered the structure and functioning of Earth's systems. Land fragmentation, a characteristic of land transformation, is recognized as a primary driving force in the loss of biological diversity worldwide. However, little is known about its implications in complex urban settings where

Land transformation under conditions of rapid urbanization has significantly altered the structure and functioning of Earth's systems. Land fragmentation, a characteristic of land transformation, is recognized as a primary driving force in the loss of biological diversity worldwide. However, little is known about its implications in complex urban settings where interaction with social dynamics is intense. This research asks: How do patterns of land cover and land fragmentation vary over time and space, and what are the socio-ecological drivers and consequences of land transformation in a rapidly growing city? Using Metropolitan Phoenix as a case study, the research links pattern and process relationships between land cover, land fragmentation, and socio-ecological systems in the region. It examines population growth, water provision and institutions as major drivers of land transformation, and the changes in bird biodiversity that result from land transformation. How to manage socio-ecological systems is one of the biggest challenges of moving towards sustainability. This research project provides a deeper understanding of how land transformation affects socio-ecological dynamics in an urban setting. It uses a series of indices to evaluate land cover and fragmentation patterns over the past twenty years, including land patch numbers, contagion, shapes, and diversities. It then generates empirical evidence on the linkages between land cover patterns and ecosystem properties by exploring the drivers and impacts of land cover change. An interdisciplinary approach that integrates social, ecological, and spatial analysis is applied in this research. Findings of the research provide a documented dataset that can help researchers study the relationship between human activities and biotic processes in an urban setting, and contribute to sustainable urban development.
ContributorsZhang, Sainan (Author) / Boone, Christopher G. (Thesis advisor) / York, Abigail M. (Committee member) / Myint, Soe (Committee member) / Arizona State University (Publisher)
Created2013
151938-Thumbnail Image.png
Description

Hydrology and biogeochemistry are coupled in all systems. However, human decision-making regarding hydrology and biogeochemistry are often separate, even though decisions about hydrologic systems may have substantial impacts on biogeochemical patterns and processes. The overarching question of this dissertation was: How does hydrologic engineering interact with the effects of nutrient

Hydrology and biogeochemistry are coupled in all systems. However, human decision-making regarding hydrology and biogeochemistry are often separate, even though decisions about hydrologic systems may have substantial impacts on biogeochemical patterns and processes. The overarching question of this dissertation was: How does hydrologic engineering interact with the effects of nutrient loading and climate to drive watershed nutrient yields? I conducted research in two study systems with contrasting spatial and temporal scales. Using a combination of data-mining and modeling approaches, I reconstructed nitrogen and phosphorus budgets for the northeastern US over the 20th century, including anthropogenic nutrient inputs and riverine fluxes, for ~200 watersheds at 5 year time intervals. Infrastructure systems, such as sewers, wastewater treatment plants, and reservoirs, strongly affected the spatial and temporal patterns of nutrient fluxes from northeastern watersheds. At a smaller scale, I investigated the effects of urban stormwater drainage infrastructure on water and nutrient delivery from urban watersheds in Phoenix, AZ. Using a combination of field monitoring and statistical modeling, I tested hypotheses about the importance of hydrologic and biogeochemical control of nutrient delivery. My research suggests that hydrology is the major driver of differences in nutrient fluxes from urban watersheds at the event scale, and that consideration of altered hydrologic networks is critical for understanding anthropogenic impacts on biogeochemical cycles. Overall, I found that human activities affect nutrient transport via multiple pathways. Anthropogenic nutrient additions increase the supply of nutrients available for transport, whereas hydrologic infrastructure controls the delivery of nutrients from watersheds. Incorporating the effects of hydrologic infrastructure is critical for understanding anthropogenic effects on biogeochemical fluxes across spatial and temporal scales.

ContributorsHale, Rebecca Leslie (Author) / Grimm, Nancy (Thesis advisor) / Childers, Daniel (Committee member) / Vivoni, Enrique (Committee member) / York, Abigail (Committee member) / Wu, Jianguo (Committee member) / Arizona State University (Publisher)
Created2013
151170-Thumbnail Image.png
Description
Cancer claims hundreds of thousands of lives every year in US alone. Finding ways for early detection of cancer onset is crucial for better management and treatment of cancer. Thus, biomarkers especially protein biomarkers, being the functional units which reflect dynamic physiological changes, need to be discovered. Though important, there

Cancer claims hundreds of thousands of lives every year in US alone. Finding ways for early detection of cancer onset is crucial for better management and treatment of cancer. Thus, biomarkers especially protein biomarkers, being the functional units which reflect dynamic physiological changes, need to be discovered. Though important, there are only a few approved protein cancer biomarkers till date. To accelerate this process, fast, comprehensive and affordable assays are required which can be applied to large population studies. For this, these assays should be able to comprehensively characterize and explore the molecular diversity of nominally "single" proteins across populations. This information is usually unavailable with commonly used immunoassays such as ELISA (enzyme linked immunosorbent assay) which either ignore protein microheterogeneity, or are confounded by it. To this end, mass spectrometric immuno assays (MSIA) for three different human plasma proteins have been developed. These proteins viz. IGF-1, hemopexin and tetranectin have been found in reported literature to show correlations with many diseases along with several carcinomas. Developed assays were used to extract entire proteins from plasma samples and subsequently analyzed on mass spectrometric platforms. Matrix assisted laser desorption ionization (MALDI) and electrospray ionization (ESI) mass spectrometric techniques where used due to their availability and suitability for the analysis. This resulted in visibility of different structural forms of these proteins showing their structural micro-heterogeneity which is invisible to commonly used immunoassays. These assays are fast, comprehensive and can be applied in large sample studies to analyze proteins for biomarker discovery.
ContributorsRai, Samita (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Borges, Chad (Committee member) / Ros, Alexandra (Committee member) / Arizona State University (Publisher)
Created2012