This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 21 - 30 of 47
Filtering by

Clear all filters

129385-Thumbnail Image.png
Description

Although perceptions of physically, socially, and morally stigmatized occupations – ‘dirty work’ – are socially constructed, very little attention has been paid to how the context shapes those constructions. We explore the impact of historical trends (when), macro and micro cultures (where), and demographic characteristics (who) on the social construction

Although perceptions of physically, socially, and morally stigmatized occupations – ‘dirty work’ – are socially constructed, very little attention has been paid to how the context shapes those constructions. We explore the impact of historical trends (when), macro and micro cultures (where), and demographic characteristics (who) on the social construction of dirty work. Historically, the rise of hygiene, along with economic and technological development, resulted in greater societal distancing from dirty work, while the rise of liberalism has resulted in greater social acceptance of some morally stigmatized occupations. Culturally, masculinity tends to be preferred over femininity as an ideological discourse for dirty work, unless the occupation is female-dominated; members of collectivist cultures are generally better able than members of individualist cultures to combat the collective-level threat that stigma inherently represents; and members of high power-distance cultures tend to view dirty work more negatively than members of low power-distance cultures. Demographically, marginalized work tends to devolve to marginalized socioeconomic, gender, and racioethnic categories, creating a pernicious and entrapping recursive loop between ‘dirty work’ and being labeled as ‘dirty people.’

ContributorsAshforth, Blake (Author) / Kreiner, Glen E. (Author) / W.P. Carey School of Business (Contributor)
Created2014-07-01
129551-Thumbnail Image.png
Description

National and state organizations have developed policies calling upon afterschool programs (ASPs, 3–6 pm) to serve a fruit or vegetable (FV) each day for snack, while eliminating foods and beverages high in added-sugars, and to ensure children accumulate a minimum of 30 min/d of moderate-to-vigorous physical activity (MVPA). Few efficacious

National and state organizations have developed policies calling upon afterschool programs (ASPs, 3–6 pm) to serve a fruit or vegetable (FV) each day for snack, while eliminating foods and beverages high in added-sugars, and to ensure children accumulate a minimum of 30 min/d of moderate-to-vigorous physical activity (MVPA). Few efficacious and cost-effective strategies exist to assist ASP providers in achieving these important public health goals. This paper reports on the design and conceptual framework of Making Healthy Eating and Physical Activity (HEPA) Policy Practice in ASPs, a 3-year group randomized controlled trial testing the effectiveness of strategies designed to improve snacks served and increase MVPA in children attending community-based ASPs. Twenty ASPs, serving over 1800 children (6–12 years) will be enrolled and match-paired based on enrollment size, average daily min/d MVPA, and days/week FV served, with ASPs randomized after baseline data collection to immediate intervention or a 1-year delayed group. The framework employed, STEPs (Strategies To Enhance Practice), focuses on intentional programming of HEPA in each ASPs' daily schedule, and includes a grocery store partnership to reduce price barriers to purchasing FV, professional development training to promote physical activity to develop core physical activity competencies, as well as ongoing technical support/assistance. Primary outcome measures include children's accelerometry-derived MVPA and time spend sedentary while attending an ASP, direct observation of staff HEPA promoting and inhibiting behaviors, types of snacks served, and child consumption of snacks, as well as, cost of snacks via receipts and detailed accounting of intervention delivery costs to estimate cost-effectiveness.

ContributorsBeets, Michael W. (Author) / Weaver, R. Glenn (Author) / Turner-McGrievy, Gabrielle (Author) / Huberty, Jennifer (Author) / Ward, Dianne S. (Author) / Freedman, Darcy A. (Author) / Saunders, Ruth (Author) / Pate, Russell R. (Author) / Beighle, Aaron (Author) / Hutto, Brent (Author) / Moore, Justin B. (Author) / College of Health Solutions (Contributor)
Created2014-07-01
129048-Thumbnail Image.png
Description

Background: GoGirlGo! (GGG) is designed to increase girls’ physical activity (PA) using a health behavior and PA-based curriculum and is widely available for free to afterschool programs across the nation. However, GGG has not been formally evaluated. The purpose of this pilot study was to evaluate the effectiveness of the GGG

Background: GoGirlGo! (GGG) is designed to increase girls’ physical activity (PA) using a health behavior and PA-based curriculum and is widely available for free to afterschool programs across the nation. However, GGG has not been formally evaluated. The purpose of this pilot study was to evaluate the effectiveness of the GGG curricula to improve PA, and self-efficacy for and enjoyment of PA in elementary aged girls (i.e., 5-13 years).

Methods: Nine afterschool programs were recruited to participate in the pilot (within subjects repeated measures design). GGG is a 12-week program, with a once a week, one-hour lesson with 30 minutes of education and 30 minutes of PA). Data collection occurred at baseline, mid (twice), post, and at follow-up (3-months after the intervention ended). PA was assessed via accelerometry at each time point. Self-efficacy for and enjoyment of PA was measured using the Self-Efficacy Scale and the Short-PA enjoyment scale and was assessed at baseline, post, and follow-up. Fidelity was assessed at midpoint.

Results: Across all age groups there was a statistically significant increase in PA. Overall, on days GGG was offered girls accumulated an average of 11 minutes of moderate-to-vigorous PA compared to 8 minutes during non-GGG days. There was a statistically significant difference in girls’ self-efficacy for PA reported between baseline and post, which was maintained at follow-up. An improvement in enjoyment of PA for girls was found between baseline and follow-up. According to fidelity assessment, 89% of the activities within the curriculum were completed each lesson. Girls appeared to respond well to the curriculum but girls 5-7 years had difficulties paying attention and understanding discussion questions.

Conclusions: Even though there were statistically significant differences in self-efficacy for PA and enjoyment of PA, minimal increases in girls’ PA were observed. GGG curricula improvements are warranted. Future GGG programming should explore offering GGG every day, modifying activities so that they are moderate-to-vigorous in intensity, and providing additional trainings that allow staff to better implement PA and improve behavior management techniques. With modifications, GGG could provide a promising no-cost curriculum that afterschool programs may implement to help girls achieve recommendations for PA.

ContributorsHuberty, Jennifer (Author) / Dinkel, Danae M. (Author) / Beets, Michael W. (Author) / College of Health Solutions (Contributor)
Created2014-02-05
128769-Thumbnail Image.png
Description

Theory suggests that human behavior has implications for disease spread. We examine the hypothesis that individuals engage in voluntary defensive behavior during an epidemic. We estimate the number of passengers missing previously purchased flights as a function of concern for swine flu or A/H1N1 influenza using 1.7 million detailed flight

Theory suggests that human behavior has implications for disease spread. We examine the hypothesis that individuals engage in voluntary defensive behavior during an epidemic. We estimate the number of passengers missing previously purchased flights as a function of concern for swine flu or A/H1N1 influenza using 1.7 million detailed flight records, Google Trends, and the World Health Organization's FluNet data. We estimate that concern over “swine flu,” as measured by Google Trends, accounted for 0.34% of missed flights during the epidemic. The Google Trends data correlates strongly with media attention, but poorly (at times negatively) with reported cases in FluNet. Passengers show no response to reported cases. Passengers skipping their purchased trips forwent at least $50 M in travel related benefits. Responding to actual cases would have cut this estimate in half. Thus, people appear to respond to an epidemic by voluntarily engaging in self-protection behavior, but this behavior may not be responsive to objective measures of risk. Clearer risk communication could substantially reduce epidemic costs. People undertaking costly risk reduction behavior, for example, forgoing nonrefundable flights, suggests they may also make less costly behavior adjustments to avoid infection. Accounting for defensive behaviors may be important for forecasting epidemics, but linking behavior with epidemics likely requires consideration of risk communication.

ContributorsFenichel, Eli P. (Author) / Kuminoff, Nicolai (Author) / Chowell-Puente, Gerardo (Author) / W.P. Carey School of Business (Contributor)
Created2013-03-20
127949-Thumbnail Image.png
Description

The United State generates the most waste among OECD countries, and there are adverse effects of the waste generation. One of the most serious adverse effects is greenhouse gas, especially CH4, which causes global warming. However, the amount of waste generation is not decreasing, and the United State recycling rate,

The United State generates the most waste among OECD countries, and there are adverse effects of the waste generation. One of the most serious adverse effects is greenhouse gas, especially CH4, which causes global warming. However, the amount of waste generation is not decreasing, and the United State recycling rate, which could reduce waste generation, is only 26%, which is lower than other OECD countries. Thus, waste generation and greenhouse gas emission should decrease, and in order for that to happen, identifying the causes should be made a priority. The research objective is to verify whether the Environmental Kuznets Curve relationship is supported for waste generation and GDP across the U.S. Moreover, it also confirmed that total waste generation and recycling waste influences carbon dioxide emissions from the waste sector. The annual-based U.S. data from 1990 to 2012 were used. The data were collected from various data sources, and the Granger causality test was applied for identifying the causal relationships. The results showed that there is no causality between GDP and waste generation, but total waste and recycling generation significantly cause positive and negative greenhouse gas emissions from the waste sector, respectively. This implies that the waste generation will not decrease even if GDP increases. And, if waste generation decreases or recycling rate increases, the greenhouse gas emission will decrease. Based on these results, it is expected that the waste generation and carbon dioxide emission from the waste sector can decrease more efficiently.

ContributorsLee, Seungtaek (Author) / Kim, Jonghoon (Author) / Chong, Oswald (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-05-20
127945-Thumbnail Image.png
Description

With the advent of high-dimensional stored big data and streaming data, suddenly machine learning on a very large scale has become a critical need. Such machine learning should be extremely fast, should scale up easily with volume and dimension, should be able to learn from streaming data, should automatically perform

With the advent of high-dimensional stored big data and streaming data, suddenly machine learning on a very large scale has become a critical need. Such machine learning should be extremely fast, should scale up easily with volume and dimension, should be able to learn from streaming data, should automatically perform dimension reduction for high-dimensional data, and should be deployable on hardware. Neural networks are well positioned to address these challenges of large scale machine learning. In this paper, we present a method that can effectively handle large scale, high-dimensional data. It is an online method that can be used for both streaming and large volumes of stored big data. It primarily uses Kohonen nets, although only a few selected neurons (nodes) from multiple Kohonen nets are actually retained in the end; we discard all Kohonen nets after training. We use Kohonen nets both for dimensionality reduction through feature selection and for building an ensemble of classifiers using single Kohonen neurons. The method is meant to exploit massive parallelism and should be easily deployable on hardware that implements Kohonen nets. Some initial computational results are presented.

ContributorsRoy, Asim (Author) / W.P. Carey School of Business (Contributor)
Created2015-08-10
127931-Thumbnail Image.png
Description

Construction waste management has become extremely important due to stricter disposal and landfill regulations, and a lesser number of available landfills. There are extensive works done on waste treatment and management of the construction industry. Concepts like deconstruction, recyclability, and Design for Disassembly (DfD) are examples of better construction waste

Construction waste management has become extremely important due to stricter disposal and landfill regulations, and a lesser number of available landfills. There are extensive works done on waste treatment and management of the construction industry. Concepts like deconstruction, recyclability, and Design for Disassembly (DfD) are examples of better construction waste management methods. Although some authors and organizations have published rich guides addressing the DfD's principles, there are only a few buildings already developed in this area. This study aims to find the challenges in the current practice of deconstruction activities and the gaps between its theory and implementation. Furthermore, it aims to provide insights about how DfD can create opportunities to turn these concepts into strategies that can be largely adopted by the construction industry stakeholders in the near future.

ContributorsRios, Fernanda (Author) / Chong, Oswald (Author) / Grau, David (Author) / Julie Ann Wrigley Global Institute of Sustainability (Contributor)
Created2015-09-14
127929-Thumbnail Image.png
Description

Previous studies in building energy assessment clearly state that to meet sustainable energy goals, existing buildings, as well as new buildings, will need to improve their energy efficiency. Thus, meeting energy goals relies on retrofitting existing buildings. Most building energy models are bottom-up engineering models, meaning these models calculate energy

Previous studies in building energy assessment clearly state that to meet sustainable energy goals, existing buildings, as well as new buildings, will need to improve their energy efficiency. Thus, meeting energy goals relies on retrofitting existing buildings. Most building energy models are bottom-up engineering models, meaning these models calculate energy demand of individual buildings through their physical properties and energy use for specific end uses (e.g., lighting, appliances, and water heating). Researchers then scale up these model results to represent the building stock of the region studied.

Studies reveal that there is a lack of information about the building stock and associated modeling tools and this lack of knowledge affects the assessment of building energy efficiency strategies. Literature suggests that the level of complexity of energy models needs to be limited. Accuracy of these energy models can be elevated by reducing the input parameters, alleviating the need for users to make many assumptions about building construction and occupancy, among other factors. To mitigate the need for assumptions and the resulting model inaccuracies, the authors argue buildings should be described in a regional stock model with a restricted number of input parameters. One commonly-accepted method of identifying critical input parameters is sensitivity analysis, which requires a large number of runs that are both time consuming and may require high processing capacity.

This paper utilizes the Energy, Carbon and Cost Assessment for Buildings Stocks (ECCABS) model, which calculates the net energy demand of buildings and presents aggregated and individual- building-level, demand for specific end uses, e.g., heating, cooling, lighting, hot water and appliances. The model has already been validated using the Swedish, Spanish, and UK building stock data. This paper discusses potential improvements to this model by assessing the feasibility of using stepwise regression to identify the most important input parameters using the data from UK residential sector. The paper presents results of stepwise regression and compares these to sensitivity analysis; finally, the paper documents the advantages and challenges associated with each method.

ContributorsArababadi, Reza (Author) / Naganathan, Hariharan (Author) / Parrish, Kristen (Author) / Chong, Oswald (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-09-14
127964-Thumbnail Image.png
Description

As the construction continue to be a leading industry in the number of injuries and fatalities annually, several organizations and agencies are working avidly to ensure the number of injuries and fatalities is minimized. The Occupational Safety and Health Administration (OSHA) is one such effort to assure safe and healthful

As the construction continue to be a leading industry in the number of injuries and fatalities annually, several organizations and agencies are working avidly to ensure the number of injuries and fatalities is minimized. The Occupational Safety and Health Administration (OSHA) is one such effort to assure safe and healthful working conditions for working men and women by setting and enforcing standards and by providing training, outreach, education and assistance. Given the large databases of OSHA historical events and reports, a manual analysis of the fatality and catastrophe investigations content is a time consuming and expensive process. This paper aims to evaluate the strength of unsupervised machine learning and Natural Language Processing (NLP) in supporting safety inspections and reorganizing accidents database on a state level. After collecting construction accident reports from the OSHA Arizona office, the methodology consists of preprocessing the accident reports and weighting terms in order to apply a data-driven unsupervised K-Means-based clustering approach. The proposed method classifies the collected reports in four clusters, each reporting a type of accident. The results show the construction accidents in the state of Arizona to be caused by falls (42.9%), struck by objects (34.3%), electrocutions (12.5%), and trenches collapse (10.3%). The findings of this research empower state and local agencies with a customized presentation of the accidents fitting their regulations and weather conditions. What is applicable to one climate might not be suitable for another; therefore, such rearrangement of the accidents database on a state based level is a necessary prerequisite to enhance the local safety applications and standards.

ContributorsChokor, Abbas (Author) / Naganathan, Hariharan (Author) / Chong, Oswald (Author) / El Asmar, Mounir (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-05-20
128270-Thumbnail Image.png
Description

Based on considerable neurophysiological evidence, Roy (2012) proposed the theory that localist representation is widely used in the brain, starting from the lowest levels of processing. Grandmother cells are a special case of localist representation. In this article, I present the theory that grandmother cells are also widely used in

Based on considerable neurophysiological evidence, Roy (2012) proposed the theory that localist representation is widely used in the brain, starting from the lowest levels of processing. Grandmother cells are a special case of localist representation. In this article, I present the theory that grandmother cells are also widely used in the brain. To support the proposed theory, I present neurophysiological evidence and an analysis of the concept of grandmother cells. Konorski (1967) first predicted the existence of grandmother cells (he called them “gnostic” neurons) - single neurons that respond to complex stimuli such as faces, hands, expressions, objects, and so on. The term “grandmother cell” was introduced by Jerry Lettvin in 1969 (Barlow, 1995).

ContributorsRoy, Asim (Author) / W.P. Carey School of Business (Contributor)
Created2013-05-24