This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 21 - 30 of 37
Filtering by

Clear all filters

129656-Thumbnail Image.png
Description

The objective of this study was to identify physical, social, and intrapersonal cues that were associated with the consumption of sweetened beverages and sweet and salty snacks among adolescents from lower SES neighborhoods. Students were recruited from high schools with a minimum level of 25% free or reduced cost lunches.

The objective of this study was to identify physical, social, and intrapersonal cues that were associated with the consumption of sweetened beverages and sweet and salty snacks among adolescents from lower SES neighborhoods. Students were recruited from high schools with a minimum level of 25% free or reduced cost lunches. Using ecological momentary assessment, participants (N = 158) were trained to answer brief questionnaires on handheld PDA devices: (a) each time they ate or drank, (b) when prompted randomly, and (c) once each evening. Data were collected over 7 days for each participant. Participants reported their location (e.g., school grounds, home), mood, social environment, activities (e.g., watching TV, texting), cravings, food cues (e.g., saw a snack), and food choices. Results showed that having unhealthy snacks or sweet drinks among adolescents was associated with being at school, being with friends, feeling lonely or bored, craving a drink or snack, and being exposed to food cues. Surprisingly, sweet drink consumption was associated with exercising. Watching TV was associated with consuming sweet snacks but not with salty snacks or sweet drinks. These findings identify important environmental and intrapersonal cues to poor snacking choices that may be applied to interventions designed to disrupt these food-related, cue-behavior linked habits.

ContributorsGrenard, Jerry L. (Author) / Stacy, Alan W. (Author) / Shiffman, Saul (Author) / Baraldi, Amanda (Author) / MacKinnon, David (Author) / Lockhart, Ginger (Author) / Kisbu-Sakarya, Yasemin (Author) / Boyle, Sarah (Author) / Beleva, Yuliyana (Author) / Koprowski, Carol (Author) / Ames, Susan L. (Author) / Reynolds, Kim D. (Author) / College of Liberal Arts and Sciences (Contributor)
Created2013-09-09
129645-Thumbnail Image.png
Description

Methodologists have developed mediation analysis techniques for a broad range of substantive applications, yet methods for estimating mediating mechanisms with missing data have been understudied. This study outlined a general Bayesian missing data handling approach that can accommodate mediation analyses with any number of manifest variables. Computer simulation studies showed

Methodologists have developed mediation analysis techniques for a broad range of substantive applications, yet methods for estimating mediating mechanisms with missing data have been understudied. This study outlined a general Bayesian missing data handling approach that can accommodate mediation analyses with any number of manifest variables. Computer simulation studies showed that the Bayesian approach produced frequentist coverage rates and power estimates that were comparable to those of maximum likelihood with the bias-corrected bootstrap. We share an SAS macro that implements Bayesian estimation and use 2 data analysis examples to demonstrate its use.

ContributorsEnders, Craig (Author) / Fairchild, Amanda J. (Author) / MacKinnon, David (Author) / College of Liberal Arts and Sciences (Contributor)
Created2013
128958-Thumbnail Image.png
Description

Background: Immunosignaturing is a new peptide microarray based technology for profiling of humoral immune responses. Despite new challenges, immunosignaturing gives us the opportunity to explore new and fundamentally different research questions. In addition to classifying samples based on disease status, the complex patterns and latent factors underlying immunosignatures, which we attempt

Background: Immunosignaturing is a new peptide microarray based technology for profiling of humoral immune responses. Despite new challenges, immunosignaturing gives us the opportunity to explore new and fundamentally different research questions. In addition to classifying samples based on disease status, the complex patterns and latent factors underlying immunosignatures, which we attempt to model, may have a diverse range of applications.

Methods: We investigate the utility of a number of statistical methods to determine model performance and address challenges inherent in analyzing immunosignatures. Some of these methods include exploratory and confirmatory factor analyses, classical significance testing, structural equation and mixture modeling.

Results: We demonstrate an ability to classify samples based on disease status and show that immunosignaturing is a very promising technology for screening and presymptomatic screening of disease. In addition, we are able to model complex patterns and latent factors underlying immunosignatures. These latent factors may serve as biomarkers for disease and may play a key role in a bioinformatic method for antibody discovery.

Conclusion: Based on this research, we lay out an analytic framework illustrating how immunosignatures may be useful as a general method for screening and presymptomatic screening of disease as well as antibody discovery.

ContributorsBrown, Justin (Author) / Stafford, Phillip (Author) / Johnston, Stephen (Author) / Dinu, Valentin (Author) / College of Health Solutions (Contributor)
Created2011-08-19
129075-Thumbnail Image.png
Description

Background: High-throughput technologies such as DNA, RNA, protein, antibody and peptide microarrays are often used to examine differences across drug treatments, diseases, transgenic animals, and others. Typically one trains a classification system by gathering large amounts of probe-level data, selecting informative features, and classifies test samples using a small number of

Background: High-throughput technologies such as DNA, RNA, protein, antibody and peptide microarrays are often used to examine differences across drug treatments, diseases, transgenic animals, and others. Typically one trains a classification system by gathering large amounts of probe-level data, selecting informative features, and classifies test samples using a small number of features. As new microarrays are invented, classification systems that worked well for other array types may not be ideal. Expression microarrays, arguably one of the most prevalent array types, have been used for years to help develop classification algorithms. Many biological assumptions are built into classifiers that were designed for these types of data. One of the more problematic is the assumption of independence, both at the probe level and again at the biological level. Probes for RNA transcripts are designed to bind single transcripts. At the biological level, many genes have dependencies across transcriptional pathways where co-regulation of transcriptional units may make many genes appear as being completely dependent. Thus, algorithms that perform well for gene expression data may not be suitable when other technologies with different binding characteristics exist. The immunosignaturing microarray is based on complex mixtures of antibodies binding to arrays of random sequence peptides. It relies on many-to-many binding of antibodies to the random sequence peptides. Each peptide can bind multiple antibodies and each antibody can bind multiple peptides. This technology has been shown to be highly reproducible and appears promising for diagnosing a variety of disease states. However, it is not clear what is the optimal classification algorithm for analyzing this new type of data.

Results: We characterized several classification algorithms to analyze immunosignaturing data. We selected several datasets that range from easy to difficult to classify, from simple monoclonal binding to complex binding patterns in asthma patients. We then classified the biological samples using 17 different classification algorithms. Using a wide variety of assessment criteria, we found ‘Naïve Bayes’ far more useful than other widely used methods due to its simplicity, robustness, speed and accuracy.

Conclusions: ‘Naïve Bayes’ algorithm appears to accommodate the complex patterns hidden within multilayered immunosignaturing microarray data due to its fundamental mathematical properties.

ContributorsKukreja, Muskan (Author) / Johnston, Stephen (Author) / Stafford, Phillip (Author) / Biodesign Institute (Contributor)
Created2012-06-21
128754-Thumbnail Image.png
Description

The rise in antibiotic resistance has led to an increased research focus on discovery of new antibacterial candidates. While broad-spectrum antibiotics are widely pursued, there is evidence that resistance arises in part from the wide spread use of these antibiotics. Our group has developed a system to produce protein affinity

The rise in antibiotic resistance has led to an increased research focus on discovery of new antibacterial candidates. While broad-spectrum antibiotics are widely pursued, there is evidence that resistance arises in part from the wide spread use of these antibiotics. Our group has developed a system to produce protein affinity agents, called synbodies, which have high affinity and specificity for their target. In this report, we describe the adaptation of this system to produce new antibacterial candidates towards a target bacterium. The system functions by screening target bacteria against an array of 10,000 random sequence peptides and, using a combination of membrane labeling and intracellular dyes, we identified peptides with target specific binding or killing functions. Binding and lytic peptides were identified in this manner and in vitro tests confirmed the activity of the lead peptides. A peptide with antibacterial activity was linked to a peptide specifically binding Staphylococcus aureus to create a synbody with increased antibacterial activity. Subsequent tests showed that this peptide could block S. aureus induced killing of HEK293 cells in a co-culture experiment. These results demonstrate the feasibility of using the synbody system to discover new antibacterial candidate agents.

ContributorsDomenyuk, Valeriy (Author) / Loskutov, Andrey (Author) / Johnston, Stephen (Author) / Diehnelt, Chris (Author) / Biodesign Institute (Contributor)
Created2013-01-23
127961-Thumbnail Image.png
Description

As gesture interfaces become more main-stream, it is increasingly important to investigate the behavioral characteristics of these interactions – particularly in three-dimensional (3D) space. In this study, Fitts’ method was extended to such input technologies, and the applicability of Fitts’ law to gesture-based interactions was examined. The experiment included three

As gesture interfaces become more main-stream, it is increasingly important to investigate the behavioral characteristics of these interactions – particularly in three-dimensional (3D) space. In this study, Fitts’ method was extended to such input technologies, and the applicability of Fitts’ law to gesture-based interactions was examined. The experiment included three gesture-based input devices that utilize different techniques to capture user movement, and compared them to conventional input technologies like touchscreen and mouse. Participants completed a target-acquisition test and were instructed to move a cursor from a home location to a spherical target as quickly and accurately as possible. Three distances and three target sizes were tested six times in a randomized order for all input devices. A total of 81 participants completed all tasks. Movement time, error rate, and throughput were calculated for each input technology. Results showed that the mean movement time was highly correlated with the target's index of difficulty for all devices, providing evidence that Fitts’ law can be extended and applied to gesture-based devices. Throughputs were found to be significantly lower for the gesture-based devices compared to mouse and touchscreen, and as the index of difficulty increased, the movement time increased significantly more for these gesture technologies. Error counts were statistically higher for all gesture-based input technologies compared to mouse. In addition, error counts for all inputs were highly correlated with target width, but little impact was shown by movement distance. Overall, the findings suggest that gesture-based devices can be characterized by Fitts’ law in a similar fashion to conventional 1D or 2D devices.

ContributorsBurno, Rachael A. (Author) / Wu, Bing (Author) / Doherty, Rina (Author) / Colett, Hannah (Author) / Elnaggar, Rania (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2015-10-23
127936-Thumbnail Image.png
Description

Load associated fatigue cracking is one of the major distress types occurring in flexible pavements. Flexural bending beam fatigue laboratory test has been used for several decades and is considered an integral part of the Superpave advanced characterization procedure. One of the most significant solutions to sustain the fatigue life

Load associated fatigue cracking is one of the major distress types occurring in flexible pavements. Flexural bending beam fatigue laboratory test has been used for several decades and is considered an integral part of the Superpave advanced characterization procedure. One of the most significant solutions to sustain the fatigue life for an asphaltic mixture is to add sustainable materials such as rubber or polymers to the asphalt mixture. A laboratory testing program was performed on three gap-graded mixtures: unmodified, Asphalt Rubber (AR) and polymer-modified. Strain controlled fatigue tests were conducted according to the AASHTO T321 procedure. The results from the beam fatigue tests indicated that the AR and polymer-modified gap graded mixtures would have much longer fatigue lives compared to the reference (unmodified) mixture. In addition, a mechanistic analysis using 3D-Move software coupled with a cost-effectiveness analysis study based on the fatigue performance on the three mixtures were performed. Overall, the analysis showed that the AR and polymer-modified asphalt mixtures exhibited significantly higher cost-effectiveness compared to unmodified HMA mixture. Although AR and polymer-modification increases the cost of the material, the analysis showed that they are more cost effective than the unmodified mixture.

ContributorsSouliman, Mena I. (Author) / Mamlouk, Michael (Author) / Eifert, Annie (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127935-Thumbnail Image.png
Description

The principles of a new project management model have been tested for the past 20 years. This project management model utilizes expertise instead of the traditional management, direction, and control (MDC). This new project management model is a leadership-based model instead of a management model. The practice of the new

The principles of a new project management model have been tested for the past 20 years. This project management model utilizes expertise instead of the traditional management, direction, and control (MDC). This new project management model is a leadership-based model instead of a management model. The practice of the new model requires a change in paradigm and project management structure. Some of the practices of this new paradigm include minimizing the flow of information and communications to and from the project manager [including meetings, emails and documents], eliminating technical communications, reducing client management, direction, and control of the vendor, and the hiring of vendors or personnel to do specific tasks. A vendors is hired only after they have clearly shown that they know what they are doing by showing past performance on similar projects, that they clearly understand how to create transparency to minimize risk that they do not control, and that they can clearly outline their project plan using a detailed milestone schedule including time, cost, and tasks all communicated in the language of metrics.

ContributorsRivera, Alfredo (Author) / Kashiwagi, Dean (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127934-Thumbnail Image.png
Description

For the past three decades, the Saudi construction industry (SCI) has exhibited poor performance. Many research efforts have tried to identify the problem and the potential causes but there have been few publications identifying ways to mitigate the problem and describing testing to validate the proposed solution. This paper examines

For the past three decades, the Saudi construction industry (SCI) has exhibited poor performance. Many research efforts have tried to identify the problem and the potential causes but there have been few publications identifying ways to mitigate the problem and describing testing to validate the proposed solution. This paper examines the research and development (R&D) approach in the SCI. A literature research was performed identifying the impact that R&D has had on the SCI. A questionnaire was also created for surveying industry professionals and researchers. The results show evidence that the SCI practice and the academic research work exist in separate silos. This study recommends a change of mindset in both the public and private sector on their views on R&D since cooperation is required to create collaboration between the two sectors and improve the competitiveness of the country's economy.

ContributorsAlhammadi, Yasir (Author) / Algahtany, Mohammed (Author) / Kashiwagi, Dean (Author) / Sullivan, Kenneth (Author) / Kashiwagi, Jacob (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127933-Thumbnail Image.png
Description

To date, little research has been performed regarding the planning and management of “small” projects – those projects typically differentiated from “large” projects due to having lower costs. In 2013, The Construction Industry Institute (CII) set out to develop a front end planning tool that will provide practitioners with a

To date, little research has been performed regarding the planning and management of “small” projects – those projects typically differentiated from “large” projects due to having lower costs. In 2013, The Construction Industry Institute (CII) set out to develop a front end planning tool that will provide practitioners with a standardized process for planning small projects in the industrial sector. The research team determined that data should be sought from industry regarding small industrial projects to ensure applicability, effectiveness and validity of the new tool. The team developed and administered a survey to determine (1) the prevalence of small projects, (2) the planning processes currently in use for small projects, and (3) current metrics used by industry to differentiate between small and large projects. The survey data showed that small projects make up a majority of projects completed in the industrial sector, planning of these projects varies greatly across the industry, and the metrics posed in the survey were mostly not appropriate for use in differentiating between small and large projects. This study contributes to knowledge through adding to the limited research surrounding small projects, and suggesting future research regarding using measures of project complexity to differentiate between small and large projects.

ContributorsCollins, Wesley (Author) / Parrish, Kristen (Author) / Gibson, G (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2017-08-24