This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 21 - 30 of 38
Filtering by

Clear all filters

128958-Thumbnail Image.png
Description

Background: Immunosignaturing is a new peptide microarray based technology for profiling of humoral immune responses. Despite new challenges, immunosignaturing gives us the opportunity to explore new and fundamentally different research questions. In addition to classifying samples based on disease status, the complex patterns and latent factors underlying immunosignatures, which we attempt

Background: Immunosignaturing is a new peptide microarray based technology for profiling of humoral immune responses. Despite new challenges, immunosignaturing gives us the opportunity to explore new and fundamentally different research questions. In addition to classifying samples based on disease status, the complex patterns and latent factors underlying immunosignatures, which we attempt to model, may have a diverse range of applications.

Methods: We investigate the utility of a number of statistical methods to determine model performance and address challenges inherent in analyzing immunosignatures. Some of these methods include exploratory and confirmatory factor analyses, classical significance testing, structural equation and mixture modeling.

Results: We demonstrate an ability to classify samples based on disease status and show that immunosignaturing is a very promising technology for screening and presymptomatic screening of disease. In addition, we are able to model complex patterns and latent factors underlying immunosignatures. These latent factors may serve as biomarkers for disease and may play a key role in a bioinformatic method for antibody discovery.

Conclusion: Based on this research, we lay out an analytic framework illustrating how immunosignatures may be useful as a general method for screening and presymptomatic screening of disease as well as antibody discovery.

ContributorsBrown, Justin (Author) / Stafford, Phillip (Author) / Johnston, Stephen (Author) / Dinu, Valentin (Author) / College of Health Solutions (Contributor)
Created2011-08-19
128935-Thumbnail Image.png
Description

Background: We introduced a hypometabolic convergence index (HCI) to characterize in a single measurement the extent to which a person’s fluorodeoxyglucose positron emission tomogram (FDG PET) corresponds to that in Alzheimer’s disease (AD). Apolipoprotein E ε4 (APOE ε4) gene dose is associated with three levels of risk for late-onset AD. We

Background: We introduced a hypometabolic convergence index (HCI) to characterize in a single measurement the extent to which a person’s fluorodeoxyglucose positron emission tomogram (FDG PET) corresponds to that in Alzheimer’s disease (AD). Apolipoprotein E ε4 (APOE ε4) gene dose is associated with three levels of risk for late-onset AD. We explored the association between gene dose and HCI in cognitively normal ε4 homozygotes, heterozygotes, and non-carriers.

Methods: An algorithm was used to characterize and compare AD-related HCIs in cognitively normal individuals, including 36 ε4 homozygotes, 46 heterozygotes, and 78 non-carriers.

Results: These three groups differed significantly in their HCIs (ANOVA, p = 0.004), and there was a significant association between HCIs and gene dose (linear trend, p = 0.001).

Conclusions: The HCI is associated with three levels of genetic risk for late-onset AD. This supports the possibility of using a single FDG PET measurement to help in the preclinical detection and tracking of AD.

ContributorsSchraml, Frank (Author) / Chen, Kewei (Author) / Ayutyanont, Napatkamon (Author) / Auttawut, Roontiva (Author) / Langbaum, Jessica B. S. (Author) / Lee, Wendy (Author) / Liu, Xiaofen (Author) / Bandy, Dan (Author) / Reeder, Stephanie Q. (Author) / Alexander, Gene E. (Author) / Caselli, Richard J. (Author) / Fleisher, Adam S. (Author) / Reiman, Eric M. (Author) / Alzheimer's Disease Neuroimaging Initiative (Project) (Contributor)
Created2013-06-26
128413-Thumbnail Image.png
Description

One of the gravest dangers facing cancer patients is an extended symptom-free lull between tumor initiation and the first diagnosis. Detection of tumors is critical for effective intervention. Using the body’s immune system to detect and amplify tumor-specific signals may enable detection of cancer using an inexpensive immunoassay. Immunosignatures are

One of the gravest dangers facing cancer patients is an extended symptom-free lull between tumor initiation and the first diagnosis. Detection of tumors is critical for effective intervention. Using the body’s immune system to detect and amplify tumor-specific signals may enable detection of cancer using an inexpensive immunoassay. Immunosignatures are one such assay: they provide a map of antibody interactions with random-sequence peptides. They enable detection of disease-specific patterns using classic train/test methods. However, to date, very little effort has gone into extracting information from the sequence of peptides that interact with disease-specific antibodies. Because it is difficult to represent all possible antigen peptides in a microarray format, we chose to synthesize only 330,000 peptides on a single immunosignature microarray. The 330,000 random-sequence peptides on the microarray represent 83% of all tetramers and 27% of all pentamers, creating an unbiased but substantial gap in the coverage of total sequence space. We therefore chose to examine many relatively short motifs from these random-sequence peptides. Time-variant analysis of recurrent subsequences provided a means to dissect amino acid sequences from the peptides while simultaneously retaining the antibody–peptide binding intensities. We first used a simple experiment in which monoclonal antibodies with known linear epitopes were exposed to these random-sequence peptides, and their binding intensities were used to create our algorithm. We then demonstrated the performance of the proposed algorithm by examining immunosignatures from patients with Glioblastoma multiformae (GBM), an aggressive form of brain cancer. Eight different frameshift targets were identified from the random-sequence peptides using this technique. If immune-reactive antigens can be identified using a relatively simple immune assay, it might enable a diagnostic test with sufficient sensitivity to detect tumors in a clinically useful way.

Created2015-06-18
129075-Thumbnail Image.png
Description

Background: High-throughput technologies such as DNA, RNA, protein, antibody and peptide microarrays are often used to examine differences across drug treatments, diseases, transgenic animals, and others. Typically one trains a classification system by gathering large amounts of probe-level data, selecting informative features, and classifies test samples using a small number of

Background: High-throughput technologies such as DNA, RNA, protein, antibody and peptide microarrays are often used to examine differences across drug treatments, diseases, transgenic animals, and others. Typically one trains a classification system by gathering large amounts of probe-level data, selecting informative features, and classifies test samples using a small number of features. As new microarrays are invented, classification systems that worked well for other array types may not be ideal. Expression microarrays, arguably one of the most prevalent array types, have been used for years to help develop classification algorithms. Many biological assumptions are built into classifiers that were designed for these types of data. One of the more problematic is the assumption of independence, both at the probe level and again at the biological level. Probes for RNA transcripts are designed to bind single transcripts. At the biological level, many genes have dependencies across transcriptional pathways where co-regulation of transcriptional units may make many genes appear as being completely dependent. Thus, algorithms that perform well for gene expression data may not be suitable when other technologies with different binding characteristics exist. The immunosignaturing microarray is based on complex mixtures of antibodies binding to arrays of random sequence peptides. It relies on many-to-many binding of antibodies to the random sequence peptides. Each peptide can bind multiple antibodies and each antibody can bind multiple peptides. This technology has been shown to be highly reproducible and appears promising for diagnosing a variety of disease states. However, it is not clear what is the optimal classification algorithm for analyzing this new type of data.

Results: We characterized several classification algorithms to analyze immunosignaturing data. We selected several datasets that range from easy to difficult to classify, from simple monoclonal binding to complex binding patterns in asthma patients. We then classified the biological samples using 17 different classification algorithms. Using a wide variety of assessment criteria, we found ‘Naïve Bayes’ far more useful than other widely used methods due to its simplicity, robustness, speed and accuracy.

Conclusions: ‘Naïve Bayes’ algorithm appears to accommodate the complex patterns hidden within multilayered immunosignaturing microarray data due to its fundamental mathematical properties.

ContributorsKukreja, Muskan (Author) / Johnston, Stephen (Author) / Stafford, Phillip (Author) / Biodesign Institute (Contributor)
Created2012-06-21
128763-Thumbnail Image.png
Description

Purpose: PET (positron emission tomography) imaging researches of functional metabolism using fluorodeoxyglucose ([superscript 18]F-FDG) of animal brain are important in neuroscience studies. FDG-PET imaging studies are often performed on groups of rats, so it is desirable to establish an objective voxel-based statistical methodology for group data analysis.

Material and Methods: This study establishes

Purpose: PET (positron emission tomography) imaging researches of functional metabolism using fluorodeoxyglucose ([superscript 18]F-FDG) of animal brain are important in neuroscience studies. FDG-PET imaging studies are often performed on groups of rats, so it is desirable to establish an objective voxel-based statistical methodology for group data analysis.

Material and Methods: This study establishes a statistical parametric mapping (SPM) toolbox (plug-ins) named spmratIHEP for voxel-wise analysis of FDG-PET images of rat brain, in which an FDG-PET template and an intracranial mask image of rat brain in Paxinos & Watson space were constructed, and the default settings were modified according to features of rat brain. Compared to previous studies, our constructed rat brain template comprises not only the cerebrum and cerebellum, but also the whole olfactory bulb which made the later cognitive studies much more exhaustive. And with an intracranial mask image in the template space, the brain tissues of individuals could be extracted automatically. Moreover, an atlas space is used for anatomically labeling the functional findings in the Paxinos & Watson space. In order to standardize the template image with the atlas accurately, a synthetic FDG-PET image with six main anatomy structures is constructed from the atlas, which performs as a target image in the co-registration.

Results: The spatial normalization procedure is evaluated, by which the individual rat brain images could be standardized into the Paxinos & Watson space successfully and the intracranial tissues could also be extracted accurately. The practical usability of this toolbox is evaluated using FDG-PET functional images from rats with left side middle cerebral artery occlusion (MCAO) in comparison to normal control rats. And the two-sample t-test statistical result is almost related to the left side MCA.

Conclusion: We established a toolbox of SPM8 named spmratIHEP for voxel-wise analysis of FDG-PET images of rat brain.

ContributorsNie, Binbin (Author) / Liu, Hua (Author) / Chen, Kewei (Author) / Jiang, Xiaofeng (Author) / Shan, Baoci (Author) / College of Liberal Arts and Sciences (Contributor)
Created2014-09-26
127961-Thumbnail Image.png
Description

As gesture interfaces become more main-stream, it is increasingly important to investigate the behavioral characteristics of these interactions – particularly in three-dimensional (3D) space. In this study, Fitts’ method was extended to such input technologies, and the applicability of Fitts’ law to gesture-based interactions was examined. The experiment included three

As gesture interfaces become more main-stream, it is increasingly important to investigate the behavioral characteristics of these interactions – particularly in three-dimensional (3D) space. In this study, Fitts’ method was extended to such input technologies, and the applicability of Fitts’ law to gesture-based interactions was examined. The experiment included three gesture-based input devices that utilize different techniques to capture user movement, and compared them to conventional input technologies like touchscreen and mouse. Participants completed a target-acquisition test and were instructed to move a cursor from a home location to a spherical target as quickly and accurately as possible. Three distances and three target sizes were tested six times in a randomized order for all input devices. A total of 81 participants completed all tasks. Movement time, error rate, and throughput were calculated for each input technology. Results showed that the mean movement time was highly correlated with the target's index of difficulty for all devices, providing evidence that Fitts’ law can be extended and applied to gesture-based devices. Throughputs were found to be significantly lower for the gesture-based devices compared to mouse and touchscreen, and as the index of difficulty increased, the movement time increased significantly more for these gesture technologies. Error counts were statistically higher for all gesture-based input technologies compared to mouse. In addition, error counts for all inputs were highly correlated with target width, but little impact was shown by movement distance. Overall, the findings suggest that gesture-based devices can be characterized by Fitts’ law in a similar fashion to conventional 1D or 2D devices.

ContributorsBurno, Rachael A. (Author) / Wu, Bing (Author) / Doherty, Rina (Author) / Colett, Hannah (Author) / Elnaggar, Rania (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2015-10-23
127954-Thumbnail Image.png
Description

Cerebral small-vessel damage manifests as white matter hyperintensities and cerebral atrophy on brain MRI and is associated with aging, cognitive decline and dementia. We sought to examine the interrelationship of these imaging biomarkers and the influence of hypertension in older individuals. We used a multivariate spatial covariance neuroimaging technique to

Cerebral small-vessel damage manifests as white matter hyperintensities and cerebral atrophy on brain MRI and is associated with aging, cognitive decline and dementia. We sought to examine the interrelationship of these imaging biomarkers and the influence of hypertension in older individuals. We used a multivariate spatial covariance neuroimaging technique to localize the effects of white matter lesion load on regional gray matter volume and assessed the role of blood pressure control, age and education on this relationship. Using a case-control design matching for age, gender, and educational attainment we selected 64 participants with normal blood pressure, controlled hypertension or uncontrolled hypertension from the Northern Manhattan Study cohort. We applied gray matter voxel-based morphometry with the scaled subprofile model to (1) identify regional covariance patterns of gray matter volume differences associated with white matter lesion load, (2) compare this relationship across blood pressure groups, and (3) relate it to cognitive performance. In this group of participants aged 60–86 years, we identified a pattern of reduced gray matter volume associated with white matter lesion load in bilateral temporal-parietal regions with relative preservation of volume in the basal forebrain, thalami and cingulate cortex. This pattern was expressed most in the uncontrolled hypertension group and least in the normotensives, but was also more evident in older and more educated individuals. Expression of this pattern was associated with worse performance in executive function and memory. In summary, white matter lesions from small-vessel disease are associated with a regional pattern of gray matter atrophy that is mitigated by blood pressure control, exacerbated by aging, and associated with cognitive performance.

ContributorsKern, Kyle C. (Author) / Wright, Clinton B. (Author) / Bergfield, Kaitlin L. (Author) / Fitzhugh, Megan C. (Author) / Chen, Kewei (Author) / Moeller, James R. (Author) / Nabizadeh, Nooshin (Author) / Elkind, Mitchell S. V. (Author) / Sacco, Ralph L. (Author) / Stern, Yaakov (Author) / DeCarli, Charles S. (Author) / Alexander, Gene E. (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2017-05-15
127936-Thumbnail Image.png
Description

Load associated fatigue cracking is one of the major distress types occurring in flexible pavements. Flexural bending beam fatigue laboratory test has been used for several decades and is considered an integral part of the Superpave advanced characterization procedure. One of the most significant solutions to sustain the fatigue life

Load associated fatigue cracking is one of the major distress types occurring in flexible pavements. Flexural bending beam fatigue laboratory test has been used for several decades and is considered an integral part of the Superpave advanced characterization procedure. One of the most significant solutions to sustain the fatigue life for an asphaltic mixture is to add sustainable materials such as rubber or polymers to the asphalt mixture. A laboratory testing program was performed on three gap-graded mixtures: unmodified, Asphalt Rubber (AR) and polymer-modified. Strain controlled fatigue tests were conducted according to the AASHTO T321 procedure. The results from the beam fatigue tests indicated that the AR and polymer-modified gap graded mixtures would have much longer fatigue lives compared to the reference (unmodified) mixture. In addition, a mechanistic analysis using 3D-Move software coupled with a cost-effectiveness analysis study based on the fatigue performance on the three mixtures were performed. Overall, the analysis showed that the AR and polymer-modified asphalt mixtures exhibited significantly higher cost-effectiveness compared to unmodified HMA mixture. Although AR and polymer-modification increases the cost of the material, the analysis showed that they are more cost effective than the unmodified mixture.

ContributorsSouliman, Mena I. (Author) / Mamlouk, Michael (Author) / Eifert, Annie (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127935-Thumbnail Image.png
Description

The principles of a new project management model have been tested for the past 20 years. This project management model utilizes expertise instead of the traditional management, direction, and control (MDC). This new project management model is a leadership-based model instead of a management model. The practice of the new

The principles of a new project management model have been tested for the past 20 years. This project management model utilizes expertise instead of the traditional management, direction, and control (MDC). This new project management model is a leadership-based model instead of a management model. The practice of the new model requires a change in paradigm and project management structure. Some of the practices of this new paradigm include minimizing the flow of information and communications to and from the project manager [including meetings, emails and documents], eliminating technical communications, reducing client management, direction, and control of the vendor, and the hiring of vendors or personnel to do specific tasks. A vendors is hired only after they have clearly shown that they know what they are doing by showing past performance on similar projects, that they clearly understand how to create transparency to minimize risk that they do not control, and that they can clearly outline their project plan using a detailed milestone schedule including time, cost, and tasks all communicated in the language of metrics.

ContributorsRivera, Alfredo (Author) / Kashiwagi, Dean (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127934-Thumbnail Image.png
Description

For the past three decades, the Saudi construction industry (SCI) has exhibited poor performance. Many research efforts have tried to identify the problem and the potential causes but there have been few publications identifying ways to mitigate the problem and describing testing to validate the proposed solution. This paper examines

For the past three decades, the Saudi construction industry (SCI) has exhibited poor performance. Many research efforts have tried to identify the problem and the potential causes but there have been few publications identifying ways to mitigate the problem and describing testing to validate the proposed solution. This paper examines the research and development (R&D) approach in the SCI. A literature research was performed identifying the impact that R&D has had on the SCI. A questionnaire was also created for surveying industry professionals and researchers. The results show evidence that the SCI practice and the academic research work exist in separate silos. This study recommends a change of mindset in both the public and private sector on their views on R&D since cooperation is required to create collaboration between the two sectors and improve the competitiveness of the country's economy.

ContributorsAlhammadi, Yasir (Author) / Algahtany, Mohammed (Author) / Kashiwagi, Dean (Author) / Sullivan, Kenneth (Author) / Kashiwagi, Jacob (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20