This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 1 - 10 of 27
Filtering by

Clear all filters

Description

Photosynthesis, a process catalysed by plants, algae and cyanobacteria converts sunlight to energy thus sustaining all higher life on Earth. Two large membrane protein complexes, photosystem I and II (PSI and PSII), act in series to catalyse the light-driven reactions in photosynthesis. PSII catalyses the light-driven water splitting process, which

Photosynthesis, a process catalysed by plants, algae and cyanobacteria converts sunlight to energy thus sustaining all higher life on Earth. Two large membrane protein complexes, photosystem I and II (PSI and PSII), act in series to catalyse the light-driven reactions in photosynthesis. PSII catalyses the light-driven water splitting process, which maintains the Earth’s oxygenic atmosphere. In this process, the oxygen-evolving complex (OEC) of PSII cycles through five states, S0 to S4, in which four electrons are sequentially extracted from the OEC in four light-driven charge-separation events. Here we describe time resolved experiments on PSII nano/microcrystals from Thermosynechococcus elongatus performed with the recently developed technique of serial femtosecond crystallography. Structures have been determined from PSII in the dark S1 state and after double laser excitation (putative S3 state) at 5 and 5.5 Å resolution, respectively. The results provide evidence that PSII undergoes significant conformational changes at the electron acceptor side and at the Mn4CaO5 core of the OEC. These include an elongation of the metal cluster, accompanied by changes in the protein environment, which could allow for binding of the second substrate water molecule between the more distant protruding Mn (referred to as the ‘dangler’ Mn) and the Mn3CaOx cubane in the S2 to S3 transition, as predicted by spectroscopic and computational studies. This work shows the great potential for time-resolved serial femtosecond crystallography for investigation of catalytic processes in biomolecules.

ContributorsKupitz, Christopher (Author) / Basu, Shibom (Author) / Grotjohann, Ingo (Author) / Fromme, Raimund (Author) / Zatsepin, Nadia (Author) / Rendek, Kimberly (Author) / Hunter, Mark (Author) / Shoeman, Robert L. (Author) / White, Thomas A. (Author) / Wang, Dingjie (Author) / James, Daniel (Author) / Yang, Jay-How (Author) / Cobb, Danielle (Author) / Reeder, Brenda (Author) / Sierra, Raymond G. (Author) / Liu, Haiguang (Author) / Barty, Anton (Author) / Aquila, Andrew L. (Author) / Deponte, Daniel (Author) / Kirian, Richard (Author) / Bari, Sadia (Author) / Bergkamp, Jesse (Author) / Beyerlein, Kenneth R. (Author) / Bogan, Michael J. (Author) / Caleman, Carl (Author) / Chao, Tzu-Chiao (Author) / Conrad, Chelsie (Author) / Davis, Katherine M. (Author) / Department of Chemistry and Biochemistry (Contributor)
Created2014-09-11
Description

Background: Cancer diagnosis in both dogs and humans is complicated by the lack of a non-invasive diagnostic test. To meet this clinical need, we apply the recently developed immunosignature assay to spontaneous canine lymphoma as clinical proof-of-concept. Here we evaluate the immunosignature as a diagnostic for spontaneous canine lymphoma at both

Background: Cancer diagnosis in both dogs and humans is complicated by the lack of a non-invasive diagnostic test. To meet this clinical need, we apply the recently developed immunosignature assay to spontaneous canine lymphoma as clinical proof-of-concept. Here we evaluate the immunosignature as a diagnostic for spontaneous canine lymphoma at both at initial diagnosis and evaluating the disease free interval following treatment.

Methods: Sera from dogs with confirmed lymphoma (B cell n = 38, T cell n = 11) and clinically normal dogs (n = 39) were analyzed. Serum antibody responses were characterized by analyzing the binding pattern, or immunosignature, of serum antibodies on a non-natural sequence peptide microarray. Peptides were selected and tested for the ability to distinguish healthy dogs from those with lymphoma and to distinguish lymphoma subtypes based on immunophenotype. The immunosignature of dogs with lymphoma were evaluated for individual signatures. Changes in the immunosignatures were evaluated following treatment and eventual relapse.

Results: Despite being a clonal disease, both an individual immunosignature and a generalized lymphoma immunosignature were observed in each dog. The general lymphoma immunosignature identified in the initial set of dogs (n = 32) was able to predict disease status in an independent set of dogs (n = 42, 97% accuracy). A separate immunosignature was able to distinguish the lymphoma based on immunophenotype (n = 25, 88% accuracy). The individual immunosignature was capable of confirming remission three months following diagnosis. Immunosignature at diagnosis was able to predict which dogs with B cell lymphoma would relapse in less than 120 days (n = 33, 97% accuracy).

Conclusion: We conclude that the immunosignature can serve as a multilevel diagnostic for canine, and potentially human, lymphoma.

ContributorsJohnston, Stephen (Author) / Thamm, Douglas H. (Author) / Legutki, Joseph Barten (Author) / Biodesign Institute (Contributor)
Created2014-09-08
Description

We present results from experiments at the Linac Coherent Light Source (LCLS) demonstrating that serial femtosecond crystallography (SFX) can be performed to high resolution (~2.5 Å) using protein microcrystals deposited on an ultra-thin silicon nitride membrane and embedded in a preservation medium at room temperature. Data can be acquired at

We present results from experiments at the Linac Coherent Light Source (LCLS) demonstrating that serial femtosecond crystallography (SFX) can be performed to high resolution (~2.5 Å) using protein microcrystals deposited on an ultra-thin silicon nitride membrane and embedded in a preservation medium at room temperature. Data can be acquired at a high acquisition rate using x-ray free electron laser sources to overcome radiation damage, while sample consumption is dramatically reduced compared to flowing jet methods. We achieved a peak data acquisition rate of 10 Hz with a hit rate of ~38%, indicating that a complete data set could be acquired in about one 12-hour LCLS shift using the setup described here, or in even less time using hardware optimized for fixed target SFX. This demonstration opens the door to ultra low sample consumption SFX using the technique of diffraction-before-destruction on proteins that exist in only small quantities and/or do not produce the copious quantities of microcrystals required for flowing jet methods.

ContributorsHunter, Mark S. (Author) / Segelke, Brent (Author) / Messerschmidt, Marc (Author) / Williams, Garth J. (Author) / Zatsepin, Nadia (Author) / Barty, Anton (Author) / Benner, W. Henry (Author) / Carlson, David B. (Author) / Coleman, Matthew (Author) / Graf, Alexander (Author) / Hau-Riege, Stefan P. (Author) / Pardini, Tommaso (Author) / Seibert, M. Marvin (Author) / Evans, James (Author) / Boutet, Sebastien (Author) / Frank, Matthias (Author) / College of Liberal Arts and Sciences (Contributor)
Created2014-08-12
127882-Thumbnail Image.png
Description

The estimation of energy demand (by power plants) has traditionally relied on historical energy use data for the region(s) that a plant produces for. Regression analysis, artificial neural network and Bayesian theory are the most common approaches for analysing these data. Such data and techniques do not generate reliable results.

The estimation of energy demand (by power plants) has traditionally relied on historical energy use data for the region(s) that a plant produces for. Regression analysis, artificial neural network and Bayesian theory are the most common approaches for analysing these data. Such data and techniques do not generate reliable results. Consequently, excess energy has to be generated to prevent blackout; causes for energy surge are not easily determined; and potential energy use reduction from energy efficiency solutions is usually not translated into actual energy use reduction. The paper highlights the weaknesses of traditional techniques, and lays out a framework to improve the prediction of energy demand by combining energy use models of equipment, physical systems and buildings, with the proposed data mining algorithms for reverse engineering. The research team first analyses data samples from large complex energy data, and then, presents a set of computationally efficient data mining algorithms for reverse engineering. In order to develop a structural system model for reverse engineering, two focus groups are developed that has direct relation with cause and effect variables. The research findings of this paper includes testing out different sets of reverse engineering algorithms, understand their output patterns and modify algorithms to elevate accuracy of the outputs.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Ye, Long (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2015-12-09
127878-Thumbnail Image.png
Description

Small and medium office buildings consume a significant parcel of the U.S. building stock energy consumption. Still, owners lack resources and experience to conduct detailed energy audits and retrofit analysis. We present an eight-steps framework for an energy retrofit assessment in small and medium office buildings. Through a bottom-up approach

Small and medium office buildings consume a significant parcel of the U.S. building stock energy consumption. Still, owners lack resources and experience to conduct detailed energy audits and retrofit analysis. We present an eight-steps framework for an energy retrofit assessment in small and medium office buildings. Through a bottom-up approach and a web-based retrofit toolkit tested on a case study in Arizona, this methodology was able to save about 50% of the total energy consumed by the case study building, depending on the adopted measures and invested capital. While the case study presented is a deep energy retrofit, the proposed framework is effective in guiding the decision-making process that precedes any energy retrofit, deep or light.

ContributorsRios, Fernanda (Author) / Parrish, Kristen (Author) / Chong, Oswald (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127865-Thumbnail Image.png
Description

Commercial buildings’ consumption is driven by multiple factors that include occupancy, system and equipment efficiency, thermal heat transfer, equipment plug loads, maintenance and operational procedures, and outdoor and indoor temperatures. A modern building energy system can be viewed as a complex dynamical system that is interconnected and influenced by external

Commercial buildings’ consumption is driven by multiple factors that include occupancy, system and equipment efficiency, thermal heat transfer, equipment plug loads, maintenance and operational procedures, and outdoor and indoor temperatures. A modern building energy system can be viewed as a complex dynamical system that is interconnected and influenced by external and internal factors. Modern large scale sensor measures some physical signals to monitor real-time system behaviors. Such data has the potentials to detect anomalies, identify consumption patterns, and analyze peak loads. The paper proposes a novel method to detect hidden anomalies in commercial building energy consumption system. The framework is based on Hilbert-Huang transform and instantaneous frequency analysis. The objectives are to develop an automated data pre-processing system that can detect anomalies and provide solutions with real-time consumption database using Ensemble Empirical Mode Decomposition (EEMD) method. The finding of this paper will also include the comparisons of Empirical mode decomposition and Ensemble empirical mode decomposition of three important type of institutional buildings.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Huang, Zigang (Author) / Cheng, Ying (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127848-Thumbnail Image.png
Description

There are an increasing variety of applications in which peptides are both synthesized and used attached to solid surfaces. This has created a need for high throughput sequence analysis directly on surfaces. However, common sequencing approaches that can be adapted to surface bound peptides lack the throughput often needed in

There are an increasing variety of applications in which peptides are both synthesized and used attached to solid surfaces. This has created a need for high throughput sequence analysis directly on surfaces. However, common sequencing approaches that can be adapted to surface bound peptides lack the throughput often needed in library-based applications. Here we describe a simple approach for sequence analysis directly on solid surfaces that is both high speed and high throughput, utilizing equipment available in most protein analysis facilities. In this approach, surface bound peptides, selectively labeled at their N-termini with a positive charge-bearing group, are subjected to controlled degradation in ammonia gas, resulting in a set of fragments differing by a single amino acid that remain spatially confined on the surface they were bound to. These fragments can then be analyzed by MALDI mass spectrometry, and the peptide sequences read directly from the resulting spectra.

ContributorsZhao, Zhan-Gong (Author) / Cordovez, Lalaine Anne (Author) / Johnston, Stephen (Author) / Woodbury, Neal (Author) / Biodesign Institute (Contributor)
Created2017-12-19
127836-Thumbnail Image.png
Description

Previous proof-of-concept measurements on single-layer two-dimensional membrane-protein crystals performed at X-ray free-electron lasers (FELs) have demonstrated that the collection of meaningful diffraction patterns, which is not possible at synchrotrons because of radiation-damage issues, is feasible. Here, the results obtained from the analysis of a thousand single-shot, room-temperature X-ray FEL diffraction

Previous proof-of-concept measurements on single-layer two-dimensional membrane-protein crystals performed at X-ray free-electron lasers (FELs) have demonstrated that the collection of meaningful diffraction patterns, which is not possible at synchrotrons because of radiation-damage issues, is feasible. Here, the results obtained from the analysis of a thousand single-shot, room-temperature X-ray FEL diffraction images from two-dimensional crystals of a bacteriorhodopsin mutant are reported in detail. The high redundancy in the measurements boosts the intensity signal-to-noise ratio, so that the values of the diffracted intensities can be reliably determined down to the detector-edge resolution of 4 Å. The results show that two-dimensional serial crystallography at X-ray FELs is a suitable method to study membrane proteins to near-atomic length scales at ambient temperature. The method presented here can be extended to pump–probe studies of optically triggered structural changes on submillisecond timescales in two-dimensional crystals, which allow functionally relevant large-scale motions that may be quenched in three-dimensional crystals.

ContributorsCasadei, Cecilia M. (Author) / Tsai, Ching-Ju (Author) / Barty, Anton (Author) / Hunter, Mark S. (Author) / Zatsepin, Nadia (Author) / Padeste, Celestino (Author) / Capitani, Guido (Author) / Benner, W. Henry (Author) / Boutet, Sebastien (Author) / Hau-Riege, Stefan P. (Author) / Kupitz, Christopher (Author) / Messerschmidt, Marc (Author) / Ogren, John I. (Author) / Pardini, Tom (Author) / Rothschild, Kenneth J. (Author) / Sala, Leonardo (Author) / Segelke, Brent (Author) / Williams, Garth J. (Author) / Evans, James E. (Author) / Li, Xiao-Dan (Author) / Coleman, Matthew (Author) / Pedrini, Bill (Author) / Frank, Matthias (Author) / College of Liberal Arts and Sciences (Contributor)
Created2018-01
127833-Thumbnail Image.png
Description

There are many data mining and machine learning techniques to manage large sets of complex energy supply and demand data for building, organization and city. As the amount of data continues to grow, new data analysis methods are needed to address the increasing complexity. Using data from the energy loss

There are many data mining and machine learning techniques to manage large sets of complex energy supply and demand data for building, organization and city. As the amount of data continues to grow, new data analysis methods are needed to address the increasing complexity. Using data from the energy loss between the supply (energy production sources) and demand (buildings and cities consumption), this paper proposes a Semi-Supervised Energy Model (SSEM) to analyse different loss factors for a building cluster. This is done by deep machine learning by training machines to semi-supervise the learning, understanding and manage the process of energy losses. Semi-Supervised Energy Model (SSEM) aims at understanding the demand-supply characteristics of a building cluster and utilizes the confident unlabelled data (loss factors) using deep machine learning techniques. The research findings involves sample data from one of the university campuses and presents the output, which provides an estimate of losses that can be reduced. The paper also provides a list of loss factors that contributes to the total losses and suggests a threshold value for each loss factor, which is determined through real time experiments. The conclusion of this paper provides a proposed energy model that can provide accurate numbers on energy demand, which in turn helps the suppliers to adopt such a model to optimize their supply strategies.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Chen, Xue-wen (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-09-14
127830-Thumbnail Image.png
Description

Recent infectious outbreaks highlight the need for platform technologies that can be quickly deployed to develop therapeutics needed to contain the outbreak. We present a simple concept for rapid development of new antimicrobials. The goal was to produce in as little as one week thousands of doses of an intervention

Recent infectious outbreaks highlight the need for platform technologies that can be quickly deployed to develop therapeutics needed to contain the outbreak. We present a simple concept for rapid development of new antimicrobials. The goal was to produce in as little as one week thousands of doses of an intervention for a new pathogen. We tested the feasibility of a system based on antimicrobial synbodies. The system involves creating an array of 100 peptides that have been selected for broad capability to bind and/or kill viruses and bacteria. The peptides are pre-screened for low cell toxicity prior to large scale synthesis. Any pathogen is then assayed on the chip to find peptides that bind or kill it. Peptides are combined in pairs as synbodies and further screened for activity and toxicity. The lead synbody can be quickly produced in large scale, with completion of the entire process in one week.

ContributorsJohnston, Stephen (Author) / Domenyuk, Valeriy (Author) / Gupta, Nidhi (Author) / Tavares Batista, Milene (Author) / Lainson, John (Author) / Zhao, Zhan-Gong (Author) / Lusk, Joel (Author) / Loskutov, Andrey (Author) / Cichacz, Zbigniew (Author) / Stafford, Phillip (Author) / Legutki, Joseph Barten (Author) / Diehnelt, Chris (Author) / Biodesign Institute (Contributor)
Created2017-12-14