This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 1 - 10 of 35
Filtering by

Clear all filters

141461-Thumbnail Image.png
Description
In the digital humanities, there is a constant need to turn images and PDF files into plain text to apply analyses such as topic modelling, named entity recognition, and other techniques. However, although there exist different solutions to extract text embedded in PDF files or run OCR on images, they

In the digital humanities, there is a constant need to turn images and PDF files into plain text to apply analyses such as topic modelling, named entity recognition, and other techniques. However, although there exist different solutions to extract text embedded in PDF files or run OCR on images, they typically require additional training (for example, scholars have to learn how to use the command line) or are difficult to automate without programming skills. The Giles Ecosystem is a distributed system based on Apache Kafka that allows users to upload documents for text and image extraction. The system components are implemented using Java and the Spring Framework and are available under an Open Source license on GitHub (https://github.com/diging/).
ContributorsLessios-Damerow, Julia (Contributor) / Peirson, Erick (Contributor) / Laubichler, Manfred (Contributor) / ASU-SFI Center for Biosocial Complex Systems (Contributor)
Created2017-09-28
141463-Thumbnail Image.png
Description

Five immunocompetent C57BL/6-cBrd/cBrd/Cr (albino C57BL/6) mice were injected with GL261-luc2 cells, a cell line sharing characteristics of human glioblastoma multiforme (GBM). The mice were imaged using magnetic resonance (MR) at five separate time points to characterize growth and development of the tumor. After 25 days, the final tumor volumes of

Five immunocompetent C57BL/6-cBrd/cBrd/Cr (albino C57BL/6) mice were injected with GL261-luc2 cells, a cell line sharing characteristics of human glioblastoma multiforme (GBM). The mice were imaged using magnetic resonance (MR) at five separate time points to characterize growth and development of the tumor. After 25 days, the final tumor volumes of the mice varied from 12 mm3 to 62 mm3, even though mice were inoculated from the same tumor cell line under carefully controlled conditions. We generated hypotheses to explore large variances in final tumor size and tested them with our simple reaction-diffusion model in both a 3-dimensional (3D) finite difference method and a 2-dimensional (2D) level set method. The parameters obtained from a best-fit procedure, designed to yield simulated tumors as close as possible to the observed ones, vary by an order of magnitude between the three mice analyzed in detail. These differences may reflect morphological and biological variability in tumor growth, as well as errors in the mathematical model, perhaps from an oversimplification of the tumor dynamics or nonidentifiability of parameters. Our results generate parameters that match other experimental in vitro and in vivo measurements. Additionally, we calculate wave speed, which matches with other rat and human measurements.

ContributorsRutter, Erica (Author) / Stepien, Tracy (Author) / Anderies, Barrett (Author) / Plasencia, Jonathan (Author) / Woolf, Eric C. (Author) / Scheck, Adrienne C. (Author) / Turner, Gregory H. (Author) / Liu, Qingwei (Author) / Frakes, David (Author) / Kodibagkar, Vikram (Author) / Kuang, Yang (Author) / Preul, Mark C. (Author) / Kostelich, Eric (Author) / College of Liberal Arts and Sciences (Contributor)
Created2017-05-31
141494-Thumbnail Image.png
Description

Background:
Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and

Background:
Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and spread of a malignant brain cancer (glioblastoma multiforme) in individual patient cases, where the observations are synthetic magnetic resonance images of a hypothetical tumor.

Results:
We apply a modern state estimation algorithm (the Local Ensemble Transform Kalman Filter), previously developed for numerical weather prediction, to two different mathematical models of glioblastoma, taking into account likely errors in model parameters and measurement uncertainties in magnetic resonance imaging. The filter can accurately shadow the growth of a representative synthetic tumor for 360 days (six 60-day forecast/update cycles) in the presence of a moderate degree of systematic model error and measurement noise.

Conclusions:
The mathematical methodology described here may prove useful for other modeling efforts in biology and oncology. An accurate forecast system for glioblastoma may prove useful in clinical settings for treatment planning and patient counseling.

ContributorsKostelich, Eric (Author) / Kuang, Yang (Author) / McDaniel, Joshua (Author) / Moore, Nina Z. (Author) / Martirosyan, Nikolay L. (Author) / Preul, Mark C. (Author) / College of Liberal Arts and Sciences (Contributor)
Created2011-12-21
128166-Thumbnail Image.png
Description

At the end of the dark ages, anatomy was taught as though everything that could be known was known. Scholars learned about what had been discovered rather than how to make discoveries. This was true even though the body (and the rest of biology) was very poorly understood. The renaissance

At the end of the dark ages, anatomy was taught as though everything that could be known was known. Scholars learned about what had been discovered rather than how to make discoveries. This was true even though the body (and the rest of biology) was very poorly understood. The renaissance eventually brought a revolution in how scholars (and graduate students) were trained and worked. This revolution never occurred in K-12 or university education such that we now teach young students in much the way that scholars were taught in the dark ages, we teach them what is already known rather than the process of knowing. Citizen science offers a way to change K-12 and university education and, in doing so, complete the renaissance. Here we offer an example of such an approach and call for change in the way students are taught science, change that is more possible than it has ever been and is, nonetheless, five hundred years delayed.

Created2016-03-01
128588-Thumbnail Image.png
Description

Introduction: Fluorescence-guided surgery is one of the rapidly emerging methods of surgical “theranostics.” In this review, we summarize current fluorescence techniques used in neurosurgical practice for brain tumor patients as well as future applications of recent laboratory and translational studies.

Methods: Review of the literature.

Results: A wide spectrum of fluorophores that

Introduction: Fluorescence-guided surgery is one of the rapidly emerging methods of surgical “theranostics.” In this review, we summarize current fluorescence techniques used in neurosurgical practice for brain tumor patients as well as future applications of recent laboratory and translational studies.

Methods: Review of the literature.

Results: A wide spectrum of fluorophores that have been tested for brain surgery is reviewed. Beginning with a fluorescein sodium application in 1948 by Moore, fluorescence-guided brain tumor surgery is either routinely applied in some centers or is under active study in clinical trials. Besides the trinity of commonly used drugs (fluorescein sodium, 5-aminolevulinic acid, and indocyanine green), less studied fluorescent stains, such as tetracyclines, cancer-selective alkylphosphocholine analogs, cresyl violet, acridine orange, and acriflavine, can be used for rapid tumor detection and pathological tissue examination. Other emerging agents, such as activity-based probes and targeted molecular probes that can provide biomolecular specificity for surgical visualization and treatment, are reviewed. Furthermore, we review available engineering and optical solutions for fluorescent surgical visualization. Instruments for fluorescent-guided surgery are divided into wide-field imaging systems and hand-held probes. Recent advancements in quantitative fluorescence-guided surgery are discussed.

Conclusion: We are standing on the threshold of the era of marker-assisted tumor management. Innovations in the fields of surgical optics, computer image analysis, and molecular bioengineering are advancing fluorescence-guided tumor resection paradigms, leading to cell-level approaches to visualization and resection of brain tumors.

Created2016-10-17
127882-Thumbnail Image.png
Description

The estimation of energy demand (by power plants) has traditionally relied on historical energy use data for the region(s) that a plant produces for. Regression analysis, artificial neural network and Bayesian theory are the most common approaches for analysing these data. Such data and techniques do not generate reliable results.

The estimation of energy demand (by power plants) has traditionally relied on historical energy use data for the region(s) that a plant produces for. Regression analysis, artificial neural network and Bayesian theory are the most common approaches for analysing these data. Such data and techniques do not generate reliable results. Consequently, excess energy has to be generated to prevent blackout; causes for energy surge are not easily determined; and potential energy use reduction from energy efficiency solutions is usually not translated into actual energy use reduction. The paper highlights the weaknesses of traditional techniques, and lays out a framework to improve the prediction of energy demand by combining energy use models of equipment, physical systems and buildings, with the proposed data mining algorithms for reverse engineering. The research team first analyses data samples from large complex energy data, and then, presents a set of computationally efficient data mining algorithms for reverse engineering. In order to develop a structural system model for reverse engineering, two focus groups are developed that has direct relation with cause and effect variables. The research findings of this paper includes testing out different sets of reverse engineering algorithms, understand their output patterns and modify algorithms to elevate accuracy of the outputs.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Ye, Long (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2015-12-09
127878-Thumbnail Image.png
Description

Small and medium office buildings consume a significant parcel of the U.S. building stock energy consumption. Still, owners lack resources and experience to conduct detailed energy audits and retrofit analysis. We present an eight-steps framework for an energy retrofit assessment in small and medium office buildings. Through a bottom-up approach

Small and medium office buildings consume a significant parcel of the U.S. building stock energy consumption. Still, owners lack resources and experience to conduct detailed energy audits and retrofit analysis. We present an eight-steps framework for an energy retrofit assessment in small and medium office buildings. Through a bottom-up approach and a web-based retrofit toolkit tested on a case study in Arizona, this methodology was able to save about 50% of the total energy consumed by the case study building, depending on the adopted measures and invested capital. While the case study presented is a deep energy retrofit, the proposed framework is effective in guiding the decision-making process that precedes any energy retrofit, deep or light.

ContributorsRios, Fernanda (Author) / Parrish, Kristen (Author) / Chong, Oswald (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127872-Thumbnail Image.png
Description

Background: Modern advances in sequencing technology have enabled the census of microbial members of many natural ecosystems. Recently, attention is increasingly being paid to the microbial residents of human-made, built ecosystems, both private (homes) and public (subways, office buildings, and hospitals). Here, we report results of the characterization of the microbial

Background: Modern advances in sequencing technology have enabled the census of microbial members of many natural ecosystems. Recently, attention is increasingly being paid to the microbial residents of human-made, built ecosystems, both private (homes) and public (subways, office buildings, and hospitals). Here, we report results of the characterization of the microbial ecology of a singular built environment, the International Space Station (ISS). This ISS sampling involved the collection and microbial analysis (via 16S rRNA gene PCR) of 15 surfaces sampled by swabs onboard the ISS. This sampling was a component of Project MERCCURI (Microbial Ecology Research Combining Citizen and University Researchers on ISS). Learning more about the microbial inhabitants of the “buildings” in which we travel through space will take on increasing importance, as plans for human exploration continue, with the possibility of colonization of other planets and moons.

Results: Sterile swabs were used to sample 15 surfaces onboard the ISS. The sites sampled were designed to be analogous to samples collected for (1) the Wildlife of Our Homes project and (2) a study of cell phones and shoes that were concurrently being collected for another component of Project MERCCURI. Sequencing of the 16S rRNA genes amplified from DNA extracted from each swab was used to produce a census of the microbes present on each surface sampled. We compared the microbes found on the ISS swabs to those from both homes on Earth and data from the Human Microbiome Project.

Conclusions: While significantly different from homes on Earth and the Human Microbiome Project samples analyzed here, the microbial community composition on the ISS was more similar to home surfaces than to the human microbiome samples. The ISS surfaces are OTU-rich with 1,036–4,294 operational taxonomic units (OTUs per sample). There was no discernible biogeography of microbes on the 15 ISS surfaces, although this may be a reflection of the small sample size we were able to obtain.

ContributorsLang, Jenna M. (Author) / Coil, David A. (Author) / Neches, Russell Y. (Author) / Brown, Wendy E. (Author) / Cavalier, Darlene (Author) / Severance, Mark (Author) / Hampton-Marcell, Jarrad T. (Author) / Gilbert, Jack A. (Author) / Eisen, Jonathan A. (Author) / ASU-SFI Center for Biosocial Complex Systems (Contributor)
Created2017-12-05
127865-Thumbnail Image.png
Description

Commercial buildings’ consumption is driven by multiple factors that include occupancy, system and equipment efficiency, thermal heat transfer, equipment plug loads, maintenance and operational procedures, and outdoor and indoor temperatures. A modern building energy system can be viewed as a complex dynamical system that is interconnected and influenced by external

Commercial buildings’ consumption is driven by multiple factors that include occupancy, system and equipment efficiency, thermal heat transfer, equipment plug loads, maintenance and operational procedures, and outdoor and indoor temperatures. A modern building energy system can be viewed as a complex dynamical system that is interconnected and influenced by external and internal factors. Modern large scale sensor measures some physical signals to monitor real-time system behaviors. Such data has the potentials to detect anomalies, identify consumption patterns, and analyze peak loads. The paper proposes a novel method to detect hidden anomalies in commercial building energy consumption system. The framework is based on Hilbert-Huang transform and instantaneous frequency analysis. The objectives are to develop an automated data pre-processing system that can detect anomalies and provide solutions with real-time consumption database using Ensemble Empirical Mode Decomposition (EEMD) method. The finding of this paper will also include the comparisons of Empirical mode decomposition and Ensemble empirical mode decomposition of three important type of institutional buildings.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Huang, Zigang (Author) / Cheng, Ying (Author) / Ira A. Fulton School of Engineering (Contributor)
Created2016-05-20
127833-Thumbnail Image.png
Description

There are many data mining and machine learning techniques to manage large sets of complex energy supply and demand data for building, organization and city. As the amount of data continues to grow, new data analysis methods are needed to address the increasing complexity. Using data from the energy loss

There are many data mining and machine learning techniques to manage large sets of complex energy supply and demand data for building, organization and city. As the amount of data continues to grow, new data analysis methods are needed to address the increasing complexity. Using data from the energy loss between the supply (energy production sources) and demand (buildings and cities consumption), this paper proposes a Semi-Supervised Energy Model (SSEM) to analyse different loss factors for a building cluster. This is done by deep machine learning by training machines to semi-supervise the learning, understanding and manage the process of energy losses. Semi-Supervised Energy Model (SSEM) aims at understanding the demand-supply characteristics of a building cluster and utilizes the confident unlabelled data (loss factors) using deep machine learning techniques. The research findings involves sample data from one of the university campuses and presents the output, which provides an estimate of losses that can be reduced. The paper also provides a list of loss factors that contributes to the total losses and suggests a threshold value for each loss factor, which is determined through real time experiments. The conclusion of this paper provides a proposed energy model that can provide accurate numbers on energy demand, which in turn helps the suppliers to adopt such a model to optimize their supply strategies.

ContributorsNaganathan, Hariharan (Author) / Chong, Oswald (Author) / Chen, Xue-wen (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-09-14