Matching Items (414)
134464-Thumbnail Image.png
Description
Methanogens are methane-producing archaea that play a major role in the global carbon cycle. However, despite their importance, the community dynamics of these organisms have not been thoroughly characterized or modeled. In the majority of methanogenesis models, the communities are approximated as a chemical reaction or divided into two populations

Methanogens are methane-producing archaea that play a major role in the global carbon cycle. However, despite their importance, the community dynamics of these organisms have not been thoroughly characterized or modeled. In the majority of methanogenesis models, the communities are approximated as a chemical reaction or divided into two populations based on the most common methanogenic pathways. These models provide reasonable estimate of methanogenesis rates but cannot predict community structure. In this work, a trait-based model for methanogenic communities in peatlands is developed. The model divides methanogens commonly found in wetlands into ten guilds, with divisions based on factors such as substrate affinity, pH tolerance, and phylogeny. The model uses steady-state, mixotrophic Monod kinetics to model growth and assumes peatlands operate as a semi-batch system. An extensive literature review was performed to parameterize the model. The acetoclastic module of the model was validated against experimental data. It was found that this portion of the model was able to reproduce the major result of an experiment that examined competition between Methanosaeta and Methanosarcina species under irregular feeding conditions. The model was analyzed as a whole using Monte Carlo simulation methods. It was found that equilibrium membership is negatively correlated with a guild's half-substrate constant, but independent of the guild's yield. These results match what is seen in simple pairwise competition models. In contrast, it was found that both the half-substrate constant and yield affected a guild's numerical dominance. Lower half-substrate constants and higher yields led to a guild accounting for a greater fraction of community biomass. This is not seen in simple pairwise competitions models where only yield affects final biomass. As a whole, the development of this model framework and the accompanying analyses have laid the groundwork for a new class of more detailed methanogen community models that go beyond the two compartment acetoclastic-hydrogenotrophic assumption. .
ContributorsLopez Jr, Jaime Gerardo (Author) / Cadillo-Quiroz, Hinsby (Thesis director) / Marcus, Andrew (Committee member) / Chemical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
133505-Thumbnail Image.png
Description
While biodiesel production from photosynthesizing algae is a promising form of alternative energy, the process is water and nutrient intensive. I designed a mathematical model for a photobioreactor system that filters the reactor effluent and returns the permeate to the system so that unutilized nutrients are not wasted, addressing these

While biodiesel production from photosynthesizing algae is a promising form of alternative energy, the process is water and nutrient intensive. I designed a mathematical model for a photobioreactor system that filters the reactor effluent and returns the permeate to the system so that unutilized nutrients are not wasted, addressing these problems. The model tracks soluble and biomass components that govern the rates of the processes within the photobioreactor (PBR). It considers light attenuation and inhibition, nutrient limitation, preference for ammonia consumption over nitrate, production of soluble microbial products (SMP) and extracellular polymeric substance (EPS), and competition with heterotrophic bacteria that predominately consume SMP. I model a continuous photobioreactor + microfiltration system under nine unique operation conditions - three dilution rates and three recycling rates. I also evaluate the health of a PBR under different dilution rates for two values of qpred. I evaluate the success of each run by calculating values such as biomass productivity and specific biomass yield. The model shows that for low dilution rates (D = <0.2 d-1) and high recycling rates (>66%), nutrient limitation can lead to a PBR crash. In balancing biomass productivity with water conservation, the most favorable runs were those in which the dilution rate and the recycling rate were highest. In a second part of my thesis, I developed a model that describes the interactions of phototrophs and their predators. The model also shows that dilution rates corresponding to realistic PBR operation can washout predators from the system, but the simulation outputs depend heavily on the accuracy of parameters that are not well defined.
ContributorsWik, Benjamin Philip (Author) / Marcus, Andrew (Thesis director) / Rittmann, Bruce (Committee member) / School of Sustainability (Contributor) / Chemical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
Description
Computer simulations are gaining recognition as educational tools, but in general there is still a line dividing a simulation from a game. Yet as many recent and successful video games heavily involve simulations (SimCity comes to mind), there is not only the growing question of whether games can be used

Computer simulations are gaining recognition as educational tools, but in general there is still a line dividing a simulation from a game. Yet as many recent and successful video games heavily involve simulations (SimCity comes to mind), there is not only the growing question of whether games can be used for educational purposes, but also of how a game might qualify as educational. Endemic: The Agent is a project that tries to bridge the gap between educational simulations and educational games. This paper outlines the creation of the project and the characteristics that make it an educational tool, a simulation, and a game.
ContributorsFish, Derek Austin (Author) / Karr, Timothy (Thesis director) / Marcus, Andrew (Committee member) / Jones, Donald (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2013-05
147619-Thumbnail Image.png
Description

The human gut microbiome is a complex community of microorganisms. These microbes play an important role in host health by contributing essential compounds and acting as a barrier against pathogens. However, these communities and associated functions can be impacted by factors like disease and diet. In particular, microbial fermentation of

The human gut microbiome is a complex community of microorganisms. These microbes play an important role in host health by contributing essential compounds and acting as a barrier against pathogens. However, these communities and associated functions can be impacted by factors like disease and diet. In particular, microbial fermentation of dietary components like polysaccharides, proteins, and fats that reach the gut are being examined to better understand how these biopolymers are utilized and affect community structure. Thus, evaluating the accuracy of methods used to quantify specific macromolecules is crucial to gaining a precise understanding of how gut microbes hydrolyze those substrates. This study presents findings on the accuracy of the Megazyme RS kit (Rapid) modified for high performance liquid chromatography (HPLC) readings and the DC Protein Assay when performed on samples from complex gut media with potato starch treatments and bovine serum albumin (BSA) treatments. Overall, our data indicates that the megazyme RS kit needs further modification to detect expected starch content with the HPLC and that the DC Protein Assay is not suitable for specific protein analysis.

ContributorsKlein, Rachel Marie (Author) / Krajmalnik-Brown, Rosa (Thesis director) / Marcus, Andrew (Committee member) / School of Life Sciences (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

Pay-for-performance (PFP) is a relatively new approach to agricultural conservation that attaches an incentive payment to quantified reductions in nutrient runoff from a participating farm. Similar to a payment for ecosystem services approach, PFP lends itself to providing incentives for the most beneficial practices at the field level. To date,

Pay-for-performance (PFP) is a relatively new approach to agricultural conservation that attaches an incentive payment to quantified reductions in nutrient runoff from a participating farm. Similar to a payment for ecosystem services approach, PFP lends itself to providing incentives for the most beneficial practices at the field level. To date, PFP conservation in the U.S. has only been applied in small pilot programs. Because monitoring conservation performance for each field enrolled in a program would be cost-prohibitive, field-level modeling can provide cost-effective estimates of anticipated improvements in nutrient runoff. We developed a PFP system that uses a unique application of one of the leading agricultural models, the USDA’s Soil and Water Assessment Tool, to evaluate the nutrient load reductions of potential farm practice changes based on field-level agronomic and management data. The initial phase of the project focused on simulating individual fields in the River Raisin watershed in southeastern Michigan. Here we present development of the modeling approach and results from the pilot year, 2015-2016. These results stress that (1) there is variability in practice effectiveness both within and between farms, and thus there is not one “best practice” for all farms, (2) conservation decisions are made most effectively at the scale of the farm field rather than the sub-watershed or watershed level, and (3) detailed, field-level management information is needed to accurately model and manage on-farm nutrient loadings.

Supplemental information mentioned in the article is attached as a separate document.

ContributorsMuenich, Rebecca (Author) / Kalcic, M. M. (Author) / Winsten, J. (Author) / Fisher, K. (Author) / Day, M. (Author) / O'Neil, G. (Author) / Wang, Y.-C. (Author) / Scavia, D. (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2017
128329-Thumbnail Image.png
Description

The emerging field of neuroprosthetics is focused on the development of new therapeutic interventions that will be able to restore some lost neural function by selective electrical stimulation or by harnessing activity recorded from populations of neurons. As more and more patients benefit from these approaches, the interest in neural

The emerging field of neuroprosthetics is focused on the development of new therapeutic interventions that will be able to restore some lost neural function by selective electrical stimulation or by harnessing activity recorded from populations of neurons. As more and more patients benefit from these approaches, the interest in neural interfaces has grown significantly and a new generation of penetrating microelectrode arrays are providing unprecedented access to the neurons of the central nervous system (CNS). These microelectrodes have active tip dimensions that are similar in size to neurons and because they penetrate the nervous system, they provide selective access to these cells (within a few microns). However, the very long-term viability of chronically implanted microelectrodes and the capability of recording the same spiking activity over long time periods still remain to be established and confirmed in human studies. Here we review the main responses to acute implantation of microelectrode arrays, and emphasize that it will become essential to control the neural tissue damage induced by these intracortical microelectrodes in order to achieve the high clinical potentials accompanying this technology.

ContributorsFernandez, Eduardo (Author) / Greger, Bradley (Author) / House, Paul A. (Author) / Aranda, Ignacio (Author) / Botella, Carlos (Author) / Albisua, Julio (Author) / Soto-Sanchez, Cristina (Author) / Alfaro, Arantxa (Author) / Normann, Richard A. (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2014-07-21
127956-Thumbnail Image.png
Description

In this study, a low-cycle fatigue experiment was conducted on printed wiring boards (PWB). The Weibull regression model and computational Bayesian analysis method were applied to analyze failure time data and to identify important factors that influence the PWB lifetime. The analysis shows that both shape parameter and scale parameter

In this study, a low-cycle fatigue experiment was conducted on printed wiring boards (PWB). The Weibull regression model and computational Bayesian analysis method were applied to analyze failure time data and to identify important factors that influence the PWB lifetime. The analysis shows that both shape parameter and scale parameter of Weibull distribution are affected by the supplier factor and preconditioning methods Based on the energy equivalence approach, a 6-cycle reflow precondition can be replaced by a 5-cycle IST precondition, thus the total testing time can be greatly reduced. This conclusion was validated by the likelihood ratio test of two datasets collected under two different preconditioning methods Therefore, the Weibull regression modeling approach is an effective approach for accounting for the variation of experimental setting in the PWB lifetime prediction.

ContributorsPan, Rong (Author) / Xu, Xinyue (Author) / Juarez, Joseph (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-11-12
127957-Thumbnail Image.png
Description

Studies about the data quality of National Bridge Inventory (NBI) reveal missing, erroneous, and logically conflicting data. Existing data quality programs lack a focus on detecting the logical inconsistencies within NBI and between NBI and external data sources. For example, within NBI, the structural condition ratings of some bridges improve

Studies about the data quality of National Bridge Inventory (NBI) reveal missing, erroneous, and logically conflicting data. Existing data quality programs lack a focus on detecting the logical inconsistencies within NBI and between NBI and external data sources. For example, within NBI, the structural condition ratings of some bridges improve over a period while having no improvement activity or maintenance funds recorded in relevant attributes documented in NBI. An example of logical inconsistencies between NBI and external data sources is that some bridges are not located within 100 meters of any roads extracted from Google Map. Manual detection of such logical errors is tedious and error-prone. This paper proposes a systematical “hypothesis testing” approach for automatically detecting logical inconsistencies within NBI and between NBI and external data sources. Using this framework, the authors detected logical inconsistencies in the NBI data of two sample states for revealing suspicious data items in NBI. The results showed that about 1% of bridges were not located within 100 meters of any actual roads, and few bridges showed improvements in the structural evaluation without any reported maintenance records.

ContributorsDin, Zia Ud (Author) / Tang, Pingbo (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-05-20
Description

Quorum-sensing networks enable bacteria to sense and respond to chemical signals produced by neighboring bacteria. They are widespread: over 100 morphologically and genetically distinct species of eubacteria are known to use quorum sensing to control gene expression. This diversity suggests the potential to use natural protein variants to engineer parallel,

Quorum-sensing networks enable bacteria to sense and respond to chemical signals produced by neighboring bacteria. They are widespread: over 100 morphologically and genetically distinct species of eubacteria are known to use quorum sensing to control gene expression. This diversity suggests the potential to use natural protein variants to engineer parallel, input-specific, cell–cell communication pathways. However, only three distinct signaling pathways, Lux, Las, and Rhl, have been adapted for and broadly used in engineered systems. The paucity of unique quorum-sensing systems and their propensity for crosstalk limits the usefulness of our current quorum-sensing toolkit. This review discusses the need for more signaling pathways, roadblocks to using multiple pathways in parallel, and strategies for expanding the quorum-sensing toolbox for synthetic biology.

ContributorsDaer, Rene (Author) / Muller, Ryan Yue (Author) / Haynes, Karmella (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-03-10
Description

Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead

Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale.

Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and side effect prediction. The software and benchmark are available at https://github.com/hansaimlim/REMAP.

ContributorsLim, Hansaim (Author) / Poleksic, Aleksandar (Author) / Yao, Yuan (Author) / Tong, Hanghang (Author) / He, Di (Author) / Zhuang, Luke (Author) / Meng, Patrick (Author) / Xie, Lei (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-10-07