Filtering by
- Language: English
First, in existing linear controllability frameworks, the ability to steer a network from any initiate state toward any desired state is measured by the minimum number of driver nodes. However, the associated optimal control energy can become unbearably large, preventing actual control from being realized. Here I develop a physical controllability framework and propose strategies to turn physically uncontrollable networks into physically controllable ones. I also discover that although full control can be guaranteed by the prevailing structural controllability theory, it is necessary to balance the number of driver nodes and control energy to achieve actual control, and my work provides a framework to address this issue.
Second, in spite of recent progresses in linear controllability, controlling nonlinear dynamical networks remains an outstanding problem. Here I develop an experimentally feasible control framework for nonlinear dynamical networks that exhibit multistability. The control objective is to apply parameter perturbation to drive the system from one attractor to another. I introduce the concept of attractor network and formulate a quantifiable framework: a network is more controllable if the attractor network is more strongly connected. I test the control framework using examples from various models and demonstrate the beneficial role of noise in facilitating control.
Third, I analyze large data sets from a diverse online social networking (OSN) systems and find that the growth dynamics of meme popularity exhibit characteristically different behaviors: linear, “S”-shape and exponential growths. Inspired by cell population growth model in microbial ecology, I construct a base growth model for meme popularity in OSNs. Then I incorporate human interest dynamics into the base model and propose a hybrid model which contains a small number of free parameters. The model successfully predicts the various distinct meme growth dynamics.
At last, I propose a nonlinear dynamics model to characterize the controlling of WNT signaling pathway in the differentiation of neural progenitor cells. The model is able to predict experiment results and shed light on the understanding of WNT regulation mechanisms.
Almost every step during analysis and quantification requires the use of an often empirically determined threshold, which makes quantification of noise less accurate. In addition, each research group often develops their own data analysis pipeline making it impossible to compare data from different groups. To remedy this problem a streamlined and standardized scRNA-seq data analysis and normalization protocol was designed and developed. After analyzing multiple experiments we identified the possible pipeline stages, and tools needed. Our pipeline is capable of handling data with adapters and barcodes, which was not the case with pipelines from some experiments. Our pipeline can be used to analyze single experiment scRNA-seq data and also to compare scRNA-seq data across experiments. Various processes like data gathering, file conversion, and data merging were automated in the pipeline. The main focus was to standardize and normalize single-cell RNA-seq data to minimize technical noise introduced by disparate platforms.
A globally integrated carbon observation and analysis system is needed to improve the fundamental understanding of the global carbon cycle, to improve our ability to project future changes, and to verify the effectiveness of policies aiming to reduce greenhouse gas emissions and increase carbon sequestration. Building an integrated carbon observation system requires transformational advances from the existing sparse, exploratory framework towards a dense, robust, and sustained system in all components: anthropogenic emissions, the atmosphere, the ocean, and the terrestrial biosphere. The paper is addressed to scientists, policymakers, and funding agencies who need to have a global picture of the current state of the (diverse) carbon observations.
We identify the current state of carbon observations, and the needs and notional requirements for a global integrated carbon observation system that can be built in the next decade. A key conclusion is the substantial expansion of the ground-based observation networks required to reach the high spatial resolution for CO2 and CH4 fluxes, and for carbon stocks for addressing policy-relevant objectives, and attributing flux changes to underlying processes in each region. In order to establish flux and stock diagnostics over areas such as the southern oceans, tropical forests, and the Arctic, in situ observations will have to be complemented with remote-sensing measurements. Remote sensing offers the advantage of dense spatial coverage and frequent revisit. A key challenge is to bring remote-sensing measurements to a level of long-term consistency and accuracy so that they can be efficiently combined in models to reduce uncertainties, in synergy with ground-based data.
Bringing tight observational constraints on fossil fuel and land use change emissions will be the biggest challenge for deployment of a policy-relevant integrated carbon observation system. This will require in situ and remotely sensed data at much higher resolution and density than currently achieved for natural fluxes, although over a small land area (cities, industrial sites, power plants), as well as the inclusion of fossil fuel CO2 proxy measurements such as radiocarbon in CO2 and carbon-fuel combustion tracers. Additionally, a policy-relevant carbon monitoring system should also provide mechanisms for reconciling regional top-down (atmosphere-based) and bottom-up (surface-based) flux estimates across the range of spatial and temporal scales relevant to mitigation policies. In addition, uncertainties for each observation data-stream should be assessed. The success of the system will rely on long-term commitments to monitoring, on improved international collaboration to fill gaps in the current observations, on sustained efforts to improve access to the different data streams and make databases interoperable, and on the calibration of each component of the system to agreed-upon international scales.