Matching Items (442)
130401-Thumbnail Image.png
Description
Wall-bounded turbulence manifests itself in a broad range of applications, not least of which in hydraulic systems. Here we briefly review the significant advances over the past few decades in the fundamental study of wall turbulence over smooth and rough surfaces, with an emphasis on coherent structures and their role

Wall-bounded turbulence manifests itself in a broad range of applications, not least of which in hydraulic systems. Here we briefly review the significant advances over the past few decades in the fundamental study of wall turbulence over smooth and rough surfaces, with an emphasis on coherent structures and their role at high Reynolds numbers. We attempt to relate these findings to parallel efforts in the hydraulic engineering community and discuss the implications of coherent structures in important hydraulic phenomena.
Created2012-09-10
130402-Thumbnail Image.png
Description
Vortex organization in the outer layer of a turbulent boundary layer overlying sparse, hemispherical roughness elements is explored with two-component particle-image velocimetry (PIV) in multiple streamwise-wall-normal measurement planes downstream and between elements. The presence of sparse roughness elements causes a shortening of the streamwise length scale in the near-wall region.

Vortex organization in the outer layer of a turbulent boundary layer overlying sparse, hemispherical roughness elements is explored with two-component particle-image velocimetry (PIV) in multiple streamwise-wall-normal measurement planes downstream and between elements. The presence of sparse roughness elements causes a shortening of the streamwise length scale in the near-wall region. These measurements confirm that vortex packets exist in the outer layer of flow over rough walls, but that their organization is altered, and this is interpreted as the underlying cause of the length-scale reduction. In particular, the elements shed vortices which appear to align in the near-wall region, but are distinct from the packets. Further, it is observed that ejection events triggered in the element wakes are more intense compared to the ejection events in smooth wall. We speculate that this may initiate a self-sustaining mechanism leading to the formation of hairpin packets as a much more effective instability compared to those typical of smooth-wall turbulence.
Created2012-09-09
130403-Thumbnail Image.png
Description
The dynamic importance of spanwise vorticity and vortex filaments has been assessed in steady, uniform open-channel flows by means of particle image velocimetry (PIV). By expressing the net force due to Reynolds’ turbulent shear stress, ∂(−[bar over uv]) ∂y, in terms of two velocity-vorticity correlations, [bar over vω[subscript z]] and

The dynamic importance of spanwise vorticity and vortex filaments has been assessed in steady, uniform open-channel flows by means of particle image velocimetry (PIV). By expressing the net force due to Reynolds’ turbulent shear stress, ∂(−[bar over uv]) ∂y, in terms of two velocity-vorticity correlations, [bar over vω[subscript z]] and [bar over wω[subscript y]], the results show that both spanwise vorticity [bar over ω[subscript z]] and the portion of it that is due to spanwise filaments make important contributions to the net force and hence the shape of the mean flow profile. Using the swirling strength to identify spanwise vortex filaments, it is found that they account for about 45% of [bar over vω[subscript z]], the remainder coming from non-filamentary spanwise vorticity, i.e. shear. The mechanism underlying this contribution is the movement of vortex filaments away from the wall. The contribution of spanwise vortex filaments to the Reynolds stress is small because they occupy a small fraction of the flow. The contribution of the induced motion of the spanwise vortex filaments is significant.
Created2013-11-30
130321-Thumbnail Image.png
Description
We have fabricated a high mobility device, composed of a monolayer graphene flake sandwiched between two sheets of hexagonal boron nitride. Conductance fluctuations as functions of a back gate voltage and magnetic field were obtained to check for ergodicity. Non-linear dynamics concepts were used to study the nature of these

We have fabricated a high mobility device, composed of a monolayer graphene flake sandwiched between two sheets of hexagonal boron nitride. Conductance fluctuations as functions of a back gate voltage and magnetic field were obtained to check for ergodicity. Non-linear dynamics concepts were used to study the nature of these fluctuations. The distribution of eigenvalues was estimated from the conductance fluctuations with Gaussian kernels and it indicates that the carrier motion is chaotic at low temperatures. We argue that a two-phase dynamical fluid model best describes the transport in this system and can be used to explain the violation of the so-called ergodic hypothesis found in graphene.
Contributorsda Cunha, C. R. (Author) / Mineharu, M. (Author) / Matsunaga, M. (Author) / Matsumoto, N. (Author) / Chuang, C. (Author) / Ochiai, Y. (Author) / Kim, G.-H. (Author) / Watanabe, K. (Author) / Taniguchi, T. (Author) / Ferry, David (Author) / Aoki, N. (Author) / Ira A. Fulton Schools of Engineering (Contributor) / School of Electrical, Computer and Energy Engineering (Contributor)
Created2016-09-09
130326-Thumbnail Image.png
Description

Inhibition by ammonium at concentrations above 1000 mgN/L is known to harm the methanogenesis phase of anaerobic digestion. We anaerobically digested swine waste and achieved steady state COD-removal efficiency of around 52% with no fatty-acid or H[subscript 2] accumulation. As the anaerobic microbial community adapted to the gradual increase of total

Inhibition by ammonium at concentrations above 1000 mgN/L is known to harm the methanogenesis phase of anaerobic digestion. We anaerobically digested swine waste and achieved steady state COD-removal efficiency of around 52% with no fatty-acid or H[subscript 2] accumulation. As the anaerobic microbial community adapted to the gradual increase of total ammonia-N (NH[subscript 3]-N) from 890 ± 295 to 2040 ± 30 mg/L, the Bacterial and Archaeal communities became less diverse. Phylotypes most closely related to hydrogenotrophic Methanoculleus (36.4%) and Methanobrevibacter (11.6%), along with acetoclastic Methanosaeta (29.3%), became the most abundant Archaeal sequences during acclimation. This was accompanied by a sharp increase in the relative abundances of phylotypes most closely related to acetogens and fatty-acid producers (Clostridium, Coprococcus, and Sphaerochaeta) and syntrophic fatty-acid Bacteria (Syntrophomonas, Clostridium, Clostridiaceae species, and Cloacamonaceae species) that have metabolic capabilities for butyrate and propionate fermentation, as well as for reverse acetogenesis. Our results provide evidence countering a prevailing theory that acetoclastic methanogens are selectively inhibited when the total ammonia-N concentration is greater than ~1000 mgN/L. Instead, acetoclastic and hydrogenotrophic methanogens coexisted in the presence of total ammonia-N of ~2000 mgN/L by establishing syntrophic relationships with fatty-acid fermenters, as well as homoacetogens able to carry out forward and reverse acetogenesis.

Created2016-08-11
130341-Thumbnail Image.png
Description
Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only

Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only four laboratory confirmed cases of Ebola in the entire nation. Public interest in these events was high, as reflected in the millions of Ebola-related Internet searches and tweets performed in the month following the first confirmed case. Use of trending Internet searches and tweets has been proposed in the past for real-time prediction of outbreaks (a field referred to as “digital epidemiology”), but accounting for the biases of public panic has been problematic. In the case of the limited U. S. Ebola outbreak, we know that the Ebola-related searches and tweets originating the U. S. during the outbreak were due only to public interest or panic, providing an unprecedented means to determine how these dynamics affect such data, and how news media may be driving these trends.
Methodology
We examine daily Ebola-related Internet search and Twitter data in the U. S. during the six week period ending Oct 31, 2014. TV news coverage data were obtained from the daily number of Ebola-related news videos appearing on two major news networks. We fit the parameters of a mathematical contagion model to the data to determine if the news coverage was a significant factor in the temporal patterns in Ebola-related Internet and Twitter data.
Conclusions
We find significant evidence of contagion, with each Ebola-related news video inspiring tens of thousands of Ebola-related tweets and Internet searches. Between 65% to 76% of the variance in all samples is described by the news media contagion model.
Created2015-06-11
130342-Thumbnail Image.png
Description
Background
Grading schemes for breast cancer diagnosis are predominantly based on pathologists' qualitative assessment of altered nuclear structure from 2D brightfield microscopy images. However, cells are three-dimensional (3D) objects with features that are inherently 3D and thus poorly characterized in 2D. Our goal is to quantitatively characterize nuclear structure in 3D,

Background
Grading schemes for breast cancer diagnosis are predominantly based on pathologists' qualitative assessment of altered nuclear structure from 2D brightfield microscopy images. However, cells are three-dimensional (3D) objects with features that are inherently 3D and thus poorly characterized in 2D. Our goal is to quantitatively characterize nuclear structure in 3D, assess its variation with malignancy, and investigate whether such variation correlates with standard nuclear grading criteria.
Methodology
We applied micro-optical computed tomographic imaging and automated 3D nuclear morphometry to quantify and compare morphological variations between human cell lines derived from normal, benign fibrocystic or malignant breast epithelium. To reproduce the appearance and contrast in clinical cytopathology images, we stained cells with hematoxylin and eosin and obtained 3D images of 150 individual stained cells of each cell type at sub-micron, isotropic resolution. Applying volumetric image analyses, we computed 42 3D morphological and textural descriptors of cellular and nuclear structure.
Principal Findings
We observed four distinct nuclear shape categories, the predominant being a mushroom cap shape. Cell and nuclear volumes increased from normal to fibrocystic to metastatic type, but there was little difference in the volume ratio of nucleus to cytoplasm (N/C ratio) between the lines. Abnormal cell nuclei had more nucleoli, markedly higher density and clumpier chromatin organization compared to normal. Nuclei of non-tumorigenic, fibrocystic cells exhibited larger textural variations than metastatic cell nuclei. At p<0.0025 by ANOVA and Kruskal-Wallis tests, 90% of our computed descriptors statistically differentiated control from abnormal cell populations, but only 69% of these features statistically differentiated the fibrocystic from the metastatic cell populations.
Conclusions
Our results provide a new perspective on nuclear structure variations associated with malignancy and point to the value of automated quantitative 3D nuclear morphometry as an objective tool to enable development of sensitive and specific nuclear grade classification in breast cancer diagnosis.
Created2012-01-05
130346-Thumbnail Image.png
Description
Recent studies indicate the presence of nano-scale titanium dioxide (TiO[subscript 2]) as an additive in human foodstuffs, but a practical protocol to isolate and separate nano-fractions from soluble foodstuffs as a source of material remains elusive. As such, we developed a method for separating the nano and submicron fractions found

Recent studies indicate the presence of nano-scale titanium dioxide (TiO[subscript 2]) as an additive in human foodstuffs, but a practical protocol to isolate and separate nano-fractions from soluble foodstuffs as a source of material remains elusive. As such, we developed a method for separating the nano and submicron fractions found in commercial-grade TiO[subscript 2] (E171) and E171 extracted from soluble foodstuffs and pharmaceutical products (e.g., chewing gum, pain reliever, and allergy medicine). Primary particle analysis of commercial-grade E171 indicated that 54% of particles were nano-sized (i.e., < 100 nm). Isolation and primary particle analysis of five consumer goods intended to be ingested revealed differences in the percent of nano-sized particles from 32%‒58%. Separation and enrichment of nano- and submicron-sized particles from commercial-grade E171 and E171 isolated from foodstuffs and pharmaceuticals was accomplished using rate-zonal centrifugation. Commercial-grade E171 was separated into nano- and submicron-enriched fractions consisting of a nano:submicron fraction of approximately 0.45:1 and 3.2:1, respectively. E171 extracted from gum had nano:submicron fractions of 1.4:1 and 0.19:1 for nano- and submicron-enriched, respectively. We show a difference in particle adhesion to the cell surface, which was found to be dependent on particle size and epithelial orientation. Finally, we provide evidence that E171 particles are not immediately cytotoxic to the Caco-2 human intestinal epithelium model. These data suggest that this separation method is appropriate for studies interested in isolating the nano-sized particle fraction taken directly from consumer products, in order to study separately the effects of nano and submicron particles.
Created2016-10-31
130348-Thumbnail Image.png
Description
Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic,

Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic, is not the result of a binomial sampling process because infection events are not independent of each other, we propose the use of an asymptotic distribution of the final size to compute approximate 95% confidence intervals of the observed final size. This allows the comparison of the observed final sizes against predictions based on the modeling study (R = 1.15, 1.40 and 1.90), which also yields simple formulae for determining sample sizes for future seroepidemiological studies. We examine a total of eleven published seroepidemiological studies of H1N1-2009 that took place after observing the peak incidence in a number of countries. Observed seropositive proportions in six studies appear to be smaller than that predicted from R = 1.40; four of the six studies sampled serum less than one month after the reported peak incidence. The comparison of the observed final sizes against R = 1.15 and 1.90 reveals that all eleven studies appear not to be significantly deviating from the prediction with R = 1.15, but final sizes in nine studies indicate overestimation if the value R = 1.90 is used.
Conclusions
Sample sizes of published seroepidemiological studies were too small to assess the validity of model predictions except when R = 1.90 was used. We recommend the use of the proposed approach in determining the sample size of post-epidemic seroepidemiological studies, calculating the 95% confidence interval of observed final size, and conducting relevant hypothesis testing instead of the use of methods that rely on a binomial proportion.
Created2011-03-24
130349-Thumbnail Image.png
Description
Background
Several past studies have found that media reports of suicides and homicides appear to subsequently increase the incidence of similar events in the community, apparently due to the coverage planting the seeds of ideation in at-risk individuals to commit similar acts.
Methods
Here we explore whether or not contagion is evident in

Background
Several past studies have found that media reports of suicides and homicides appear to subsequently increase the incidence of similar events in the community, apparently due to the coverage planting the seeds of ideation in at-risk individuals to commit similar acts.
Methods
Here we explore whether or not contagion is evident in more high-profile incidents, such as school shootings and mass killings (incidents with four or more people killed). We fit a contagion model to recent data sets related to such incidents in the US, with terms that take into account the fact that a school shooting or mass murder may temporarily increase the probability of a similar event in the immediate future, by assuming an exponential decay in contagiousness after an event.
Conclusions
We find significant evidence that mass killings involving firearms are incented by similar events in the immediate past. On average, this temporary increase in probability lasts 13 days, and each incident incites at least 0.30 new incidents (p = 0.0015). We also find significant evidence of contagion in school shootings, for which an incident is contagious for an average of 13 days, and incites an average of at least 0.22 new incidents (p = 0.0001). All p-values are assessed based on a likelihood ratio test comparing the likelihood of a contagion model to that of a null model with no contagion. On average, mass killings involving firearms occur approximately every two weeks in the US, while school shootings occur on average monthly. We find that state prevalence of firearm ownership is significantly associated with the state incidence of mass killings with firearms, school shootings, and mass shootings.
Created2015-07-02