Matching Items (779)
Filtering by

Clear all filters

150404-Thumbnail Image.png
Description
As the use of engineered nanomaterials (ENMs) in consumer products becomes more common, the amount of ENMs entering wastewater treatment plants (WWTPs) increases. Investigating the fate of ENMs in WWTPs is critical for risk assessment and pollution control. The objectives of this dissertation were to (1) quantify and characterize titanium

As the use of engineered nanomaterials (ENMs) in consumer products becomes more common, the amount of ENMs entering wastewater treatment plants (WWTPs) increases. Investigating the fate of ENMs in WWTPs is critical for risk assessment and pollution control. The objectives of this dissertation were to (1) quantify and characterize titanium (Ti) in full-scale wastewater treatment plants, (2) quantify sorption of different ENMs to wastewater biomass in laboratory-scale batch reactors, (3) evaluate the use of a standard, soluble-pollutant sorption test method for quantifying ENM interaction with wastewater biomass, and (4) develop a mechanistic model of a biological wastewater treatment reactor to serve as the basis for modeling nanomaterial fate in WWTPs. Using titanium (Ti) as a model material for the fate of ENMs in WWTPs, Ti concentrations were measured in 10 municipal WWTPs. Ti concentrations in pant influent ranged from 181 to 3000 µg/L, and more than 96% of Ti was removed, with effluent Ti concentrations being less than 25 µg/L. Ti removed from wastewater accumulated in solids at concentrations ranging from 1 to 6 µg Ti/mg solids. Using transmission electron microscopy, spherical titanium oxide nanoparticles with diameters ranging from 4 to 30 nm were found in WWTP effluents, evidence that some nanoscale particles will pass through WWTPs and enter aquatic systems. Batch experiments were conducted to quantify sorption of different ENM types to activated sludge. Percentages of sorption to 400 mg TSS/L biomass ranged from about 10 to 90%, depending on the ENM material and functionalization. Natural organic matter, surfactants, and proteins had a stabilizing effect on most of the ENMs tested. The United States Environmental Protection Agency's standard sorption testing method (OPPTS 835.1110) used for soluble compounds was found to be inapplicable to ENMs, as freeze-dried activated sludge transforms ENMs into stable particles in suspension. In conjunction with experiments, we created a mechanistic model of the microbiological processes in membrane bioreactors to predict MBR, extended and modified this model to predict the fate of soluble micropollutants, and then discussed how the micropollutant fate model could be used to predict the fate of nanomaterials in wastewater treatment plants.
ContributorsKiser, Mehlika Ayla (Author) / Westerhoff, Paul K (Thesis advisor) / Rittmann, Bruce E. (Committee member) / Hristovski, Kiril D (Committee member) / Arizona State University (Publisher)
Created2011
150341-Thumbnail Image.png
Description
A numerical study of incremental spin-up and spin-up from rest of a thermally- stratified fluid enclosed within a right circular cylinder with rigid bottom and side walls and stress-free upper surface is presented. Thermally stratified spin-up is a typical example of baroclinity, which is initiated by a sudden increase in

A numerical study of incremental spin-up and spin-up from rest of a thermally- stratified fluid enclosed within a right circular cylinder with rigid bottom and side walls and stress-free upper surface is presented. Thermally stratified spin-up is a typical example of baroclinity, which is initiated by a sudden increase in rotation rate and the tilting of isotherms gives rise to baroclinic source of vorticity. Research by (Smirnov et al. [2010a]) showed the differences in evolution of instabilities when Dirichlet and Neumann thermal boundary conditions were applied at top and bottom walls. Study of parametric variations carried out in this dissertation confirmed the instability patterns observed by them for given aspect ratio and Rossby number values greater than 0.5. Also results reveal that flow maintained axisymmetry and stability for short aspect ratio containers independent of amount of rotational increment imparted. Investigation on vorticity components provides framework for baroclinic vorticity feedback mechanism which plays important role in delayed rise of instabilities when Dirichlet thermal Boundary Conditions are applied.
ContributorsKher, Aditya Deepak (Author) / Chen, Kangping (Thesis advisor) / Huang, Huei-Ping (Committee member) / Herrmann, Marcus (Committee member) / Arizona State University (Publisher)
Created2011
150365-Thumbnail Image.png
Description

A recent joint study by Arizona State University and the Arizona Department of Transportation (ADOT) was conducted to evaluate certain Warm Mix Asphalt (WMA) properties in the laboratory. WMA material was taken from an actual ADOT project that involved two WMA sections. The first section used a foamed-based WMA admixture,

A recent joint study by Arizona State University and the Arizona Department of Transportation (ADOT) was conducted to evaluate certain Warm Mix Asphalt (WMA) properties in the laboratory. WMA material was taken from an actual ADOT project that involved two WMA sections. The first section used a foamed-based WMA admixture, and the second section used a chemical-based WMA admixture. The rest of the project included control hot mix asphalt (HMA) mixture. The evaluation included testing of field-core specimens and laboratory compacted specimens. The laboratory specimens were compacted at two different temperatures; 270 °F (132 °C) and 310 °F (154 °C). The experimental plan included four laboratory tests: the dynamic modulus (E*), indirect tensile strength (IDT), moisture damage evaluation using AASHTO T-283 test, and the Hamburg Wheel-track Test. The dynamic modulus E* results of the field cores at 70 °F showed similar E* values for control HMA and foaming-based WMA mixtures; the E* values of the chemical-based WMA mixture were relatively higher. IDT test results of the field cores had comparable finding as the E* results. For the laboratory compacted specimens, both E* and IDT results indicated that decreasing the compaction temperatures from 310 °F to 270 °F did not have any negative effect on the material strength for both WMA mixtures; while the control HMA strength was affected to some extent. It was noticed that E* and IDT results of the chemical-based WMA field cores were high; however, the laboratory compacted specimens results didn't show the same tendency. The moisture sensitivity findings from TSR test disagreed with those of Hamburg test; while TSR results indicated relatively low values of about 60% for all three mixtures, Hamburg test results were quite excellent. In general, the results of this study indicated that both WMA mixes can be best evaluated through field compacted mixes/cores; the results of the laboratory compacted specimens were helpful to a certain extent. The dynamic moduli for the field-core specimens were higher than for those compacted in the laboratory. The moisture damage findings indicated that more investigations are needed to evaluate moisture damage susceptibility in field.

ContributorsAlossta, Abdulaziz (Author) / Kaloush, Kamil (Thesis advisor) / Witczak, Matthew W. (Committee member) / Mamlouk, Michael S. (Committee member) / Arizona State University (Publisher)
Created2011
149867-Thumbnail Image.png
Description
Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding

Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.
ContributorsKrishnamoorthi, Harish (Author) / Spanias, Andreas (Thesis advisor) / Papandreou-Suppappola, Antonia (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011
149930-Thumbnail Image.png
Description
Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst.

Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst. This research is an exercise in measuring and reporting data quality. The assessment was conducted to support the performance measurement program at the Maricopa Association of Governments in Phoenix, Arizona, and investigates the traffic data from 228 continuous monitoring freeway sensors in the metropolitan region. Results of the assessment provide an example of describing the quality of the traffic data with each of six data quality measures suggested in the literature, which are accuracy, completeness, validity, timeliness, coverage and accessibility. An important contribution is made in the use of data quality visualization tools. These visualization tools are used in evaluating the validity of the traffic data beyond pass/fail criteria commonly used. More significantly, they serve to educate an intuitive sense or understanding of the underlying characteristics of the data considered valid. Recommendations from the experience gained in this assessment include that data quality visualization tools be developed and used in the processing and quality control of traffic data, and that these visualization tools, along with other information on the quality control effort, be stored as metadata with the processed data.
ContributorsSamuelson, Jothan P (Author) / Pendyala, Ram M. (Thesis advisor) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2011
149969-Thumbnail Image.png
Description
In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for

In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for breath analysis. The thermoelectric biosensors under investigation were constructed using a thermopile for transduction and four different materials for biorecognition. The analytes, acetone and ethanol, were evaluated under dry-air and humidified-air conditions. The biosensor response to acetone concentration was found to be both repeatable and linear, while the sensor response to ethanol presence was also found to be repeatable. The different biorecognition materials produced discernible thermoelectric responses that were characteristic for each analyte. The sensor output data is presented in this report. Additionally, the results were evaluated against a mathematical model for further analysis. Ultimately, a thermoelectric biosensor based upon adsorption chemistry was developed and characterized. Additional work is needed to characterize the physicochemical action mechanism.
ContributorsWilson, Kimberly (Author) / Guilbeau, Eric (Thesis advisor) / Pizziconi, Vincent (Thesis advisor) / LaBelle, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2011
149744-Thumbnail Image.png
Description
The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth

The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth surfaces from the polygonal domain has its share of issues and there is an inherent need to address various rendering bottlenecks that could hamper such a move. The game engine needs to choose an appropriate method based on in-game characteristics of the objects; character and animated objects need more sophisticated methods whereas static objects could use simpler techniques. Scaling the polygon count over various hardware platforms becomes an important factor. Much control is needed over the tessellation levels, either imposed by the hardware limitations or by the application, to be able to adaptively render the mesh without significant loss in performance. This thesis explores several methods that would help game engine developers in making correct design choices by optimally balancing the trade-offs while rendering the scene using smooth surfaces. It proposes a novel technique for adaptive tessellation of triangular meshes that vastly improves speed and tessellation count. It develops an approximate method for rendering Loop subdivision surfaces on tessellation enabled hardware. A taxonomy and evaluation of the methods is provided and a unified rendering system that provides automatic level of detail by switching between the methods is proposed.
ContributorsAmresh, Ashish (Author) / Farin, Gerlad (Thesis advisor) / Razdan, Anshuman (Thesis advisor) / Wonka, Peter (Committee member) / Hansford, Dianne (Committee member) / Arizona State University (Publisher)
Created2011
149822-Thumbnail Image.png
Description
It is estimated that wind induced soil transports more than 500 x 106 metric tons of fugitive dust annually. Soil erosion has negative effects on human health, the productivity of farms, and the quality of surface waters. A variety of different polymer stabilizers are available on the market for fugitive

It is estimated that wind induced soil transports more than 500 x 106 metric tons of fugitive dust annually. Soil erosion has negative effects on human health, the productivity of farms, and the quality of surface waters. A variety of different polymer stabilizers are available on the market for fugitive dust control. Most of these polymer stabilizers are expensive synthetic polymer products. Their adverse effects and expense usually limits their use. Biopolymers provide a potential alternative to synthetic polymers. They can provide dust abatement by encapsulating soil particles and creating a binding network throughout the treated area. This research into the effectiveness of biopolymers for fugitive dust control involved three phases. Phase I included proof of concept tests. Phase II included carrying out the tests in a wind tunnel. Phase III consisted of conducting the experiments in the field. Proof of concept tests showed that biopolymers have the potential to reduce soil erosion and fugitive dust transport. Wind tunnel tests on two candidate biopolymers, xanthan and chitosan, showed that there is a proportional relationship between biopolymer application rates and threshold wind velocities. The wind tunnel tests also showed that xanthan gum is more successful in the field than chitosan. The field tests showed that xanthan gum was effective at controlling soil erosion. However, the chitosan field data was inconsistent with the xanthan data and field data on bare soil.
ContributorsAlsanad, Abdullah (Author) / Kavazanjian, Edward (Thesis advisor) / Edwards, David (Committee member) / Zapata, Claudia (Committee member) / Arizona State University (Publisher)
Created2011
149854-Thumbnail Image.png
Description
There is increasing interest in the medical and behavioral health communities towards developing effective strategies for the treatment of chronic diseases. Among these lie adaptive interventions, which consider adjusting treatment dosages over time based on participant response. Control engineering offers a broad-based solution framework for optimizing the effectiveness of such

There is increasing interest in the medical and behavioral health communities towards developing effective strategies for the treatment of chronic diseases. Among these lie adaptive interventions, which consider adjusting treatment dosages over time based on participant response. Control engineering offers a broad-based solution framework for optimizing the effectiveness of such interventions. In this thesis, an approach is proposed to develop dynamical models and subsequently, hybrid model predictive control schemes for assigning optimal dosages of naltrexone, an opioid antagonist, as treatment for a chronic pain condition known as fibromyalgia. System identification techniques are employed to model the dynamics from the daily diary reports completed by participants of a blind naltrexone intervention trial. These self-reports include assessments of outcomes of interest (e.g., general pain symptoms, sleep quality) and additional external variables (disturbances) that affect these outcomes (e.g., stress, anxiety, and mood). Using prediction-error methods, a multi-input model describing the effect of drug, placebo and other disturbances on outcomes of interest is developed. This discrete time model is approximated by a continuous second order model with zero, which was found to be adequate to capture the dynamics of this intervention. Data from 40 participants in two clinical trials were analyzed and participants were classified as responders and non-responders based on the models obtained from system identification. The dynamical models can be used by a model predictive controller for automated dosage selection of naltrexone using feedback/feedforward control actions in the presence of external disturbances. The clinical requirement for categorical (i.e., discrete-valued) drug dosage levels creates a need for hybrid model predictive control (HMPC). The controller features a multiple degree-of-freedom formulation that enables the user to adjust the speed of setpoint tracking, measured disturbance rejection and unmeasured disturbance rejection independently in the closed loop system. The nominal and robust performance of the proposed control scheme is examined via simulation using system identification models from a representative participant in the naltrexone intervention trial. The controller evaluation described in this thesis gives credibility to the promise and applicability of control engineering principles for optimizing adaptive interventions.
ContributorsDeśapāṇḍe, Sunīla (Author) / Rivera, Daniel E. (Thesis advisor) / Si, Jennie (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011
150165-Thumbnail Image.png
Description
As a term and method that is rapidly gaining popularity, Building Information Modeling (BIM) is under the scrutiny of many building professionals questioning its potential benefits on their projects. A relevant and accepted calculation methodology and baseline to properly evaluate BIM's benefits have not been established, thus there are mixed

As a term and method that is rapidly gaining popularity, Building Information Modeling (BIM) is under the scrutiny of many building professionals questioning its potential benefits on their projects. A relevant and accepted calculation methodology and baseline to properly evaluate BIM's benefits have not been established, thus there are mixed perspectives and opinions of the benefits of BIM, creating a general misunderstanding of the expected outcomes. The purpose of this thesis was to develop a more complete methodology to analyze the benefits of BIM, apply recent projects to this methodology to quantify outcomes, resulting in a more a holistic framework of BIM and its impacts on project efficiency. From the literature, a framework calculation model to determine the value of BIM is developed and presented. The developed model is applied via case studies within a large industrial setting where similar projects are evaluated, some implementing BIM and some with traditional non-BIM approaches. Cost or investment metrics were considered along with benefit or return metrics. The return metrics were: requests for information, change orders, and duration improvements. The investment metrics were: design and construction costs. The methodology was tested against three separate cases and results on the returns and investments are presented. The findings indicate that in the tool installation department of semiconductor manufacturing, there is a high potential for BIM benefits to be realized. The evidence also suggests that actual returns and investments will vary with each project.
ContributorsBarlish, Kristen Caroline (Author) / Sullivan, Kenneth T. (Thesis advisor) / Kashiwagi, Dean T. (Committee member) / Badger, William W. (Committee member) / Arizona State University (Publisher)
Created2011