Matching Items (169)
150037-Thumbnail Image.png
Description
Intimate coupling of Ti2 photocatalysis and biodegradation (ICPB) offers potential for degrading biorecalcitrant and toxic organic compounds much better than possible with conventional wastewater treatments. This study reports on using a novel sponge-type, Ti2-coated biofilm carrier that shows significant adherence of Ti2 to its exterior and the ability to accumulate

Intimate coupling of Ti2 photocatalysis and biodegradation (ICPB) offers potential for degrading biorecalcitrant and toxic organic compounds much better than possible with conventional wastewater treatments. This study reports on using a novel sponge-type, Ti2-coated biofilm carrier that shows significant adherence of Ti2 to its exterior and the ability to accumulate biomass in its interior (protected from UV light and free radicals). First, this carrier was tested for ICPB in a continuous-flow photocatalytic circulating-bed biofilm reactor (PCBBR) to mineralize biorecalcitrant organic: 2,4,5-trichlorophenol (TCP). Four mechanisms possibly acting of ICPB were tested separately: TCP adsorption, UV photolysis/photocatalysis, and biodegradation. The carrier exhibited strong TCP adsorption, while photolysis was negligible. Photocatalysis produced TCP-degradation products that could be mineralized and the strong adsorption of TCP to the carrier enhanced biodegradation by relieving toxicity. Validating the ICPB concept, biofilm was protected inside the carriers from UV light and free radicals. ICPB significantly lowered the diversity of the bacterial community, but five genera known to biodegrade chlorinated phenols were markedly enriched. Secondly, decolorization and mineralization of reactive dyes by ICPB were investigated on a refined Ti2-coated biofilm carrier in a PCBBR. Two typical reactive dyes: Reactive Black 5 (RB5) and Reactive Yellow 86 (RY86), showed similar first-order kinetics when being photocatalytically decolorized at low pH (~4-5), which was inhibited at neutral pH in the presence of phosphate or carbonate buffer, presumably due to electrostatic repulsion from negatively charged surface sites on Ti2, radical scavenging by phosphate or carbonate, or both. In the PCBBR, photocatalysis alone with Ti2-coated carriers could remove RB5 and COD by 97% and 47%, respectively. Addition of biofilm inside macroporous carriers maintained a similar RB5 removal efficiency, but COD removal increased to 65%, which is evidence of ICPB despite the low pH. A proposed ICPB pathway for RB5 suggests that a major intermediate, a naphthol derivative, was responsible for most of the residual COD. Finally, three low-temperature sintering methods, called O, D and DN, were compared based on photocatalytic efficiency and Ti2 adherence. The DN method had the best Ti2-coating properties and was a successful carrier for ICPB of RB5 in a PCBBR.
ContributorsLi, Guozheng (Author) / Rittmann, Bruce E. (Thesis advisor) / Halden, Rolf (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Arizona State University (Publisher)
Created2011
149803-Thumbnail Image.png
Description
With the advent of technologies such as web services, service oriented architecture and cloud computing, modern organizations have to deal with policies such as Firewall policies to secure the networks, XACML (eXtensible Access Control Markup Language) policies for controlling the access to critical information as well as resources. Management of

With the advent of technologies such as web services, service oriented architecture and cloud computing, modern organizations have to deal with policies such as Firewall policies to secure the networks, XACML (eXtensible Access Control Markup Language) policies for controlling the access to critical information as well as resources. Management of these policies is an extremely important task in order to avoid unintended security leakages via illegal accesses, while maintaining proper access to services for legitimate users. Managing and maintaining access control policies manually over long period of time is an error prone task due to their inherent complex nature. Existing tools and mechanisms for policy management use different approaches for different types of policies. This research thesis represents a generic framework to provide an unified approach for policy analysis and management of different types of policies. Generic approach captures the common semantics and structure of different access control policies with the notion of policy ontology. Policy ontology representation is then utilized for effectively analyzing and managing the policies. This thesis also discusses a proof-of-concept implementation of the proposed generic framework and demonstrates how efficiently this unified approach can be used for analysis and management of different types of access control policies.
ContributorsKulkarni, Ketan (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2011
149858-Thumbnail Image.png
Description
This dissertation is focused on building scalable Attribute Based Security Systems (ABSS), including efficient and privacy-preserving attribute based encryption schemes and applications to group communications and cloud computing. First of all, a Constant Ciphertext Policy Attribute Based Encryption (CCP-ABE) is proposed. Existing Attribute Based Encryption (ABE) schemes usually incur large,

This dissertation is focused on building scalable Attribute Based Security Systems (ABSS), including efficient and privacy-preserving attribute based encryption schemes and applications to group communications and cloud computing. First of all, a Constant Ciphertext Policy Attribute Based Encryption (CCP-ABE) is proposed. Existing Attribute Based Encryption (ABE) schemes usually incur large, linearly increasing ciphertext. The proposed CCP-ABE dramatically reduces the ciphertext to small, constant size. This is the first existing ABE scheme that achieves constant ciphertext size. Also, the proposed CCP-ABE scheme is fully collusion-resistant such that users can not combine their attributes to elevate their decryption capacity. Next step, efficient ABE schemes are applied to construct optimal group communication schemes and broadcast encryption schemes. An attribute based Optimal Group Key (OGK) management scheme that attains communication-storage optimality without collusion vulnerability is presented. Then, a novel broadcast encryption model: Attribute Based Broadcast Encryption (ABBE) is introduced, which exploits the many-to-many nature of attributes to dramatically reduce the storage complexity from linear to logarithm and enable expressive attribute based access policies. The privacy issues are also considered and addressed in ABSS. Firstly, a hidden policy based ABE schemes is proposed to protect receivers' privacy by hiding the access policy. Secondly,a new concept: Gradual Identity Exposure (GIE) is introduced to address the restrictions of hidden policy based ABE schemes. GIE's approach is to reveal the receivers' information gradually by allowing ciphertext recipients to decrypt the message using their possessed attributes one-by-one. If the receiver does not possess one attribute in this procedure, the rest of attributes are still hidden. Compared to hidden-policy based solutions, GIE provides significant performance improvement in terms of reducing both computation and communication overhead. Last but not least, ABSS are incorporated into the mobile cloud computing scenarios. In the proposed secure mobile cloud data management framework, the light weight mobile devices can securely outsource expensive ABE operations and data storage to untrusted cloud service providers. The reported scheme includes two components: (1) a Cloud-Assisted Attribute-Based Encryption/Decryption (CA-ABE) scheme and (2) An Attribute-Based Data Storage (ABDS) scheme that achieves information theoretical optimality.
ContributorsZhou, Zhibin (Author) / Huang, Dijiang (Thesis advisor) / Yau, Sik-Sang (Committee member) / Ahn, Gail-Joon (Committee member) / Reisslein, Martin (Committee member) / Arizona State University (Publisher)
Created2011
150317-Thumbnail Image.png
Description
To address sustainability issues in wastewater treatment (WWT), Siemens Water Technologies (SWT) has designed a "hybrid" process that couples common activated sludge (AS) and anaerobic digestion (AD) technologies with the novel concepts of AD sludge recycle and biosorption. At least 85% of the hybrid's AD sludge is recycled to the

To address sustainability issues in wastewater treatment (WWT), Siemens Water Technologies (SWT) has designed a "hybrid" process that couples common activated sludge (AS) and anaerobic digestion (AD) technologies with the novel concepts of AD sludge recycle and biosorption. At least 85% of the hybrid's AD sludge is recycled to the AS process, providing additional sorbent for influent particulate chemical oxygen demand (PCOD) biosorption in contact tanks. Biosorbed PCOD is transported to the AD, where it is converted to methane. The aim of this study is to provide mass balance and microbial community analysis (MCA) of SWT's two hybrid and one conventional pilot plant trains and mathematical modeling of the hybrid process including a novel model of biosorption. A detailed mass balance was performed on each tank and the overall system. The mass balance data supports the hybrid process is more sustainable: It produces 1.5 to 5.5x more methane and 50 to 83% less sludge than the conventional train. The hybrid's superior performance is driven by 4 to 8 times longer solid retention times (SRTs) as compared to conventional trains. However, the conversion of influent COD to methane was low at 15 to 22%, and neither train exhibited significant nitrification or denitrification. Data were inconclusive as to the role of biosorption in the processes. MCA indicated the presence of Archaea and nitrifiers throughout both systems. However, it is inconclusive as to how active Archaea and nitrifiers are under anoxic, aerobic, and anaerobic conditions. Mathematical modeling confirms the hybrid process produces 4 to 20 times more methane and 20 to 83% less sludge than the conventional train under various operating conditions. Neither process removes more than 25% of the influent nitrogen or converts more that 13% to nitrogen gas due to biomass washout in the contact tank and short SRTs in the stabilization tank. In addition, a mathematical relationship was developed to describe PCOD biosorption through adsorption to biomass and floc entrapment. Ultimately, process performance is more heavily influenced by the higher AD SRTs attained when sludge is recycled through the system and less influenced by the inclusion of biosorption kinetics.
ContributorsYoung, Michelle Nichole (Author) / Rittmann, Bruce E. (Thesis advisor) / Fox, Peter (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Arizona State University (Publisher)
Created2011
150093-Thumbnail Image.png
Description
Action language C+ is a formalism for describing properties of actions, which is based on nonmonotonic causal logic. The definite fragment of C+ is implemented in the Causal Calculator (CCalc), which is based on the reduction of nonmonotonic causal logic to propositional logic. This thesis describes the language

Action language C+ is a formalism for describing properties of actions, which is based on nonmonotonic causal logic. The definite fragment of C+ is implemented in the Causal Calculator (CCalc), which is based on the reduction of nonmonotonic causal logic to propositional logic. This thesis describes the language of CCalc in terms of answer set programming (ASP), based on the translation of nonmonotonic causal logic to formulas under the stable model semantics. I designed a standard library which describes the constructs of the input language of CCalc in terms of ASP, allowing a simple modular method to represent CCalc input programs in the language of ASP. Using the combination of system F2LP and answer set solvers, this method achieves functionality close to that of CCalc while taking advantage of answer set solvers to yield efficient computation that is orders of magnitude faster than CCalc for many benchmark examples. In support of this, I created an automated translation system Cplus2ASP that implements the translation and encoding method and automatically invokes the necessary software to solve the translated input programs.
ContributorsCasolary, Michael (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Arizona State University (Publisher)
Created2011
150148-Thumbnail Image.png
Description
In order to catch the smartest criminals in the world, digital forensics examiners need a means of collaborating and sharing information with each other and outside experts that is not prohibitively difficult. However, standard operating procedures and the rules of evidence generally disallow the use of the collaboration software and

In order to catch the smartest criminals in the world, digital forensics examiners need a means of collaborating and sharing information with each other and outside experts that is not prohibitively difficult. However, standard operating procedures and the rules of evidence generally disallow the use of the collaboration software and techniques that are currently available because they do not fully adhere to the dictated procedures for the handling, analysis, and disclosure of items relating to cases. The aim of this work is to conceive and design a framework that provides a completely new architecture that 1) can perform fundamental functions that are common and necessary to forensic analyses, and 2) is structured such that it is possible to include collaboration-facilitating components without changing the way users interact with the system sans collaboration. This framework is called the Collaborative Forensic Framework (CUFF). CUFF is constructed from four main components: Cuff Link, Storage, Web Interface, and Analysis Block. With the Cuff Link acting as a mediator between components, CUFF is flexible in both the method of deployment and the technologies used in implementation. The details of a realization of CUFF are given, which uses a combination of Java, the Google Web Toolkit, Django with Apache for a RESTful web service, and an Ubuntu Enterprise Cloud using Eucalyptus. The functionality of CUFF's components is demonstrated by the integration of an acquisition script designed for Android OS-based mobile devices that use the YAFFS2 file system. While this work has obvious application to examination labs which work under the mandate of judicial or investigative bodies, security officers at any organization would benefit from the improved ability to cooperate in electronic discovery efforts and internal investigations.
ContributorsMabey, Michael Kent (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2011
152278-Thumbnail Image.png
Description
The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there

The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there is no well-defined process to be used for email forensics the comprehensiveness, extensibility of tools, uniformity of evidence, usefulness in collaborative/distributed environments, and consistency of investigations are hindered. At present, there exists little support for discovering, acquiring, and representing web-based email, despite its widespread use. To remedy this, a systematic process which includes discovering, acquiring, and representing web-based email for email forensics which is integrated into the normal forensic analysis workflow, and which accommodates the distinct characteristics of email evidence will be presented. This process focuses on detecting the presence of non-obvious artifacts related to email accounts, retrieving the data from the service provider, and representing email in a well-structured format based on existing standards. As a result, developers and organizations can collaboratively create and use analysis tools that can analyze email evidence from any source in the same fashion and the examiner can access additional data relevant to their forensic cases. Following, an extensible framework implementing this novel process-driven approach has been implemented in an attempt to address the problems of comprehensiveness, extensibility, uniformity, collaboration/distribution, and consistency within forensic investigations involving email evidence.
ContributorsPaglierani, Justin W (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Santanam, Raghu T (Committee member) / Arizona State University (Publisher)
Created2013
151653-Thumbnail Image.png
Description
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling

Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling language in order to enhance expressivity, such as incorporating aggregates and interfaces with ontologies. Also, in order to overcome the grounding bottleneck of computation in ASP, there are increasing interests in integrating ASP with other computing paradigms, such as Constraint Programming (CP) and Satisfiability Modulo Theories (SMT). Due to the non-monotonic nature of the ASP semantics, such enhancements turned out to be non-trivial and the existing extensions are not fully satisfactory. We observe that one main reason for the difficulties rooted in the propositional semantics of ASP, which is limited in handling first-order constructs (such as aggregates and ontologies) and functions (such as constraint variables in CP and SMT) in natural ways. This dissertation presents a unifying view on these extensions by viewing them as instances of formulas with generalized quantifiers and intensional functions. We extend the first-order stable model semantics by by Ferraris, Lee, and Lifschitz to allow generalized quantifiers, which cover aggregate, DL-atoms, constraints and SMT theory atoms as special cases. Using this unifying framework, we study and relate different extensions of ASP. We also present a tight integration of ASP with SMT, based on which we enhance action language C+ to handle reasoning about continuous changes. Our framework yields a systematic approach to study and extend non-monotonic languages.
ContributorsMeng, Yunsong (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Fainekos, Georgios (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2013
151784-Thumbnail Image.png
Description
This work focuses on a generalized assessment of source zone natural attenuation (SZNA) at chlorinated aliphatic hydrocarbon (CAH) impacted sites. Given the numbers of sites and technical challenges for cleanup there is a need for a SZNA method at CAH impacted sites. The method anticipates that decision makers will be

This work focuses on a generalized assessment of source zone natural attenuation (SZNA) at chlorinated aliphatic hydrocarbon (CAH) impacted sites. Given the numbers of sites and technical challenges for cleanup there is a need for a SZNA method at CAH impacted sites. The method anticipates that decision makers will be interested in the following questions: 1-Is SZNA occurring and what processes contribute? 2-What are the current SZNA rates? 3-What are the longer-term implications? The approach is macroscopic and uses multiple lines-of-evidence. An in-depth application of the generalized non-site specific method over multiple site events, with sampling refinement approaches applied for improving SZNA estimates, at three CAH impacted sites is presented with a focus on discharge rates for four events over approximately three years (Site 1:2.9, 8.4, 4.9, 2.8kg/yr as PCE, Site 2:1.6, 2.2, 1.7, 1.1kg/y as PCE, Site 3:570, 590, 250, 240kg/y as TCE). When applying the generalized CAH-SZNA method, it is likely that different practitioners will not sample a site similarly, especially regarding sampling density on a groundwater transect. Calculation of SZNA rates is affected by contaminant spatial variability with reference to transect sampling intervals and density with variations in either resulting in different mass discharge estimates. The effects on discharge estimates from varied sampling densities and spacings were examined to develop heuristic sampling guidelines with practical site sampling densities; the guidelines aim to reduce the variability in discharge estimates due to different sampling approaches and to improve confidence in SZNA rates allowing decision-makers to place the rates in perspective and determine a course of action based on remedial goals. Finally bench scale testing was used to address longer term questions; specifically the nature and extent of source architecture. A rapid in-situ disturbance method was developed using a bench-scale apparatus. The approach allows for rapid identification of the presence of DNAPL using several common pilot scale technologies (ISCO, air-sparging, water-injection) and can identify relevant source architectural features (ganglia, pools, dissolved source). Understanding of source architecture and identification of DNAPL containing regions greatly enhances site conceptualization models, improving estimated time frames for SZNA, and possibly improving design of remedial systems.
ContributorsEkre, Ryan (Author) / Johnson, Paul Carr (Thesis advisor) / Rittmann, Bruce (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Arizona State University (Publisher)
Created2013
151868-Thumbnail Image.png
Description
Microbial electrochemical cells (MXCs) are promising platforms for bioenergy production from renewable resources. In these systems, specialized anode-respiring bacteria (ARB) deliver electrons from oxidation of organic substrates to the anode of an MXC. While much progress has been made in understanding the microbiology, physiology, and electrochemistry of well-studied model ARB

Microbial electrochemical cells (MXCs) are promising platforms for bioenergy production from renewable resources. In these systems, specialized anode-respiring bacteria (ARB) deliver electrons from oxidation of organic substrates to the anode of an MXC. While much progress has been made in understanding the microbiology, physiology, and electrochemistry of well-studied model ARB such as Geobacter and Shewanella, tremendous potential exists for MXCs as microbiological platforms for exploring novel ARB. This dissertation introduces approaches for selective enrichment and characterization of phototrophic, halophilic, and alkaliphilic ARB. An enrichment scheme based on manipulation of poised anode potential, light, and nutrient availability led to current generation that responded negatively to light. Analysis of phototrophically enriched communities suggested essential roles for green sulfur bacteria and halophilic ARB in electricity generation. Reconstruction of light-responsive current generation could be successfully achieved using cocultures of anode-respiring Geobacter and phototrophic Chlorobium isolated from the MXC enrichments. Experiments lacking exogenously supplied organic electron donors indicated that Geobacter could produce a measurable current from stored photosynthate in the dark. Community analysis of phototrophic enrichments also identified members of the novel genus Geoalkalibacter as potential ARB. Electrochemical characterization of two haloalkaliphilic, non-phototrophic Geoalkalibacter spp. showed that these bacteria were in fact capable of producing high current densities (4-8 A/m2) and using higher organic substrates under saline or alkaline conditions. The success of these selective enrichment approaches and community analyses in identifying and understanding novel ARB capabilities invites further use of MXCs as robust platforms for fundamental microbiological investigations.
ContributorsBadalamenti, Jonathan P (Author) / Krajmalnik-Brown, Rosa (Thesis advisor) / Garcia-Pichel, Ferran (Committee member) / Rittmann, Bruce E. (Committee member) / Torres, César I (Committee member) / Vermaas, Willem (Committee member) / Arizona State University (Publisher)
Created2013