Matching Items (4)
Filtering by

Clear all filters

150987-Thumbnail Image.png
Description
In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in unencrypted forms to remote machines owned

In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in unencrypted forms to remote machines owned and operated by third-party service providers, there are risks of unauthorized use of the users' sensitive data by service providers. Although there are many techniques for protecting users' data from outside attackers, currently there is no effective way to protect users' sensitive data from service providers. In this dissertation, an approach is presented to protecting the confidentiality of users' data from service providers, and ensuring that service providers cannot collect users' confidential data while the data is processed or stored in cloud computing systems. The approach has four major features: (1) separation of software service providers and infrastructure service providers, (2) hiding the information of the owners of data, (3) data obfuscation, and (4) software module decomposition and distributed execution. Since the approach to protecting users' data confidentiality includes software module decomposition and distributed execution, it is very important to effectively allocate the resource of servers in SBS to each of the software module to manage the overall performance of workflows in SBS. An approach is presented to resource allocation for SBS to adaptively allocating the system resources of servers to their software modules in runtime in order to satisfy the performance requirements of multiple workflows in SBS. Experimental results show that the dynamic resource allocation approach can substantially increase the throughput of a SBS and the optimal resource allocation can be found in polynomial time
ContributorsAn, Ho Geun (Author) / Yau, Sik-Sang (Thesis advisor) / Huang, Dijiang (Committee member) / Ahn, Gail-Joon (Committee member) / Santanam, Raghu (Committee member) / Arizona State University (Publisher)
Created2012
151176-Thumbnail Image.png
Description
Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these

Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these massive datasets lay in their complex structures, such as high-dimensionality, hierarchy, multi-modality, heterogeneity and data uncertainty. Besides the statistical challenges, the associated computational approaches are also considered essential in achieving efficiency, effectiveness, as well as the numerical stability in practice. On the other hand, some recent developments in statistics and machine learning, such as sparse learning, transfer learning, and some traditional methodologies which still hold potential, such as multi-level models, all shed lights on addressing these complex datasets in a statistically powerful and computationally efficient way. In this dissertation, we identify four kinds of general complex datasets, including "high-dimensional datasets", "hierarchically-structured datasets", "multimodality datasets" and "data uncertainties", which are ubiquitous in many domains, such as biology, medicine, neuroscience, health care delivery, manufacturing, etc. We depict the development of novel statistical models to analyze complex datasets which fall under these four categories, and we show how these models can be applied to some real-world applications, such as Alzheimer's disease research, nursing care process, and manufacturing.
ContributorsHuang, Shuai (Author) / Li, Jing (Thesis advisor) / Askin, Ronald (Committee member) / Ye, Jieping (Committee member) / Runger, George C. (Committee member) / Arizona State University (Publisher)
Created2012
149301-Thumbnail Image.png
Description
Prognostics and health management (PHM) is a method that permits the reliability of a system to be evaluated in its actual application conditions. This work involved developing a robust system to determine the advent of failure. Using the data from the PHM experiment, a model was developed to estimate the

Prognostics and health management (PHM) is a method that permits the reliability of a system to be evaluated in its actual application conditions. This work involved developing a robust system to determine the advent of failure. Using the data from the PHM experiment, a model was developed to estimate the prognostic features and build a condition based system based on measured prognostics. To enable prognostics, a framework was developed to extract load parameters required for damage assessment from irregular time-load data. As a part of the methodology, a database engine was built to maintain and monitor the experimental data. This framework helps in significant reduction of the time-load data without compromising features that are essential for damage estimation. A failure precursor based approach was used for remaining life prognostics. The developed system has a throughput of 4MB/sec with 90% latency within 100msec. This work hence provides an overview on Prognostic framework survey, Prognostics Framework architecture and design approach with a robust system implementation.
ContributorsVaradarajan, Gayathri (Author) / Liu, Huan (Thesis advisor) / Ye, Jieping (Committee member) / Davalcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2010
155039-Thumbnail Image.png
Description
Access control has been historically recognized as an effective technique for ensuring that computer systems preserve important security properties. Recently, attribute-based

access control (ABAC) has emerged as a new paradigm to provide access mediation

by leveraging the concept of attributes: observable properties that become relevant under a certain security context and are

Access control has been historically recognized as an effective technique for ensuring that computer systems preserve important security properties. Recently, attribute-based

access control (ABAC) has emerged as a new paradigm to provide access mediation

by leveraging the concept of attributes: observable properties that become relevant under a certain security context and are exhibited by the entities normally involved in the mediation process, namely, end-users and protected resources. Also recently, independently-run organizations from the private and public sectors have recognized the benefits of engaging in multi-disciplinary research collaborations that involve sharing sensitive proprietary resources such as scientific data, networking capabilities and computation time and have recognized ABAC as the paradigm that suits their needs for restricting the way such resources are to be shared with each other. In such a setting, a robust yet flexible access mediation scheme is crucial to guarantee participants are granted access to such resources in a safe and secure manner.

However, no consensus exists either in the literature with respect to a formal model that clearly defines the way the components depicted in ABAC should interact with each other, so that the rigorous study of security properties to be effectively pursued. This dissertation proposes an approach tailored to provide a well-defined and formal definition of ABAC, including a description on how attributes exhibited by different independent organizations are to be leveraged for mediating access to shared resources, by allowing for collaborating parties to engage in federations for the specification, discovery, evaluation and communication of attributes, policies, and access mediation decisions. In addition, a software assurance framework is introduced to support the correct construction of enforcement mechanisms implementing our approach by leveraging validation and verification techniques based on software assertions, namely, design by contract (DBC) and behavioral interface specification languages (BISL). Finally, this dissertation also proposes a distributed trust framework that allows for exchanging recommendations on the perceived reputations of members of our proposed federations, in such a way that the level of trust of previously-unknown participants can be properly assessed for the purposes of access mediation.
ContributorsRubio Medrano, Carlos Ernesto (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Zhao, Ziming (Committee member) / Santanam, Raghu (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2016