Matching Items (4)
Filtering by

Clear all filters

150987-Thumbnail Image.png
Description
In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in unencrypted forms to remote machines owned

In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in unencrypted forms to remote machines owned and operated by third-party service providers, there are risks of unauthorized use of the users' sensitive data by service providers. Although there are many techniques for protecting users' data from outside attackers, currently there is no effective way to protect users' sensitive data from service providers. In this dissertation, an approach is presented to protecting the confidentiality of users' data from service providers, and ensuring that service providers cannot collect users' confidential data while the data is processed or stored in cloud computing systems. The approach has four major features: (1) separation of software service providers and infrastructure service providers, (2) hiding the information of the owners of data, (3) data obfuscation, and (4) software module decomposition and distributed execution. Since the approach to protecting users' data confidentiality includes software module decomposition and distributed execution, it is very important to effectively allocate the resource of servers in SBS to each of the software module to manage the overall performance of workflows in SBS. An approach is presented to resource allocation for SBS to adaptively allocating the system resources of servers to their software modules in runtime in order to satisfy the performance requirements of multiple workflows in SBS. Experimental results show that the dynamic resource allocation approach can substantially increase the throughput of a SBS and the optimal resource allocation can be found in polynomial time
ContributorsAn, Ho Geun (Author) / Yau, Sik-Sang (Thesis advisor) / Huang, Dijiang (Committee member) / Ahn, Gail-Joon (Committee member) / Santanam, Raghu (Committee member) / Arizona State University (Publisher)
Created2012
156773-Thumbnail Image.png
Description
As integrated technologies are scaling down, there is an increasing trend in the

process,voltage and temperature (PVT) variations of highly integrated RF systems.

Accounting for these variations during the design phase requires tremendous amount

of time for prediction of RF performance and optimizing it accordingly. Thus, there

is an increasing gap between the need

As integrated technologies are scaling down, there is an increasing trend in the

process,voltage and temperature (PVT) variations of highly integrated RF systems.

Accounting for these variations during the design phase requires tremendous amount

of time for prediction of RF performance and optimizing it accordingly. Thus, there

is an increasing gap between the need to relax the RF performance requirements at

the design phase for rapid development and the need to provide high performance

and low cost RF circuits that function with PVT variations. No matter how care-

fully designed, RF integrated circuits (ICs) manufactured with advanced technology

nodes necessitate lengthy post-production calibration and test cycles with expensive

RF test instruments. Hence design-for-test (DFT) is proposed for low-cost and fast

measurement of performance parameters during both post-production and in-eld op-

eration. For example, built-in self-test (BIST) is a DFT solution for low-cost on-chip

measurement of RF performance parameters. In this dissertation, three aspects of

automated test and calibration, including DFT mathematical model, BIST hardware

and built-in calibration are covered for RF front-end blocks.

First, the theoretical foundation of a post-production test of RF integrated phased

array antennas is proposed by developing the mathematical model to measure gain

and phase mismatches between antenna elements without any electrical contact. The

proposed technique is fast, cost-efficient and uses near-field measurement of radiated

power from antennas hence, it requires single test setup, it has easy implementation

and it is short in time which makes it viable for industrialized high volume integrated

IC production test.

Second, a BIST model intended for the characterization of I/Q offset, gain and

phase mismatch of IQ transmitters without relying on external equipment is intro-

duced. The proposed BIST method is based on on-chip amplitude measurement as

in prior works however,here the variations in the BIST circuit do not affect the target

parameter estimation accuracy since measurements are designed to be relative. The

BIST circuit is implemented in 130nm technology and can be used for post-production

and in-field calibration.

Third, a programmable low noise amplifier (LNA) is proposed which is adaptable

to different application scenarios depending on the specification requirements. Its

performance is optimized with regards to required specifications e.g. distance, power

consumption, BER, data rate, etc.The statistical modeling is used to capture the

correlations among measured performance parameters and calibration modes for fast

adaptation. Machine learning technique is used to capture these non-linear correlations and build the probability distribution of a target parameter based on measurement results of the correlated parameters. The proposed concept is demonstrated by

embedding built-in tuning knobs in LNA design in 130nm technology. The tuning

knobs are carefully designed to provide independent combinations of important per-

formance parameters such as gain and linearity. Minimum number of switches are

used to provide the desired tuning range without a need for an external analog input.
ContributorsShafiee, Maryam (Author) / Ozev, Sule (Thesis advisor) / Diaz, Rodolfo (Committee member) / Ogras, Umit Y. (Committee member) / Bakkaloglu, Bertan (Committee member) / Arizona State University (Publisher)
Created2018
155149-Thumbnail Image.png
Description
Cyber systems, including IoT (Internet of Things), are increasingly being used ubiquitously to vastly improve the efficiency and reduce the cost of critical application areas, such as finance, transportation, defense, and healthcare. Over the past two decades, computing efficiency and hardware cost have dramatically been improved. These improvements have made

Cyber systems, including IoT (Internet of Things), are increasingly being used ubiquitously to vastly improve the efficiency and reduce the cost of critical application areas, such as finance, transportation, defense, and healthcare. Over the past two decades, computing efficiency and hardware cost have dramatically been improved. These improvements have made cyber systems omnipotent, and control many aspects of human lives. Emerging trends in successful cyber system breaches have shown increasing sophistication in attacks and that attackers are no longer limited by resources, including human and computing power. Most existing cyber defense systems for IoT systems have two major issues: (1) they do not incorporate human user behavior(s) and preferences in their approaches, and (2) they do not continuously learn from dynamic environment and effectively adapt to thwart sophisticated cyber-attacks. Consequently, the security solutions generated may not be usable or implementable by the user(s) thereby drastically reducing the effectiveness of these security solutions.

In order to address these major issues, a comprehensive approach to securing ubiquitous smart devices in IoT environment by incorporating probabilistic human user behavioral inputs is presented. The approach will include techniques to (1) protect the controller device(s) [smart phone or tablet] by continuously learning and authenticating the legitimate user based on the touch screen finger gestures in the background, without requiring users’ to provide their finger gesture inputs intentionally for training purposes, and (2) efficiently configure IoT devices through controller device(s), in conformance with the probabilistic human user behavior(s) and preferences, to effectively adapt IoT devices to the changing environment. The effectiveness of the approach will be demonstrated with experiments that are based on collected user behavioral data and simulations.
ContributorsBuduru, Arun Balaji (Author) / Yau, Sik-Sang (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Davulcu, Hasan (Committee member) / Zhang, Yanchao (Committee member) / Arizona State University (Publisher)
Created2016
156819-Thumbnail Image.png
Description
Internet of Things (IoT) is emerging as part of the infrastructures for advancing a large variety of applications involving connections of many intelligent devices, leading to smart communities. Due to the severe limitation of the computing resources of IoT devices, it is common to offload tasks of various applications requiring

Internet of Things (IoT) is emerging as part of the infrastructures for advancing a large variety of applications involving connections of many intelligent devices, leading to smart communities. Due to the severe limitation of the computing resources of IoT devices, it is common to offload tasks of various applications requiring substantial computing resources to computing systems with sufficient computing resources, such as servers, cloud systems, and/or data centers for processing. However, this offloading method suffers from both high latency and network congestion in the IoT infrastructures.

Recently edge computing has emerged to reduce the negative impacts of tasks offloading to remote computing systems. As edge computing is in close proximity to IoT devices, it can reduce the latency of task offloading and reduce network congestion. Yet, edge computing has its drawbacks, such as the limited computing resources of some edge computing devices and the unbalanced loads among these devices. In order to effectively explore the potential of edge computing to support IoT applications, it is necessary to have efficient task management and load balancing in edge computing networks.

In this dissertation research, an approach is presented to periodically distributing tasks within the edge computing network while satisfying the quality-of-service (QoS) requirements of tasks. The QoS requirements include task completion deadline and security requirement. The approach aims to maximize the number of tasks that can be accommodated in the edge computing network, with consideration of tasks’ priorities. The goal is achieved through the joint optimization of the computing resource allocation and network bandwidth provisioning. Evaluation results show the improvement of the approach in increasing the number of tasks that can be accommodated in the edge computing network and the efficiency in resource utilization.
ContributorsSong, Yaozhong (Author) / Yau, Sik-Sang (Thesis advisor) / Huang, Dijiang (Committee member) / Sarjoughian, Hessam S. (Committee member) / Zhang, Yanchao (Committee member) / Arizona State University (Publisher)
Created2018