Matching Items (1,190)
Filtering by

Clear all filters

149668-Thumbnail Image.png
Description
Service based software (SBS) systems are software systems consisting of services based on the service oriented architecture (SOA). Each service in SBS systems provides partial functionalities and collaborates with other services as workflows to provide the functionalities required by the systems. These services may be developed and/or owned by different

Service based software (SBS) systems are software systems consisting of services based on the service oriented architecture (SOA). Each service in SBS systems provides partial functionalities and collaborates with other services as workflows to provide the functionalities required by the systems. These services may be developed and/or owned by different entities and physically distributed across the Internet. Compared with traditional software system components which are usually specifically designed for the target systems and bound tightly, the interfaces of services and their communication protocols are standardized, which allow SBS systems to support late binding, provide better interoperability, better flexibility in dynamic business logics, and higher fault tolerance. The development process of SBS systems can be divided to three major phases: 1) SBS specification, 2) service discovery and matching, and 3) service composition and workflow execution. This dissertation focuses on the second phase, and presents a privacy preserving service discovery and ranking approach for multiple user QoS requirements. This approach helps service providers to register services and service users to search services through public, but untrusted service directories with the protection of their privacy against the service directories. The service directories can match the registered services with service requests, but do not learn any information about them. Our approach also enforces access control on services during the matching process, which prevents unauthorized users from discovering services. After the service directories match a set of services that satisfy the service users' functionality requirements, the service discovery approach presented in this dissertation further considers service users' QoS requirements in two steps. First, this approach optimizes services' QoS by making tradeoff among various QoS aspects with users' QoS requirements and preferences. Second, this approach ranks services based on how well they satisfy users' QoS requirements to help service users select the most suitable service to develop their SBSs.
ContributorsYin, Yin (Author) / Yau, Stephen S. (Thesis advisor) / Candan, Kasim (Committee member) / Dasgupta, Partha (Committee member) / Santanam, Raghu (Committee member) / Arizona State University (Publisher)
Created2011
150382-Thumbnail Image.png
Description
This thesis proposed a novel approach to establish the trust model in a social network scenario based on users' emails. Email is one of the most important social connections nowadays. By analyzing email exchange activities among users, a social network trust model can be established to judge the trust rate

This thesis proposed a novel approach to establish the trust model in a social network scenario based on users' emails. Email is one of the most important social connections nowadays. By analyzing email exchange activities among users, a social network trust model can be established to judge the trust rate between each two users. The whole trust checking process is divided into two steps: local checking and remote checking. Local checking directly contacts the email server to calculate the trust rate based on user's own email communication history. Remote checking is a distributed computing process to get help from user's social network friends and built the trust rate together. The email-based trust model is built upon a cloud computing framework called MobiCloud. Inside MobiCloud, each user occupies a virtual machine which can directly communicate with others. Based on this feature, the distributed trust model is implemented as a combination of local analysis and remote analysis in the cloud. Experiment results show that the trust evaluation model can give accurate trust rate even in a small scale social network which does not have lots of social connections. With this trust model, the security in both social network services and email communication could be improved.
ContributorsZhong, Yunji (Author) / Huang, Dijiang (Thesis advisor) / Dasgupta, Partha (Committee member) / Syrotiuk, Violet (Committee member) / Arizona State University (Publisher)
Created2011
148104-Thumbnail Image.png
Description

Reducing the amount of error and introduced data variability increases the accuracy of Western blot results. In this study, different methods of normalization for loading differences and data alignment were explored with respect to their impact on Western blot results. GAPDH was compared to the LI-COR Revert total protein stain

Reducing the amount of error and introduced data variability increases the accuracy of Western blot results. In this study, different methods of normalization for loading differences and data alignment were explored with respect to their impact on Western blot results. GAPDH was compared to the LI-COR Revert total protein stain as a loading control. The impact of normalizing data to a control condition, which is commonly done to align Western blot data distributed over several immunoblots, was also investigated. Specifically, this study addressed whether normalization to a small subset of distinct controls on each immunoblot increases pooled data variability compared to a larger set of controls. Protein expression data for NOX-2 and SOD-2 from a study investigating the protective role of the bradykinin type 1 receptor in angiotensin-II induced left ventricle remodeling were used to address these questions but are also discussed in the context of the original study. The comparison of GAPDH and Revert total protein stain as a loading control was done by assessing their correlation and comparing how they affected protein expression results. Additionally, the impact of treatment on GAPDH was investigated. To assess how normalization to different combinations of controls influences data variability, protein data were normalized to the average of 5 controls, the average of 2 controls, or an average vehicle and the results by treatment were compared. The results of this study demonstrated that GAPDH expression is not affected by angiotensin-II or bradykinin type 1 receptor antagonist R-954 and is a less sensitive loading control compared to Revert total protein stain. Normalization to the average of 5 controls tended to reduce pooled data variability compared to 2 controls. Lastly, the results of this study provided preliminary evidence that R-954 does not alter the expression of NOX-2 or SOD-2 to an expression profile that would be expected to explain the protection it confers against Ang-II induced left ventricle remodeling.

ContributorsSiegel, Matthew Marat (Author) / Jeremy, Mills (Thesis director) / Sweazea, Karen (Committee member) / Hale, Taben (Committee member) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148106-Thumbnail Image.png
Description

The Electoral College, the current electoral system in the U.S., operates on a Winner-Take-All or First Past the Post (FPTP) principle, where the candidate with the most votes wins. Despite the Electoral College being the current system, it is problematic. According to Lani Guinier in Tyranny of the Majority, “the

The Electoral College, the current electoral system in the U.S., operates on a Winner-Take-All or First Past the Post (FPTP) principle, where the candidate with the most votes wins. Despite the Electoral College being the current system, it is problematic. According to Lani Guinier in Tyranny of the Majority, “the winner-take-all principle invariably wastes some votes” (121). This means that the majority group gets all of the power in an election while the votes of the minority groups are completely wasted and hold little to no significance. Additionally, FPTP systems reinforce a two-party system in which neither candidate could satisfy the majority of the electorate’s needs and issues, yet forces them to choose between the two dominant parties. Moreover, voting for a third party candidate only hurts the voter since it takes votes away from the party they might otherwise support and gives the victory to the party they prefer the least, ensuring that the two party system is inescapable. Therefore, a winner-take-all system does not provide the electorate with fair or proportional representation and creates voter disenfranchisement: it offers them very few choices that appeal to their needs and forces them to choose a candidate they dislike. There are, however, alternative voting systems that remedy these issues, such as a Ranked voting system, in which voters can rank their candidate choices in the order they prefer them, or a Proportional voting system, in which a political party acquires a number of seats based on the proportion of votes they receive from the voter base. Given these alternatives, we will implement a software simulation of one of these systems to demonstrate how they work in contrast to FPTP systems, and therefore provide evidence of how these alternative systems could work in practice and in place of the current electoral system.

ContributorsSummers, Jack Gillespie (Co-author) / Martin, Autumn (Co-author) / Burger, Kevin (Thesis director) / Voorhees, Matthew (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148109-Thumbnail Image.png
Description

System and software verification is a vital component in the development and reliability of cyber-physical systems - especially in critical domains where the margin of error is minimal. In the case of autonomous driving systems (ADS), the vision perception subsystem is a necessity to ensure correct maneuvering of the environment

System and software verification is a vital component in the development and reliability of cyber-physical systems - especially in critical domains where the margin of error is minimal. In the case of autonomous driving systems (ADS), the vision perception subsystem is a necessity to ensure correct maneuvering of the environment and identification of objects. The challenge posed in perception systems involves verifying the accuracy and rigidity of detections. The use of Spatio-Temporal Perception Logic (STPL) enables the user to express requirements for the perception system to verify, validate, and ensure its behavior; however, a drawback to STPL involves its accessibility. It is limited to individuals with an expert or higher-level knowledge of temporal and spatial logics, and the formal-written requirements become quite verbose with more restrictions imposed. In this thesis, I propose a domain-specific language (DSL) catered to Spatio-Temporal Perception Logic to enable non-expert users the ability to capture requirements for perception subsystems while reducing the necessity to have an experienced background in said logic. The domain-specific language for the Spatio-Temporal Perception Logic is built upon the formal language with two abstractions. The main abstraction captures simple programming statements that are translated to a lower-level STPL expression accepted by the testing monitor. The STPL DSL provides a seamless interface to writing formal expressions while maintaining the power and expressiveness of STPL. These translated equivalent expressions are capable of directing a standard for perception systems to ensure the safety and reduce the risks involved in ill-formed detections.

ContributorsAnderson, Jacob (Author) / Fainekos, Georgios (Thesis director) / Yezhou, Yang (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148110-Thumbnail Image.png
Description

This thesis identifies and explains two main problems students face during their internships. The first problem relates to feeling bored at internships due to the simplicity of projects or lack of work. From the interviews conducted, several strategies to avoid this boredom were created, including having employers design education plans

This thesis identifies and explains two main problems students face during their internships. The first problem relates to feeling bored at internships due to the simplicity of projects or lack of work. From the interviews conducted, several strategies to avoid this boredom were created, including having employers design education plans for interns to further their knowledge in programs such as excel during their downtime. The second problem with internships discovered focuses on the gap between what is taught in schools versus what is expected of interns in practice. This thesis identifies several opportunities for improvement in education and strategies on how to handle feeling overwhelmed on intern projects due to lack of knowledge.

ContributorsKomarnyckyj, Katya (Author) / Byrne, Jared (Thesis director) / Crawford, Cassidy (Committee member) / School of Molecular Sciences (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148121-Thumbnail Image.png
Description

This thesis proposes hardware and software security enhancements to the robotic explorer of a capstone team, in collaboration with the NASA Psyche Mission Student Collaborations program. The NASA Psyche Mission, launching in 2022 and reaching the metallic asteroid of the same name in 2026, will explore from orbit what is

This thesis proposes hardware and software security enhancements to the robotic explorer of a capstone team, in collaboration with the NASA Psyche Mission Student Collaborations program. The NASA Psyche Mission, launching in 2022 and reaching the metallic asteroid of the same name in 2026, will explore from orbit what is hypothesized to be remnant core material of an early planet, potentially providing key insights to planet formation. Following this initial mission, it is possible there would be scientists and engineers interested in proposing a mission to land an explorer on the surface of Psyche to further document various properties of the asteroid. As a proposal for a second mission, an interdisciplinary engineering and science capstone team at Arizona State University designed and constructed a robotic explorer for the hypothesized surfaces of Psyche, capable of semi-autonomously navigating simulated surfaces to collect scientific data from onboard sensors. A critical component of this explorer is the command and data handling subsystem, and as such, the security of this system, though outside the scope of the capstone project, remains a crucial consideration. This thesis proposes the pairing of Trusted Platform Module (TPM) technology for increased hardware security and the implementation of SELinux (Security Enhanced Linux) for increased software security for Earth-based testing as well as space-ready missions.

ContributorsAnderson, Kelly Joanne (Author) / Bowman, Catherine (Thesis director) / Kozicki, Michael (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148128-Thumbnail Image.png
Description

CubeSats can encounter a myriad of difficulties in space like cosmic rays, temperature<br/>issues, and loss of control. By creating better, more reliable software, these problems can be<br/>mitigated and increase the chance of success for the mission. This research sets out to answer the<br/>question: how do we create reliable flight software

CubeSats can encounter a myriad of difficulties in space like cosmic rays, temperature<br/>issues, and loss of control. By creating better, more reliable software, these problems can be<br/>mitigated and increase the chance of success for the mission. This research sets out to answer the<br/>question: how do we create reliable flight software for CubeSats? by providing a concentrated<br/>list of the best flight software development practices. The CubeSat used in this research is the<br/>Deployable Optical Receiver Aperture (DORA) CubeSat, which is a 3U CubeSat that seeks to<br/>demonstrate optical communication data rates of 1 Gbps over long distances. We present an<br/>analysis over many of the flight software development practices currently in use in the industry,<br/>from industry leads NASA, and identify three key flight software development areas of focus:<br/>memory, concurrency, and error handling. Within each of these areas, the best practices were<br/>defined for how to approach the area. These practices were also developed using experience<br/>from the creation of flight software for the DORA CubeSat in order to drive the design and<br/>testing of the system. We analyze DORA’s effectiveness in the three areas of focus, as well as<br/>discuss how following the best practices identified helped to create a more reliable flight<br/>software system for the DORA CubeSat.

ContributorsHoffmann, Zachary Christian (Author) / Chavez-Echeagaray, Maria Elena (Thesis director) / Jacobs, Daniel (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148130-Thumbnail Image.png
Description

Over 40% of adults in the United States are considered obese. Obesity is known to cause abnormal metabolic effects and lead to other negative health consequences. Interestingly, differences in metabolism and contractile performance between obese and healthy weight individuals are associated with differences in skeletal muscle fiber type composition between

Over 40% of adults in the United States are considered obese. Obesity is known to cause abnormal metabolic effects and lead to other negative health consequences. Interestingly, differences in metabolism and contractile performance between obese and healthy weight individuals are associated with differences in skeletal muscle fiber type composition between these groups. Each fiber type is characterized by unique metabolic and contractile properties, which are largely determined by the myosin heavy chain isoform (MHC) or isoform combination that the fiber expresses. In previous studies, SDS-PAGE single fiber analysis has been utilized as a method to determine MHC isoform distribution and single fiber type distribution in skeletal muscle. Herein, a methodological approach to analyze MHC isoform and fiber type distribution in skeletal muscle was fine-tuned for use in human and rodent studies. In the future, this revised methodology will be implemented to evaluate the effects of obesity and exercise on the phenotypic fiber type composition of skeletal muscle.

ContributorsOhr, Jalonna Rose (Author) / Katsanos, Christos (Thesis director) / Tucker, Derek (Committee member) / Serrano, Nathan (Committee member) / School of Life Sciences (Contributor) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147836-Thumbnail Image.png
Description

Since 1975, the prevalence of obesity has nearly tripled around the world. In 2016, 39% of adults, or 1.9 billion people, were considered overweight, and 13% of adults, or 650 million people, were considered obese. Furthermore, Cardiovascular disease remains to be the leading cause of death for adults in the

Since 1975, the prevalence of obesity has nearly tripled around the world. In 2016, 39% of adults, or 1.9 billion people, were considered overweight, and 13% of adults, or 650 million people, were considered obese. Furthermore, Cardiovascular disease remains to be the leading cause of death for adults in the United States, with 655,000 people dying from related conditions and consequences each year. Including fiber in one’s dietary regimen has been shown to greatly improve health outcomes in regards to these two areas of health. However, not much literature is available on the effects of corn-based fiber, especially detailing the individual components of the grain itself. The purpose of this preliminary study was to test the differences in influence on both LDL-cholesterol and triglycerides between treatments based on whole-grain corn flour, refined corn flour, and 50% refined corn flour + 50% corn bran derived from whole grain cornmeal (excellent fiber) in healthy overweight (BMI ≥ 25.0 kg/m2) adults (ages 18 - 70) with high LDL cholesterol (LDL ≥ 120mg/dL). 20 participants, ages 18 - 64 (10 males, 10 females) were involved. Data was derived from blood draws taken before and after each of the three treatments as well as before and after each treatment’s wash out periods. A general linear model was used to assess the effect of corn products on circulating concentrations of LDL-cholesterol and triglycerides. From the model, it was found that the whole-grain corn flour and the 50% refined corn flour + 50% corn bran drive from whole grain cornmeal treatments produced a higher, similar benefit in reductions in LDL-cholesterol. However, the whole grain flour, refined flour, and bran-based fiber treatments did not influence the triglyceride levels of the participants throughout this study. Further research is needed to elucidate the effects of these fiber items on cardiometabolic disease markers in the long-term as well as with a larger sample size.

ContributorsLe, Justin (Author) / Whisner, Corrie (Thesis director) / Ortega Santos, Carmen (Committee member) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05