Matching Items (63)
Filtering by

Clear all filters

151948-Thumbnail Image.png
Description
Smart home system (SHS) is a kind of information system aiming at realizing home automation. The SHS can connect with almost any kind of electronic/electric device used in a home so that they can be controlled and monitored centrally. Today's technology also allows the home owners to control and monitor

Smart home system (SHS) is a kind of information system aiming at realizing home automation. The SHS can connect with almost any kind of electronic/electric device used in a home so that they can be controlled and monitored centrally. Today's technology also allows the home owners to control and monitor the SHS installed in their homes remotely. This is typically realized by giving the SHS network access ability. Although the SHS's network access ability brings a lot of conveniences to the home owners, it also makes the SHS facing more security threats than ever before. As a result, when designing a SHS, the security threats it might face should be given careful considerations. System security threats can be solved properly by understanding them and knowing the parts in the system that should be protected against them first. This leads to the idea of solving the security threats a SHS might face from the requirements engineering level. Following this idea, this paper proposes a systematic approach to generate the security requirements specifications for the SHS. It can be viewed as the first step toward the complete SHS security requirements engineering process.
ContributorsXu, Rongcao (Author) / Ghazarian, Arbi (Thesis advisor) / Bansal, Ajay (Committee member) / Lindquist, Timothy (Committee member) / Arizona State University (Publisher)
Created2013
150509-Thumbnail Image.png
Description
Gathering and managing software requirements, known as Requirement Engineering (RE), is a significant and basic step during the Software Development Life Cycle (SDLC). Any error or defect during the RE step will propagate to further steps of SDLC and resolving it will be more costly than any defect in other

Gathering and managing software requirements, known as Requirement Engineering (RE), is a significant and basic step during the Software Development Life Cycle (SDLC). Any error or defect during the RE step will propagate to further steps of SDLC and resolving it will be more costly than any defect in other steps. In order to produce better quality software, the requirements have to be free of any defects. Verification and Validation (V&V;) of requirements are performed to improve their quality, by performing the V&V; process on the Software Requirement Specification (SRS) document. V&V; of the software requirements focused to a specific domain helps in improving quality. A large database of software requirements from software projects of different domains is created. Software requirements from commercial applications are focus of this project; other domains embedded, mobile, E-commerce, etc. can be the focus of future efforts. The V&V; is done to inspect the requirements and improve the quality. Inspections are done to detect defects in the requirements and three approaches for inspection of software requirements are discussed; ad-hoc techniques, checklists, and scenario-based techniques. A more systematic domain-specific technique is presented for performing V&V; of requirements.
ContributorsChughtai, Rehman (Author) / Ghazarian, Arbi (Thesis advisor) / Bansal, Ajay (Committee member) / Millard, Bruce (Committee member) / Arizona State University (Publisher)
Created2012
161626-Thumbnail Image.png
Description
Calculus as a math course is important subject students need to succeed in, in order to venture into STEM majors. This thesis focuses on the early detection of at-risk students in a calculus course which can provide the proper intervention that might help them succeed in the course. Calculus has

Calculus as a math course is important subject students need to succeed in, in order to venture into STEM majors. This thesis focuses on the early detection of at-risk students in a calculus course which can provide the proper intervention that might help them succeed in the course. Calculus has high failure rates which corroborates with the data collected from Arizona State University that shows that 40% of the 3266 students whose data were used failed in their calculus course.This thesis proposes to utilize educational big data to detect students at high risk of failure and their eventual early detection and subsequent intervention can be useful. Some existing studies similar to this thesis make use of open-scale data that are lower in data count and perform predictions on low-impact Massive Open Online Courses(MOOC) based courses. In this thesis, an automatic detection method of academically at-risk students by using learning management systems(LMS) activity data along with the student information system(SIS) data from Arizona State University(ASU) for the course calculus for engineers I (MAT 265) is developed. The method will detect students at risk by employing machine learning to identify key features that contribute to the success of a student. This thesis also proposes a new technique to convert this button click data into a button click sequence which can be used as inputs to classifiers. In addition, the advancements in Natural Language Processing field can be used by adopting methods such as part-of-speech (POS) tagging and tools such as Facebook Fasttext word embeddings to convert these button click sequences into numeric vectors before feeding them into the classifiers. The thesis proposes two preprocessing techniques and evaluates them on 3 different machine learning ensembles to determine their performance across the two modalities of the class.
ContributorsDileep, Akshay Kumar (Author) / Bansal, Ajay (Thesis advisor) / Cunningham, James (Committee member) / Acuna, Ruben (Committee member) / Arizona State University (Publisher)
Created2021
161629-Thumbnail Image.png
Description
One persisting problem in Massive Open Online Courses (MOOCs) is the issue of student dropout from these courses. The prediction of student dropout from MOOC courses can identify the factors responsible for such an event and it can further initiate intervention before such an event to increase student success in

One persisting problem in Massive Open Online Courses (MOOCs) is the issue of student dropout from these courses. The prediction of student dropout from MOOC courses can identify the factors responsible for such an event and it can further initiate intervention before such an event to increase student success in MOOC. There are different approaches and various features available for the prediction of student’s dropout in MOOC courses.In this research, the data derived from the self-paced math course ‘College Algebra and Problem Solving’ offered on the MOOC platform Open edX offered by Arizona State University (ASU) from 2016 to 2020 was considered. This research aims to predict the dropout of students from a MOOC course given a set of features engineered from the learning of students in a day. Machine Learning (ML) model used is Random Forest (RF) and this model is evaluated using the validation metrics like accuracy, precision, recall, F1-score, Area Under the Curve (AUC), Receiver Operating Characteristic (ROC) curve. The average rate of student learning progress was found to have more impact than other features. The model developed can predict the dropout or continuation of students on any given day in the MOOC course with an accuracy of 87.5%, AUC of 94.5%, precision of 88%, recall of 87.5%, and F1-score of 87.5% respectively. The contributing features and interactions were explained using Shapely values for the prediction of the model. The features engineered in this research are predictive of student dropout and could be used for similar courses to predict student dropout from the course. This model can also help in making interventions at a critical time to help students succeed in this MOOC course.
ContributorsDominic Ravichandran, Sheran Dass (Author) / Gary, Kevin (Thesis advisor) / Bansal, Ajay (Committee member) / Cunningham, James (Committee member) / Sannier, Adrian (Committee member) / Arizona State University (Publisher)
Created2021
171603-Thumbnail Image.png
Description
A significant proportion of medical errors exist in crucial medical information, and most stem from misinterpreting non-standardized clinical notes. Clinical Skills exam offered by the United States Medical Licensing Examination (USMLE) was put in place to certify patient note-taking skills before medical students joined professional practices, offering the first line

A significant proportion of medical errors exist in crucial medical information, and most stem from misinterpreting non-standardized clinical notes. Clinical Skills exam offered by the United States Medical Licensing Examination (USMLE) was put in place to certify patient note-taking skills before medical students joined professional practices, offering the first line of defense in protecting patients from medical errors. Nonetheless, the exams were discontinued in 2021 following high costs and resource usage in scoring the exams. This thesis compares four transformer-based models, namely BERT (Bidirectional Encoder Representations from Transformers) Base Uncased, Emilyalsentzer Bio_ClinicalBERT, RoBERTa (Robustly Optimized BERT Pre-Training Approach), and DeBERTa (Decoding-enhanced BERT with disentangled attention), with the goal to map free text in patient notes to clinical concepts present in the exam rubric. The impact of context-specific embeddings on BERT was also studied to determine the need for a clinical BERT in Clinical Skills exam. This thesis proposes the use of DeBERTa as a backbone model in patient note scoring for the USMLE Clinical Skills exam after comparing it with three other transformer models. Disentangled attention and enhanced mask decoder integrated into DeBERTa were credited for the high performance of DeBERTa as compared to the other models. Besides, the effect of meta pseudo labeling was also investigated in this thesis, which in turn, further enhanced DeBERTa’s performance.
ContributorsGanesh, Jay (Author) / Bansal, Ajay (Thesis advisor) / Mehlhase, Alexandra (Committee member) / Findler, Michael (Committee member) / Arizona State University (Publisher)
Created2022
171980-Thumbnail Image.png
Description
The increasing availability of data and advances in computation have spurred the development of data-driven approaches for modeling complex dynamical systems. These approaches are based on the idea that the underlying structure of a complex system can be discovered from data using mathematical and computational techniques. They also show promise

The increasing availability of data and advances in computation have spurred the development of data-driven approaches for modeling complex dynamical systems. These approaches are based on the idea that the underlying structure of a complex system can be discovered from data using mathematical and computational techniques. They also show promise for addressing the challenges of modeling high-dimensional, nonlinear systems with limited data. In this research expository, the state of the art in data-driven approaches for modeling complex dynamical systems is surveyed in a systemic way. First the general formulation of data-driven modeling of dynamical systems is discussed. Then several representative methods in feature engineering and system identification/prediction are reviewed, including recent advances and key challenges.
ContributorsShi, Wenlong (Author) / Ren, Yi (Thesis advisor) / Hong, Qijun (Committee member) / Jiao, Yang (Committee member) / Yang, Yezhou (Committee member) / Arizona State University (Publisher)
Created2022
171992-Thumbnail Image.png
Description
The need for autonomous cars has never been more vital, and for a vehicle to be completely autonomous, multiple components must work together, one of which is the capacity to park at the end of a mission. This thesis project aims to design and execute an automated parking assist system

The need for autonomous cars has never been more vital, and for a vehicle to be completely autonomous, multiple components must work together, one of which is the capacity to park at the end of a mission. This thesis project aims to design and execute an automated parking assist system (APAS). Traditional Automated parking assist systems (APAS) may not be effective in some constrained urban parking environments because of the parking space dimension. The thesis proposes a novel four-wheel steering (4-WS) vehicle for automated parallel parking to overcome this kind of challenge. Then, benefiting from the maneuverability enabled by the 4WS system, the feasible initial parking area is vastly expanded from those for the conventional 2WS vehicles. In addition, the expanded initial area is divided into four areas where different paths are planned correspondingly. In the proposed novel APAS first, a suitable parking space is identified through ultra-sonic sensors, which are mounted around the vehicle, and then depending upon the vehicle's initial position, various compact and smooth parallel parking paths are generated. An optimization function is built to get the smoothest (i.e., the smallest steering angle change and the shortest path) parallel parking path. With the full utilization of the 4WS system, the proposed path planning algorithm can allow a larger initial parking area that can be easily tracked by the 4WS vehicles. The proposed APAS for 4WS vehicles makes the automatic parking process in restricted spaces efficient. To verify the feasibility and effectiveness of the proposed APAS, a 4WS vehicle prototype is applied for validation through both simulation and experiment results.
ContributorsGujarathi, Kaushik Kumar (Author) / Chen, Yan (Thesis advisor) / Yong, Sze Zheng (Committee member) / Ren, Yi (Committee member) / Arizona State University (Publisher)
Created2022
172002-Thumbnail Image.png
Description
The distribution and transport of mercury in the human body are poorly constrained. For instance, the long-term persistence and intra-individual distribution of mercury in bones from dental amalgams or environmental exposure have not been studied. A robust method validated for accuracy and precision specifically for mercury in human bones would

The distribution and transport of mercury in the human body are poorly constrained. For instance, the long-term persistence and intra-individual distribution of mercury in bones from dental amalgams or environmental exposure have not been studied. A robust method validated for accuracy and precision specifically for mercury in human bones would facilitate studies of mercury in anthropological, forensic, and medical studies. I present a highly precise, accurate mercury concentration analytical method targeted to human bone samples. This method uses commercially commonly available and reliable instruments that are not limited to elemental Hg analysis. This method requires significantly lower sample amounts than existing methods because it has a much lower limit of detection compared to the best mercury analyzers on the market and other analytical methods. With the low limit of detection achieved, this mercury concentration protocol is an excellent fit for studies with a limited amount of samples for destructive analysis. I then use this method to analyze the mercury concentration distribution in modern skeletal collections provided by three U.S. anthropological research facilities. Mercury concentration and distribution were analyzed from 35 donors’ skeletons with 18 different skeletal elements (bones) per donor to evaluate both the intra-individual and inter-individual variation in mercury concentration. Considered factors include geological differences in decomposition sites and the presence of dental amalgam filling. Geological differences in decomposition sites did not statistically affect the mercury concentration in the donor’s skeleton. The presence of dental amalgam significantly affected the inter-individual and intra-individual mercury concentration variation in donors’ skeletal samples. Individuals who had dental amalgam had significantly higher mercury concentration in their skeleton compared to individuals who did not have dental amalgam (p-value <0.01). Mercury concentration in the mandible, occipital bone, patella, and proximal phalanx (foot) was significantly affected by the presence of dental amalgam.
ContributorsRen, Yi (Author) / Gordon, Gwyneth GG (Thesis advisor) / Anbar, Ariel AD (Thesis advisor) / Shock, Everett ES (Committee member) / Knudson, Kelly KJ (Committee member) / Arizona State University (Publisher)
Created2022
190879-Thumbnail Image.png
Description
Open Information Extraction (OIE) is a subset of Natural Language Processing (NLP) that constitutes the processing of natural language into structured and machine-readable data. This thesis uses data in Resource Description Framework (RDF) triple format that comprises of a subject, predicate, and object. The extraction of RDF triples from

Open Information Extraction (OIE) is a subset of Natural Language Processing (NLP) that constitutes the processing of natural language into structured and machine-readable data. This thesis uses data in Resource Description Framework (RDF) triple format that comprises of a subject, predicate, and object. The extraction of RDF triples from natural language is an essential step towards importing data into web ontologies as part of the linked open data cloud on the Semantic web. There have been a number of related techniques for extraction of triples from plain natural language text including but not limited to ClausIE, OLLIE, Reverb, and DeepEx. This proposed study aims to reduce the dependency on conventional machine learning models since they require training datasets, and the models are not easily customizable or explainable. By leveraging a context-free grammar (CFG) based model, this thesis aims to address some of these issues while minimizing the trade-offs on performance and accuracy. Furthermore, a deep-dive is conducted to analyze the strengths and limitations of the proposed approach.
ContributorsSingh, Varun (Author) / Bansal, Srividya (Thesis advisor) / Bansal, Ajay (Committee member) / Mehlhase, Alexandra (Committee member) / Arizona State University (Publisher)
Created2023
187873-Thumbnail Image.png
Description
Least squares fitting in 3D is applied to produce higher level geometric parameters that describe the optimum location of a line-profile through many nodal points that are derived from Finite Element Analysis (FEA) simulations of elastic spring-back of features both on stamped sheet metal components after they have been plasticly

Least squares fitting in 3D is applied to produce higher level geometric parameters that describe the optimum location of a line-profile through many nodal points that are derived from Finite Element Analysis (FEA) simulations of elastic spring-back of features both on stamped sheet metal components after they have been plasticly deformed in a press and released, and on simple assemblies made from them. Although the traditional Moore-Penrose inverse was used to solve the superabundant linear equations, the formulation of these equations was distinct and based on virtual work and statics applied to parallel-actuated robots in order to allow for both more complex profiles and a change in profile size. The output, a small displacement torsor (SDT) is used to describe the displacement of the profile from its nominal location. It may be regarded as a generalization of the slope and intercept parameters of a line which result from a Gauss-Markov regression fit of points in a plane. Additionally, minimum zone-magnitudes were computed that just capture the points along the profile. And finally, algorithms were created to compute simple parameters for cross-sectional shapes of components were also computed from sprung-back data points according to the protocol of simulations and benchmark experiments conducted by the metal forming community 30 years ago, although it was necessary to modify their protocol for some geometries that differed from the benchmark.
ContributorsSunkara, Sai Chandu (Author) / Davidson, Joseph (Thesis advisor) / Shah, Jami (Committee member) / Ren, Yi (Committee member) / Arizona State University (Publisher)
Created2023