Matching Items (2)
Filtering by

Clear all filters

153807-Thumbnail Image.png
Description
Brain Computer Interfaces are becoming the next generation controllers not only in the medical devices for disabled individuals but also in the gaming and entertainment industries. In order to build an effective Brain Computer Interface, which accurately translates the user thoughts into machine commands, it is important to have robust

Brain Computer Interfaces are becoming the next generation controllers not only in the medical devices for disabled individuals but also in the gaming and entertainment industries. In order to build an effective Brain Computer Interface, which accurately translates the user thoughts into machine commands, it is important to have robust and fail proof signal processing and machine learning modules which operate on the raw EEG signals and estimate the current thought of the user.

In this thesis, several techniques used to perform EEG signal pre-processing, feature extraction and signal classification have been discussed, implemented, validated and verified; efficient supervised machine learning models, for the EEG motor imagery signal classification are identified. To further improve the performance of system unsupervised feature learning techniques have been investigated by pre-training the Deep Learning models. Use of pre-training stacked autoencoders have been proposed to solve the problems caused by random initialization of weights in neural networks.

Motor Imagery (imaginary hand and leg movements) signals are acquire using the Emotiv EEG headset. Different kinds of features like mean signal, band powers, RMS of the signal have been extracted and supplied to the machine learning (ML) stage, wherein, several ML techniques like LDA, KNN, SVM, Logistic regression and Neural Networks are applied and validated. During the validation phase the performances of various techniques are compared and some important observations are reported. Further, deep Learning techniques like autoencoding have been used to perform unsupervised feature learning. The reliability of the features is analyzed by performing classification by using the ML techniques mentioned earlier. The performance of the neural networks has been further improved by pre-training the network in an unsupervised fashion using stacked autoencoders and supplying the stacked autoencoders’ network parameters as initial parameters to the neural network. All the findings in this research, during each phase (pre-processing, feature extraction, classification) are directly relevant and can be used by the BCI research community for building motor imagery based BCI applications.

Additionally, this thesis attempts to develop, test, and compare the performance of an alternative method for classifying human driving behavior. This thesis proposes the use of driver affective states to know the driving behavior. The purpose of this part of the thesis was to classify the EEG data collected from several subjects while driving simulated vehicle and compare the classification results with those obtained by classifying the driving behavior using vehicle parameters collected simultaneously from all the subjects. The objective here is to see if the drivers’ mental state is reflected in his driving behavior.
ContributorsManchala, Vamsi Krishna (Author) / Redkar, Sangram (Thesis advisor) / Rogers, Bradley (Committee member) / Sugar, Thomas (Committee member) / Arizona State University (Publisher)
Created2015
156463-Thumbnail Image.png
Description
Traditional usability methods in Human-Computer Interaction (HCI) have been extensively used to understand the usability of products. Measurements of user experience (UX) in traditional HCI studies mostly rely on task performance and observable user interactions with the product or services, such as usability tests, contextual inquiry, and subjective self-report data,

Traditional usability methods in Human-Computer Interaction (HCI) have been extensively used to understand the usability of products. Measurements of user experience (UX) in traditional HCI studies mostly rely on task performance and observable user interactions with the product or services, such as usability tests, contextual inquiry, and subjective self-report data, including questionnaires, interviews, and usability tests. However, these studies fail to directly reflect a user’s psychological involvement and further fail to explain the cognitive processing and the related emotional arousal. Thus, capturing how users think and feel when they are using a product remains a vital challenge of user experience evaluation studies. Conversely, recent research has revealed that sensor-based affect detection technologies, such as eye tracking, electroencephalography (EEG), galvanic skin response (GSR), and facial expression analysis, effectively capture affective states and physiological responses. These methods are efficient indicators of cognitive involvement and emotional arousal and constitute effective strategies for a comprehensive measurement of UX. The literature review shows that the impacts of sensor-based affect detection systems to the UX can be categorized in two groups: (1) confirmatory to validate the results obtained from the traditional usability methods in UX evaluations; and (2) complementary to enhance the findings or provide more precise and valid evidence. Both provided comprehensive findings to uncover the issues related to mental and physiological pathways to enhance the design of product and services. Therefore, this dissertation claims that it can be efficient to integrate sensor-based affect detection technologies to solve the current gaps or weaknesses of traditional usability methods. The dissertation revealed that the multi-sensor-based UX evaluation approach through biometrics tools and software corroborated user experience identified by traditional UX methods during an online purchasing task. The use these systems enhanced the findings and provided more precise and valid evidence to predict the consumer purchasing preferences. Thus, their impact was “complementary” on overall UX evaluation. The dissertation also provided information of the unique contributions of each tool and recommended some ways user experience researchers can combine both sensor-based and traditional UX approaches to explain consumer purchasing preferences.
ContributorsKula, Irfan (Author) / Atkinson, Robert K (Thesis advisor) / Roscoe, Rod D. (Thesis advisor) / Branaghan, Russell J (Committee member) / Arizona State University (Publisher)
Created2018