Matching Items (21)
Filtering by

Clear all filters

135188-Thumbnail Image.png
Description
Space microbiology, or the study of microorganisms in space, has significant applications for both human spaceflight and Earth-based medicine. This thesis traces the evolution of the field of space microbiology since its creation in 1935. Beginning with simple studies to determine if terrestrial life could survive spaceflight, the field of

Space microbiology, or the study of microorganisms in space, has significant applications for both human spaceflight and Earth-based medicine. This thesis traces the evolution of the field of space microbiology since its creation in 1935. Beginning with simple studies to determine if terrestrial life could survive spaceflight, the field of space microbiology has grown to encompass a substantial body of work that is now recognized as an essential component of NASA' research endeavors. Part one provides an overview of the early period of space microbiology, from high-altitude balloon and rocket studies to work conducted during the Apollo program. Part two summarizes the current state of the field, with a specific focus on the revolutionary contributions made by the Nickerson lab at the Biodesign Institute at ASU using the NASA-designed Rotating Wall Vessel (RWV) Bioreactor. Finally, part three highlights the research I've conducted in the Nickerson lab, as well as continuing studies within the field of space microbiology.
ContributorsMcCarthy, Breanne E. (Author) / Lynch, John (Thesis director) / Foy, Joseph (Committee member) / School of Life Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
134291-Thumbnail Image.png
Description
Orbiting space debris is an active issue that affects the capability of space launch for future satellites, probes, and space shuttles, and it will become a nearly insurmountable problem without action. Debris of varying sizes and speeds orbit the Earth at a range of heights above the atmosphere and need

Orbiting space debris is an active issue that affects the capability of space launch for future satellites, probes, and space shuttles, and it will become a nearly insurmountable problem without action. Debris of varying sizes and speeds orbit the Earth at a range of heights above the atmosphere and need to be removed to avoid damage to crucial equipment of active orbiting satellites including the International Space Station. Finding a feasible solution to space debris removal requires that several facets be covered to become a reality; these include being aware of the problem in magnitude and source. This literature assessment covers the magnitude of space debris in low-earth and geosynchronous orbit as well as collision events which have increased the amount of space debris. There have been efforts made by several space agencies to control the amount of space debris added to orbit by current and future launches over the last decade \u2014 serving as a temporary fix before removal can be executed. This paper explores known removal efforts through mitigation, projects conceived and tested by DARPA, related space policies and laws, CubeSat technology, and the cataloguing of known space debris. To make space debris removal a reality, roadblocks need to be removed to acquire permission from states or countries for space missions. For example, these restrictions are in place to protect the assets of several countries and organizations. Guidelines set to curb the growth of space debris fail to prevent the growth due to the restrictions for ownership rights making them not as effective. This paper covers space policy and laws, the economy, satellite ownership, international conflict, status of space debris, and the overall feasibility of space debris removal. It will then discuss currently proposed solutions for the removal of space debris. Finally, this paper attempts to weight the advantages and disadvantages of the idea that space debris removal should include the opportunity to recycle materials. For example, defunct satellites and other discarded space crafts could be used for future launches. It will conclude with a personal exploration of what materials can be recycled, what chemical processes can be used to break down materials, and how to combine recycling and chemical processes for space-based recycling stations between Earth and the moon. The overall question that drives the search for making space debris removal a reality is whether it is feasible in multiple areas including technologically, legally, monetarily, and physically.
ContributorsBreden, Elizabeth Catherine (Author) / Foy, Joseph (Thesis director) / Thoesen, Andrew (Committee member) / Maximon, Leonard (Committee member) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
135660-Thumbnail Image.png
Description
This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can

This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can be used to understand the image better through recognizing different features present within the image. Deep CNNs, however, require training sets that can be larger than a million pictures in order to fine tune their feature detectors. For the case of facial expression datasets, none of these large datasets are available. Due to this limited availability of data required to train a new CNN, the idea of using naïve domain adaptation is explored. Instead of creating and using a new CNN trained specifically to extract features related to FER, a previously trained CNN originally trained for another computer vision task is used. Work for this research involved creating a system that can run a CNN, can extract feature vectors from the CNN, and can classify these extracted features. Once this system was built, different aspects of the system were tested and tuned. These aspects include the pre-trained CNN that was used, the layer from which features were extracted, normalization used on input images, and training data for the classifier. Once properly tuned, the created system returned results more accurate than previous attempts on facial expression recognition. Based on these positive results, naïve domain adaptation is shown to successfully leverage advantages of deep CNNs for facial expression recognition.
ContributorsEusebio, Jose Miguel Ang (Author) / Panchanathan, Sethuraman (Thesis director) / McDaniel, Troy (Committee member) / Venkateswara, Hemanth (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
132967-Thumbnail Image.png
Description
Classical planning is a field of Artificial Intelligence concerned with allowing autonomous agents to make reasonable decisions in complex environments. This work investigates
the application of deep learning and planning techniques, with the aim of constructing generalized plans capable of solving multiple problem instances. We construct a Deep Neural Network that,

Classical planning is a field of Artificial Intelligence concerned with allowing autonomous agents to make reasonable decisions in complex environments. This work investigates
the application of deep learning and planning techniques, with the aim of constructing generalized plans capable of solving multiple problem instances. We construct a Deep Neural Network that, given an abstract problem state, predicts both (i) the best action to be taken from that state and (ii) the generalized “role” of the object being manipulated. The neural network was tested on two classical planning domains: the blocks world domain and the logistic domain. Results indicate that neural networks are capable of making such
predictions with high accuracy, indicating a promising new framework for approaching generalized planning problems.
ContributorsNakhleh, Julia Blair (Author) / Srivastava, Siddharth (Thesis director) / Fainekos, Georgios (Committee member) / Computer Science and Engineering Program (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133339-Thumbnail Image.png
Description
Medical records are increasingly being recorded in the form of electronic health records (EHRs), with a significant amount of patient data recorded as unstructured natural language text. Consequently, being able to extract and utilize clinical data present within these records is an important step in furthering clinical care. One important

Medical records are increasingly being recorded in the form of electronic health records (EHRs), with a significant amount of patient data recorded as unstructured natural language text. Consequently, being able to extract and utilize clinical data present within these records is an important step in furthering clinical care. One important aspect within these records is the presence of prescription information. Existing techniques for extracting prescription information — which includes medication names, dosages, frequencies, reasons for taking, and mode of administration — from unstructured text have focused on the application of rule- and classifier-based methods. While state-of-the-art systems can be effective in extracting many types of information, they require significant effort to develop hand-crafted rules and conduct effective feature engineering. This paper presents the use of a bidirectional LSTM with CRF tagging model initialized with precomputed word embeddings for extracting prescription information from sentences without requiring significant feature engineering. The experimental results, run on the i2b2 2009 dataset, achieve an F1 macro measure of 0.8562, and scores above 0.9449 on four of the six categories, indicating significant potential for this model.
ContributorsRawal, Samarth Chetan (Author) / Baral, Chitta (Thesis director) / Anwar, Saadat (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
148128-Thumbnail Image.png
Description

CubeSats can encounter a myriad of difficulties in space like cosmic rays, temperature<br/>issues, and loss of control. By creating better, more reliable software, these problems can be<br/>mitigated and increase the chance of success for the mission. This research sets out to answer the<br/>question: how do we create reliable flight software

CubeSats can encounter a myriad of difficulties in space like cosmic rays, temperature<br/>issues, and loss of control. By creating better, more reliable software, these problems can be<br/>mitigated and increase the chance of success for the mission. This research sets out to answer the<br/>question: how do we create reliable flight software for CubeSats? by providing a concentrated<br/>list of the best flight software development practices. The CubeSat used in this research is the<br/>Deployable Optical Receiver Aperture (DORA) CubeSat, which is a 3U CubeSat that seeks to<br/>demonstrate optical communication data rates of 1 Gbps over long distances. We present an<br/>analysis over many of the flight software development practices currently in use in the industry,<br/>from industry leads NASA, and identify three key flight software development areas of focus:<br/>memory, concurrency, and error handling. Within each of these areas, the best practices were<br/>defined for how to approach the area. These practices were also developed using experience<br/>from the creation of flight software for the DORA CubeSat in order to drive the design and<br/>testing of the system. We analyze DORA’s effectiveness in the three areas of focus, as well as<br/>discuss how following the best practices identified helped to create a more reliable flight<br/>software system for the DORA CubeSat.

ContributorsHoffmann, Zachary Christian (Author) / Chavez-Echeagaray, Maria Elena (Thesis director) / Jacobs, Daniel (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148333-Thumbnail Image.png
Description

This thesis attempts to explain Everettian quantum mechanics from the ground up, such that those with little to no experience in quantum physics can understand it. First, we introduce the history of quantum theory, and some concepts that make up the framework of quantum physics. Through these concepts, we reveal

This thesis attempts to explain Everettian quantum mechanics from the ground up, such that those with little to no experience in quantum physics can understand it. First, we introduce the history of quantum theory, and some concepts that make up the framework of quantum physics. Through these concepts, we reveal why interpretations are necessary to map the quantum world onto our classical world. We then introduce the Copenhagen interpretation, and how many-worlds differs from it. From there, we dive into the concepts of entanglement and decoherence, explaining how worlds branch in an Everettian universe, and how an Everettian universe can appear as our classical observed world. From there, we attempt to answer common questions about many-worlds and discuss whether there are philosophical ramifications to believing such a theory. Finally, we look at whether the many-worlds interpretation can be proven, and why one might choose to believe it.

ContributorsSecrest, Micah (Author) / Foy, Joseph (Thesis director) / Hines, Taylor (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148341-Thumbnail Image.png
Description

The purpose of this paper is to provide an analysis of entanglement and the particular problems it poses for some physicists. In addition to looking at the history of entanglement and non-locality, this paper will use the Bell Test as a means for demonstrating how entanglement works, which measures the

The purpose of this paper is to provide an analysis of entanglement and the particular problems it poses for some physicists. In addition to looking at the history of entanglement and non-locality, this paper will use the Bell Test as a means for demonstrating how entanglement works, which measures the behavior of electrons whose combined internal angular momentum is zero. This paper will go over Dr. Bell's famous inequality, which shows why the process of entanglement cannot be explained by traditional means of local processes. Entanglement will be viewed initially through the Copenhagen Interpretation, but this paper will also look at two particular models of quantum mechanics, de-Broglie Bohm theory and Everett's Many-Worlds Interpretation, and observe how they explain the behavior of spin and entangled particles compared to the Copenhagen Interpretation.

ContributorsWood, Keaten Lawrence (Author) / Foy, Joseph (Thesis director) / Hines, Taylor (Committee member) / Department of Physics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

Breast cancer is one of the most common types of cancer worldwide. Early detection and diagnosis are crucial for improving the chances of successful treatment and survival. In this thesis, many different machine learning algorithms were evaluated and compared to predict breast cancer malignancy from diagnostic features extracted from digitized

Breast cancer is one of the most common types of cancer worldwide. Early detection and diagnosis are crucial for improving the chances of successful treatment and survival. In this thesis, many different machine learning algorithms were evaluated and compared to predict breast cancer malignancy from diagnostic features extracted from digitized images of breast tissue samples, called fine-needle aspirates. Breast cancer diagnosis typically involves a combination of mammography, ultrasound, and biopsy. However, machine learning algorithms can assist in the detection and diagnosis of breast cancer by analyzing large amounts of data and identifying patterns that may not be discernible to the human eye. By using these algorithms, healthcare professionals can potentially detect breast cancer at an earlier stage, leading to more effective treatment and better patient outcomes. The results showed that the gradient boosting classifier performed the best, achieving an accuracy of 96% on the test set. This indicates that this algorithm can be a useful tool for healthcare professionals in the early detection and diagnosis of breast cancer, potentially leading to improved patient outcomes.

ContributorsMallya, Aatmik (Author) / De Luca, Gennaro (Thesis director) / Chen, Yinong (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
Description

This research paper explores the effects of data variance on the quality of Artificial Intelligence image generation models and the impact on a viewer's perception of the generated images. The study examines how the quality and accuracy of the images produced by these models are influenced by factors such as

This research paper explores the effects of data variance on the quality of Artificial Intelligence image generation models and the impact on a viewer's perception of the generated images. The study examines how the quality and accuracy of the images produced by these models are influenced by factors such as size, labeling, and format of the training data. The findings suggest that reducing the training dataset size can lead to a decrease in image coherence, indicating that AI models get worse as the training dataset gets smaller. Moreover, the study makes surprising discoveries regarding AI image generation models that are trained on highly varied datasets. In addition, the study involves a survey in which people were asked to rate the subjective realism of the generated images on a scale ranging from 1 to 5 as well as sorting the images into their respective classes. The findings of this study emphasize the importance of considering dataset variance and size as a critical aspect of improving image generation models as well as the implications of using AI technology in the future.

ContributorsPunyamurthula, Rushil (Author) / Carter, Lynn (Thesis director) / Sarmento, Rick (Committee member) / Barrett, The Honors College (Contributor) / School of Sustainability (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05