Matching Items (6)
Filtering by

Clear all filters

136655-Thumbnail Image.png
Description

The U.S. Navy and other amphibious military organizations utilize a derivation of the traditional side stroke called the Combat Side Stroke, or CSS, and tout it as the most efficient technique available. Citing its low aerobic requirements and slow yet powerful movements as superior to the traditionally-best front crawl (freestyle),

The U.S. Navy and other amphibious military organizations utilize a derivation of the traditional side stroke called the Combat Side Stroke, or CSS, and tout it as the most efficient technique available. Citing its low aerobic requirements and slow yet powerful movements as superior to the traditionally-best front crawl (freestyle), the CSS is the go-to stroke for any operation in the water. The purpose of this thesis is to apply principles of Industrial Engineering to a real-world situation not typically approached from a perspective of optimization. I will analyze pre-existing data about various swim strokes in order to compare them in terms of efficiency for different variables. These variables include calories burned, speed, and strokes per unit distance, as well as their interactions. Calories will be measured by heart rate monitors, converting BPM to calories burned. Speed will be measured by stopwatch and observer. Strokes per unit distance will be measured by observer. The strokes to be analyzed include the breast stroke, crawl stroke, butterfly, and combat side stroke. The goal is to informally test the U.S. Navy's claim that the combat side stroke is the optimum stroke to conserve energy while covering distance. Because of limitations in the scope of the project, analysis will be done using data collected from literary sources rather than through experimentation. This thesis will include a design of experiment to test the findings here in practical study. The main method of analysis will be linear programming, followed by hypothesis testing, culminating in a design of experiment for future progress on this topic.

ContributorsGoodsell, Kevin Lewis (Author) / McCarville, Daniel R. (Thesis director) / Kashiwagi, Jacob (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2014-12
137487-Thumbnail Image.png
Description
The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program acquisition completion time were not explicitly identified in the simulation

The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program acquisition completion time were not explicitly identified in the simulation study. This research seeks to determine the acquisition processes that contribute significantly to total simulated program time in the acquisition system for all programs reaching Milestone C. Specifically, this research examines the effect of increased scope management, technology maturity, and decreased variation and mean process times in post-Design Readiness Review contractor activities by performing additional simulation analyses. Potential policies are formulated from the results to further improve program acquisition completion time.
ContributorsWorger, Danielle Marie (Author) / Wu, Teresa (Thesis director) / Shunk, Dan (Committee member) / Wirthlin, J. Robert (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2013-05
134430-Thumbnail Image.png
Description
Abstract Chess has been a common research topic for expert-novice studies and thus for learning science as a whole because of its limited framework and longevity as a game. One factor is that chess studies are good at measuring how expert chess players use their memory and skills to approach

Abstract Chess has been a common research topic for expert-novice studies and thus for learning science as a whole because of its limited framework and longevity as a game. One factor is that chess studies are good at measuring how expert chess players use their memory and skills to approach a new chessboard con�guration. Studies have shown that chess skill is based on memory, speci�cally, "chunks" of chess piece positions that have been previously encountered by players. However, debate exists concerning how these chunks are constructed in players' memory. These chunks could be constructed by proximity of pieces on the chessboard as well as their precise location or constructed through attack-defense relations. The primary objective of this study is to support which one is more in line with chess players' actual chess abilities based off their memory, proximity or attack/defense. This study replicates and extends an experiment conducted by McGregor and Howe (2002), which explored the argument that pieces are primed more by attack and defense relations than by proximity. Like their study, the present study examined novice and expert chess players' response times for correct and error responses by showing slides of game configurations. In addition to these metrics, the present study also incorporated an eye-tracker to measure visual attention and EEG to measure affective and cognitive states. They were added to allow the comparison of subtle and unconscious behaviors of both novices and expert chess players. Overall, most McGregor and Howe's (2002) results were replicated supporting their theory on chess expertise. This included statistically significance for skill in the error rates with the mean error rates on the piece recognition tests were 70.1% for novices and 87.9% for experts, as well as significance for the two-way interaction for relatedness and proximity with error rates of 22.4% for unrelated/far, 18.8% for related/far, 15.8% for unrelated
ear, and 29.3% for related
ear. Unfortunately, there were no statistically significance for any of the response time effects, which McGregor and Howe found for the interaction between skill and proximity. Despite eye-tracking and EEG data not either support nor confirm McGregor and Howe's theory on how chess players memorize chessboard configurations, these metrics did help build a secondary theory on how novices typically rely on proximity to approach chess and new visual problems in general. This was exemplified by the statistically significant results for short-term excitement for the two-way interaction of skill and proximity, where the largest short-term excitement score was between novices on near proximity slides. This may indicate that novices, because they may lean toward using proximity to try to recall these pieces, experience a short burst of excitement when the pieces are close to each other because they are more likely to recall these configurations.
ContributorsSeto, Christian Paul (Author) / Atkinson, Robert (Thesis director) / Runger, George (Committee member) / Industrial, Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description
In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental group was subjected to different learning material than the control grou

In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental group was subjected to different learning material than the control group with the use of the Elements 4D mobile application by Daqri to learn basic chemical elements and compounds. The control group learning material provided all the exact information as the application, but in the 2D form of a printed packet. It was expected the experimental group would outperform the control group and have a more enjoyable experience and higher performance. After data analysis, it was concluded that the control group outperformed the experimental group on performance and both groups has similar experiences in contradiction to the hypothesis. Once the factors that contribute to the limitations of different study duration, learning the application beforehand, and only-memorization questions are addressed, the study can be conducted again. Application improvements may also alter the future results of the study and hopefully lead to full implementation into a curriculum.
ContributorsApplegate, Garrett Charles (Author) / Atkinson, Robert (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
134361-Thumbnail Image.png
Description

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.

ContributorsArmstrong, Julia Robin (Author) / McCarville, Daniel R. (Thesis director) / Montgomery, Douglas (Committee member) / Industrial, Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description
In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight

In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight cities, and urban agglomerations, collectively referred to as dense urban areas. The motivation for this project comes from the United States Army's desire for readiness in all operating environments including dense urban areas. Though there is valuable insight in research to support Army operational behaviors, megacities are of unique interest to nearly every societal sector imaginable. A novel application for determining both main effects and interactive effects between factors within a dense urban area is a Design of Experiments- providing insight on factor causations. Regression Modelling can also be employed for analysis of dense urban areas, providing wide ranging insights into correlations between factors and their interactions. Past studies involving megacities concern themselves with general trend of cities and their operation. This study is unique in its efforts to model a singular megacity to enable decision support for military operational planning, as well as potential decision support to city planners to increase the sustainability of these dense urban areas and megacities.
ContributorsMathesen, Logan Michael (Author) / Zenzen, Frances (Thesis director) / Jennings, Cheryl (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05