Matching Items (3)
Filtering by

Clear all filters

Description
In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental group was subjected to different learning material than the control grou

In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental group was subjected to different learning material than the control group with the use of the Elements 4D mobile application by Daqri to learn basic chemical elements and compounds. The control group learning material provided all the exact information as the application, but in the 2D form of a printed packet. It was expected the experimental group would outperform the control group and have a more enjoyable experience and higher performance. After data analysis, it was concluded that the control group outperformed the experimental group on performance and both groups has similar experiences in contradiction to the hypothesis. Once the factors that contribute to the limitations of different study duration, learning the application beforehand, and only-memorization questions are addressed, the study can be conducted again. Application improvements may also alter the future results of the study and hopefully lead to full implementation into a curriculum.
ContributorsApplegate, Garrett Charles (Author) / Atkinson, Robert (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
134361-Thumbnail Image.png
Description

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.

ContributorsArmstrong, Julia Robin (Author) / McCarville, Daniel R. (Thesis director) / Montgomery, Douglas (Committee member) / Industrial, Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
135127-Thumbnail Image.png
Description
Access to clean drinking water has been identified by the National Academy of Engineering as one of the Grand Challenges of the 21st century. This thesis investigated clean drinking water access in the greater Phoenix area, specifically with regards to drinking water quality standards and management strategies. This research report

Access to clean drinking water has been identified by the National Academy of Engineering as one of the Grand Challenges of the 21st century. This thesis investigated clean drinking water access in the greater Phoenix area, specifically with regards to drinking water quality standards and management strategies. This research report provides an introduction to water quality, treatment, and management; a background on the Salt River Project; and an analysis on source water mix and drinking water quality indicators for water delivered to Tempe, Arizona water treatment facilities.
ContributorsMercer, Rebecca Nicole (Author) / Ganesh, Tirupalavanam (Thesis director) / Trowbridge, Amy (Committee member) / Industrial, Systems (Contributor) / School of Sustainability (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12