Matching Items (14)
Description
In 2015, a new way to track baseball games was introduced to MLB, marking the beginning of the Statcast Revolution. This new way to track the game brought about a number of new statistics, including the use of expected statistics. Expected statistics provide an estimate of what a player’s statistics

In 2015, a new way to track baseball games was introduced to MLB, marking the beginning of the Statcast Revolution. This new way to track the game brought about a number of new statistics, including the use of expected statistics. Expected statistics provide an estimate of what a player’s statistics should be on average with their same actions. This will be explored more in the upcoming paper. While expected statistics are not intended to predict the future performance of players, I theorized that there may be some relation, particularly on younger players. There is not any research on this topic yet, and if there does exist a correlation between expected statistics and future performance, it would allow teams to have a new way to predict data on their players. Research to find a correlation between the two was carried out by taking predictive accuracies of expected batting average and slugging of 12 MLB players throughout their rookie to 8th year seasons and combining them together to find an interval in which I could be confident the correlation lay. Overall, I found that I could not be certain that there was a correlation between the predictive accuracy of expected statistics and the length of time a player has played in MLB. While this conclusion does not offer any insights of how to better predict a player’s future performance, the methodology and findings still present opportunities to gain a better understanding of the predictive measures of expected statistics.
ContributorsEdmiston, Alexander (Author) / Pavlic, Theodore (Thesis director) / Montgomery, Douglas (Committee member) / Barrett, The Honors College (Contributor) / Dean, W.P. Carey School of Business (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2024-05
161846-Thumbnail Image.png
Description
Complex systems appear when interaction among system components creates emergent behavior that is difficult to be predicted from component properties. The growth of Internet of Things (IoT) and embedded technology has increased complexity across several sectors (e.g., automotive, aerospace, agriculture, city infrastructures, home technologies, healthcare) where the paradigm of cyber-physical

Complex systems appear when interaction among system components creates emergent behavior that is difficult to be predicted from component properties. The growth of Internet of Things (IoT) and embedded technology has increased complexity across several sectors (e.g., automotive, aerospace, agriculture, city infrastructures, home technologies, healthcare) where the paradigm of cyber-physical systems (CPSs) has become a standard. While CPS enables unprecedented capabilities, it raises new challenges in system design, certification, control, and verification. When optimizing system performance computationally expensive simulation tools are often required, and search algorithms that sequentially interrogate a simulator to learn promising solutions are in great demand. This class of algorithms are black-box optimization techniques. However, the generality that makes black-box optimization desirable also causes computational efficiency difficulties when applied real problems. This thesis focuses on Bayesian optimization, a prominent black-box optimization family, and proposes new principles, translated in implementable algorithms, to scale Bayesian optimization to highly expensive, large scale problems. Four problem contexts are studied and approaches are proposed for practically applying Bayesian optimization concepts, namely: (1) increasing sample efficiency of a highly expensive simulator in the presence of other sources of information, where multi-fidelity optimization is used to leverage complementary information sources; (2) accelerating global optimization in the presence of local searches by avoiding over-exploitation with adaptive restart behavior; (3) scaling optimization to high dimensional input spaces by integrating Game theoretic mechanisms with traditional techniques; (4) accelerating optimization by embedding function structure when the reward function is a minimum of several functions. In the first context this thesis produces two multi-fidelity algorithms, a sample driven and model driven approach, and is implemented to optimize a serial production line; in the second context the Stochastic Optimization with Adaptive Restart (SOAR) framework is produced and analyzed with multiple applications to CPS falsification problems; in the third context the Bayesian optimization with sample fictitious play (BOFiP) algorithm is developed with an implementation in high-dimensional neural network training; in the last problem context the minimum surrogate optimization (MSO) framework is produced and combined with both Bayesian optimization and the SOAR framework with applications in simultaneous falsification of multiple CPS requirements.
ContributorsMathesen, Logan (Author) / Pedrielli, Giulia (Thesis advisor) / Candan, Kasim (Committee member) / Fainekos, Georgios (Committee member) / Gel, Esma (Committee member) / Montgomery, Douglas (Committee member) / Zabinsky, Zelda (Committee member) / Arizona State University (Publisher)
Created2021
130267-Thumbnail Image.png
Description
Purpose: To investigate use of an improved ocular tear film analysis protocol (OPI 2.0) in the Controlled Adverse Environment (CAE[superscript SM]) model of dry eye disease, and to examine the utility of new metrics in the identification of subpopulations of dry eye patients.
Methods: Thirty-three dry eye subjects completed a single-center,

Purpose: To investigate use of an improved ocular tear film analysis protocol (OPI 2.0) in the Controlled Adverse Environment (CAE[superscript SM]) model of dry eye disease, and to examine the utility of new metrics in the identification of subpopulations of dry eye patients.
Methods: Thirty-three dry eye subjects completed a single-center, single-visit, pilot CAE study. The primary endpoint was mean break-up area (MBA) as assessed by the OPI 2.0 system. Secondary endpoints included corneal fluorescein staining, tear film break-up time, and OPI 2.0 system measurements. Subjects were also asked to rate their ocular discomfort throughout the CAE. Dry eye endpoints were measured at baseline, immediately following a 90-minute CAE exposure, and again 30 minutes after exposure.
Results: The post-CAE measurements of MBA showed a statistically significant decrease from the baseline measurements. The decrease was relatively specific to those patients with moderate to severe dry eye, as measured by baseline MBA. Secondary endpoints including palpebral fissure size, corneal staining, and redness, also showed significant changes when pre- and post-CAE measurements were compared. A correlation analysis identified specific associations between MBA, blink rate, and palpebral fissure size. Comparison of MBA responses allowed us to identify subpopulations of subjects who exhibited different compensatory mechanisms in response to CAE challenge. Of note, none of the measures of tear film break-up time showed statistically significant changes or correlations in pre-, versus post-CAE measures.
Conclusion: This pilot study confirms that the tear film metric MBA can detect changes in the ocular surface induced by a CAE, and that these changes are correlated with other, established measures of dry eye disease. The observed decrease in MBA following CAE exposure demonstrates that compensatory mechanisms are initiated during the CAE exposure, and that this compensation may provide the means to identify and characterize clinically relevant subpopulations of dry eye patients.
Created2012-11-12
130268-Thumbnail Image.png
Description
Purpose: To evaluate a new method of measuring ocular exposure in the context of a natural blink pattern through analysis of the variables tear film breakup time (TFBUT), interblink interval (IBI), and tear film breakup area (BUA).
Methods: The traditional methodology (Forced-Stare [FS]) measures TFBUT and IBI separately. TFBUT is measured

Purpose: To evaluate a new method of measuring ocular exposure in the context of a natural blink pattern through analysis of the variables tear film breakup time (TFBUT), interblink interval (IBI), and tear film breakup area (BUA).
Methods: The traditional methodology (Forced-Stare [FS]) measures TFBUT and IBI separately. TFBUT is measured under forced-stare conditions by an examiner using a stopwatch, while IBI is measured as the subject watches television. The new methodology (video capture manual analysis [VCMA]) involves retrospective analysis of video data of fluorescein-stained eyes taken through a slit lamp while the subject watches television, and provides TFBUT and BUA for each IBI during the 1-minute video under natural blink conditions. The FS and VCMA methods were directly compared in the same set of dry-eye subjects. The VCMA method was evaluated for the ability to discriminate between dry-eye subjects and normal subjects. The VCMA method was further evaluated in the dry eye subjects for the ability to detect a treatment effect before, and 10 minutes after, bilateral instillation of an artificial tear solution.
Results: Ten normal subjects and 17 dry-eye subjects were studied. In the dry-eye subjects, the two methods differed with respect to mean TFBUTs (5.82 seconds, FS; 3.98 seconds, VCMA; P = 0.002). The FS variables alone (TFBUT, IBI) were not able to successfully distinguish between the dry-eye and normal subjects, whereas the additional VCMA variables, both derived and observed (BUA, BUA/IBI, breakup rate), were able to successfully distinguish between the dry-eye and normal subjects in a statistically significant fashion. TFBUT (P = 0.034) and BUA/IBI (P = 0.001) were able to distinguish the treatment effect of artificial tears in dry-eye subjects.
Conclusion: The VCMA methodology provides a clinically relevant analysis of tear film stability measured in the context of a natural blink pattern.
Created2011-09-21