Matching Items (16)
Filtering by

Clear all filters

135547-Thumbnail Image.png
Description
The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a

The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a probabilistic analysis to describe the variation between replicates of the experimental process, and analyze reliability of a structural system based on that model. In order to help design the EDP software to perform the full analysis, the probabilistic and regression modeling aspects of this analysis have been explored. The focus has been on creating and analyzing probabilistic models for the data, adding multivariate and nonparametric fits to raw data, and developing computational techniques that allow for these methods to be properly implemented within EDP. For creating a probabilistic model of replicate data, the normal, lognormal, gamma, Weibull, and generalized exponential distributions have been explored. Goodness-of-fit tests, including the chi-squared, Anderson-Darling, and Kolmogorov-Smirnoff tests, have been used in order to analyze the effectiveness of any of these probabilistic models in describing the variation of parameters between replicates of an experimental test. An example using Young's modulus data for a Kevlar-49 Swath stress-strain test was used in order to demonstrate how this analysis is performed within EDP. In order to implement the distributions, numerical solutions for the gamma, beta, and hypergeometric functions were implemented, along with an arbitrary precision library to store numbers that exceed the maximum size of double-precision floating point digits. To create a multivariate fit, the multilinear solution was created as the simplest solution to the multivariate regression problem. This solution was then extended to solve nonlinear problems that can be linearized into multiple separable terms. These problems were solved analytically with the closed-form solution for the multilinear regression, and then by using a QR decomposition to solve numerically while avoiding numerical instabilities associated with matrix inversion. For nonparametric regression, or smoothing, the loess method was developed as a robust technique for filtering noise while maintaining the general structure of the data points. The loess solution was created by addressing concerns associated with simpler smoothing methods, including the running mean, running line, and kernel smoothing techniques, and combining the ability of each of these methods to resolve those issues. The loess smoothing method involves weighting each point in a partition of the data set, and then adding either a line or a polynomial fit within that partition. Both linear and quadratic methods were applied to a carbon fiber compression test, showing that the quadratic model was more accurate but the linear model had a shape that was more effective for analyzing the experimental data. Finally, the EDP program itself was explored to consider its current functionalities for processing data, as described by shear tests on carbon fiber data, and the future functionalities to be developed. The probabilistic and raw data processing capabilities were demonstrated within EDP, and the multivariate and loess analysis was demonstrated using R. As the functionality and relevant considerations for these methods have been developed, the immediate goal is to finish implementing and integrating these additional features into a version of EDP that performs a full streamlined structural analysis on experimental data.
ContributorsMarkov, Elan Richard (Author) / Rajan, Subramaniam (Thesis director) / Khaled, Bilal (Committee member) / Chemical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Ira A. Fulton School of Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136550-Thumbnail Image.png
Description
The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team

The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team be as successful as possible by defining which positions are most important to a team's success. Data from fifteen years of NFL games was collected and information on every player in the league was analyzed. First there needed to be a benchmark which describes a team as being average and then every player in the NFL must be compared to that average. Based on properties of linear regression using ordinary least squares this project aims to define such a model that shows each position's importance. Finally, once such a model had been established then the focus turned to the NFL draft in which the goal was to find a strategy of where each position needs to be drafted so that it is most likely to give the best payoff based on the results of the regression in part one.
ContributorsBalzer, Kevin Ryan (Author) / Goegan, Brian (Thesis director) / Dassanayake, Maduranga (Committee member) / Barrett, The Honors College (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
132834-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However
while this does cause ETF deviations to be generally lower than their mutual fund counterparts,
as our paper explores this process does not eliminate these deviations completely. This article
builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of
premiums (discounts) of ETFs from their fair market value. And looks to see if these premia
have changed in the last 10 years. Our paper then diverges from the original and takes a deeper
look into the standard deviations of these premia specifically.
Our findings show that over 70% of an ETFs standard deviation of premia can be
explained through a linear combination consisting of two variables: a categorical (Domestic[US],
Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds
that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market
indicators such as the economic freedom index and investment freedom index are insignificant
predictors of an ETFs standard deviation of premia. These findings differ somewhat from
existing literature which indicate that these factors should have a significant impact on the
predictive ability of an ETFs standard deviation of premia.
ContributorsHenning, Thomas Louis (Co-author) / Zhang, Jingbo (Co-author) / Simonson, Mark (Thesis director) / Wendell, Licon (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133570-Thumbnail Image.png
Description
In the last decade, the population of honey bees across the globe has declined sharply leaving scientists and bee keepers to wonder why? Amongst all nations, the United States has seen some of the greatest declines in the last 10 plus years. Without a definite explanation, Colony Collapse Disorder (CCD)

In the last decade, the population of honey bees across the globe has declined sharply leaving scientists and bee keepers to wonder why? Amongst all nations, the United States has seen some of the greatest declines in the last 10 plus years. Without a definite explanation, Colony Collapse Disorder (CCD) was coined to explain the sudden and sharp decline of the honey bee colonies that beekeepers were experiencing. Colony collapses have been rising higher compared to expected averages over the years, and during the winter season losses are even more severe than what is normally acceptable. There are some possible explanations pointing towards meteorological variables, diseases, and even pesticide usage. Despite the cause of CCD being unknown, thousands of beekeepers have reported their losses, and even numbers of infected colonies and colonies under certain stressors in the most recent years. Using the data that was reported to The United States Department of Agriculture (USDA), as well as weather data collected by The National Centers for Environmental Information (NOAA) and the National Centers for Environmental Information (NCEI), regression analysis was used to investigate honey bee colonies to find relationships between stressors in honey bee colonies and meteorological variables, and colony collapses during the winter months. The regression analysis focused on the winter season, or quarter 4 of the year, which includes the months of October, November, and December. In the model, the response variables was the percentage of colonies lost in quarter 4. Through the model, it was concluded that certain weather thresholds and the percentage increase of colonies under certain stressors were related to colony loss.
ContributorsVasquez, Henry Antony (Author) / Zheng, Yi (Thesis director) / Saffell, Erinanne (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
137120-Thumbnail Image.png
Description
Science fiction has a unique ability to express, analyze, and critique concepts in a subtle way that emphasizes a point but is still entertaining to the audience. Because of science fiction's ability to do this it has long been a powerful way to ask questions that would normally not be

Science fiction has a unique ability to express, analyze, and critique concepts in a subtle way that emphasizes a point but is still entertaining to the audience. Because of science fiction's ability to do this it has long been a powerful way to ask questions that would normally not be addressed. As such, this paper provides an overview of the effects of biomedical technology in science fiction films. The discussions in this paper will analyze the different portrayals of the technology in the viewed cinematic pieces and the effects they have on the characters in the film. The discussion will begin with the films that have technology based in Genetic Engineering. This will then be followed by a discussion of the biomedical technology based in the fields of Endocrinology; Reanimation; Preservation; Prosthetics; Physical Metamorphosis; Super-Drugs and Super-Viruses; and Diagnostic, Surgical, and Monitoring Equipment. At the end of this paper movie summaries are provided to assist in clarifying plot details.
ContributorsGrzybowski, Amanda Ann (Author) / Foy, Joseph (Thesis director) / Facinelli, Diane (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
134415-Thumbnail Image.png
Description
This paper will begin by initially discussing the potential uses and challenges of efficient and accurate traffic forecasting. The data we used includes traffic volume from seven locations on a busy Athens street in April and May of 2000. This data was used as part of a traffic forecasting competition.

This paper will begin by initially discussing the potential uses and challenges of efficient and accurate traffic forecasting. The data we used includes traffic volume from seven locations on a busy Athens street in April and May of 2000. This data was used as part of a traffic forecasting competition. Our initial observation, was that due to the volatility and oscillating nature of daily traffic volume, simple linear regression models will not perform well in predicting the time-series data. For this we present the Harmonic Time Series model. Such model (assuming all predictors are significant) will include a sinusoidal term for each time index within a period of data. Our assumption is that traffic volumes have a period of one week (which is evidenced by the graphs reproduced in our paper). This leads to a model that has 6,720 sine and cosine terms. This is clearly too many coefficients, so in an effort to avoid over-fitting and having an efficient model, we apply the sub-setting algorithm known as Adaptive Lass.
ContributorsMora, Juan (Author) / Kamarianakis, Ioannis (Thesis director) / Yu, Wanchunzi (Committee member) / W. P. Carey School of Business (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
133444-Thumbnail Image.png
Description
Abstract Modern imaging techniques for sciatic nerves often use imaging techniques that can clearly find myelinated axons (Group A and Group B and analyze their properties, but have trouble with the more numerous Remak Fibers (Group C). In this paper, Group A and B fibers are analyzed while also analyzing

Abstract Modern imaging techniques for sciatic nerves often use imaging techniques that can clearly find myelinated axons (Group A and Group B and analyze their properties, but have trouble with the more numerous Remak Fibers (Group C). In this paper, Group A and B fibers are analyzed while also analyzing Remak fibers using osmium tetroxide staining and imaging with the help of transmission electron microscopy. Using this method, nerves had various electrical stimuli attached to them and were analyzed as such. They were analyzed with a cuff electrode attached, a stimulator attached, and both, with images taken at the center of the nerve and the ends of them. The number and area taken by the Remak fibers were analyzed, along with the g-ratios of the Group A and B fibers. These were analyzed to help deduce the overall health of the fibers along with vacuolization, and mitochondria available. While some important information was gained from this evaluation, further testing has to be done to improve the myelin detection system, along with analyzing the proper and necessary Remak fibers and the role they play. The research tries to thoroughly look at the necessary material and find a way to use it as a guide to further experimentation with electrical stimuli, and notes the differences found within and without various groups, various points of observation, and various stimuli as a whole. Nevertheless, this research allows a strong look into the benefits of transmission electron microscopy and the ability to assess electrical stimulation from these points.
ContributorsNambiar, Karthik (Author) / Muthuswamy, Jitendran (Thesis director) / Towe, Bruce (Committee member) / Harrington Bioengineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
137739-Thumbnail Image.png
Description
The role of retention and forgetting of context dependent sensorimotor memory of dexterous manipulation was explored. Human subjects manipulated a U-shaped object by switching the handle to be grasped (context) three times, and then came back two weeks later to lift the same object in the opposite context relative to

The role of retention and forgetting of context dependent sensorimotor memory of dexterous manipulation was explored. Human subjects manipulated a U-shaped object by switching the handle to be grasped (context) three times, and then came back two weeks later to lift the same object in the opposite context relative to that experience on the last block. On each context switch, an interference of the previous block of trials was found resulting in manipulation errors (object tilt). However, no significant re-learning was found two weeks later for the first block of trials (p = 0.826), indicating that the previously observed interference among contexts lasted a very short time. Interestingly, upon switching to the other context, sensorimotor memories again interfered with visually-based planning. This means that the memory of lifting in the first context somehow blocked the memory of lifting in the second context. In addition, the performance in the first trial two weeks later and the previous trial of the same context were not significantly different (p = 0.159). This means that subjects are able to retain long-term sensorimotor memories. Lastly, the last four trials in which subjects switched contexts were not significantly different from each other (p = 0.334). This means that the interference from sensorimotor memories of lifting in opposite contexts was weaker, thus eventually leading to the attainment of steady performance.
ContributorsGaw, Nathan Benjamin (Author) / Santello, Marco (Thesis director) / Helms Tillery, Stephen (Committee member) / Buneo, Christopher (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Harrington Bioengineering Program (Contributor)
Created2013-05
137683-Thumbnail Image.png
Description
Rupture of intracranial aneurysms causes a subarachnoid hemorrhage, which is often lethal health event. A minimally invasive method of solving this problem may involve a material, which can be administered as a liquid and then becomes a strong solid within minutes preventing flow of blood in the aneurysm. Here we

Rupture of intracranial aneurysms causes a subarachnoid hemorrhage, which is often lethal health event. A minimally invasive method of solving this problem may involve a material, which can be administered as a liquid and then becomes a strong solid within minutes preventing flow of blood in the aneurysm. Here we report on the development of temperature responsive copolymers, which are deliverable through a microcatheter at body temperature and then rapidly cure to form a highly elastic hydrogel. To our knowledge, this is the first physical-and chemical-crosslinked hydrogel capable of rapid crosslinking at temperatures above the gel transition temperature. The polymer system, poly(N-isopropylacrylamide-co-cysteamine-co-Jeffamine® M-1000 acrylamide) and poly(ethylene glycol) diacrylate, was evaluated in wide-neck aneurysm flow models to evaluate the stability of the hydrogels. Investigation of this polymer system indicates that the Jeffamine® M-1000 causes the gels to retain water, resulting in gels that are initially weak and viscous, but become stronger and more elastic after chemical crosslinking.
ContributorsLee, Elizabeth Jean (Author) / Vernon, Brent (Thesis director) / Brennecka, Celeste (Committee member) / Overstreet, Derek (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2013-05
135765-Thumbnail Image.png
Description
The development of computational systems known as brain-computer interfaces (BCIs) offers the possibility of allowing individuals disabled by neurological disorders such as Amyotrophic Lateral Sclerosis (ALS) and ischemic stroke the ability to perform relatively complex tasks such as communicating with others and walking. BCIs are closed-loop systems that record physiological

The development of computational systems known as brain-computer interfaces (BCIs) offers the possibility of allowing individuals disabled by neurological disorders such as Amyotrophic Lateral Sclerosis (ALS) and ischemic stroke the ability to perform relatively complex tasks such as communicating with others and walking. BCIs are closed-loop systems that record physiological signals from the brain and translate those signals into commands that control an external device such as a wheelchair or a robotic exoskeleton. Despite the potential for BCIs to vastly improve the lives of almost one billion people, one question arises: Just because we can use brain-computer interfaces, should we? The human brain is an embodiment of the mind, which is largely seen to determine a person's identity, so a number of ethical and philosophical concerns emerge over current and future uses of BCIs. These concerns include privacy, informed consent, autonomy, identity, enhancement, and justice. In this thesis, I focus on three of these issues: privacy, informed consent, and autonomy. The ultimate purpose of brain-computer interfaces is to provide patients with a greater degree of autonomy; thus, many of the ethical issues associated with BCIs are intertwined with autonomy. Currently, brain-computer interfaces exist mainly in the domain of medicine and medical research, but recently companies have started commercializing BCIs and providing them at affordable prices. These consumer-grade BCIs are primarily for non-medical purposes, and so they are beyond the scope of medicine. As BCIs become more widespread in the near future, it is crucial for interdisciplinary teams of ethicists, philosophers, engineers, and physicians to collaborate to address these ethical concerns now before BCIs become more commonplace.
ContributorsChu, Kevin Michael (Author) / Ankeny, Casey (Thesis director) / Robert, Jason (Committee member) / Frow, Emma (Committee member) / Harrington Bioengineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor) / School for the Future of Innovation in Society (Contributor) / Lincoln Center for Applied Ethics (Contributor)
Created2016-05