Matching Items (182)
135540-Thumbnail Image.png
Description
Since 1994, the Performance Based Studies Research Group at Arizona State University has utilized an approach to industry called Best Value (BV). Since its origin, this approach has been used in 1860 tests creating $6.4 billion dollars of projects and services delivered, at a customer satisfaction rating of 95%. Best

Since 1994, the Performance Based Studies Research Group at Arizona State University has utilized an approach to industry called Best Value (BV). Since its origin, this approach has been used in 1860 tests creating $6.4 billion dollars of projects and services delivered, at a customer satisfaction rating of 95%. Best Value (BV) is rooted in simplicity, and seeks to help organizations hire experts, plan ahead, minimize risk, optimize resources, and optimize resources. This is accomplished largely through the use of a tool the PBSRG calls the Kashiwagi Solution Model (KSM). Kashiwagi Solution Models can be used across every industry from construction to Wall Street to help achieve sustainable success in what is perhaps the most efficient and effective manner available today. Using Best Value (BV) and the Kashiwagi Solution Model (KSM), the author identified groups on Wall Street and throughout the world who deal in a unique entity called "Over-The-Counter (OTC) Derivatives". More specifically, this paper focuses on the current status and ramifications of derivative contracts that two parties enter with the sole intention of speculating. KSMs are used in Information Measurement Theory, which seeks to take seemingly complex subjects and simplify them into terms that everyone can understand. This document uses Information Measurement Theory to explain what OTC derivatives are in the simplest possible way, so that little prior knowledge of finance is required to understand the material. Through research and observation, KSMs can be used to identify the characteristics of groups who deal in OTC derivatives, which contributed to the financial crisis in 2008 and have grown in size and complexity. This document uses dominant information in order to see the potential problems within the OTC derivatives market from 30,000 feet, and offer solutions to those problems. Keywords: simplicity, best value approach, identify characteristics, dominant information
ContributorsBills, Andrew Marius (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Rivera, Alfredo (Committee member) / Department of Finance (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135547-Thumbnail Image.png
Description
The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a

The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a probabilistic analysis to describe the variation between replicates of the experimental process, and analyze reliability of a structural system based on that model. In order to help design the EDP software to perform the full analysis, the probabilistic and regression modeling aspects of this analysis have been explored. The focus has been on creating and analyzing probabilistic models for the data, adding multivariate and nonparametric fits to raw data, and developing computational techniques that allow for these methods to be properly implemented within EDP. For creating a probabilistic model of replicate data, the normal, lognormal, gamma, Weibull, and generalized exponential distributions have been explored. Goodness-of-fit tests, including the chi-squared, Anderson-Darling, and Kolmogorov-Smirnoff tests, have been used in order to analyze the effectiveness of any of these probabilistic models in describing the variation of parameters between replicates of an experimental test. An example using Young's modulus data for a Kevlar-49 Swath stress-strain test was used in order to demonstrate how this analysis is performed within EDP. In order to implement the distributions, numerical solutions for the gamma, beta, and hypergeometric functions were implemented, along with an arbitrary precision library to store numbers that exceed the maximum size of double-precision floating point digits. To create a multivariate fit, the multilinear solution was created as the simplest solution to the multivariate regression problem. This solution was then extended to solve nonlinear problems that can be linearized into multiple separable terms. These problems were solved analytically with the closed-form solution for the multilinear regression, and then by using a QR decomposition to solve numerically while avoiding numerical instabilities associated with matrix inversion. For nonparametric regression, or smoothing, the loess method was developed as a robust technique for filtering noise while maintaining the general structure of the data points. The loess solution was created by addressing concerns associated with simpler smoothing methods, including the running mean, running line, and kernel smoothing techniques, and combining the ability of each of these methods to resolve those issues. The loess smoothing method involves weighting each point in a partition of the data set, and then adding either a line or a polynomial fit within that partition. Both linear and quadratic methods were applied to a carbon fiber compression test, showing that the quadratic model was more accurate but the linear model had a shape that was more effective for analyzing the experimental data. Finally, the EDP program itself was explored to consider its current functionalities for processing data, as described by shear tests on carbon fiber data, and the future functionalities to be developed. The probabilistic and raw data processing capabilities were demonstrated within EDP, and the multivariate and loess analysis was demonstrated using R. As the functionality and relevant considerations for these methods have been developed, the immediate goal is to finish implementing and integrating these additional features into a version of EDP that performs a full streamlined structural analysis on experimental data.
ContributorsMarkov, Elan Richard (Author) / Rajan, Subramaniam (Thesis director) / Khaled, Bilal (Committee member) / Chemical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Ira A. Fulton School of Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05