Matching Items (160)
Filtering by

Clear all filters

136255-Thumbnail Image.png
Description
Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that hel

Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that help predict how much time it takes to implement a cost-saving project. These projects had previously been considered only on the merit of cost savings, but with an added dimension of time, we hope to forecast time according to a number of variables. With such a forecast, we can then apply it to an expense project prioritization model which relates time and cost savings together, compares many different projects simultaneously, and returns a series of present value calculations over different ranges of time. The goal is twofold: assist with an accurate prediction of a project's time to implementation, and provide a basis to compare different projects based on their present values, ultimately helping to reduce the Company's manufacturing costs and improve gross margins. We believe this approach, and the research found toward this goal, is most valuable for the Company. Two coaches from the Company have provided assistance and clarified our questions when necessary throughout our research. In this paper, we begin by defining the problem, setting an objective, and establishing a checklist to monitor our progress. Next, our attention shifts to the data: making observations, trimming the dataset, framing and scoping the variables to be used for the analysis portion of the paper. Before creating a hypothesis, we perform a preliminary statistical analysis of certain individual variables to enrich our variable selection process. After the hypothesis, we run multiple linear regressions with project duration as the dependent variable. After regression analysis and a test for robustness, we shift our focus to an intuitive model based on rules of thumb. We relate these models to an expense project prioritization tool developed using Microsoft Excel software. Our deliverables to the Company come in the form of (1) a rules of thumb intuitive model and (2) an expense project prioritization tool.
ContributorsAl-Assi, Hashim (Co-author) / Chiang, Robert (Co-author) / Liu, Andrew (Co-author) / Ludwick, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor) / School of Accountancy (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / WPC Graduate Programs (Contributor)
Created2015-05
136147-Thumbnail Image.png
Description
This paper looks at defined contribution 401(k) plans in the United States to analyze whether or not participants have plans with better plan characteristics defined in this study by paying more for administration services, advisory services, and investments. By collecting and analyzing Form 5500 and audit data, I find that

This paper looks at defined contribution 401(k) plans in the United States to analyze whether or not participants have plans with better plan characteristics defined in this study by paying more for administration services, advisory services, and investments. By collecting and analyzing Form 5500 and audit data, I find that there is no relation between how much a plan and its participants are paying for recordkeeping, advisory, and investment fees and the analyzed characteristics of the plan that they receive in regards to active/passive allocation, revenue share, and the performance of the funds.
ContributorsAziz, Julian (Author) / Wahal, Sunil (Thesis director) / Bharath, Sreedhar (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor)
Created2015-05
133198-Thumbnail Image.png
Description
Financial Intelligence Pays Off blog is an easy to use blog for high school juniors and seniors and college students to access in order to receive a quick overview of essential financial topics. There are many sources and college courses for students to take to get a more in-depth understanding

Financial Intelligence Pays Off blog is an easy to use blog for high school juniors and seniors and college students to access in order to receive a quick overview of essential financial topics. There are many sources and college courses for students to take to get a more in-depth understanding of topics such as saving, filing taxes, learning about credit but many times students do not know about these courses. However, it is often that courses are restricted to students who are business majors and online sources sometimes use to technical of terminology for young adults to follow along. The goal of this blog is for it to give students just a quick overview of what taxes are, how to manage and have a good credit score, how to keep a budget and other essential financial tasks. There are five topics covered in the blog as well as resources for students to access if they would like more information on a topic.
ContributorsFavata, Danielle (Co-author) / Perez-Vargas, Sofia (Co-author) / Sadusky, Brian (Thesis director) / Hoffman, David (Committee member) / WPC Graduate Programs (Contributor) / School of Accountancy (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2018-12
132456-Thumbnail Image.png
Description
This paper seeks to emphasize how the presence of uncertainty, speculation and leverage work in concert within the stock market to exacerbate crashes in a cyclical market. It analyzes three major stock market events: the crash of Oct. 19, 1987, “Black Monday;” the dotcom bust, from 1999 to 2002; and

This paper seeks to emphasize how the presence of uncertainty, speculation and leverage work in concert within the stock market to exacerbate crashes in a cyclical market. It analyzes three major stock market events: the crash of Oct. 19, 1987, “Black Monday;” the dotcom bust, from 1999 to 2002; and the subprime mortgage crisis, from 2007 to 2010. Within each event period I define determinants or measurements of uncertainty, speculation. Analysis of how these three concepts functioned during boom and bust will highlight how their presence can amplify the magnitude of a crash. This paper postulates that the amount of leverage during a crash determines how long-term its effects will be. This theory is fortified by extensive research and interviews with experts in the stock market who had a front row view of the discussed crises.
ContributorsGraff, Veronica Camille (Author) / Leckey, Andrew (Thesis director) / Cohen, Sarah (Committee member) / Historical, Philosophical & Religious Studies (Contributor) / Walter Cronkite School of Journalism & Mass Comm (Contributor, Contributor) / Dean, W.P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133950-Thumbnail Image.png
Description
When making investment decisions many different indicators are taken into consideration before picking a stock/corporation to invest in (retail or institutional). Traditionally these indicators tend to be financial measures such as earnings per share, price to earnings ratio, price to book value ratio, dividend yield/payout ratio, etc. Often these indicators

When making investment decisions many different indicators are taken into consideration before picking a stock/corporation to invest in (retail or institutional). Traditionally these indicators tend to be financial measures such as earnings per share, price to earnings ratio, price to book value ratio, dividend yield/payout ratio, etc. Often these indicators do not take into consideration the actual running intricacies of a company as they are simply based on historical financial statements, thus limiting an investor's decision-making ability. In this paper I analyze several companies stock performance to see if analyzing operational factors such as supply chain management before making an investment decision would have resulted in a profitable investment and thus prove as a reliable investment indicator. To do this I focused my analysis over a period of 5 years on two companies within three different industries; Fast Food, Processing, and Ecommerce. These industries were selected as the nature of their businesses require intensive supply chains thus this strategy would be most applicable to them as opposed to a software or IT company. Of the two companies selected from each respective industry one company would be listed/analyzed in Gartner's ranking of the "Annual Supply Chain Top 25" while the other company would not be. This Gartner ranking would serve as a measure of whether or not a company had a good supply chain. These companies then had their traditional financial metrics evaluated to see if supply chain analysis indirectly encapsulated some of these metrics as well. The goal of this analysis was to find if there was a strong correlation between companies listed on Gartner's rating scale and strong stock performance. If this was true this would suggest that there is a benefit to be captured by investors through using supply chain analysis as an indicator when making investment decisions.
ContributorsThompson, Tyler Thomas (Author) / Kellso, James (Thesis director) / Smith, Geoffrey (Committee member) / Department of Finance (Contributor) / Department of Supply Chain Management (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
132832-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However while this does cause ETF deviations to be generally lower than their mutual fund counterparts, as our paper explores this process does not eliminate these deviations completely. This article builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of premiums (discounts) of ETFs from their fair market value. And looks to see if these premia have changed in the last 10 years. Our paper then diverges from the original and takes a deeper look into the standard deviations of these premia specifically.

Our findings show that over 70% of an ETFs standard deviation of premia can be explained through a linear combination consisting of two variables: a categorical (Domestic[US], Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market indicators such as the economic freedom index and investment freedom index are insignificant predictors of an ETFs standard deviation of premia when combined with the categorical variable. These findings differ somewhat from existing literature which indicate that these factors should have a significant impact on the predictive ability of an ETFs standard deviation of premia.
ContributorsZhang, Jingbo (Co-author, Co-author) / Henning, Thomas (Co-author) / Simonson, Mark (Thesis director) / Licon, L. Wendell (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132834-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However
while this does cause ETF deviations to be generally lower than their mutual fund counterparts,
as our paper explores this process does not eliminate these deviations completely. This article
builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of
premiums (discounts) of ETFs from their fair market value. And looks to see if these premia
have changed in the last 10 years. Our paper then diverges from the original and takes a deeper
look into the standard deviations of these premia specifically.
Our findings show that over 70% of an ETFs standard deviation of premia can be
explained through a linear combination consisting of two variables: a categorical (Domestic[US],
Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds
that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market
indicators such as the economic freedom index and investment freedom index are insignificant
predictors of an ETFs standard deviation of premia. These findings differ somewhat from
existing literature which indicate that these factors should have a significant impact on the
predictive ability of an ETFs standard deviation of premia.
ContributorsHenning, Thomas Louis (Co-author) / Zhang, Jingbo (Co-author) / Simonson, Mark (Thesis director) / Wendell, Licon (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132835-Thumbnail Image.png
Description
This paper classifies private equity groups (PEGs) seeking to engage in public to private transactions (PTPs) and determines (primarily through an examination of the implied merger arbitrage spread), whether certain reputational factors associated with the private equity industry affect a firm's ability to acquire a publicly-traded company. We use a

This paper classifies private equity groups (PEGs) seeking to engage in public to private transactions (PTPs) and determines (primarily through an examination of the implied merger arbitrage spread), whether certain reputational factors associated with the private equity industry affect a firm's ability to acquire a publicly-traded company. We use a sample of 1,027 US-based take private transactions announced between January 5, 2009 and August 2, 2018, where 333 transactions consist of private-equity led take-privates, to investigate how merger arbitrage spreads, offer premiums, and deal closure are impacted based on PEG- and PTP-specific input variables. We find that the merger arbitrage spread of PEG-backed deals are 2-3% wider than strategic deals, hostile deals have a greater merger arbitrage spread, larger bid premiums widen spreads and markets accurately identify deals that will close through a narrower spread. PEG deals offer lower premiums, as well as friendly deals and larger deals. Offer premiums are 8.2% larger among deals that eventually consummate. In a logistic regression, we identified that PEG deals are less likely to close than strategic deals, however friendly deals are much more likely to close and Mega Funds are more likely to consummate deals among their PEG peers. These findings support previous research on PTP deals. The insignificance of PEG-classified variables on arbitrage spreads and premiums suggest that investors do not differentiate PEG-backed deals by PEG due to most PEGs equal ability to raise competitive financing. However, Mega Funds are more likely to close deals, and thus, we identify that merger arbitrage spreads should be narrower among this PEG classification.
ContributorsSliwicki, Austin James (Co-author) / Schifman, Eli (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Department of Economics (Contributor) / School of Accountancy (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133824-Thumbnail Image.png
Description
Autonomous vehicles (AV) are capable of producing massive amounts of real time and precise data. This data has the ability to present new business possibilities across a vast amount of markets. These possibilities range from simple applications to unprecedented use cases. With this in mind, the three main objectives we

Autonomous vehicles (AV) are capable of producing massive amounts of real time and precise data. This data has the ability to present new business possibilities across a vast amount of markets. These possibilities range from simple applications to unprecedented use cases. With this in mind, the three main objectives we sought to accomplish in our thesis were to: 1. Understand if there is monetization potential in autonomous vehicle data 2. Create a financial model of what detailing the viability of AV data monetization 3. Discover how a particular company (Company X) can take advantage of this opportunity, and outline how that company might access this autonomous vehicle data.
ContributorsCarlton, Corrine (Co-author) / Clark, Rachael (Co-author) / Quintana, Alex (Co-author) / Shapiro, Brandon (Co-author) / Sigrist, Austin (Co-author) / Simonson, Mark (Thesis director) / Reber, Kevin (Committee member) / School of Accountancy (Contributor) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133707-Thumbnail Image.png
Description
Dodd-Frank should be celebrated for its success in stabilizing the financial sector following the last financial crisis. Some of its measures have not only contained financial disaster but contributed to economic growth. These elements of Dodd-Frank have been identified as "clear wins" and include the increase of financial institutions' capital

Dodd-Frank should be celebrated for its success in stabilizing the financial sector following the last financial crisis. Some of its measures have not only contained financial disaster but contributed to economic growth. These elements of Dodd-Frank have been identified as "clear wins" and include the increase of financial institutions' capital requirements, the single-point-of-entry approach to regulating financial firms, and the creation of the Consumer Financial Protection Bureau (CFPB). The single-point-of-entry strategy (SPOE), specifically, has done much to bring an end to the age of "too big to fail" institutions. By identifying firms that could expect to be aided in case of financial crisis, the SPOE approach reduces uncertainty among financial institutions. Moreover, SPOE eliminates the significant source of risk by establishing clear protocols for resolving failed financial firms. Dodd-Frank has also taken measures to better protect consumers with the creation of the CFPB. Some of the CFPB's stabilizing actions have included the removal of deceptive financial products, setting guidelines for qualified mortgages, and other regulatory safeguards on money transfers. Despite the CFPB's many triumphs, however, there is room for improvement, especially in the agency's ability to reduce regulatory redundancies in supervision and collaboration with other financial sector controllers. The significant strengths of Dodd-Frank are evident in its elements that have secured financial stability. However, it is important to also consider any potential to stifle healthy economic growth. There are several areas for legislative amendments and reforms in order to improve the performance of Dodd-Frank given its sweeping regulatory impact. Several governing redundancies now exist with the creation of new regulatory authorities. Special efforts to increase the authority of the Financial Sector Oversight Council (FSOC) and preserving the impartiality of the Office of Financial Research (OFR) are specific examples of reforms still needed to elevate the effectiveness of Dodd-Frank. In addition, Dodd-Frank could do more to clarify the Volcker Rule in order to ease banks' burden to comply with excessive oversight. Going forward, policymakers must be willing to adjust parts of Dodd-Frank that encroach too far on the private sector's ability to foster efficiency or development. In addition, identifying and monitoring areas of the legislation deemed "too soon to tell" will provide insight on the accuracy and benefit of some Dodd-Frank measures.
ContributorsConrad, Cody Lee (Author) / Sadusky, Brian (Thesis director) / Hoffman, David (Committee member) / School of Politics and Global Studies (Contributor) / Department of Management and Entrepreneurship (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05