Matching Items (15)
Filtering by

Clear all filters

135574-Thumbnail Image.png
Description
The purpose of our research was to develop recommendations and/or strategies for Company A's data center group in the context of the server CPU chip industry. We used data collected from the International Data Corporation (IDC) that was provided by our team coaches, and data that is accessible on the

The purpose of our research was to develop recommendations and/or strategies for Company A's data center group in the context of the server CPU chip industry. We used data collected from the International Data Corporation (IDC) that was provided by our team coaches, and data that is accessible on the internet. As the server CPU industry expands and transitions to cloud computing, Company A's Data Center Group will need to expand their server CPU chip product mix to meet new demands of the cloud industry and to maintain high market share. Company A boasts leading performance with their x86 server chips and 95% market segment share. The cloud industry is dominated by seven companies Company A calls "The Super 7." These seven companies include: Amazon, Google, Microsoft, Facebook, Alibaba, Tencent, and Baidu. In the long run, the growing market share of the Super 7 could give them substantial buying power over Company A, which could lead to discounts and margin compression for Company A's main growth engine. Additionally, in the long-run, the substantial growth of the Super 7 could fuel the development of their own design teams and work towards making their own server chips internally, which would be detrimental to Company A's data center revenue. We first researched the server industry and key terminology relevant to our project. We narrowed our scope by focusing most on the cloud computing aspect of the server industry. We then researched what Company A has already been doing in the context of cloud computing and what they are currently doing to address the problem. Next, using our market analysis, we identified key areas we think Company A's data center group should focus on. Using the information available to us, we developed our strategies and recommendations that we think will help Company A's Data Center Group position themselves well in an extremely fast growing cloud computing industry.
ContributorsJurgenson, Alex (Co-author) / Nguyen, Duy (Co-author) / Kolder, Sean (Co-author) / Wang, Chenxi (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Department of Finance (Contributor) / Department of Management (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Accountancy (Contributor) / WPC Graduate Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136625-Thumbnail Image.png
Description
A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to

A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to the material similar to that of which is presented in class at ASU. The guide is available to students and professors in the new Actuarial Science degree program offered by ASU. There are twelve chapters, including financial calculator tips, detailed notes, examples, and practice exercises. Included at the end of the guide is a list of referenced material.
ContributorsDougher, Caroline Marie (Author) / Milovanovic, Jelena (Thesis director) / Boggess, May (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136803-Thumbnail Image.png
Description
This paper provides evidence through an event study, portfolio simulation, and regression analysis that insider trading, when appropriately aggregated, has predictive power for abnormal risk-adjusted returns on some country and sector exchange traded funds (ETFs). I examine ETFs because of their broad scope and liquidity. ETF markets are relatively efficient

This paper provides evidence through an event study, portfolio simulation, and regression analysis that insider trading, when appropriately aggregated, has predictive power for abnormal risk-adjusted returns on some country and sector exchange traded funds (ETFs). I examine ETFs because of their broad scope and liquidity. ETF markets are relatively efficient and, thus, the effects I document are unlikely to appear in ETF markets. My evidence that aggregated insider trading predicts abnormal returns in some ETFs suggests that aggregated insider trading is likely to have predictive power for financial assets traded in less efficient markets. My analysis depends on specialized insider trading data covering 88 countries is generously provided by 2iQ.
ContributorsKerker, Mackenzie Alan (Author) / Coles, Jeffrey (Thesis director) / Mcauley, Daniel (Committee member) / Licon, Wendell (Committee member) / Barrett, The Honors College (Contributor) / Department of Economics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor)
Created2014-05
136255-Thumbnail Image.png
Description
Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that hel

Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that help predict how much time it takes to implement a cost-saving project. These projects had previously been considered only on the merit of cost savings, but with an added dimension of time, we hope to forecast time according to a number of variables. With such a forecast, we can then apply it to an expense project prioritization model which relates time and cost savings together, compares many different projects simultaneously, and returns a series of present value calculations over different ranges of time. The goal is twofold: assist with an accurate prediction of a project's time to implementation, and provide a basis to compare different projects based on their present values, ultimately helping to reduce the Company's manufacturing costs and improve gross margins. We believe this approach, and the research found toward this goal, is most valuable for the Company. Two coaches from the Company have provided assistance and clarified our questions when necessary throughout our research. In this paper, we begin by defining the problem, setting an objective, and establishing a checklist to monitor our progress. Next, our attention shifts to the data: making observations, trimming the dataset, framing and scoping the variables to be used for the analysis portion of the paper. Before creating a hypothesis, we perform a preliminary statistical analysis of certain individual variables to enrich our variable selection process. After the hypothesis, we run multiple linear regressions with project duration as the dependent variable. After regression analysis and a test for robustness, we shift our focus to an intuitive model based on rules of thumb. We relate these models to an expense project prioritization tool developed using Microsoft Excel software. Our deliverables to the Company come in the form of (1) a rules of thumb intuitive model and (2) an expense project prioritization tool.
ContributorsAl-Assi, Hashim (Co-author) / Chiang, Robert (Co-author) / Liu, Andrew (Co-author) / Ludwick, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor) / School of Accountancy (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / WPC Graduate Programs (Contributor)
Created2015-05
132832-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However while this does cause ETF deviations to be generally lower than their mutual fund counterparts, as our paper explores this process does not eliminate these deviations completely. This article builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of premiums (discounts) of ETFs from their fair market value. And looks to see if these premia have changed in the last 10 years. Our paper then diverges from the original and takes a deeper look into the standard deviations of these premia specifically.

Our findings show that over 70% of an ETFs standard deviation of premia can be explained through a linear combination consisting of two variables: a categorical (Domestic[US], Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market indicators such as the economic freedom index and investment freedom index are insignificant predictors of an ETFs standard deviation of premia when combined with the categorical variable. These findings differ somewhat from existing literature which indicate that these factors should have a significant impact on the predictive ability of an ETFs standard deviation of premia.
ContributorsZhang, Jingbo (Co-author, Co-author) / Henning, Thomas (Co-author) / Simonson, Mark (Thesis director) / Licon, L. Wendell (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132834-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual
funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to
achieve their functionality and this mechanism is designed to minimize the deviations that occur
between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However
while this does cause ETF deviations to be generally lower than their mutual fund counterparts,
as our paper explores this process does not eliminate these deviations completely. This article
builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of
premiums (discounts) of ETFs from their fair market value. And looks to see if these premia
have changed in the last 10 years. Our paper then diverges from the original and takes a deeper
look into the standard deviations of these premia specifically.
Our findings show that over 70% of an ETFs standard deviation of premia can be
explained through a linear combination consisting of two variables: a categorical (Domestic[US],
Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds
that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market
indicators such as the economic freedom index and investment freedom index are insignificant
predictors of an ETFs standard deviation of premia. These findings differ somewhat from
existing literature which indicate that these factors should have a significant impact on the
predictive ability of an ETFs standard deviation of premia.
ContributorsHenning, Thomas Louis (Co-author) / Zhang, Jingbo (Co-author) / Simonson, Mark (Thesis director) / Wendell, Licon (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
137408-Thumbnail Image.png
Description
This paper investigates whether measures of investor sentiment can be used to predict future total returns of the S&P 500 index. Rolling regressions and other statistical techniques are used to determine which indicators contain the most predictive information and which time horizons' returns are "easiest" to predict in a three

This paper investigates whether measures of investor sentiment can be used to predict future total returns of the S&P 500 index. Rolling regressions and other statistical techniques are used to determine which indicators contain the most predictive information and which time horizons' returns are "easiest" to predict in a three year data set. The five "most predictive" indicators are used to predict 180 calendar day future returns of the market and simulated investment of hypothetical accounts is conducted in an independent six year data set based on the rolling regression future return predictions. Some indicators, most notably the VIX index, appear to contain predictive information which led to out-performance of the accounts that invested based on the rolling regression model's predictions.
ContributorsDundas, Matthew William (Author) / Boggess, May (Thesis director) / Budolfson, Arthur (Committee member) / Hedegaard, Esben (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor)
Created2013-12
135069-Thumbnail Image.png
Description
The purpose of this paper is to study the impact that poison pills have on the value of share prices after the cancellation of a transaction. While various studies have focused on the generic share price impact of poison pills, very few have focused on the impact of poison pills

The purpose of this paper is to study the impact that poison pills have on the value of share prices after the cancellation of a transaction. While various studies have focused on the generic share price impact of poison pills, very few have focused on the impact of poison pills in cancelled transactions. Based on our research and analysis, in cancelled transactions, target firms that have poison pills prior to the transaction and target firms without poison pills generate returns above the announcement date premium and subsequent investment in the S&P 500 when held to the cancellation of the transaction and when held from cancellation to 6 months after the transaction. This analysis can contribute to the argument that holding shares of firms regardless of cancellation risk is preferable to taking profit at announcement date. Additionally, it can contribute to the study of undiscovered pricing impact of poison pills.
ContributorsChotalla, Gurkaran (Co-author) / Amjad, Hamza (Co-author) / Reddy, Samir (Co-author) / Stein, Luke (Thesis director) / Lindsey, Laura (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Finance (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
134810-Thumbnail Image.png
Description
This research project examines the craft brewing industry and its position in the North American market. Specifically, this research will highlight the most important aspects of the product market, cost structure, market trends, as well as an assessment of the viability of several modes of entry. The data and analysis

This research project examines the craft brewing industry and its position in the North American market. Specifically, this research will highlight the most important aspects of the product market, cost structure, market trends, as well as an assessment of the viability of several modes of entry. The data and analysis provided indicates that the industry is promising and poised to grow in comparison to many other sectors within the alcoholic beverages industry, as demand for differentiated craft beer products is relatively strong. The continued existence of craft brewing would not be made possible without the devotion and dedication of individuals simply interested in brewing recipes at home. Although the process of brewing remains relatively traditional, the paper will discuss the possibilities to diversify as a successful craft brewing brand due to consumers' willingness and curiosity to try new beverages. Production details and supply chain processes will be discussed to fully understand the fruitful beginnings of a local brewer to a large scale company that distributes nationwide. Nonetheless, prominent risks include extensive regulatory hurdles ranging from local to federal levels and threats from significant established competitors. These competitors and their business activities will be heavily discussed as it pertains to the question of whether entering the market is a smart business decision. The purpose of this research is to provide potential business owners and investors the strength and knowledge to engage in the craft brewing industry. In essence, the business decision to participate in the craft brewing industry is met with encouragement from an avid consumer base, collaboration with competitors, and an undying passion to brew quality beer for consumption.
ContributorsKnapp, Kurt (Co-author) / Wu, Katherine (Co-author) / Nguyen, Kelley (Co-author) / Budolfson, Arthur (Thesis director) / Bhattacharya, Anand (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Accountancy (Contributor) / Hugh Downs School of Human Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
134974-Thumbnail Image.png
Description
The goal of this thesis was to provide in depth research into the semiconductor wet-etch market and create a supplier analysis tool that would allow Company X to identify the best supplier partnerships. Several models were used to analyze the wet etch market including Porter's Five Forces and SWOT analyses.

The goal of this thesis was to provide in depth research into the semiconductor wet-etch market and create a supplier analysis tool that would allow Company X to identify the best supplier partnerships. Several models were used to analyze the wet etch market including Porter's Five Forces and SWOT analyses. These models were used to rate suppliers based on financial indicators, management history, market share, research and developments spend, and investment diversity. This research allowed for the removal of one of the four companies in question due to a discovered conflict of interest. Once the initial research was complete a dynamic excel model was created that would allow Company X to continually compare costs and factors of the supplier's products. Many cost factors were analyzed such as initial capital investment, power and chemical usage, warranty costs, and spares parts usage. Other factors that required comparison across suppliers included wafer throughput, number of layers the tool could process, the number of chambers the tool has, and the amount of space the tool requires. The demand needed for the tool was estimated by Company X in order to determine how each supplier's tool set would handle the required usage. The final feature that was added to the model was the ability to run a sensitivity analysis on each tool set. This allows Company X to quickly and accurately forecast how certain changes to costs or tool capacities would affect total cost of ownership. This could be heavily utilized during Company X's negotiations with suppliers. The initial research as well the model lead to the final recommendation of Supplier A as they had the most cost effective tool given the required demand. However, this recommendation is subject to change as demand fluctuates or if changes can be made during negotiations.
ContributorsSchmitt, Connor (Co-author) / Rickets, Dawson (Co-author) / Castiglione, Maia (Co-author) / Witten, Forrest (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Information Systems (Contributor) / Department of Supply Chain Management (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Accountancy (Contributor) / WPC Graduate Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12