Matching Items (36)
Filtering by

Clear all filters

133884-Thumbnail Image.png
Description
This paper looks at the growth of influencer marketing in application and how it has shifted the relationship between brands and consumers. Barriers to enter the space and methods of practice are discussed and analyzed to project the accessibility of obtaining influencer status. Best practices for brands and influencers are

This paper looks at the growth of influencer marketing in application and how it has shifted the relationship between brands and consumers. Barriers to enter the space and methods of practice are discussed and analyzed to project the accessibility of obtaining influencer status. Best practices for brands and influencers are outlined based on research, and key findings are analyzed from interviewed participants that play an active role in the field. Another component of the paper includes the discussion of the significance of platform dependence regarding influencers and brands using social media channels to reach consumers. The dynamic of the relationship that exists between consumers, brands and platforms is demonstrated through a model to demonstrate the interdependence of the relationship. The final component of the paper involves the exploration of the field as an active participant through an experiment that was conducted by the researcher on behalf of the question: can anyone be an influencer? The answer to this question is explored through personal accounts on the journey during an eight month process of testing content creation and promotion to build awareness and increase engagement. The barriers to enter the space as an influencer and to collaborate with brands is addressed through the process of testing tactics and strategies on social channels, along with travel expeditions across Arizona to contribute to content creation purposed into blog articles. The findings throughout the paper are conclusive that the value of influencer marketing is increasing as more brands validate and utilize this method in their marketing efforts.
ContributorsDavis, Natalie Marie (Author) / Giles, Bret (Thesis director) / Schlacter, John (Committee member) / Department of Information Systems (Contributor) / Department of Marketing (Contributor) / Walter Cronkite School of Journalism and Mass Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
136255-Thumbnail Image.png
Description
Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that hel

Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that help predict how much time it takes to implement a cost-saving project. These projects had previously been considered only on the merit of cost savings, but with an added dimension of time, we hope to forecast time according to a number of variables. With such a forecast, we can then apply it to an expense project prioritization model which relates time and cost savings together, compares many different projects simultaneously, and returns a series of present value calculations over different ranges of time. The goal is twofold: assist with an accurate prediction of a project's time to implementation, and provide a basis to compare different projects based on their present values, ultimately helping to reduce the Company's manufacturing costs and improve gross margins. We believe this approach, and the research found toward this goal, is most valuable for the Company. Two coaches from the Company have provided assistance and clarified our questions when necessary throughout our research. In this paper, we begin by defining the problem, setting an objective, and establishing a checklist to monitor our progress. Next, our attention shifts to the data: making observations, trimming the dataset, framing and scoping the variables to be used for the analysis portion of the paper. Before creating a hypothesis, we perform a preliminary statistical analysis of certain individual variables to enrich our variable selection process. After the hypothesis, we run multiple linear regressions with project duration as the dependent variable. After regression analysis and a test for robustness, we shift our focus to an intuitive model based on rules of thumb. We relate these models to an expense project prioritization tool developed using Microsoft Excel software. Our deliverables to the Company come in the form of (1) a rules of thumb intuitive model and (2) an expense project prioritization tool.
ContributorsAl-Assi, Hashim (Co-author) / Chiang, Robert (Co-author) / Liu, Andrew (Co-author) / Ludwick, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor) / School of Accountancy (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / WPC Graduate Programs (Contributor)
Created2015-05
133957-Thumbnail Image.png
Description
Coherent vortices are ubiquitous structures in natural flows that affect mixing and transport of substances and momentum/energy. Being able to detect these coherent structures is important for pollutant mitigation, ecological conservation and many other aspects. In recent years, mathematical criteria and algorithms have been developed to extract these coherent structures

Coherent vortices are ubiquitous structures in natural flows that affect mixing and transport of substances and momentum/energy. Being able to detect these coherent structures is important for pollutant mitigation, ecological conservation and many other aspects. In recent years, mathematical criteria and algorithms have been developed to extract these coherent structures in turbulent flows. In this study, we will apply these tools to extract important coherent structures and analyze their statistical properties as well as their implications on kinematics and dynamics of the flow. Such information will aide representation of small-scale nonlinear processes that large-scale models of natural processes may not be able to resolve.
ContributorsCass, Brentlee Jerry (Author) / Tang, Wenbo (Thesis director) / Kostelich, Eric (Committee member) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
132832-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However while this does cause ETF deviations to be generally lower than their mutual fund counterparts, as our paper explores this process does not eliminate these deviations completely. This article builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of premiums (discounts) of ETFs from their fair market value. And looks to see if these premia have changed in the last 10 years. Our paper then diverges from the original and takes a deeper look into the standard deviations of these premia specifically.

Our findings show that over 70% of an ETFs standard deviation of premia can be explained through a linear combination consisting of two variables: a categorical (Domestic[US], Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market indicators such as the economic freedom index and investment freedom index are insignificant predictors of an ETFs standard deviation of premia when combined with the categorical variable. These findings differ somewhat from existing literature which indicate that these factors should have a significant impact on the predictive ability of an ETFs standard deviation of premia.
ContributorsZhang, Jingbo (Co-author, Co-author) / Henning, Thomas (Co-author) / Simonson, Mark (Thesis director) / Licon, L. Wendell (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132840-Thumbnail Image.png
Description
The United States is in a period of political turmoil and polarization. New technologies have matured over the last ten years, which have transformed an individual’s relationship with society and government. The emergence of these technologies has revolutionized access to both information and misinformation. Skills such as bias recognition and

The United States is in a period of political turmoil and polarization. New technologies have matured over the last ten years, which have transformed an individual’s relationship with society and government. The emergence of these technologies has revolutionized access to both information and misinformation. Skills such as bias recognition and critical thinking are more imperative than in any other time to separate truth from false or misleading information. Meanwhile, education has not evolved with these changes. The average individual is more likely to come to uninformed conclusions and less likely to listen to differing perspectives. Moreover, technology is further complicating and compounding other issues in the political process. All of this is manifesting in division among the American people who elect more polarized politicians who increasingly fail to find avenues for compromise.

In an effort to address these trends, we founded a student organization, The Political Literates, to fight political apathy by delivering political news in an easy to understand and unbiased manner. Inspired by our experience with this organization, we combine our insights with research to paint a new perspective on the state of the American political system.

This thesis analyzes various issues identified through our observations and research, with a heavy emphasis on using examples from the 2016 election. Our focus is how new technologies like data analytics, the Internet, smartphones, and social media are changing politics by driving political and social transformation. We identify and analyze five core issues that have been amplified by new technology, hindering the effectiveness of elections and further increasing political polarization:

● Gerrymandering which skews partisan debate by forcing politicians to pander to ideologically skewed districts.
● Consolidation of media companies which affects the diversity of how news is shared.
● Repeal of the Fairness Doctrine which allowed media to become more partisan.
● The Citizens United Ruling which skews power away from average voters in elections.
● A Failing Education System which does not prepare Americans to be civically engaged and to avoid being swayed by biased or untrue media.

Based on our experiment with the Political Literates and our research, we call for improving how critical thinking and civics is taught in the American education system. Critical thought and civics must be developed pervasively. With this, more people would be able to form more sophisticated views by listening to others to learn rather than win, listening less to irrelevant information, and forming a culture with more engagement in politics. Through this re-enlightenment, many of America’s other problems may evaporate or become more actionable.
ContributorsStenseth, Kyle (Co-author) / Tumas, Trevor (Co-author) / Mokwa, Michael (Thesis director) / Eaton, John (Committee member) / Department of Information Systems (Contributor) / Department of Supply Chain Management (Contributor) / Sandra Day O'Connor College of Law (Contributor) / Watts College of Public Service & Community Solut (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
Description
The object of the present study is to examine methods in which the company can optimize their costs on third-party suppliers whom oversee other third-party trade labor. The third parties in scope of this study are suspected to overstaff their workforce, thus overcharging the company. We will introduce a complex

The object of the present study is to examine methods in which the company can optimize their costs on third-party suppliers whom oversee other third-party trade labor. The third parties in scope of this study are suspected to overstaff their workforce, thus overcharging the company. We will introduce a complex spreadsheet model that will propose a proper project staffing level based on key qualitative variables and statistics. Using the model outputs, the Thesis team proposes a headcount solution for the company and problem areas to focus on, going forward. All sources of information come from company proprietary and confidential documents.
ContributorsLoo, Andrew (Co-author) / Brennan, Michael (Co-author) / Sheiner, Alexander (Co-author) / Hertzel, Michael (Thesis director) / Simonson, Mark (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Department of Supply Chain Management (Contributor) / WPC Graduate Programs (Contributor) / School of Accountancy (Contributor)
Created2014-05
137207-Thumbnail Image.png
Description
The main goal of this study was to understand the awareness of small business owners regarding occupational fraud, meaning fraud committed from within an organization. A survey/questionnaire was used to gather insight into the knowledge and perceptions of small business owners, while also obtaining information about the history of fraud

The main goal of this study was to understand the awareness of small business owners regarding occupational fraud, meaning fraud committed from within an organization. A survey/questionnaire was used to gather insight into the knowledge and perceptions of small business owners, while also obtaining information about the history of fraud and the internal controls within their business. Twenty-four owners of businesses with less than 100 employees participated in the study. The results suggest that small business owners overestimate their knowledge regarding internal controls and occupational fraud, while also underestimating the risk of fraud within their own business. In fact, 92% of participants were not at all familiar with the popular Internal Control \u2014 Integrated Framework published by the Committee of Sponsoring Organizations of the Treadway Commission. The results also show that small business owners tend to overestimate the protection provided by their currently implemented controls in regard to their risk of fraud. Overall, through continued knowledge of internal controls and occupational fraud, business owners can better protect their businesses from the risk of occupational fraud by increasing their awareness of fraud.
ContributorsDennis, Lauren Nicole (Author) / Orpurt, Steven (Thesis director) / Munshi, Perseus (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / School of Accountancy (Contributor)
Created2014-05
133143-Thumbnail Image.png
Description
The prevalence of bots, or automated accounts, on social media is a well-known problem. Some of the ways bots harm social media users include, but are not limited to, spreading misinformation, influencing topic discussions, and dispersing harmful links. Bots have affected the field of disaster relief on social media as

The prevalence of bots, or automated accounts, on social media is a well-known problem. Some of the ways bots harm social media users include, but are not limited to, spreading misinformation, influencing topic discussions, and dispersing harmful links. Bots have affected the field of disaster relief on social media as well. These bots cause problems such as preventing rescuers from determining credible calls for help, spreading fake news and other malicious content, and generating large amounts of content which burdens rescuers attempting to provide aid in the aftermath of disasters. To address these problems, this research seeks to detect bots participating in disaster event related discussions and increase the recall, or number of bots removed from the network, of Twitter bot detection methods. The removal of these bots will also prevent human users from accidentally interacting with these bot accounts and being manipulated by them. To accomplish this goal, an existing bot detection classification algorithm known as BoostOR was employed. BoostOR is an ensemble learning algorithm originally modeled to increase bot detection recall in a dataset and it has the possibility to solve the social media bot dilemma where there may be several different types of bots in the data. BoostOR was first introduced as an adjustment to existing ensemble classifiers to increase recall. However, after testing the BoostOR algorithm on unobserved datasets, results showed that BoostOR does not perform as expected. This study attempts to improve the BoostOR algorithm by comparing it with a baseline classification algorithm, AdaBoost, and then discussing the intentional differences between the two. Additionally, this study presents the main factors which contribute to the shortcomings of the BoostOR algorithm and proposes a solution to improve it. These recommendations should ensure that the BoostOR algorithm can be applied to new and unobserved datasets in the future.
ContributorsDavis, Matthew William (Author) / Liu, Huan (Thesis director) / Nazer, Tahora H. (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2018-12
132866-Thumbnail Image.png
Description
Within this paper I summarize the key features, and results, of research conducted to support the development, design, and implementation of an internal control system at a startup small business. These efforts were conducted for an Honors Thesis/Creative Project for Barrett, the Honors College at Arizona State University. The research

Within this paper I summarize the key features, and results, of research conducted to support the development, design, and implementation of an internal control system at a startup small business. These efforts were conducted for an Honors Thesis/Creative Project for Barrett, the Honors College at Arizona State University. The research revolved around deciding which financial policies, procedures, and safeguards could be useful in creating an internal control system for small businesses. In addition to academic research, I developed an “Internal Control Questionnaire” for use as a ‘jumping off point’ in conversations about a business’ existing accounting system. This questionnaire is applicable across many industries, covering the major topics which every small business/startup should consider.

The questionnaire was then used in conjunction with two interviews of small business owners. The interviews covered both the overall financial status of their business and their business’ pre-existing accounting system. The feedback received during these interviews was subsequently used to provide the business owners with eleven recommendations ranging from the implementation of new policies to verification of existing internal controls.

Finally, I summarize my findings, both academic and real-world, conveying that many small business owners do not implement formal internal control systems. I also discuss why the business owners, in this specific circumstance, did not yet implement the aforementioned eleven suggestions.
ContributorsDuncan, Spencer James (Author) / Garverick, Michael (Thesis director) / Casas Arce, Pablo (Committee member) / School of Accountancy (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
134373-Thumbnail Image.png
Description
Our research encompassed the prospect draft in baseball and looked at what type of player teams drafted to maximize value. We wanted to know which position returned the best value to the team that drafted them, and which level is safer to draft players from, college or high school. We

Our research encompassed the prospect draft in baseball and looked at what type of player teams drafted to maximize value. We wanted to know which position returned the best value to the team that drafted them, and which level is safer to draft players from, college or high school. We decided to look at draft data from 2006-2010 for the first ten rounds of players selected. Because there is only a monetary cap on players drafted in the first ten rounds we restricted our data to these players. Once we set up the parameters we compiled a spreadsheet of these players with both their signing bonuses and their wins above replacement (WAR). This allowed us to see how much a team was spending per win at the major league level. After the data was compiled we made pivot tables and graphs to visually represent our data and better understand the numbers. We found that the worst position that MLB teams could draft would be high school second baseman. They returned the lowest WAR of any player that we looked at. In general though high school players were more costly to sign and had lower WARs than their college counterparts making them, on average, a worse pick value wise. The best position you could pick was college shortstops. They had the trifecta of the best signability of all players, along with one of the highest WARs and lowest signing bonuses. These were three of the main factors that you want with your draft pick and they ranked near the top in all three categories. This research can help give guidelines to Major League teams as they go to select players in the draft. While there are always going to be exceptions to trends, by following the enclosed research teams can minimize risk in the draft.
ContributorsValentine, Robert (Co-author) / Johnson, Ben (Co-author) / Eaton, John (Thesis director) / Goegan, Brian (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Information Systems (Contributor) / School of Accountancy (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05