Matching Items (337)
152333-Thumbnail Image.png
Description
We apply a Bayesian network-based approach for determining the structure of consumers' brand concept maps, and we further extend this approach in order to provide a precise delineation of the set of cognitive variations of that brand concept map structure which can simultaneously coexist within the data. This methodology can

We apply a Bayesian network-based approach for determining the structure of consumers' brand concept maps, and we further extend this approach in order to provide a precise delineation of the set of cognitive variations of that brand concept map structure which can simultaneously coexist within the data. This methodology can operate with nonlinear as well as linear relationships between the variables, and utilizes simple Likert-style marketing survey data as input. In addition, the method can operate without any a priori hypothesized structures or relations among the brand associations in the model. The resulting brand concept map structures delineate directional (as opposed to simply correlational) relations among the brand associations, and differentiates between the predictive and the diagnostic directions within each link. Further, we determine a Bayesian network-based link strength measure, and apply it to a comparison of the strengths of the connections between different semantic categories of brand association descriptors, as well as between different strategically important drivers of brand differentiation. Finally, we apply a precise form of information propagation through the predictive and diagnostic links within the network in order to evaluate the effect of introducing new information to the brand concept network. This overall methodology operates via a factorization of the joint distribution of the brand association variables via conditional independence properties and an application of the causal Markov condition, and as such, it represents an alternative approach to correlation-based structural determination methods. By using conditional independence as a core structural construct, the methods utilized here are especially well- suited for determining and analyzing asymmetric or directional beliefs about brand or product attributes. This methodology builds on the pioneering Brand Concept Mapping approach of Roedder John et al. (2006). Similar to that approach, the Bayesian network-based method derives the specific link-by-link structure among a brand's associations, and also allows for a precise quantitative determination of the likely effects that manipulation of specific brand associations will have upon other strategically important associations within that brand image. In addition, the method's precise informational semantics and specific structural measures allow for a greater understanding of the structure of these brand associations.
ContributorsBrownstein, Steven Alan (Author) / Reingen, Peter (Thesis advisor) / Kumar, Ajith (Committee member) / Mokwa, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151264-Thumbnail Image.png
Description
Convergent products are products that offer multiple capabilities from different product categories. For example, a smartphone acts as an internet browser, personal assistant, and telephone. Marketers are constantly considering the value of adding new functionalities to these convergent products. This work examines convergent products in terms of the hedonic and

Convergent products are products that offer multiple capabilities from different product categories. For example, a smartphone acts as an internet browser, personal assistant, and telephone. Marketers are constantly considering the value of adding new functionalities to these convergent products. This work examines convergent products in terms of the hedonic and utilitarian value they provide along with whether the addition is related to the base product, revealing complex and nuanced interactions. This work contributes to marketing theory by advancing knowledge in the convergent products and product design literatures, specifically by showing how hedonic and utilitarian value and addition relatedness interact to impact the evaluation of convergent goods and services. Looking at a greater complexity of convergent product types also helps to resolve prior conflicting findings in the convergent products and hedonic and utilitarian value literatures. Additionally, this work examines the role of justification in convergent products, showing how different additions can help consumers to justify the evaluation of a convergent product. A three-item measure for justification was developed for this research, and can be used by future researchers to better understand the effects of justification in consumption. This work is also the first to explicitly compare effects between convergent goods and convergent services. Across two experiments, it is found that these two products types (convergent goods versus convergent services) are evaluated differently. For convergent goods, consumers evaluate additions based on anticipated practicality/productivity and on how easily they are justified. For convergent services, consumers evaluate additions based on perceptions of performance risk associated with the convergent service, which stems from the intangibility of these services. The insights gleaned from the research allow specific recommendations to be made to managers regarding convergent offerings. This research also examines the applicability of hedonic and utilitarian value to a special type of advertising appeal: reward appeals. Reward appeals are appeals that focus on peripheral benefits from purchasing or using a product, such as time or money savings, and make suggestions on how to use these savings. This work examines potential interactions between reward appeals and other common advertising elements: social norms information and role clarity messaging.
ContributorsEaton, Kathryn Karnos (Author) / Bitner, Mary Jo (Thesis advisor) / Olsen, G. Douglas (Thesis advisor) / Mokwa, Michael (Committee member) / Arizona State University (Publisher)
Created2012
133364-Thumbnail Image.png
Description
The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide

The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide possibilities it can provide. This can convolute the topic by being too broad when defined. Instead, the focus will be maintained on explaining the technical details about how and why this technology works in improving the supply chain. The scope of explanation will not be limited to the solutions, but will also detail current problems. Both public and private blockchain networks will be explained and solutions they provide in supply chains. In addition, other non-blockchain systems will be described that provide important pieces in supply chain operations that blockchain cannot provide. Blockchain when applied to the supply chain provides improved consumer transparency, management of resources, logistics, trade finance, and liquidity.
ContributorsKrukar, Joel Michael (Author) / Oke, Adegoke (Thesis director) / Duarte, Brett (Committee member) / Hahn, Richard (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133918-Thumbnail Image.png
Description
The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the

The passage of 2007's Legal Arizona Workers Act, which required all new hires to be tested for legal employment status through the federal E-Verify database, drastically changed the employment prospects for undocumented workers in the state. Using data from the 2007-2010 American Community Survey, this paper seeks to identify the impact of this law on the labor force in Arizona, specifically regarding undocumented workers and less educated native workers. Overall, the data shows that the wage bias against undocumented immigrants doubled in the four years studied, and the wages of native workers without a high school degree saw a temporary, positive increase compared to comparable workers in other states. The law did not have an effect on the wages of native workers with a high school degree.
ContributorsSantiago, Maria Christina (Author) / Pereira, Claudiney (Thesis director) / Mendez, Jose (Committee member) / School of International Letters and Cultures (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135587-Thumbnail Image.png
Description
The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000

The January 12, 2010 Haiti earthquake, which hit Port-au-Prince in the late afternoon, was the cause of over 220,000 deaths and $8 billion in damages \u2014 roughly 120% of national GDP at the time. A Mw 7.5 earthquake struck rural Guatemala in the early morning in 1976 and caused 23,000-25,000 deaths, three times as many injuries, and roughly $1.1 billion in damages, which accounted for approximately 30% of Guatemala's GDP. The earthquake which hit just outside of Christchurch, New Zealand early in the morning on September 4, 2010 had a magnitude of 7.1 and caused just two injuries, no deaths, and roughly 7.2 billion USD in damages (5% of GDP). These three earthquakes, all with magnitudes over 7 on the Richter scale, caused extremely varied amounts of economic damage for these three countries. This thesis aims to identify a possible explanation as to why this was the case and suggest ways in which to improve disaster risk management going forward.
ContributorsHeuermann, Jamie Lynne (Author) / Schoellman, Todd (Thesis director) / Mendez, Jose (Committee member) / Department of Supply Chain Management (Contributor) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135208-Thumbnail Image.png
Description
Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of

Radiometric dating estimates the age of rocks by comparing the concentration of a decaying radioactive isotope to the concentrations of the decay byproducts. Radiometric dating has been instrumental in the calculation of the Earth's age, the Moon's age, and the age of our solar system. Geochronologists in the School of Earth and Space Exploration at ASU use radiometric dating extensively in their research, and have very specific procedures, hardware, and software to perform the dating calculations. Researchers use lasers to drill small holes, or ablations, in rock faces, collect the masses of various isotopes using a mass spectrometer, and scan the pit with an interferometer, which records the z heights of the pit on an x-y grid. This scan is then processed by custom-made software to determine the volume of the pit, which then is used along with the isotope masses and known decay rates to determine the age of the rock. My research has been focused on improving this volume calculation through computational geometry methods of surface reconstruction. During the process, I created an web application that reads interferometer scans, reconstructs a surface from those scans with Poisson reconstruction, renders the surface in the browser, and calculates the volume of the pit based on parameters provided by the researcher. The scans are stored in a central cloud datastore for future analysis, allowing the researchers in the geochronology community to collaborate together on scans from various rocks in their individual labs. The result of the project has been a complete and functioning application that is accessible to any researcher and reproducible from any computer. The 3D representation of the scan data allows researchers to easily understand the topology of the pit ablation and determine early on whether the measurements of the interferometer are trustworthy for the particular ablation. The volume calculation by the new software also reduces the variability in the volume calculation, which hopefully indicates the process is removing noise from the scan data and performing volume calculations on a more realistic representation of the actual ablation. In the future, this research will be used as the groundwork for more robust testing and closer approximations through implementation of different reconstruction algorithms. As the project grows and becomes more usable, hopefully there will be adoption in the community and it will become a reproducible standard for geochronologists performing radiometric dating.
ContributorsPruitt, Jacob Richard (Author) / Hodges, Kip (Thesis director) / Mercer, Cameron (Committee member) / van Soest, Matthijs (Committee member) / Department of Economics (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The

I built a short-term West Texas Intermediate (WTI) crude oil price-forecasting model for two periods to understand how various drivers of crude oil behaved before and after the Great Recession. According to the Federal Reserve the Great Recession "...began in December 2007 and ended in June 2009" (Rich 1). The research involves two models spanning two periods. The first period encompasses 2000 to late 2007 and the second period encompasses early 2010 to 2016. The dependent variable for this model is monthly average WTI crude oil prices. The independent variables are based on what the academic community believes are drivers of crude oil prices. While the studies may be scattered across different time periods, they provide valuable insight on what the academic community believes drives oil prices. The model includes variables that address two different data groups including: 1. Market fundamentals/expectations of market fundamentals 2. Speculation One of the biggest challenges I faced was defining and quantifying "speculation". I ended up using a previous study's definition of "speculation", which it defined as the activity of certain market participants in the Commitment of Traders report released by the Commodity Futures Trading Commission. My research shows that the West Texas Intermediate crude oil market exhibited a structural change after the Great Recession. Furthermore, my research also presents interesting findings that warrant further research. For example, I find that 3-month T-bills and 10yr Treasury notes lose their predictive edge starting in the second period (2010-2016). Furthermore, the positive correlation between oil and the U.S. dollar in the period 2000-2007 warrants further investigation. Lastly, it might be interesting to see why T-bills are positively correlated to WTI prices and 10yr Treasury notes are negatively correlated to WTI prices.
ContributorsMirza, Hisham Tariq (Author) / McDaniel, Cara (Thesis director) / Budolfson, Arthur (Committee member) / Department of Finance (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135372-Thumbnail Image.png
Description
This thesis examines Endgame, a gaming themed bar and restaurant located in the heart of Tempe, Arizona on Mill Avenue. The business serves regular bar fare and offers a wide selection of video games for its customers to play and enjoy. Recently Endgame recognized that it was unsatisfied with its

This thesis examines Endgame, a gaming themed bar and restaurant located in the heart of Tempe, Arizona on Mill Avenue. The business serves regular bar fare and offers a wide selection of video games for its customers to play and enjoy. Recently Endgame recognized that it was unsatisfied with its current revenue stream, prompting this investigative study. Upon completing this project, three business problems that are limiting Endgame's revenue growth were identified. The issues identified were: food sales, visibility/access, and alcohol sales. To better understand each of these issues a study was conducted in the form of ethnography research and a survey was distributed to Endgame's target market. Two instances of observational research were conducted and a survey was distributed to 400+ students in the W. P. Carey School of Business. The data collected revealed underlying sentiments about Endgame's food/beverage service and issues related to locating the bar. This investigation revealed that ordering food and beverages at Endgame is difficult and not a straight forward process. This led to a set of recommendations related to creating an efficient and simple ordering process. The study also showed that Endgame (which is on the second floor of a building) lacks the appropriate signage to indicate its location. Using this information, recommendations were made for Endgame to create additional signage near stairs and elevators to indicate their location. The research also revealed a general lack of consumer awareness in relation to alcoholic beverages that contributed to low sales. This led to a strategy to revitalize Endgame's marketing campaign and a redesign of their beverage menu. Outside of the three business problems found during observational research, several other areas were examined in the survey at the request of Endgame's management. These areas revealed additional understandings into consumer behavior and feelings towards Endgame. These customer insights along with the recommendations given in this paper will be used by Endgame to increase their overall business revenues.
ContributorsPaplham, Tyler James (Author) / Eaton, John (Thesis director) / Mokwa, Michael (Committee member) / Department of Information Systems (Contributor) / Department of Marketing (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135389-Thumbnail Image.png
Description
The ability to draft and develop productive Major League players is vital to the success of any MLB organization. A core of cost-controlled, productive players is as important as ever with free agent salaries continuing to rise dramatically. In a sport where mere percentage points separate winners from losers at

The ability to draft and develop productive Major League players is vital to the success of any MLB organization. A core of cost-controlled, productive players is as important as ever with free agent salaries continuing to rise dramatically. In a sport where mere percentage points separate winners from losers at the end of a long season, any slight advantage in identifying talent is valuable. This study examines the 2004-2008 MLB Amateur Drafts in order to analyze whether certain types of prospects are more valuable selections than others. If organizations can better identify which draft prospects will more likely contribute at the Major League level in the future, they can more optimally spend their allotted signing bonus pool in order to acquire as much potential production as possible through the draft. Based on the data examined, during these five drafts high school prospects provided higher value than college prospects. While college players reached the Majors at a higher rate, high school players produced greater value in their first six seasons of service time. In the all-important first round of the draft, where signing bonuses are at their largest, college players proved the more valuable selection. When players were separated by position, position players held greater expected value than pitchers, with corner infielders leading the way as the position group with the highest expected value. College players were found to provide better value than high school players at defensively demanding positions such as catcher and middle infield, while high school players were more valuable among outfielders and pitchers.
ContributorsGildea, Adam Joseph (Author) / Eaton, John (Thesis director) / McIntosh, Daniel (Committee member) / Department of Economics (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135324-Thumbnail Image.png
Description
The Clean Power Plan seeks to reduce CO2 emissions in the energy industry, which is the largest source of CO2 emissions in the United States. In order to comply with the Clean Power Plan, electric utilities in Arizona will need to meet the electricity demand while reducing the use of

The Clean Power Plan seeks to reduce CO2 emissions in the energy industry, which is the largest source of CO2 emissions in the United States. In order to comply with the Clean Power Plan, electric utilities in Arizona will need to meet the electricity demand while reducing the use of fossil fuel sources in generation. The study first outlines the organization of the power sector in the United States and the structural and price changes attempted in the industry during the period of restructuring. The recent final rule of the Clean Power Plan is then described in detail with a narrowed focus on Arizona. Data from APS, a representative utility of Arizona, is used for the remainder of the analysis to determine the price increase necessary to cut Arizona's CO2 emissions in order to meet the federal goal. The first regression models the variables which affect total demand and thus generation load, from which we estimate the marginal effect of price on demand. The second regression models CO2 emissions as a function of different levels of generation. This allows the effect of generation on emissions to fluctuate with ranges of load, following the logic of the merit order of plants and changing rates of emissions for different sources. Two methods are used to find the necessary percentage increase in price to meet the CPP goals: one based on the mass-based goal for Arizona and the other based on the percentage reduction for Arizona. Then a price increase is calculated for a projection into the future using known changes in energy supply.
ContributorsHerman, Laura Alexandra (Author) / Silverman, Daniel (Thesis director) / Kuminoff, Nicolai (Committee member) / Department of Economics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05