Matching Items (62)
Filtering by

Clear all filters

134164-Thumbnail Image.png
Description
The nonprofit sector has experienced exponential growth in recent decades, thus creating a separate industry for nonprofits—an industry that requires education and training to run efficiently and successfully. As a result, Nonprofit Management Education (NME) at both graduate and undergraduate levels has steadily increased in number and demand. Recent changes

The nonprofit sector has experienced exponential growth in recent decades, thus creating a separate industry for nonprofits—an industry that requires education and training to run efficiently and successfully. As a result, Nonprofit Management Education (NME) at both graduate and undergraduate levels has steadily increased in number and demand. Recent changes in the political climate and changes in the government funding present new challenges to nonprofit professionals, thus enhancing the value of specific NME to prepare professionals for these challenges. To leverage NME and ensure that students are adequately prepared for these challenges, it is important to design curriculum that addresses the needs of the growing nonprofit industry. The Nonprofit Academic Center of Councils is the creator of the NACC Curricular Guidelines, which are currently used as a model all NME curricula should emulate. This study utilizes Arizona State University (ASU) to compare its current curriculum model to the NACC Curricular Guidelines, as well as the current challenges facing the nonprofit sector. In so doing, this study will provide an in-depth overview of NME at ASU through 1) focus groups of nonprofit leaders; 2) survey data from former students; and 3) curriculum mapping.

The comprehensive results indicated areas of opportunity for both ASU and the NACC Curricular Guidelines. According to the feedback of students, nonprofit professionals, and the current state of the ASU curriculum, ASU may wish to increase emphasis on Financial Management, Managing Staff and Volunteers, Assessment, Evaluation, and Decision Making, and Leading and Managing Nonprofit Organizations. After considering feedback from nonprofit professionals, NACC may consider amending some new competencies that reflect an emphasis on collective impact, cross sector leadership, or relationship building and the use of technology for nonprofit impact. The research team recommends accomplishing these changes through enhancing pedagogy by including case studies and an integrated curriculum into the ASU NME program. by applying the suggested changes to both the ASU curriculum and the NACC guidelines, this research prepares both ASU and NACC towards the process of accreditation and formalizing the NLM degree on a national level.
ContributorsFindlay, Molly Rebecca (Author) / Legg, Eric (Thesis director) / Ashcraft, Robert (Committee member) / Department of Information Systems (Contributor) / School of Community Resources and Development (Contributor) / Barrett, The Honors College (Contributor)
Created2017-12
136722-Thumbnail Image.png
Description
This thesis, entitled "A Community Perspective on Alcohol Education," was conducted over a ten month period during the Spring 2014 and Fall 2014 semesters, composed by Christopher Stuller and Nicholas Schmitzer. The research involved interviewing twelve professionals from Arizona State University and the City of Tempe to gather a holistic

This thesis, entitled "A Community Perspective on Alcohol Education," was conducted over a ten month period during the Spring 2014 and Fall 2014 semesters, composed by Christopher Stuller and Nicholas Schmitzer. The research involved interviewing twelve professionals from Arizona State University and the City of Tempe to gather a holistic view on alcohol education and alcohol safety as it involves the students at ASU. Upon completion of the interviews, recommendations were made regarding areas of improvement for alcohol education and alcohol safety at Arizona State University. These recommendations range from creating a mandatory alcohol education class to passing a Guardian Angel Law to creating a national network of alcohol education best practices. Through this thesis, the authors hope to prevent future alcohol related injuries, deaths, and tragedies. For the final display of this thesis a website was created. For the ease of reading, all information has been presented in text format.
ContributorsSchmitzer, Nicholas (Co-author) / Stuller, Christopher (Co-author) / Koretz, Lora (Thesis director) / Scott Lynch, Jacquelyn (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / School of Accountancy (Contributor) / Department of Supply Chain Management (Contributor)
Created2014-12
136587-Thumbnail Image.png
Description
In the words of W. Edwards Deming, "the central problem in management and in leadership is failure to understand the information in variation." While many quality management programs propose the institution of technical training in advanced statistical methods, this paper proposes that by understanding the fundamental information behind statistical theory,

In the words of W. Edwards Deming, "the central problem in management and in leadership is failure to understand the information in variation." While many quality management programs propose the institution of technical training in advanced statistical methods, this paper proposes that by understanding the fundamental information behind statistical theory, and by minimizing bias and variance while fully utilizing the available information about the system at hand, one can make valuable, accurate predictions about the future. Combining this knowledge with the work of quality gurus W. E. Deming, Eliyahu Goldratt, and Dean Kashiwagi, a framework for making valuable predictions for continuous improvement is made. After this information is synthesized, it is concluded that the best way to make accurate, informative predictions about the future is to "balance the present and future," seeing the future through the lens of the present and thus minimizing bias, variance, and risk.
ContributorsSynodis, Nicholas Dahn (Author) / Kashiwagi, Dean (Thesis director, Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136550-Thumbnail Image.png
Description
The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team

The NFL is one of largest and most influential industries in the world. In America there are few companies that have a stronger hold on the American culture and create such a phenomena from year to year. In this project aimed to develop a strategy that helps an NFL team be as successful as possible by defining which positions are most important to a team's success. Data from fifteen years of NFL games was collected and information on every player in the league was analyzed. First there needed to be a benchmark which describes a team as being average and then every player in the NFL must be compared to that average. Based on properties of linear regression using ordinary least squares this project aims to define such a model that shows each position's importance. Finally, once such a model had been established then the focus turned to the NFL draft in which the goal was to find a strategy of where each position needs to be drafted so that it is most likely to give the best payoff based on the results of the regression in part one.
ContributorsBalzer, Kevin Ryan (Author) / Goegan, Brian (Thesis director) / Dassanayake, Maduranga (Committee member) / Barrett, The Honors College (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
135858-Thumbnail Image.png
Description
The concentration factor edge detection method was developed to compute the locations and values of jump discontinuities in a piecewise-analytic function from its first few Fourier series coecients. The method approximates the singular support of a piecewise smooth function using an altered Fourier conjugate partial sum. The accuracy and characteristic

The concentration factor edge detection method was developed to compute the locations and values of jump discontinuities in a piecewise-analytic function from its first few Fourier series coecients. The method approximates the singular support of a piecewise smooth function using an altered Fourier conjugate partial sum. The accuracy and characteristic features of the resulting jump function approximation depends on these lters, known as concentration factors. Recent research showed that that these concentration factors could be designed using aexible iterative framework, improving upon the overall accuracy and robustness of the method, especially in the case where some Fourier data are untrustworthy or altogether missing. Hypothesis testing methods were used to determine how well the original concentration factor method could locate edges using noisy Fourier data. This thesis combines the iterative design aspect of concentration factor design and hypothesis testing by presenting a new algorithm that incorporates multiple concentration factors into one statistical test, which proves more ective at determining jump discontinuities than the previous HT methods. This thesis also examines how the quantity and location of Fourier data act the accuracy of HT methods. Numerical examples are provided.
ContributorsLubold, Shane Michael (Author) / Gelb, Anne (Thesis director) / Cochran, Doug (Committee member) / Viswanathan, Aditya (Committee member) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136286-Thumbnail Image.png
Description
This piece aims to discuss the roles of emerging geographies within the context of global supply chains, approaching the conversation with a "systems" view, emphasizing three key facets essential to a holistic and interdisciplinary environmental analysis: -The Implications of Governmental & Economic Activities -Supply Chain Enablement Activities, Risk Mitigation in

This piece aims to discuss the roles of emerging geographies within the context of global supply chains, approaching the conversation with a "systems" view, emphasizing three key facets essential to a holistic and interdisciplinary environmental analysis: -The Implications of Governmental & Economic Activities -Supply Chain Enablement Activities, Risk Mitigation in Emerging Nations -Implications Regarding Sustainability, Corporate Social Responsibility In the appreciation of the interdisciplinary implications that stem from participation in global supply networks, supply chain professionals can position their firms for continued success in the proactive construction of robust and resilient supply chains. Across industries, how will supply networks in emerging geographies continue to evolve? Appreciating the inherent nuances related to the political and economic climate of a region, the extent to which enablement activities must occur, and sustainability/CSR tie-ins will be key to acquire this understanding. This deliverable aims to leverage the work of philosophers, researchers and business personnel as these questions are explored. The author will also introduce a novel method of teaching (IMRS) in the undergraduate business classroom that challenges the students to integrate their prior experiences both in the classroom and in the business world as they learn to craft locally relevant solutions to solve complex global problems.
ContributorsVaney, Rachel Lee (Author) / Maltz, Arnold (Thesis director) / Kellso, James (Committee member) / Barrett, The Honors College (Contributor) / Department of Supply Chain Management (Contributor) / Department of Information Systems (Contributor)
Created2015-05
136255-Thumbnail Image.png
Description
Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that hel

Over the course of six months, we have worked in partnership with Arizona State University and a leading producer of semiconductor chips in the United States market (referred to as the "Company"), lending our skills in finance, statistics, model building, and external insight. We attempt to design models that help predict how much time it takes to implement a cost-saving project. These projects had previously been considered only on the merit of cost savings, but with an added dimension of time, we hope to forecast time according to a number of variables. With such a forecast, we can then apply it to an expense project prioritization model which relates time and cost savings together, compares many different projects simultaneously, and returns a series of present value calculations over different ranges of time. The goal is twofold: assist with an accurate prediction of a project's time to implementation, and provide a basis to compare different projects based on their present values, ultimately helping to reduce the Company's manufacturing costs and improve gross margins. We believe this approach, and the research found toward this goal, is most valuable for the Company. Two coaches from the Company have provided assistance and clarified our questions when necessary throughout our research. In this paper, we begin by defining the problem, setting an objective, and establishing a checklist to monitor our progress. Next, our attention shifts to the data: making observations, trimming the dataset, framing and scoping the variables to be used for the analysis portion of the paper. Before creating a hypothesis, we perform a preliminary statistical analysis of certain individual variables to enrich our variable selection process. After the hypothesis, we run multiple linear regressions with project duration as the dependent variable. After regression analysis and a test for robustness, we shift our focus to an intuitive model based on rules of thumb. We relate these models to an expense project prioritization tool developed using Microsoft Excel software. Our deliverables to the Company come in the form of (1) a rules of thumb intuitive model and (2) an expense project prioritization tool.
ContributorsAl-Assi, Hashim (Co-author) / Chiang, Robert (Co-author) / Liu, Andrew (Co-author) / Ludwick, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor) / School of Accountancy (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / WPC Graduate Programs (Contributor)
Created2015-05
133941-Thumbnail Image.png
Description
A thorough understanding of the key concepts of logic is critical for student success. Logic is often not explicitly taught as its own subject in modern curriculums, which results in misconceptions among students as to what comprises logical reasoning. In addition, current standardized testing schemes often promote teaching styles which

A thorough understanding of the key concepts of logic is critical for student success. Logic is often not explicitly taught as its own subject in modern curriculums, which results in misconceptions among students as to what comprises logical reasoning. In addition, current standardized testing schemes often promote teaching styles which emphasize students' abilities to memorize set problem-solving methods over their capacities to reason abstractly and creatively. These phenomena, in tandem with halting progress in United States education compared to other developed nations, suggest that implementing logic courses into public schools and universities can better prepare students for professional careers and beyond. In particular, logic is essential for mathematics students as they transition from calculation-based courses to theoretical, proof-based classes. Many students find this adjustment difficult, and existing university-level courses which emphasize the technical aspects of symbolic logic do not fully bridge the gap between these two different approaches to mathematics. As a step towards resolving this problem, this project proposes a logic course which integrates historical, technical, and interdisciplinary investigations to present logic as a robust and meaningful subject warranting independent study. This course is designed with mathematics students in mind, with particular stresses on different formulations of deductively valid proof schemes. Additionally, this class can either be taught before existing logic classes in an effort to gradually expose students to logic over an extended period of time, or it can replace current logic courses as a more holistic introduction to the subject. The first section of the course investigates historical developments in studies of argumentation and logic throughout different civilizations; specifically, the works of ancient China, ancient India, ancient Greece, medieval Europe, and modernity are investigated. Along the way, several important themes are highlighted within appropriate historical contexts; these are often presented in an ad hoc way in courses emphasizing technical features of symbolic logic. After the motivations for modern symbolic logic are established, the key technical features of symbolic logic are presented, including: logical connectives, truth tables, logical equivalence, derivations, predicates, and quantifiers. Potential obstacles in students' understandings of these ideas are anticipated, and resolution methods are proposed. Finally, examples of how ideas of symbolic logic are manifested in many modern disciplines are presented. In particular, key concepts in game theory, computer science, biology, grammar, and mathematics are reformulated in the context of symbolic logic. By combining the three perspectives of historical context, technical aspects, and practical applications of symbolic logic, this course will ideally make logic a more meaningful and accessible subject for students.
ContributorsRyba, Austin (Author) / Vaz, Paul (Thesis director) / Jones, Donald (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of Historical, Philosophical and Religious Studies (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133957-Thumbnail Image.png
Description
Coherent vortices are ubiquitous structures in natural flows that affect mixing and transport of substances and momentum/energy. Being able to detect these coherent structures is important for pollutant mitigation, ecological conservation and many other aspects. In recent years, mathematical criteria and algorithms have been developed to extract these coherent structures

Coherent vortices are ubiquitous structures in natural flows that affect mixing and transport of substances and momentum/energy. Being able to detect these coherent structures is important for pollutant mitigation, ecological conservation and many other aspects. In recent years, mathematical criteria and algorithms have been developed to extract these coherent structures in turbulent flows. In this study, we will apply these tools to extract important coherent structures and analyze their statistical properties as well as their implications on kinematics and dynamics of the flow. Such information will aide representation of small-scale nonlinear processes that large-scale models of natural processes may not be able to resolve.
ContributorsCass, Brentlee Jerry (Author) / Tang, Wenbo (Thesis director) / Kostelich, Eric (Committee member) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
132832-Thumbnail Image.png
Description
Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price

Exchange traded funds (ETFs) in many ways are similar to more traditional closed-end mutual funds, although thee differ in a crucial way. ETFs rely on a creation and redemption feature to achieve their functionality and this mechanism is designed to minimize the deviations that occur between the ETF’s listed price and the net asset value of the ETF’s underlying assets. However while this does cause ETF deviations to be generally lower than their mutual fund counterparts, as our paper explores this process does not eliminate these deviations completely. This article builds off an earlier paper by Engle and Sarkar (2006) that investigates these properties of premiums (discounts) of ETFs from their fair market value. And looks to see if these premia have changed in the last 10 years. Our paper then diverges from the original and takes a deeper look into the standard deviations of these premia specifically.

Our findings show that over 70% of an ETFs standard deviation of premia can be explained through a linear combination consisting of two variables: a categorical (Domestic[US], Developed, Emerging) and a discrete variable (time-difference from US). This paper also finds that more traditional metrics such as market cap, ETF price volatility, and even 3rd party market indicators such as the economic freedom index and investment freedom index are insignificant predictors of an ETFs standard deviation of premia when combined with the categorical variable. These findings differ somewhat from existing literature which indicate that these factors should have a significant impact on the predictive ability of an ETFs standard deviation of premia.
ContributorsZhang, Jingbo (Co-author, Co-author) / Henning, Thomas (Co-author) / Simonson, Mark (Thesis director) / Licon, L. Wendell (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05