This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 10 of 440
Filtering by

Clear all filters

149993-Thumbnail Image.png
Description
Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's

Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's inherent quality. However, at times, there may be cues in the upstream test data that, if detected, could serve to predict the likelihood of downstream failure or performance degradation induced by product use or environmental stresses. This study explores the use of downstream factory test data or product field reliability data to infer data mining or pattern recognition criteria onto manufacturing process or upstream test data by means of support vector machines (SVM) in order to provide reliability prediction models. In concert with a risk/benefit analysis, these models can be utilized to drive improvement of the product or, at least, via screening to improve the reliability of the product delivered to the customer. Such models can be used to aid in reliability risk assessment based on detectable correlations between the product test performance and the sources of supply, test stands, or other factors related to product manufacture. As an enhancement to the usefulness of the SVM or hyperplane classifier within this context, L-moments and the Western Electric Company (WECO) Rules are used to augment or replace the native process or test data used as inputs to the classifier. As part of this research, a generalizable binary classification methodology was developed that can be used to design and implement predictors of end-item field failure or downstream product performance based on upstream test data that may be composed of single-parameter, time-series, or multivariate real-valued data. Additionally, the methodology provides input parameter weighting factors that have proved useful in failure analysis and root cause investigations as indicators of which of several upstream product parameters have the greater influence on the downstream failure outcomes.
ContributorsMosley, James (Author) / Morrell, Darryl (Committee member) / Cochran, Douglas (Committee member) / Papandreou-Suppappola, Antonia (Committee member) / Roberts, Chell (Committee member) / Spanias, Andreas (Committee member) / Arizona State University (Publisher)
Created2011
149778-Thumbnail Image.png
Description
Federal education policies call for school district leaders to promote classroom technology integration to prepare students with 21st century skills. However, schools are struggling to integrate technology effectively, with students often reporting that they feel like they need to power down and step back in time technologically when they enter

Federal education policies call for school district leaders to promote classroom technology integration to prepare students with 21st century skills. However, schools are struggling to integrate technology effectively, with students often reporting that they feel like they need to power down and step back in time technologically when they enter classrooms. The lack of meaningful technology use in classrooms indicates a need for increased teacher preparation. The purpose of this study was to investigate the impact a coaching model of professional development had on school administrators` abilities to increase middle school teachers` technology integration in their classrooms. This study attempted to coach administrators to develop and articulate a vision, cultivate a culture, and model instruction relative to the meaningful use of instructional technology. The study occurred in a middle school. Data for this case study were collected via administrator interviews, the Principal`s Computer Technology Survey, structured observations using the Higher Order Thinking, Engaged Learning, Authentic Learning, Technology Use protocol, field notes, the Technology Integration Matrix, teacher interviews, and a research log. Findings concluded that cultivating change in an organization is a complex process that requires commitment over an extended period of time. The meaningful use of instructional technology remained minimal at the school during fall 2010. My actions as a change agent informed the school`s administrators about the role meaningful use of technology can play in instruction. Limited professional development, administrative vision, and expectations minimized the teachers` meaningful use of instructional technology; competing priorities and limited time minimized the administrators` efforts to improve the meaningful use of instructional technology. Realizing that technology proficient teachers contribute to student success with technology, it may be wise for administrators to incorporate technology-enriched professional development and exercise their leadership abilities to promote meaningful technology use in classrooms.
ContributorsRobertson, Kristen (Author) / Moore, David (Thesis advisor) / Cheatham, Greg (Committee member) / Catalano, Ruth (Committee member) / Arizona State University (Publisher)
Created2011
149643-Thumbnail Image.png
Description
Infectious diseases are a leading cause of death worldwide. With the development of drugs, vaccines and antibiotics, it was believed that for the first time in human history diseases would no longer be a major cause of mortality. Newly emerging diseases, re-emerging diseases and the emergence of microorganisms resistant to

Infectious diseases are a leading cause of death worldwide. With the development of drugs, vaccines and antibiotics, it was believed that for the first time in human history diseases would no longer be a major cause of mortality. Newly emerging diseases, re-emerging diseases and the emergence of microorganisms resistant to existing treatment have forced us to re-evaluate our optimistic perspective. In this study, a simple mathematical framework for super-infection is considered in order to explore the transmission dynamics of drug-resistance. Through its theoretical analysis, we identify the conditions necessary for the coexistence between sensitive strains and drug-resistant strains. Farther, in order to investigate the effectiveness of control measures, the model is extended so as to include vaccination and treatment. The impact that these preventive and control measures may have on its disease dynamics is evaluated. Theoretical results being confirmed via numerical simulations. Our theoretical results on two-strain drug-resistance models are applied in the context of Malaria, antimalarial drugs, and the administration of a possible partially effective vaccine. The objective is to develop a monitoring epidemiological framework that help evaluate the impact of antimalarial drugs and partially-effective vaccine in reducing the disease burden at the population level. Optimal control theory is applied in the context of this framework in order to assess the impact of time dependent cost-effective treatment efforts. It is shown that cost-effective combinations of treatment efforts depend on the population size, cost of implementing treatment controls, and the parameters of the model. We use these results to identify optimal control strategies for several scenarios.
ContributorsUrdapilleta, Alicia (Author) / Castillo-Chavez, Carlos (Thesis advisor) / Wang, Xiaohong (Thesis advisor) / Wirkus, Stephen (Committee member) / Camacho, Erika (Committee member) / Arizona State University (Publisher)
Created2011
150414-Thumbnail Image.png
Description
This study follows three secondary teachers as they facilitate a digital storytelling project with their students for the first time. All three teachers were not specifically trained in digital storytelling in order to investigate what happens when a digital storytelling novice tries to do a project like this with his

This study follows three secondary teachers as they facilitate a digital storytelling project with their students for the first time. All three teachers were not specifically trained in digital storytelling in order to investigate what happens when a digital storytelling novice tries to do a project like this with his or her students. The study follows two high school English teachers and one middle school math teacher. Each teacher's experience is shared in a case study, and all three case studies are compared and contrasted in a cross-case analysis. There is a discussion of the types of projects the teachers conducted and any challenges they faced. Strategies to overcome the challenges are also included. A variety of assessment rubrics are included in the appendix. In the review of literature, the history of digital storytelling is illuminated, as are historical concepts of literacy. There is also an exploration of twenty-first century skills including multiliteracies such as media and technology literacy. Both the teachers and their students offer suggestions to future teachers taking on digital storytelling projects. The dissertation ends with a discussion of future scholarship in educational uses of digital storytelling.
ContributorsGordon, Corrine (Author) / Blasingame, James (Thesis advisor) / Nilsen, Alleen P (Committee member) / Early, Jessica (Committee member) / Marsh, Josephine (Committee member) / Arizona State University (Publisher)
Created2011
150418-Thumbnail Image.png
Description
Diseases have been part of human life for generations and evolve within the population, sometimes dying out while other times becoming endemic or the cause of recurrent outbreaks. The long term influence of a disease stems from different dynamics within or between pathogen-host, that have been analyzed and studied by

Diseases have been part of human life for generations and evolve within the population, sometimes dying out while other times becoming endemic or the cause of recurrent outbreaks. The long term influence of a disease stems from different dynamics within or between pathogen-host, that have been analyzed and studied by many researchers using mathematical models. Co-infection with different pathogens is common, yet little is known about how infection with one pathogen affects the host's immunological response to another. Moreover, no work has been found in the literature that considers the variability of the host immune health or that examines a disease at the population level and its corresponding interconnectedness with the host immune system. Knowing that the spread of the disease in the population starts at the individual level, this thesis explores how variability in immune system response within an endemic environment affects an individual's vulnerability, and how prone it is to co-infections. Immunology-based models of Malaria and Tuberculosis (TB) are constructed by extending and modifying existing mathematical models in the literature. The two are then combined to give a single nine-variable model of co-infection with Malaria and TB. Because these models are difficult to gain any insight analytically due to the large number of parameters, a phenomenological model of co-infection is proposed with subsystems corresponding to the individual immunology-based model of a single infection. Within this phenomenological model, the variability of the host immune health is also incorporated through three different pathogen response curves using nonlinear bounded Michaelis-Menten functions that describe the level or state of immune system (healthy, moderate and severely compromised). The immunology-based models of Malaria and TB give numerical results that agree with the biological observations. The Malaria--TB co-infection model gives reasonable results and these suggest that the order in which the two diseases are introduced have an impact on the behavior of both. The subsystems of the phenomenological models that correspond to a single infection (either of Malaria or TB) mimic much of the observed behavior of the immunology-based counterpart and can demonstrate different behavior depending on the chosen pathogen response curve. In addition, varying some of the parameters and initial conditions in the phenomenological model yields a range of topologically different mathematical behaviors, which suggests that this behavior may be able to be observed in the immunology-based models as well. The phenomenological models clearly replicate the qualitative behavior of primary and secondary infection as well as co-infection. The mathematical solutions of the models correspond to the fundamental states described by immunologists: virgin state, immune state and tolerance state. The phenomenological model of co-infection also demonstrates a range of parameter values and initial conditions in which the introduction of a second disease causes both diseases to grow without bound even though those same parameters and initial conditions did not yield unbounded growth in the corresponding subsystems. This results applies to all three states of the host immune system. In terms of the immunology-based system, this would suggest the following: there may be parameter values and initial conditions in which a person can clear Malaria or TB (separately) from their system but in which the presence of both can result in the person dying of one of the diseases. Finally, this thesis studies links between epidemiology (population level) and immunology in an effort to assess the impact of pathogen's spread within the population on the immune response of individuals. Models of Malaria and TB are proposed that incorporate the immune system of the host into a mathematical model of an epidemic at the population level.
ContributorsSoho, Edmé L (Author) / Wirkus, Stephen (Thesis advisor) / Castillo-Chavez, Carlos (Thesis advisor) / Chowell-Puente, Gerardo (Committee member) / Arizona State University (Publisher)
Created2011
148105-Thumbnail Image.png
Description

In this creative thesis project I use digital “scrolleytelling” (an interactive scroll-based storytelling) to investigate diversity & inclusion at big tech companies. I wanted to know why diversity numbers were flatlining at Facebook, Apple, Amazon, Microsoft and Google, and took a data journalism approach to explore the relationship between what

In this creative thesis project I use digital “scrolleytelling” (an interactive scroll-based storytelling) to investigate diversity & inclusion at big tech companies. I wanted to know why diversity numbers were flatlining at Facebook, Apple, Amazon, Microsoft and Google, and took a data journalism approach to explore the relationship between what corporations were saying versus what they were doing. Finally, I critiqued diversity and inclusion by giving examples of how the current way we are addressing D&I is not fixing the problem.

ContributorsBrust, Jiaying Eliza (Author) / Coleman, Grisha (Thesis director) / Tinapple, David (Committee member) / Arts, Media and Engineering Sch T (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148125-Thumbnail Image.png
Description

In recent years, advanced metrics have dominated the game of Major League Baseball. One such metric, the Pythagorean Win-Loss Formula, is commonly used by fans, reporters, analysts and teams alike to use a team’s runs scored and runs allowed to estimate their expected winning percentage. However, this method is not

In recent years, advanced metrics have dominated the game of Major League Baseball. One such metric, the Pythagorean Win-Loss Formula, is commonly used by fans, reporters, analysts and teams alike to use a team’s runs scored and runs allowed to estimate their expected winning percentage. However, this method is not perfect, and shows notable room for improvement. One such area that could be improved is its ability to be affected drastically by a single blowout game, a game in which one team significantly outscores their opponent.<br/>We hypothesize that meaningless runs scored in blowouts are harming the predictive power of Pythagorean Win-Loss and similar win expectancy statistics such as the Linear Formula for Baseball and BaseRuns. We developed a win probability-based cutoff approach that tallied the score of each game once a certain win probability threshold was passed, effectively removing those meaningless runs from a team’s season-long runs scored and runs allowed totals. These truncated totals were then inserted into the Pythagorean Win-Loss and Linear Formulas and tested against the base models.<br/>The preliminary results show that, while certain runs are more meaningful than others depending on the situation in which they are scored, the base models more accurately predicted future record than our truncated versions. For now, there is not enough evidence to either confirm or reject our hypothesis. In this paper, we suggest several potential improvement strategies for the results.<br/>At the end, we address how these results speak to the importance of responsibility and restraint when using advanced statistics within reporting.

ContributorsIversen, Joshua Allen (Author) / Satpathy, Asish (Thesis director) / Kurland, Brett (Committee member) / Department of Information Systems (Contributor) / Walter Cronkite School of Journalism and Mass Comm (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

A journalistic, first-person narrative going through the lessons learned from travel. The story is complemented by a series of photos from childhood to the present all uploaded to a Wix-based website.

Created2021-05
150224-Thumbnail Image.png
Description
Lots of previous studies have analyzed human tutoring at great depths and have shown expert human tutors to produce effect sizes, which is twice of that produced by an intelligent tutoring system (ITS). However, there has been no consensus on which factor makes them so effective. It is important to

Lots of previous studies have analyzed human tutoring at great depths and have shown expert human tutors to produce effect sizes, which is twice of that produced by an intelligent tutoring system (ITS). However, there has been no consensus on which factor makes them so effective. It is important to know this, so that same phenomena can be replicated in an ITS in order to achieve the same level of proficiency as expert human tutors. Also, to the best of my knowledge no one has looked at student reactions when they are working with a computer based tutor. The answers to both these questions are needed in order to build a highly effective computer-based tutor. My research focuses on the second question. In the first phase of my thesis, I analyzed the behavior of students when they were working with a step-based tutor Andes, using verbal-protocol analysis. The accomplishment of doing this was that I got to know of some ways in which students use a step-based tutor which can pave way for the creation of more effective computer-based tutors. I found from the first phase of the research that students often keep trying to fix errors by guessing repeatedly instead of asking for help by clicking the hint button. This phenomenon is known as hint refusal. Surprisingly, a large portion of the student's foundering was due to hint refusal. The hypothesis tested in the second phase of the research is that hint refusal can be significantly reduced and learning can be significantly increased if Andes uses more unsolicited hints and meta hints. An unsolicited hint is a hint that is given without the student asking for one. A meta-hint is like an unsolicited hint in that it is given without the student asking for it, but it just prompts the student to click on the hint button. Two versions of Andes were compared: the original version and a new version that gave more unsolicited and meta-hints. During a two-hour experiment, there were large, statistically reliable differences in several performance measures suggesting that the new policy was more effective.
ContributorsRanganathan, Rajagopalan (Author) / VanLehn, Kurt (Thesis advisor) / Atkinson, Robert (Committee member) / Burleson, Winslow (Committee member) / Arizona State University (Publisher)
Created2011
150153-Thumbnail Image.png
Description
A new method of adaptive mesh generation for the computation of fluid flows is investigated. The method utilizes gradients of the flow solution to adapt the size and stretching of elements or volumes in the computational mesh as is commonly done in the conventional Hessian approach. However, in

A new method of adaptive mesh generation for the computation of fluid flows is investigated. The method utilizes gradients of the flow solution to adapt the size and stretching of elements or volumes in the computational mesh as is commonly done in the conventional Hessian approach. However, in the new method, higher-order gradients are used in place of the Hessian. The method is applied to the finite element solution of the incompressible Navier-Stokes equations on model problems. Results indicate that a significant efficiency benefit is realized.
ContributorsShortridge, Randall (Author) / Chen, Kang Ping (Thesis advisor) / Herrmann, Marcus (Thesis advisor) / Wells, Valana (Committee member) / Huang, Huei-Ping (Committee member) / Mittelmann, Hans (Committee member) / Arizona State University (Publisher)
Created2011