Matching Items (37)
151776-Thumbnail Image.png
Description
This is a study of scientific realism, and of the extent to which it is undermined by objections that have been raised by advocates of various forms of antirealism. I seek to develop and present a version of scientific realism that improves on past formulations, and then to show that

This is a study of scientific realism, and of the extent to which it is undermined by objections that have been raised by advocates of various forms of antirealism. I seek to develop and present a version of scientific realism that improves on past formulations, and then to show that standard antirealist arguments against it do not succeed. In this paper, I will first present my formulation of scientific realism, which conceives of theories as model-based and as fundamentally non-linguistic. I advocate an epistemic position that accords with indirect realism, and I review and assess the threat posed by theses of underdetermination. Next, I review and discuss three important views: the antirealist constructivist view of Thomas Kuhn, the realist view of Norwood Hanson, and the antirealist constructive empiricist view of Bas van Fraassen. I find merits and flaws in all three views. In the course of those discussions, I develop the theme that antirealists' arguments generally depend on assumptions that are open to question, especially from the perspective of the version of realism I advocate. I further argue that these antirealist views are undermined by their own tacit appeals to realism.
ContributorsNovack, Alexander Dion (Author) / Armendt, Brad (Thesis advisor) / Creath, Richard (Committee member) / French, Peter (Committee member) / Arizona State University (Publisher)
Created2013
152152-Thumbnail Image.png
Description
The academic literature on science communication widely acknowledges a problem: science communication between experts and lay audiences is important, but it is not done well. General audience popular science books, however, carry a reputation for clear science communication and are understudied in the academic literature. For this doctoral dissertation, I

The academic literature on science communication widely acknowledges a problem: science communication between experts and lay audiences is important, but it is not done well. General audience popular science books, however, carry a reputation for clear science communication and are understudied in the academic literature. For this doctoral dissertation, I utilize Sam Harris's The Moral Landscape, a general audience science book on the particularly thorny topic of neuroscientific approaches to morality, as a case-study to explore the possibility of using general audience science books as models for science communication more broadly. I conduct a literary analysis of the text that delimits the scope of its project, its intended audience, and the domains of science to be communicated. I also identify seven literary aspects of the text: three positive aspects that facilitate clarity and four negative aspects that interfere with lay public engagement. I conclude that The Moral Landscape relies on an assumed knowledge base and intuitions of its audience that cannot reasonably be expected of lay audiences; therefore, it cannot properly be construed as popular science communication. It nevertheless contains normative lessons for the broader science project, both in literary aspects to be salvaged and literary aspects and concepts to consciously be avoided and combated. I note that The Moral Landscape's failings can also be taken as an indication that typical descriptions of science communication offer under-detailed taxonomies of both audiences for science communication and the varieties of science communication aimed at those audiences. Future directions of study include rethinking appropriate target audiences for science literacy projects and developing a more discriminating taxonomy of both science communication and lay publics.
ContributorsJohnson, Nathan W (Author) / Robert, Jason S (Thesis advisor) / Creath, Richard (Committee member) / Martinez, Jacqueline (Committee member) / Sylvester, Edward (Committee member) / Lynch, John (Committee member) / Arizona State University (Publisher)
Created2013
152156-Thumbnail Image.png
Description
Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments

Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments in cell degeneration research with the four major chapters, I trace the emergence of the degenerating cell as a scientific object, describe the generations of a variety of concepts, interpretations and usages associated with cell death and aging, and analyze the transforming influences of the rising cell degeneration research. Particularly, the four chapters show how the changing scientific practices about cellular life in embryology, cell culture, aging research, and molecular biology of Caenorhabditis elegans shaped the interpretations about cell degeneration in the twentieth-century as life-shaping, limit-setting, complex, yet regulated. These events created and consolidated important concepts in life sciences such as programmed cell death, the Hayflick limit, apoptosis, and death genes. These cases also transformed the material and epistemic practices about the end of cellular life subsequently and led to the formations of new research communities. The four cases together show the ways cell degeneration became a shared subject between molecular cell biology, developmental biology, gerontology, oncology, and pathology of degenerative diseases. These practices and perspectives created a special kind of interconnectivity between different fields and led to a level of interdisciplinarity within cell degeneration research by the early 1990s.
ContributorsJiang, Lijing (Author) / Maienschein, Jane (Thesis advisor) / Laubichler, Manfred (Thesis advisor) / Hurlbut, James (Committee member) / Creath, Richard (Committee member) / White, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151297-Thumbnail Image.png
Description
The present essay addresses the epistemic difficulties involved in achieving consensus with respect to the Hayek-Keynes debate. In particular, it is argued that the debate cannot be settled on the basis of the observable evidence; or, more precisely, that the empirical implications of the theories of Hayek and Keynes are

The present essay addresses the epistemic difficulties involved in achieving consensus with respect to the Hayek-Keynes debate. In particular, it is argued that the debate cannot be settled on the basis of the observable evidence; or, more precisely, that the empirical implications of the theories of Hayek and Keynes are such that, regardless of what is observed, both of the theories can be interpreted as true, or at least, not falsified. Regardless of the evidence, both Hayek and Keynes can be interpreted as right. The underdetermination of theories by evidence is an old and ubiquitous problem in science. The present essay makes explicit the respects in which the empirical evidence underdetermines the choice between the theories of Hayek and Keynes. In particular, it is argued both that there are convenient responses one can offer that protect each theory from what appears to be threatening evidence (i.e., that the choice between the two theories is underdetermined in the holist sense) and that, for particular kinds of evidence, the two theories are empirically equivalent (i.e., with respect to certain kinds of evidence, the choice between the two theories is underdetermined in the contrastive sense).
ContributorsScheall, Scott (Author) / Creath, Richard (Thesis advisor) / Armendt, Brad (Committee member) / French, Peter (Committee member) / Arizona State University (Publisher)
Created2012
150965-Thumbnail Image.png
Description
Gays identity is usually cast in generics--statements about an indeterminate number of members in a given category. Sometimes these generic statements often get built up into folk definitions, vague and imprecise ways to talk about objects. Other times generics get co-opted into authentic definitions, definitions that pick out a few

Gays identity is usually cast in generics--statements about an indeterminate number of members in a given category. Sometimes these generic statements often get built up into folk definitions, vague and imprecise ways to talk about objects. Other times generics get co-opted into authentic definitions, definitions that pick out a few traits and assert that real members of the class have these traits and members that do not are simply members by a technicality. I assess how we adopt these generic traits into our language and what are the ramifications of using generic traits as a social identity. I analyze the use of authentic definitions in Queer Theory, particularly Michael Warner's use of authentic traits to define a normative Queer identity. I do not just simply focus on what are the effects, but how these folk or authentic definitions gain currency and, furthermore, how can they be changed. I conclude with an analytic account of what it means to be gay and argue that such an account will undercut many of the problems associated with folk or authentic definitions about being gay.
ContributorsBlankschaen, Kurt (Author) / Calhoun, Cheshire (Thesis advisor) / Pinillos, Angel (Committee member) / Creath, Richard (Committee member) / Arizona State University (Publisher)
Created2012
136768-Thumbnail Image.png
Description
Influenza has shown its potential to affect and even kill millions of people within an extremely short time frame, yet studies and surveys show that the general public is not well educated about the facts about influenza, including prevention and treatment. For this reason, public perception of influenza is extremely

Influenza has shown its potential to affect and even kill millions of people within an extremely short time frame, yet studies and surveys show that the general public is not well educated about the facts about influenza, including prevention and treatment. For this reason, public perception of influenza is extremely skewed, with people generally not taking the disease as seriously as they should given its severity. To investigate the inconsistencies between action and awareness of best available knowledge regarding influenza, this study conducted literature review and a survey of university students about their knowledge, perceptions, and action taken in relationship to influenza. Due to their dense living quarters, constant daily interactions, and mindset that they are "immune" to fairly common diseases like influenza, university students are a representative sample of urban populations. According to the World Health Organization (WHO), 54% of the world's population lived in cities as of 2014 (Urban population growth). Between 2015 and 2020, the global urban population is expected to grow 1.84% per year, 1.63% between 2020 and 2025, and 1.44% between 2025 and 2030 (Urban population growth). Similar projections estimate that by 2017, an overwhelming majority of the world's population, even in less developed countries, will be living in cities (Urban population growth). Results of this study suggest possible reasons for the large gap between best available knowledge and the perceptions and actions of individuals on the other hand. This may lead to better-oriented influenza education initiatives, more effective prevention and treatment plans, and generally raise excitement and awareness surrounding public health and scientific communication.
ContributorsGur-Arie, Rachel Ellen Haviva (Author) / Maienschein, Jane (Thesis director) / Laubichler, Manfred (Committee member) / Creath, Richard (Committee member) / Barrett, The Honors College (Contributor) / School of Life Sciences (Contributor)
Created2014-12
136777-Thumbnail Image.png
Description
Influenza is a viral infection with the potential to infect millions worldwide. In the case of such a pandemic outbreak, direct patient interaction is handled by the medical community, composed of hospitals, medical professionals, and the policies that regulate them. The medical community is responsible not only for treating infected

Influenza is a viral infection with the potential to infect millions worldwide. In the case of such a pandemic outbreak, direct patient interaction is handled by the medical community, composed of hospitals, medical professionals, and the policies that regulate them. The medical community is responsible not only for treating infected individuals, but preventing the spread of influenza to healthy individuals. Given this responsibility, the medical community has drafted preparedness plans laying down guidelines for action in the case of an influenza pandemic. This project reviewed these preparedness plans for hospitals in Arizona as well as reviewing the literature produced by the Department of Health and Human Services to guide these plans. The review revealed that the medical community is woefully unprepared to handle the number of infected individuals, projected to be close to 90 million. Plans are disorganized, outdated, and nonexistent. The conclusions of this thesis offer suggestions for pandemic policy improvement.
ContributorsAbboud, Alexis J (Author) / Maienschein, Jane (Thesis director) / Creath, Richard (Committee member) / O'Neil, Erica (Committee member) / Barrett, The Honors College (Contributor) / School of Life Sciences (Contributor)
Created2014-05
137054-Thumbnail Image.png
Description
Two tasks have been predominantly used over the past thirty years to measure false belief understanding: the Location Task and the Typical Box task. These tasks have produced robust findings that children fail false belief tasks at age 3 and pass false belief tasks at age 4. Recent theory, however,

Two tasks have been predominantly used over the past thirty years to measure false belief understanding: the Location Task and the Typical Box task. These tasks have produced robust findings that children fail false belief tasks at age 3 and pass false belief tasks at age 4. Recent theory, however, points out a shared confound in the 2-option Location and Typical Box tasks. This confound would allow children to perform successfully on the standard false belief tasks without understanding belief. Instead, children might be using perceptual access reasoning and reason that ignorant agents will "get it wrong" and be incorrect about reality. Modified 3-option tasks were used to address this confound by introducing a third, irrelevant option to the 2-option tasks. According to PAR, children who pass 2-option tasks should perform worse on the 3-option tasks because there are two "wrong" answers. We argue that subtle differences in salience between the false belief and irrelevant options in combination with one open-ended test question can draw children who use PAR toward one or the other in unpredictable ways. To demonstrate that other procedures will give more salience to the irrelevant options several studies are needed, each with minor variations in procedure that do not alter the basic false belief structure. Thus in five studies we varied superficial characteristics across tasks in order to test for a task effect across studies. We used the "continuously cumulating meta-analysis" (CCMA) approach, combining each replication study into a broader analysis (final N=113) for higher power. Our CCMA analyses provide strong support for the PAR hypothesis because 1) children performed worse on the 3-option tasks than the 2-option tasks, 2) children's proportion of false belief responses out of non-reality responses did not replicate across studies, 3) children's proportion of reality responses replicated across studies, and 4) the Location task was easier than the Box tasks across studies. These findings suggest that there is a lack of construct validity in traditional false belief tasks; thus, new methods of testing for false belief understanding are needed to determine at what point children acquire Theory of Mind.
ContributorsPesch, Annelise Nicole (Author) / Fabricius, William (Thesis director) / Kobes, Bernard (Committee member) / Barrett, The Honors College (Contributor) / School of Historical, Philosophical and Religious Studies (Contributor) / Department of Psychology (Contributor)
Created2014-05
137645-Thumbnail Image.png
Description
Modern computers interact with the external environment in complex ways — for instance, they interact with human users via keyboards, mouses, monitors, etc., and with other computers via networking. Existing models of computation — Turing machines, λ-calculus functions, etc. — cannot model these behaviors completely. Some additional conceptual apparatus is

Modern computers interact with the external environment in complex ways — for instance, they interact with human users via keyboards, mouses, monitors, etc., and with other computers via networking. Existing models of computation — Turing machines, λ-calculus functions, etc. — cannot model these behaviors completely. Some additional conceptual apparatus is required in order to model processes of interactive computation.
ContributorsThomas, Nicholas Woodlief (Author) / Armendt, Brad (Thesis director) / Kobes, Bernard (Committee member) / Blackson, Thomas (Committee member) / Barrett, The Honors College (Contributor) / School of Historical, Philosophical and Religious Studies (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Psychology (Contributor)
Created2013-05
137169-Thumbnail Image.png
Description
The study of literature, which has traditionally been the work of the humanities, has seemingly opened up to biology in recent years through an infusion of cognitive science and evolutionary psychology. This essay examines two perspectives on the potential for reader/character identification, one perspective from cognitive/evolutionary studies, and the other

The study of literature, which has traditionally been the work of the humanities, has seemingly opened up to biology in recent years through an infusion of cognitive science and evolutionary psychology. This essay examines two perspectives on the potential for reader/character identification, one perspective from cognitive/evolutionary studies, and the other from the humanities. Building on both perspectives, I propose my own notion of reader/character identification called immersive identification. I argue that fiction is especially suited to prompt readers to identify with fictional characters in an immersive way. Then, I demonstrate how different cognitive/evolutionary perspectives of fiction can accommodate my notion of immersive identification. Finally, I defend my account of immersive identification against a counterexample.
ContributorsDhein, Kelle James (Author) / Eder, James (Thesis director) / Kobes, Bernard (Committee member) / Cassell, Paul (Committee member) / Barrett, The Honors College (Contributor) / School of Human Evolution and Social Change (Contributor) / Department of English (Contributor) / School of Historical, Philosophical and Religious Studies (Contributor)
Created2014-05