Matching Items (27)
Description
There is no doubt that inductive logic and inductive arguments are vital to the formation of scientific theories. This thesis questions the use of inductive inferences within the sciences. Specifically, it will examine various perspectives on David Hume's famed "problem of induction". Hume proposes that inductive inferences cannot be logically

There is no doubt that inductive logic and inductive arguments are vital to the formation of scientific theories. This thesis questions the use of inductive inferences within the sciences. Specifically, it will examine various perspectives on David Hume's famed "problem of induction". Hume proposes that inductive inferences cannot be logically justified. Here we will explore several assessments of Hume's ideas and inductive logic in general. We will examine the views of philosophers and logicians: Karl Popper, Nelson Goodman, Larry Laudan, and Wesley Salmon. By comparing the radically different views of these philosophers it is possible to gain insight into the complex nature of making inductive inferences. First, Popper agrees with Hume that inductive inferences can never be logically justified. He maintains that the only way around the problem of induction is to rid science of inductive logic altogether. Goodman, on the other hand, believes induction can be justified in much the same way as deduction is justified. Goodman sets up a logical schema in which the rules of induction justify the particular inductive inferences. These general rules are then in turn justified by correct inferences. In this way, Goodman sets up an explication of inductive logic. Laudan and Salmon go on to provide more specific details about how the particular rules of induction should be constructed. Though both Laudan and Salmon are completing the logic schema of Goodman, their approaches are quite different. Laudan takes a more qualitative approach while Salmon uses the quantitative rules of probability to explicate induction. In the end, it can be concluded that it seems quite possible to justify inductive inferences, though there may be more than one possible set of rules of induction.
ContributorsFeddern, James William Edward (Author) / Creath, Richard (Thesis director) / Armendt, Brad (Committee member) / Department of Physics (Contributor) / Department of Military Science (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
171575-Thumbnail Image.png
Description
Moral status questions, (who and what counts morally) are of central concern to moral philosophers. There is also a rich history of psychological work exploring the topic. The received view in psychology of moral status accounts for it as a function of other mind perception. On this view, entities are

Moral status questions, (who and what counts morally) are of central concern to moral philosophers. There is also a rich history of psychological work exploring the topic. The received view in psychology of moral status accounts for it as a function of other mind perception. On this view, entities are morally considerable because they are perceived to have the right sort of minds. This dissertation analyzes and tests this theory, pointing out both empirical and conceptual issues with the received view. The results presented show that important moral intuitions (for example about unjustifiable interpersonal killing) cannot be explained by appealing to other mind perception. Some alternative views of the psychology of moral status are presented, as well as avenues for further research.
ContributorsLaTourelle, Jonathan Jacob (Author) / Creath, Richard (Thesis advisor) / Van Gelderen, Elly (Thesis advisor) / Robert, Jason (Committee member) / Ellison, Karin (Committee member) / Becker, D. Vaughn (Committee member) / Arizona State University (Publisher)
Created2022
157324-Thumbnail Image.png
Description
This dissertation examines the efforts of the Carnegie Image Tube Committee (CITC), a group created by Vannevar Bush and composed of astronomers and physicists, who sought to develop a photoelectric imaging device, generally called an image tube, to aid astronomical observations. The Carnegie Institution of Washington’s Department of Terrestrial Magnetism

This dissertation examines the efforts of the Carnegie Image Tube Committee (CITC), a group created by Vannevar Bush and composed of astronomers and physicists, who sought to develop a photoelectric imaging device, generally called an image tube, to aid astronomical observations. The Carnegie Institution of Washington’s Department of Terrestrial Magnetism coordinated the CITC, but the committee included members from observatories and laboratories across the United States. The CITC, which operated from 1954 to 1976, sought to replace direct photography as the primary means of astronomical imaging.

Physicists, who gained training in electronics during World War II, led the early push for the development of image tubes in astronomy. Vannevar Bush’s concern for scientific prestige led him to form a committee to investigate image tube technology, and postwar federal funding for the sciences helped the CITC sustain development efforts for a decade. During those development years, the CITC acted as a mediator between the astronomical community and the image tube producers but failed to engage astronomers concerning various development paths, resulting in a user group without real buy-in on the final product.

After a decade of development efforts, the CITC designed an image tube, which Radio Corporation of American manufactured, and, with additional funding from the National Science Foundation, the committee distributed to observatories around the world. While excited about the potential of electronic imaging, few astronomers used the Carnegie-developed device regularly. Although the CITC’s efforts did not result in an overwhelming adoption of image tubes by the astronomical community, examining the design, funding, production, and marketing of the Carnegie image tube shows the many and varied processes through which astronomers have acquired new tools. Astronomers’ use of the Carnegie image tube to acquire useful scientific data illustrates factors that contribute to astronomers’ adoption or non-adoption of those new tools.
ContributorsThompson, Samantha Michelle (Author) / Ellison, Karin (Thesis advisor) / Wetmore, Jameson (Thesis advisor) / Maienschein, Jane (Committee member) / Creath, Richard (Committee member) / DeVorkin, David (Committee member) / Arizona State University (Publisher)
Created2019
155234-Thumbnail Image.png
Description
At the interface of developmental biology and evolutionary biology, the very

criteria of scientific knowledge are up for grabs. A central issue is the status of evolutionary genetics models, which some argue cannot coherently be used with complex gene regulatory network (GRN) models to explain the same evolutionary phenomena. Despite those

At the interface of developmental biology and evolutionary biology, the very

criteria of scientific knowledge are up for grabs. A central issue is the status of evolutionary genetics models, which some argue cannot coherently be used with complex gene regulatory network (GRN) models to explain the same evolutionary phenomena. Despite those claims, many researchers use evolutionary genetics models jointly with GRN models to study evolutionary phenomena.

How do those researchers deploy those two kinds of models so that they are consistent and compatible with each other? To address that question, this dissertation closely examines, dissects, and compares two recent research projects in which researchers jointly use the two kinds of models. To identify, select, reconstruct, describe, and compare those cases, I use methods from the empirical social sciences, such as digital corpus analysis, content analysis, and structured case analysis.

From those analyses, I infer three primary conclusions about projects of the kind studied. First, they employ an implicit concept of gene that enables the joint use of both kinds of models. Second, they pursue more epistemic aims besides mechanistic explanation of phenomena. Third, they don’t work to create and export broad synthesized theories. Rather, they focus on phenomena too complex to be understood by a common general theory, they distinguish parts of the phenomena, and they apply models from different theories to the different parts. For such projects, seemingly incompatible models are synthesized largely through mediated representations of complex phenomena.

The dissertation closes by proposing how developmental evolution, a field traditionally focused on macroevolution, might fruitfully expand its research agenda to include projects that study microevolution.
ContributorsElliott, Steve (Author) / Creath, Richard (Thesis advisor) / Laubichler, Manfred D. (Thesis advisor) / Armendt, Brad (Committee member) / Forber, Patrick (Committee member) / Pratt, Stephen (Committee member) / Arizona State University (Publisher)
Created2017
166191-Thumbnail Image.png
Description

The relationship between science and religion in the modern day is complex to the point that the lines between them are often blurred. We have a need to distinguish the two from each-other for a variety of practical reasons. Various philosophies, theories, and tests have been suggested on the interaction

The relationship between science and religion in the modern day is complex to the point that the lines between them are often blurred. We have a need to distinguish the two from each-other for a variety of practical reasons. Various philosophies, theories, and tests have been suggested on the interaction between the two and how they are subdivided. One of the sets of criteria which has been shown to work was originally introduced in the opinion of Judge Overton in the case of McLean v Arkansas. McLean v Arkansas is a pivotal case in that it gave us a useful definition of what science is and isn’t in the context of the law. It used the already established Lemon test to show what counts as the establishment of religion. Given the distinction by Judge Overton, there are questions as to whether or not there is even overlap or tension between science and religion, such as in the theory of Stephen Jay Gould’s Nonoverlapping Magisteria (NOMA). What we find in this thesis is that the NOMA principle is doubtful at best. Through the discussion of McLean v. Arkansas, NOMA, and the commentaries of Professors Larry Laudan and Michael Ruse, this thesis develops a contextualization principle that can be used as a guide to develop further theories, particularly regarding the divisions between science and religion.

ContributorsAmmanamanchi, Amrit (Author) / Creath, Richard (Thesis director) / Minteer, Ben (Committee member) / Barrett, The Honors College (Contributor) / School of Politics and Global Studies (Contributor) / School of Life Sciences (Contributor)
Created2022-05
153012-Thumbnail Image.png
Description
Computational tools in the digital humanities often either work on the macro-scale, enabling researchers to analyze huge amounts of data, or on the micro-scale, supporting scholars in the interpretation and analysis of individual documents. The proposed research system that was developed in the context of this dissertation ("Quadriga System") works

Computational tools in the digital humanities often either work on the macro-scale, enabling researchers to analyze huge amounts of data, or on the micro-scale, supporting scholars in the interpretation and analysis of individual documents. The proposed research system that was developed in the context of this dissertation ("Quadriga System") works to bridge these two extremes by offering tools to support close reading and interpretation of texts, while at the same time providing a means for collaboration and data collection that could lead to analyses based on big datasets. In the field of history of science, researchers usually use unstructured data such as texts or images. To computationally analyze such data, it first has to be transformed into a machine-understandable format. The Quadriga System is based on the idea to represent texts as graphs of contextualized triples (or quadruples). Those graphs (or networks) can then be mathematically analyzed and visualized. This dissertation describes two projects that use the Quadriga System for the analysis and exploration of texts and the creation of social networks. Furthermore, a model for digital humanities education is proposed that brings together students from the humanities and computer science in order to develop user-oriented, innovative tools, methods, and infrastructures.
ContributorsDamerow, Julia (Author) / Laubichler, Manfred (Thesis advisor) / Maienschein, Jane (Thesis advisor) / Creath, Richard (Committee member) / Ellison, Karin (Committee member) / Hooper, Wallace (Committee member) / Renn, Jürgen (Committee member) / Arizona State University (Publisher)
Created2014
158898-Thumbnail Image.png
Description
This dissertation is an historical analysis of the science of human origins, paleoanthropology, examining the intersection of science and culture around fossil human ancestors (hominins) over the last century and a half. Focusing on fossils as scientific objects, this work examines three controversial fossils from the science’s history asking, how

This dissertation is an historical analysis of the science of human origins, paleoanthropology, examining the intersection of science and culture around fossil human ancestors (hominins) over the last century and a half. Focusing on fossils as scientific objects, this work examines three controversial fossils from the science’s history asking, how do fossils formulate, challenge, and reconfigure notions of what it means to be human? The introduction reviews the historiography of paleoanthropology and the gaps that exist in the literature. Chapter two examines the first case study, the type specimen of Homo neanderthalensis, known as the Feldhofer Neanderthal, providing a biography of the object from its discovery in Germany in 1856 until its species designation in 1864. Chapter three briefly links the Neanderthal’s story in time and space to the next fossil’s story. Chapter four picks up the story of paleoanthropology in 1924 in South Africa, with the discovery and initial analysis of a specimen nicknamed the Taungs Baby, which was labeled a new hominin species, Australopithecus africanus. Chapter five is another brief chapter connecting the Taungs Baby story in time and space to the final specimen examined in this work at the end of the century. Chapter six examines the final case study, a specimen discovered in 2003 in Indonesia, designated a new species named Homo floresiensis and nicknamed the Hobbit. Through comparing contrasting, and connecting the stories of these three specimens, three major conclusions emerge about the field. First, the fossils themselves play an important role in knowledge production about the hominin past. Second, scientific practice shaped both interpretations of fossils and larger questions of what it means to be human. Third, the scientific practice is itself shaped by local culture, which continually interacts with attempts to establish a global perspective about the human past. The perspective gleaned through the eyes of these three fossils therefore reveals the way shifting, rather than eternally true, claims are embedded in culture and intertwined with the perspectives of the humans conducting the science.
ContributorsMadison, Paige (Author) / Maienschein, Jane (Thesis advisor) / Kimbel, William (Committee member) / Creath, Richard (Committee member) / Hurlbut, James (Committee member) / Laubichler, Mandred (Committee member) / Arizona State University (Publisher)
Created2020