Matching Items (161)
134947-Thumbnail Image.png
Description
This study involves determining if different political symbols associated with ideological labels vary between the old and new terms. Specifically, the terms conservatism, liberalism, moderate, progressivism, and populism were used, where the first two are the old terms and the last two are the new terms. A survey was given

This study involves determining if different political symbols associated with ideological labels vary between the old and new terms. Specifically, the terms conservatism, liberalism, moderate, progressivism, and populism were used, where the first two are the old terms and the last two are the new terms. A survey was given to a representative sample of the United States, provided by SurveyMonkey, consisting of 205 respondents. Questions regarding favoritism/support for groups and political issues were asked to determine a trend of what each political ideology favors. Voting behavior was also evaluated to identify if there was a connection between self-identification of a political ideology or party and the frequency/type of elections that the individuals voted in. The hypothesis was that by adding progressivism to the liberalism category, the percentage of people who identify as these groups would be roughly equal to the percentage of people who identify as conservative, since the percentage of people who identify as conservative has been much greater than those who identify as liberal. The consensus was that the percentage of people who identified as liberal and progressive was greater than the percentage of those who identified as conservative. For example, the percentage of people who identified as conservative, moderate, liberal, and progressive was 25.9%, 31.7%, 27.3%, and 14.6%, respectively. Ultimately, after evaluating issue and symbolic preferences, progressivism is not just a term used in place of liberalism, but instead a whole new ideology that is different from other popular political ideologies. Considering voting behavior, there is no conclusive evidence that says that people who identify with one ideology vote more frequently or in a different election than people who identify with other ideologies.
ContributorsSypkens, Sasha T. (Author) / Ramirez, Mark (Thesis director) / Bustikova-Siroky, Lenka (Committee member) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
135129-Thumbnail Image.png
Description
A working knowledge of mathematics is a vital requirement for introductory university physics courses. However, there is mounting evidence which shows that many incoming introductory physics students do not have the necessary mathematical ability to succeed in physics. The investigation reported in this thesis used preinstruction diagnostics and interviews to

A working knowledge of mathematics is a vital requirement for introductory university physics courses. However, there is mounting evidence which shows that many incoming introductory physics students do not have the necessary mathematical ability to succeed in physics. The investigation reported in this thesis used preinstruction diagnostics and interviews to examine this problem in depth. It was found that in some cases, over 75% of students could not solve the most basic mathematics problems. We asked questions involving right triangles, vector addition, vector direction, systems of equations, and arithmetic, to give a few examples. The correct response rates were typically between 25% and 75%, which is worrying, because these problems are far simpler than the typical problem encountered in an introductory quantitative physics course. This thesis uncovered a few common problem solving strategies that were not particularly effective. When solving trigonometry problems, 13% of students wrote down the mnemonic "SOH CAH TOA," but a chi-squared test revealed that this was not a statistically significant factor in getting the correct answer, and was actually detrimental in certain situations. Also, about 50% of students used a tip-to-tail method to add vectors. But there is evidence to suggest that this method is not as effective as using components. There are also a number of problem solving strategies that successful students use to solve mathematics problems. Using the components of a vector increases student success when adding vectors and examining their direction. Preliminary evidence also suggests that repetitive trigonometry practice may be the best way to improve student performance on trigonometry problems. In addition, teaching students to use a wide variety of algebraic techniques like the distributive property may help them from getting stuck when working through problems. Finally, evidence suggests that checking work could eliminate up to a third of student errors.
ContributorsJones, Matthew Isaiah (Author) / Meltzer, David (Thesis director) / Peng, Xihong (Committee member) / Department of Physics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
134989-Thumbnail Image.png
Description
The FoF1 ATP synthase is a molecular motor critical to the metabolism of virtually all life forms, and it acts in the manner of a hydroelectric generator. The F1 complex contains an (αβ)3 (hexamer) ring in which catalysis occurs, as well as a rotor comprised by subunit-ε in addition to

The FoF1 ATP synthase is a molecular motor critical to the metabolism of virtually all life forms, and it acts in the manner of a hydroelectric generator. The F1 complex contains an (αβ)3 (hexamer) ring in which catalysis occurs, as well as a rotor comprised by subunit-ε in addition to the coiled-coil and globular foot domains of subunit-γ. The F1 complex can hydrolyze ATP in vitro in a manner that drives counterclockwise (CCW) rotation, in 120° power strokes, as viewed from the positive side of the membrane. The power strokes that occur in ≈ 300 μsec are separated by catalytic dwells that occur on a msec time scale. A single-molecule rotation assay that uses the intensity of polarized light, scattered from a 75 × 35 nm gold nanorod, determined the average rotational velocity of the power stroke (ω, in degrees/ms) as a function of the rotational position of the rotor (θ, in degrees, measured in reference to the catalytic dwell). The velocity is not constant but rather accelerates and decelerates in two Phases. Phase-1 (0° - 60°) is believed to derive power from elastic energy in the protein. At concentrations of ATP that limit the rate of ATP hydrolysis, the rotor can stop for an ATP-binding dwell during Phase-1. Although the most probable position that the ATP-binding dwell occurs is 40° after the catalytic dwell, the ATP-binding dwell can occur at any rotational position during Phase-1 of the power stroke. Phase-2 of the power stroke (60° - 120°) is believed to be powered by the ATP-binding induced closure of the lever domain of a β-subunit (as it acts as a cam shaft against the γ-subunit). Algorithms were written, to sort and analyze F1-ATPase power strokes, to determine the average rotational velocity profile of power strokes as a function of the rotational position at which the ATP-binding dwell occurs (θATP-bd), and when the ATP-binding dwell is absent. Sorting individual ω(θ) curves, as a function of θATP-bd, revealed that a dependence of ω on
θATP-bd exists. The ATP-binding dwell can occur even at saturating ATP concentrations. We report that ω follows a distinct pattern in the vicinity of the ATP-binding dwell, and that the ω(θ) curve contains the same oscillations within it regardless of θATP-bd. We observed that an acceleration/deceleration dependence before and after the ATP-binding dwell, respectively, remained for increasing time intervals as the dwell occurred later in Phase-1, to a maximum of ≈ 40°. The results were interpreted in terms of a model in which the ATP-binding dwell results from internal drag at a variable position on the γε rotor.
ContributorsBukhari, Zain Aziz (Author) / Frasch, Wayne D. (Thesis director) / Allen, James P. (Committee member) / Redding, Kevin (Committee member) / School of Molecular Sciences (Contributor) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
134998-Thumbnail Image.png
Description
Because of its massive nature and simple two-body structure, the heavy meson bottomonium (the flavorless bound state of the bottom quark and anti-quark) is among the simplest systems available for the study of the strong force and quantum chromodynamics (QCD)—a feature which has made it of special interest to particle

Because of its massive nature and simple two-body structure, the heavy meson bottomonium (the flavorless bound state of the bottom quark and anti-quark) is among the simplest systems available for the study of the strong force and quantum chromodynamics (QCD)—a feature which has made it of special interest to particle physicists.

Despite being bound by the strong force, bottomonium exhibits a rich spectrum of resonances corresponding to excited states extremely analogous to that of positronium or even familiar atomic systems. Transitions between these levels are possible via the absorption or emission of either a photon, gluon, or gluons manifesting as light hadrons. The goal of this thesis was to establish a theoretical value for the currently unmeasured partial decay width for one such transition—the electromagnetic decay channel hb -> etab gamma. To this end, two methods were utilized.

The first approach relied on the presumption of a nonrelativistic constituent quark model interacting via a simple static potential, allowing for radial wave functions and energy eigenvalues to be obtained for the states of interest via the Schrödinger equation. Upon an application of the standard electromagnetic multipole expansion followed by a utilization of the electric dipole E1 decay width formula, a value of 57.7 ± 0.4 keV was obtained.

The second approach stemmed from the effective Lagrangian describing the bottomonium P to S electromagnetic transitions and relied on the presumption that a single coupling constant could be approximated as describing all nP to mS transitions regardless of spin. A value for this coupling constant could then be extracted from the 1P to 1S spin triplet data and used to predict the width for the singlet 1P to 1S transition. The partial decay width value found in this manner was 47.8 ± 2.0 keV.

Various other methods and models have established a predicted range of 35 to 60 keV for this partial decay width. As the values determined in this thesis fall within the expected range, they agree well with our current understanding of this electromagnetic transition and place further confidence on the expected range.
ContributorsIreland, Aurora Nicole (Author) / McCartney, Martha (Thesis director) / Foy, Joseph (Committee member) / Maximon, Leonard (Committee member) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
Description
Computer simulations are gaining recognition as educational tools, but in general there is still a line dividing a simulation from a game. Yet as many recent and successful video games heavily involve simulations (SimCity comes to mind), there is not only the growing question of whether games can be used

Computer simulations are gaining recognition as educational tools, but in general there is still a line dividing a simulation from a game. Yet as many recent and successful video games heavily involve simulations (SimCity comes to mind), there is not only the growing question of whether games can be used for educational purposes, but also of how a game might qualify as educational. Endemic: The Agent is a project that tries to bridge the gap between educational simulations and educational games. This paper outlines the creation of the project and the characteristics that make it an educational tool, a simulation, and a game.
ContributorsFish, Derek Austin (Author) / Karr, Timothy (Thesis director) / Marcus, Andrew (Committee member) / Jones, Donald (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2013-05
Description
There is no doubt that inductive logic and inductive arguments are vital to the formation of scientific theories. This thesis questions the use of inductive inferences within the sciences. Specifically, it will examine various perspectives on David Hume's famed "problem of induction". Hume proposes that inductive inferences cannot be logically

There is no doubt that inductive logic and inductive arguments are vital to the formation of scientific theories. This thesis questions the use of inductive inferences within the sciences. Specifically, it will examine various perspectives on David Hume's famed "problem of induction". Hume proposes that inductive inferences cannot be logically justified. Here we will explore several assessments of Hume's ideas and inductive logic in general. We will examine the views of philosophers and logicians: Karl Popper, Nelson Goodman, Larry Laudan, and Wesley Salmon. By comparing the radically different views of these philosophers it is possible to gain insight into the complex nature of making inductive inferences. First, Popper agrees with Hume that inductive inferences can never be logically justified. He maintains that the only way around the problem of induction is to rid science of inductive logic altogether. Goodman, on the other hand, believes induction can be justified in much the same way as deduction is justified. Goodman sets up a logical schema in which the rules of induction justify the particular inductive inferences. These general rules are then in turn justified by correct inferences. In this way, Goodman sets up an explication of inductive logic. Laudan and Salmon go on to provide more specific details about how the particular rules of induction should be constructed. Though both Laudan and Salmon are completing the logic schema of Goodman, their approaches are quite different. Laudan takes a more qualitative approach while Salmon uses the quantitative rules of probability to explicate induction. In the end, it can be concluded that it seems quite possible to justify inductive inferences, though there may be more than one possible set of rules of induction.
ContributorsFeddern, James William Edward (Author) / Creath, Richard (Thesis director) / Armendt, Brad (Committee member) / Department of Physics (Contributor) / Department of Military Science (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
Integration of dielectrics with graphene is essential to the fulfillment of graphene based electronic applications. While many dielectric deposition techniques exist, plasma enhanced atomic layer deposition (PEALD) is emerging as a technique to deposit ultrathin dielectric films with superior densities and interfaces. However, the degree to which PEALD on graphene

Integration of dielectrics with graphene is essential to the fulfillment of graphene based electronic applications. While many dielectric deposition techniques exist, plasma enhanced atomic layer deposition (PEALD) is emerging as a technique to deposit ultrathin dielectric films with superior densities and interfaces. However, the degree to which PEALD on graphene can be achieved without plasma-induced graphene deterioration is not well understood. In this work, we investigate a range of plasma conditions across a single sample, characterizing both oxide growth and graphene deterioration using spectroscopic analysis and atomic force microscopy. Investigation of graphene and film quality produced by these conditions yields insight into plasma effects. Using a specially designed sample configuration, we achieve ultrathin (< 1 nm) aluminum oxide films atop graphene.
ContributorsTrimble, Christie Jordan (Author) / Nemanich, Robert (Thesis director) / Zaniewski, Anna (Committee member) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135914-Thumbnail Image.png
Description
In this paper, I first explain the legal theory which leads up to Obergefell v. Hodges, and then analyze Obergefell v. Hodges itself. My analysis leads me to conclude that the legal reasoning, or the argument used to come to the decision, is flawed for it relies too heavily upon

In this paper, I first explain the legal theory which leads up to Obergefell v. Hodges, and then analyze Obergefell v. Hodges itself. My analysis leads me to conclude that the legal reasoning, or the argument used to come to the decision, is flawed for it relies too heavily upon public opinion and is a legislative action of the Supreme Court. Therefore, I offer three alternatives: each of which improve upon the legal reasoning in different ways. Furthermore, my analysis of these three arguments\u2014and particularly the Free Exercise Argument\u2014leads me to postulate that there is in fact a Freedom to Practice embedded in the penumbral, or unstated, rights of the United States Constitution. While the full extent of the implications of such a right must be explored in another paper, I establish the legal reasoning for the freedom by four routes, showing that although precedent has yet to materialize, there are several arguments for the freedom.
ContributorsMartin, Daniel Brockie (Author) / Kramer, Zachary (Thesis director) / Graff, Sarah (Committee member) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2015-12
135469-Thumbnail Image.png
Description
The purpose of this creative project was to establish the foundation of an educational program that teaches financial literacy to the local homeless population. The name of this program is stillHUMAN. The project consisted of two parts, a needs analysis and a prototyping phase. The needs analysis was conducted at

The purpose of this creative project was to establish the foundation of an educational program that teaches financial literacy to the local homeless population. The name of this program is stillHUMAN. The project consisted of two parts, a needs analysis and a prototyping phase. The needs analysis was conducted at the Phoenix Rescue Mission Center, a faith-based homeless shelter that caters to male "clients", through written surveys and one-on-one interviews. Before interviewing the clients, the team acquired IRB approval as well as consent from the Center to carry out this study. These needs were then organized into a House of Quality. We concluded from Part 1 that we would need to create 3 - 7-minute-long video modules that would be available on an online platform and covered topics including professional development, budgeting, credit, and Internet literacy. In order to commence Part 2, each team member recorded a video module. These three videos collectively conveyed instruction regarding how to write a resume, use the Internet and fill out an application online, and how to budget money. These videos were uploaded to YouTube and shown to clients at Phoenix Rescue Mission, who were each asked to fill out a feedback survey afterwards. The team plans to use these responses to improve the quality of future video modules and ultimately create a holistic lesson plan that covers all financial literacy topics the clients desire. A website was also made to store future videos. The team plans to continue with this project post-graduation. Future tasks include creating and testing the a complete lesson plan, establishing a student organization at Arizona State University and recruiting volunteers from different disciplines, and creating an on-site tutoring program so clients may receive individualized attention. Once the lesson plan is demonstrated to be effective at Phoenix Rescue Mission, we plan to administer this lesson plan at other local homeless shelters and assess its efficacy in a non-faithbased and non-male environment. After a successful financial literacy program has been created, we aim to create lesson plans for other topics, including health literacy, human rights, and basic education. Ultimately stillHUMAN will become a sustainable program that unites the efforts of students and professionals to improve the quality of life of the homeless population.
ContributorsKim, Michael (Co-author) / Gulati, Guneet (Co-author) / Vanood, Aimen (Co-author) / Ganesh, Tirupalavanam (Thesis director) / Shrake, Scott (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of Life Sciences (Contributor) / Department of Physics (Contributor) / Department of Psychology (Contributor) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135866-Thumbnail Image.png
Description
Agent based models allow for complex results from simple parameters. The mobile agents in my model, the firms, are allocated an amount of capital, while the static agents, the workers, are allocated a range of wages. The firms are then allowed to move around and compete until they match with

Agent based models allow for complex results from simple parameters. The mobile agents in my model, the firms, are allocated an amount of capital, while the static agents, the workers, are allocated a range of wages. The firms are then allowed to move around and compete until they match with a worker that maximizes their production. It was found from the simulation that as competition increases so do wages. It was also found that when firms stay in the environment for longer that a higher wage is possible as a result of a larger window for drawn out competition. The different parameters result in a range of equilibriums that take variable amounts of time to reach. These results are interesting because they demonstrate that the mean wage is strongly dependent upon the window of time that firms are able to compete within. This type of model was useful because it demonstrated that there is a variation in the time dependence of the equilibrium. It also demonstrated that when there is very little entry and exiting of the market, that wage levels out at an equilibrium that is the same, regardless of the ratio between the number of firms and the number of workers. Further work to be done on this model includes the addition of a Matching Function so that firms and workers have a more fair agreement. I will also be adding parameters that allow for firms to see the workers around them so that firms are able to interact with multiple workers at the same time. Both of these alteration should improve the overall accuracy of the model.
ContributorsElledge, Jacob Morris (Author) / Veramendi, Gregory (Thesis director) / Murphy, Alvin (Committee member) / Department of Economics (Contributor) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2015-12