Matching Items (959)
Filtering by

Clear all filters

133353-Thumbnail Image.png
Description
This research compares shifts in a SuperSpec titanium nitride (TiN) kinetic inductance detector's (KID's) resonant frequency with accepted models for other KIDs. SuperSpec, which is being developed at the University of Colorado Boulder, is an on-chip spectrometer designed with a multiplexed readout with multiple KIDs that is set up for

This research compares shifts in a SuperSpec titanium nitride (TiN) kinetic inductance detector's (KID's) resonant frequency with accepted models for other KIDs. SuperSpec, which is being developed at the University of Colorado Boulder, is an on-chip spectrometer designed with a multiplexed readout with multiple KIDs that is set up for a broadband transmission of these measurements. It is useful for detecting radiation in the mm and sub mm wavelengths which is significant since absorption and reemission of photons by dust causes radiation from distant objects to reach us in infrared and far-infrared bands. In preparation for testing, our team installed stages designed previously by Paul Abers and his group into our cryostat and designed and installed other parts necessary for the cryostat to be able to test devices on the 250 mK stage. This work included the design and construction of additional parts, a new setup for the wiring in the cryostat, the assembly, testing, and installation of several stainless steel coaxial cables for the measurements through the devices, and other cryogenic and low pressure considerations. The SuperSpec KID was successfully tested on this 250 mK stage thus confirming that the new setup is functional. Our results are in agreement with existing models which suggest that the breaking of cooper pairs in the detector's superconductor which occurs in response to temperature, optical load, and readout power will decrease the resonant frequencies. A negative linear relationship in our results appears, as expected, since the parameters are varied only slightly so that a linear approximation is appropriate. We compared the rate at which the resonant frequency responded to temperature and found it to be close to the expected value.
ContributorsDiaz, Heriberto Chacon (Author) / Mauskopf, Philip (Thesis director) / McCartney, Martha (Committee member) / Department of Physics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133355-Thumbnail Image.png
Description
This study estimates the capitalization effect of golf courses in Maricopa County using the hedonic pricing method. It draws upon a dataset of 574,989 residential transactions from 2000 to 2006 to examine how the aesthetic, non-golf benefits of golf courses capitalize across a gradient of proximity measures. The measures for

This study estimates the capitalization effect of golf courses in Maricopa County using the hedonic pricing method. It draws upon a dataset of 574,989 residential transactions from 2000 to 2006 to examine how the aesthetic, non-golf benefits of golf courses capitalize across a gradient of proximity measures. The measures for amenity value extend beyond home adjacency and include considerations for homes within a range of discrete walkability buffers of golf courses. The models also distinguish between public and private golf courses as a proxy for the level of golf course access perceived by non-golfers. Unobserved spatial characteristics of the neighborhoods around golf courses are controlled for by increasing the extent of spatial fixed effects from city, to census tract, and finally to 2000 meter golf course ‘neighborhoods.’ The estimation results support two primary conclusions. First, golf course proximity is found to be highly valued for adjacent homes and homes up to 50 meters way from a course, still evident but minimal between 50 and 150 meters, and insignificant at all other distance ranges. Second, private golf courses do not command a higher proximity premia compared to public courses with the exception of homes within 25 to 50 meters of a course, indicating that the non-golf benefits of courses capitalize similarly, regardless of course type. The results of this study motivate further investigation into golf course features that signal access or add value to homes in the range of capitalization, particularly for near-adjacent homes between 50 and 150 meters thought previously not to capitalize.
ContributorsJoiner, Emily (Author) / Abbott, Joshua (Thesis director) / Smith, Kerry (Committee member) / Economics Program in CLAS (Contributor) / School of Sustainability (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133359-Thumbnail Image.png
Description
The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power.

The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power. Many parallel program- ming libraries are intended for use in high performance computing (HPC) clusters. Unlike the IOT environment described, HPC clusters will in general look to obtain very consistent network speeds and topologies. There are a significant number of software choices that make up what is referred to as the HPC stack or parallel processing stack. My thesis focused on building an HPC stack that would run on the SCB computer name the Raspberry Pi. The intention in making this Raspberry Pi cluster is to research performance of MPI implementations in an IOT environment, which had an impact on the design choices of the cluster. This thesis is a compilation of my research efforts in creating this cluster as well as an evaluation of the software that was chosen to create the parallel processing stack.
ContributorsO'Meara, Braedon Richard (Author) / Meuth, Ryan (Thesis director) / Dasgupta, Partha (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133364-Thumbnail Image.png
Description
The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide

The objective of this paper is to provide an educational diagnostic into the technology of blockchain and its application for the supply chain. Education on the topic is important to prevent misinformation on the capabilities of blockchain. Blockchain as a new technology can be confusing to grasp given the wide possibilities it can provide. This can convolute the topic by being too broad when defined. Instead, the focus will be maintained on explaining the technical details about how and why this technology works in improving the supply chain. The scope of explanation will not be limited to the solutions, but will also detail current problems. Both public and private blockchain networks will be explained and solutions they provide in supply chains. In addition, other non-blockchain systems will be described that provide important pieces in supply chain operations that blockchain cannot provide. Blockchain when applied to the supply chain provides improved consumer transparency, management of resources, logistics, trade finance, and liquidity.
ContributorsKrukar, Joel Michael (Author) / Oke, Adegoke (Thesis director) / Duarte, Brett (Committee member) / Hahn, Richard (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133379-Thumbnail Image.png
Description
The Super Catalan numbers are a known set of numbers which have so far eluded a combinatorial interpretation. Several weighted interpretations have appeared since their discovery, one of which was discovered by William Kuszmaul in 2017. In this paper, we connect the weighted Super Catalan structure created previously by Kuszmaul

The Super Catalan numbers are a known set of numbers which have so far eluded a combinatorial interpretation. Several weighted interpretations have appeared since their discovery, one of which was discovered by William Kuszmaul in 2017. In this paper, we connect the weighted Super Catalan structure created previously by Kuszmaul and a natural $q$-analogue of the Super Catalan numbers. We do this by creating a statistic $\sigma$ for which the $q$ Super Catalan numbers, $S_q(m,n)=\sum_X (-1)^{\mu(X)} q^{\sigma(X)}$. In doing so, we take a step towards finding a strict combinatorial interpretation for the Super Catalan numbers.
ContributorsHouse, John Douglas (Author) / Fishel, Susanna (Thesis director) / Childress, Nancy (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133381-Thumbnail Image.png
Description
This thesis discusses three recent optimization problems that seek to reduce disease spread on arbitrary graphs by deleting edges, and it discusses three approximation algorithms developed for these problems. Important definitions are presented including the Linear Threshold and Triggering Set models and the set function properties of submodularity and monotonicity.

This thesis discusses three recent optimization problems that seek to reduce disease spread on arbitrary graphs by deleting edges, and it discusses three approximation algorithms developed for these problems. Important definitions are presented including the Linear Threshold and Triggering Set models and the set function properties of submodularity and monotonicity. Also, important results regarding the Linear Threshold model and computation of the influence function are presented along with proof sketches. The three main problems are formally presented, and NP-hardness results along with proof sketches are presented where applicable. The first problem seeks to reduce spread of infection over the Linear Threshold process by making use of an efficient tree data structure. The second problem seeks to reduce the spread of infection over the Linear Threshold process while preserving the PageRank distribution of the input graph. The third problem seeks to minimize the spectral radius of the input graph. The algorithms designed for these problems are described in writing and with pseudocode, and their approximation bounds are stated along with time complexities. Discussion of these algorithms considers how these algorithms could see real-world use. Challenges and the ways in which these algorithms do or do not overcome them are noted. Two related works, one which presents an edge-deletion disease spread reduction problem over a deterministic threshold process and the other which considers a graph modification problem aimed at minimizing worst-case disease spread, are compared with the three main works to provide interesting perspectives. Furthermore, a new problem is proposed that could avoid some issues faced by the three main problems described, and directions for future work are suggested.
ContributorsStanton, Andrew Warren (Author) / Richa, Andrea (Thesis director) / Czygrinow, Andrzej (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
131503-Thumbnail Image.png
Description
Construction is a defining characteristic of geometry classes. In a traditional classroom, teachers and students use physical tools (i.e. a compass and straight-edge) in their constructions. However, with modern technology, construction is possible through the use of digital applications such as GeoGebra and Geometer’s SketchPad.
Many other studies have

Construction is a defining characteristic of geometry classes. In a traditional classroom, teachers and students use physical tools (i.e. a compass and straight-edge) in their constructions. However, with modern technology, construction is possible through the use of digital applications such as GeoGebra and Geometer’s SketchPad.
Many other studies have researched the benefits of digital manipulatives and digital environments through student completion of tasks and testing. This study intends to research students’ use of the digital tools and manipulatives, along with the students’ interactions with the digital environment. To this end, I conducted exploratory teaching experiments with two calculus I students.
In the exploratory teaching experiments, students were introduced to a GeoGebra application developed by Fischer (2019), which includes instructional videos and corresponding quizzes, as well as exercises and interactive notepads, where students could use digital tools to construct line segments and circles (corresponding to the physical straight-edge and compass). The application built up the students’ foundational knowledge, culminating in the construction and verbal proof of Euclid’s Elements, Proposition 1 (Euclid, 1733).
The central findings of this thesis are the students’ interactions with the digital environment, with observed changes in their conceptions of radii and circles, and in their use of tools. The students were observed to have conceptions of radii as a process, a geometric shape, and a geometric object. I observed the students’ conceptions of a circle change from a geometric shape to a geometric object, and with that change, observed the students’ use of tools change from a measuring focus to a property focus.
I report a summary of the students’ work and classify their reasoning and actions into the above categories, and an analysis of how the digital environment impacts the students’ conceptions. I also briefly discuss the impact of the findings on pedagogy and future research.
ContributorsSakauye, Noelle Marie (Author) / Roh, Kyeong Hah (Thesis director) / Zandieh, Michelle (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131504-Thumbnail Image.png
Description
In the last few years, billion-dollar companies like Yahoo and Equifax have had data breaches causing millions of people’s personal information to be leaked online. Other billion-dollar companies like Google and Facebook have gotten in trouble for abusing people’s personal information for financial gain as well. In this new age

In the last few years, billion-dollar companies like Yahoo and Equifax have had data breaches causing millions of people’s personal information to be leaked online. Other billion-dollar companies like Google and Facebook have gotten in trouble for abusing people’s personal information for financial gain as well. In this new age of technology where everything is being digitalized and stored online, people all over the world are concerned about what is happening to their personal information and how they can trust it is being kept safe. This paper describes, first, the importance of protecting user data, second, one easy tool that companies and developers can use to help ensure that their user’s information (credit card information specifically) is kept safe, how to implement that tool, and finally, future work and research that needs to be done. The solution I propose is a software tool that will keep credit card data secured. It is only a small step towards achieving a completely secure data anonymized system, but when implemented correctly, it can reduce the risk of credit card data from being exposed to the public. The software tool is a script that can scan every viable file in any given system, server, or other file-structured Linux system and detect if there any visible credit card numbers that should be hidden.
ContributorsPappas, Alexander (Author) / Zhao, Ming (Thesis director) / Kuznetsov, Eugene (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131514-Thumbnail Image.png
Description
Political polarization is the coalescence of political parties -- and the individuals of which parties are composed -- around opposing ends of the ideological spectrum. Political parties in the United States have always been divided, however, in recent years this division has only intensified. Recently, polarization has also wound its

Political polarization is the coalescence of political parties -- and the individuals of which parties are composed -- around opposing ends of the ideological spectrum. Political parties in the United States have always been divided, however, in recent years this division has only intensified. Recently, polarization has also wound its way to the Supreme Court and the nomination processes of justices to the Court. This paper examines how prevalent polarization in the Supreme Court nomination process has become by looking specifically at the failed nomination of Judge Merrick Garland and the confirmations of now-Justices Neil Gorsuch and Brett Kavanaugh. This is accomplished by comparing the ideologies and qualifications of the three most recent nominees to those of previous nominees, as well as analysing the ideological composition of the Senate at the times of the individual nominations.
ContributorsJoss, Jacob (Author) / Hoekstra, Valerie (Thesis director) / Critchlow, Donald (Committee member) / Computer Science and Engineering Program (Contributor) / School of Politics and Global Studies (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131525-Thumbnail Image.png
Description
The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark

The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark fantasy theme. We will first be exploring the challenges that came
with programming my own game - not quite from scratch, but also without a prebuilt
engine - then transition into game design and how Helix has evolved from its original form
to what we see today.
ContributorsDiscipulo, Isaiah K (Author) / Meuth, Ryan (Thesis director) / Kobayashi, Yoshihiro (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05