Advancing the State-of-the-Art of Microwave Astronomy: Novel FPGA-Based Firmware Algorithms for the Next Generation of Observational Radio and Sub-millimeter Wave Detection

193837-Thumbnail Image.png
Description
This dissertation presents a comprehensive study on the advancement of astrophysical radio, microwave, and terahertz instrumentation/simulations with three pivotal components.First, theoretical simulations of high metallicity galaxies are conducted using the supercomputing resources of Purdue University and NASA. These simulations model

This dissertation presents a comprehensive study on the advancement of astrophysical radio, microwave, and terahertz instrumentation/simulations with three pivotal components.First, theoretical simulations of high metallicity galaxies are conducted using the supercomputing resources of Purdue University and NASA. These simulations model the evolution of a gaseous cloud akin to a nascent galaxy, incorporating variables such as kinetic energy, mass, radiation fields, magnetic fields, and turbulence. The objective is to scrutinize the spatial distribution of various isotopic elements in galaxies with unusually high metallicities and measure the effects of magnetic fields on their structural distribution. Next, I proceed with an investigation of the technology used for reading out Microwave Kinetic Inductance Detectors (MKIDs) and their dynamic range limitations tied to the current method of FPGA-based readout firmware. In response, I introduce an innovative algorithm that employs PID controllers and phase-locked loops for tracking the natural frequencies of resonator pixels, thereby eliminating the need for costly mid-observation frequency recalibrations which currently hinder the widespread use of MKID arrays. Finally, I unveil the novel Spectroscopic Lock-in Firmware (SpLiF) algorithm designed to address the pernicious low-frequency noise plaguing emergent quantum-limited detection technologies. The SpLiF algorithm harmonizes the mathematical principles of lock-in amplification with the capabilities of a Fast Fourier Transform to protect spectral information from pink noise and other low-frequency noise contributors inherent to most detection systems. The efficacy of the SpLiF algorithm is substantiated through rigorous mathematical formulation, software simulations, firmware simulations, and benchtop lab results.
Date Created
2024
Agent

CUTLASS: Coral Reef, Underwater Terrain, and Littoral Archaeological Site Surveyor

171481-Thumbnail Image.png
Description

Undersea scientific ocean exploration and research only began in earnest approximately150 years ago. Much has been learned and discovered in that time, but there are also gaps in understanding of the ocean depths. One source of the knowledge gap is

Undersea scientific ocean exploration and research only began in earnest approximately150 years ago. Much has been learned and discovered in that time, but there are also gaps in understanding of the ocean depths. One source of the knowledge gap is the relative lack of crewed exploration in some regions of the ocean. This work presents a vehicle that provides divers with longer time at deeper depths than is currently available in an unpressurized environment, reduces diver workload, and improves situational awareness. Working in collaboration with the scientific diver community, top-level requirements were defined, and a Concept of Operations was developed. This effort is followed up with a vehicle design which provides the capability for two divers to complete unpressurized dives to 200 meters, remain there for 20 minutes, and return to the surface within 12 hours. Additional functionality provided by the vehicle includes significant cargo capacity, voice and data communication with the surface, geolocation capabilities, and automated maneuvering and decompression management. Analysis of the hull shape and propulsion system is presented which demonstrates that the vehicle can reach its velocity and acceleration performance requirements. A virtual environment is then presented which has the potential to allow for end-to-end mission performance evaluation. Finally, the constraints on the life support system are discussed and source code for a simulation is presented. The final chapter of this work examines a hypothetical mission to 200 meters depth. The various phases of the mission are discussed as well as the potential consumption of both oxygen and electricity. Two life support gas mixtures are examined, and the resulting decompression profiles are presented. The final analysis shows that it is possible to conduct dives to 200 meters, perform 20 minutes of work, and return to the surface within 12 hours using the CUTLASS vehicle that is presented.

Date Created
2022
Agent

Falsification of the Integrated Information Theory of Consciousness

161318-Thumbnail Image.png
Description
Astrobiology is premised on the idea that life beyond Earth can exist. Yet, everything known about life is derivative from life on Earth. To understand life beyond Earth, then, requires a definition of life that is abstracted beyond a particular

Astrobiology is premised on the idea that life beyond Earth can exist. Yet, everything known about life is derivative from life on Earth. To understand life beyond Earth, then, requires a definition of life that is abstracted beyond a particular geophysical context. To do this requires a formal understanding of the physical mechanisms by which matter is animated into life. At current, such descriptions are completely lacking for the emergence of life, but do exist for the emergence of consciousness. Namely, contemporary neuroscience offers definitions for universal physical processes that are in one-to-one correspondence with conscious experience. Since consciousness is a sufficient condition for life, these universal definitions of consciousness offer an interesting way forward in terms of the search for life in the cosmos. In this work, I systematically examine Integrated Information Theory (IIT), a well-established theory of consciousness, with the aim of applying it in both biological and astrobiological settings. Surprisingly, I discover major problems with Integrated Information Theory on two fronts: mathematical and epistemological. On the mathematical side, I show how degeneracies buried deep within the theory render it mathematically ill-defined, while on the epistemological side, I prove that the postulates of IIT are scientifically unfalsifiable and inherently metaphysical. Given that IIT is the preeminent theory of consciousness in modern neuroscience, these results have far-reaching implications in this field. In addition, I show that the epistemic issues of falsifiability that hamstring IIT apply quite generally to all contemporary theories of consciousness, which suggests a major reframing of the problem is necessary. The problems that I reveal in regard to defining consciousness offer an important parallel in regard to defining life, as both fields seek to define their topic of study in absence of an existing theoretical framework. To avoid metaphysical problems related to falsifiability, universal theories of both life and consciousness must be framed with respect to independent empirical observations that can be used to benchmark predictions from the theory. In this regard, I argue that the epistemic debate over scientific theories of consciousness should be used to inform the discussion regarding theoretical definitions of life.
Date Created
2021
Agent

Instrument design and radiation pattern testing for terahertz astronomical instruments

156378-Thumbnail Image.png
Description
The Milky Way galaxy is a powerful dynamic system that is highly efficient at recycling material. Stars are born out of intergalactic gas and dust, fuse light elements into heavier elements in their cores, then upon stellar death spread material

The Milky Way galaxy is a powerful dynamic system that is highly efficient at recycling material. Stars are born out of intergalactic gas and dust, fuse light elements into heavier elements in their cores, then upon stellar death spread material throughout the galaxy, either by diffusion of planetary nebula or by explosive events for high mass stars, and that gas must cool and condense to form stellar nurseries. Though the stellar lifecycle has been studied in detail, relatively little is known about the processes by which hot, diffuse gas ejected by dying stars cools and conglomerates in the interstellar medium (ISM). Much of this mystery arises because only recently have instruments with sufficient spatial and spectral resolution, sensitivity, and bandwidth become available in the terahertz (THz) frequency spectrum where these clouds peak in either thermal or line emission. In this dissertation, I will demonstrate technology advancement of instruments in this frequency regime with new characterization techniques, machining strategies, and scientific models of the spectral behavior of gas species targeted by these instruments.

I begin this work with a description of radiation pattern measurements and their use in astronomical instrument characterization. I will introduce a novel technique to measure complex (phase-sensitive) field patterns using direct detectors. I successfully demonstrate the technique with a single pixel microwave inductance detectors (MKID) experiment. I expand that work by measuring the APEX MKID (A-MKID) focal plane array of 880 pixel detectors centered at 350 GHz. In both chapters I discuss the development of an analysis pipeline to take advantage of all information provided by complex field mapping. I then discuss the design, simulation, fabrication processes, and characterization of a circular-to-rectangular waveguide transformer module integrated into a circularly symmetric feedhorn block. I conclude with a summary of this work and how to advance these technologies for future ISM studies.
Date Created
2018
Agent

Antenna design and foreground characterization for improved detection of the redshifted 21 cm global signature during the Epoch of Reionization

155904-Thumbnail Image.png
Description
The Universe transitioned from a state of neutral hydrogen (HI) shortly after recombination to its present day ionized state, but this transition, the Epoch of Reionization (EoR), has been poorly constrained by observational data. Estimates place the EoR between redshifts

The Universe transitioned from a state of neutral hydrogen (HI) shortly after recombination to its present day ionized state, but this transition, the Epoch of Reionization (EoR), has been poorly constrained by observational data. Estimates place the EoR between redshifts 6 < z <13 (330-770 Myr).

The interaction of the 21 cm hyperfine ground state emission/absorption-line of HI with the cosmic microwave background (CMB) and the radiation from the first luminous sources in the universe can be used to extract cosmological information about the EoR. Theorists have created global redshifted 21 cm EoR models of this interaction that predict the temperature perturbations to the CMB in the form of a sky-averaged difference temperature, Tb. The difficulty in measuring Tb is that it is

predicted to be on the order of 20 to 100 mK, while the sky foreground is dominated

by synchrotron radiation that is 105 times brighter. The challenge is to subtract the much brighter foreground radiation without subtracting the Tb signal and can only be done when the data has small error levels.

The Experiment to Detect the Global EoR Signature (EDGES) is an effort to measure Tb with a single wide field-of-view well-calibrated antenna. This dissertation focuses on reducing systematic errors by quantifying the impact of the chromatic nature of the antenna’s beam directivity and by measuring the variability of the spectral index of the radio sky foreground. The chromatic beam study quantified the superior qualities of the rectangular blade-shaped antenna and led to its adoption over the previously used fourpoint-shaped antenna and determined that a 5 term polynomial was optimum for removing the foreground. The spectral index, β, of the sky was measured, using 211 nights of data, to be −2.60 > β > −2.62 in lower LST regions, increasing to −2.50 near the Galactic plane. This matched simulated results using the Guzm´an et al. (2011) sky map (∆β < 0.05) and demonstrated the exceptional stability of the EDGES instrument. Lastly, an EoR model by Kaurov & Gnedin (2016) was shown to be inconsistent with measured EDGES data at a significance level of 1.9.
Date Created
2017
Agent

Advancement of heterodyne focal plane arrays for terahertz astronomy

155091-Thumbnail Image.png
Description
The Kilopixel Array Pathfinder Project (KAPPa) advances the number of coherent high-frequency terahertz (THz) receivers that could be packed into a single focal plane array on existing submm telescopes. The KAPPa receiver, at 655-695 GHz, is a high frequency heterodyne

The Kilopixel Array Pathfinder Project (KAPPa) advances the number of coherent high-frequency terahertz (THz) receivers that could be packed into a single focal plane array on existing submm telescopes. The KAPPa receiver, at 655-695 GHz, is a high frequency heterodyne receiver that can achieve system temperatures of less than 200 K, the specification for ALMA band-9. The KAPPa receiver uses a novel design of a permanent magnet to suppress the noise generated by the DC Josephson effect. This is in stark contrast to the benchmark solution of an electromagnet that is both too expensive and too large for use in kilo-pixel arrays. I present a simple, robust design for a single receiver element that can be tessellated throughout a telescope's focal plane to make a ~1000 pixel array, which is much larger than the current state-of-the-art array, SuperCam, at 64 pixels and ~345 GHz.

While the original goal to develop receiver technologies has been accomplished, the path to this accomplishment required a far more holistic approach than originally anticipated. The goal of the present work has expended exponentially from that of KAPPas promised technical achievements. In the present work, KAPPa and its extension, I present solutions ranging from 1) the creation of large scale astronomical maps, 2) metaheuristic algorithms that solve tasks too complex for humans, and 3) detailed technical assembly of microscopic circuit components. Each part is equally integral for the realization of a ~1000 pixel THz arrays.

Our automated tuning algorithm, Alice, uses differential evolution techniques and has been extremely successful in its implementation. Alice provides good results for characterizing the extremely complex tuning topology of THz receivers. More importantly, it has accomplished rapid optimization of an entire array without human intervention. In the age of big data astronomy, I have prepared THz heterodyne receiver arrays by making cutting edge community-oriented data analysis tools for the future of large-scale discovery. I present a from-scratch reduction and analysis architecture developed for observations of 100s of square degree on-the-sky maps with SuperCam to address the gulf between observing with single dish antennas versus a truly integrated focal plane array.
Date Created
2016
Agent

Investigation of Star Formation: Instrumentation and Methodology

151356-Thumbnail Image.png
Description
A thorough exploration of star formation necessitates observation across the electromagnetic spectrum. In particular, observations in the submillimeter and ultra-violet allow one to observe very early stage star formation and to trace the evolution from molecular cloud collapse to stellar

A thorough exploration of star formation necessitates observation across the electromagnetic spectrum. In particular, observations in the submillimeter and ultra-violet allow one to observe very early stage star formation and to trace the evolution from molecular cloud collapse to stellar ignition. Submillimeter observations are essential for piercing the heart of heavily obscured stellar nurseries to observe star formation in its infancy. Ultra-violet observations allow one to observe stars just after they emerge from their surrounding environment, allowing higher energy radiation to escape. To make detailed observations of early stage star formation in both spectral regimes requires state-of-the-art detector technology and instrumentation. In this dissertation, I discuss the calibration and feasibility of detectors developed by Lawrence Berkeley National Laboratory and specially processed at the Jet Propulsion Laboratory to increase their quantum efficiency at far-ultraviolet wavelengths. A cursory treatment of the delta-doping process is presented, followed by a thorough discussion of calibration procedures developed at JPL and in the Laboratory for Astronomical and Space Instrumentation at ASU. Subsequent discussion turns to a novel design for a Modular Imager Cell forming one possible basis for construction of future large focal plane arrays. I then discuss the design, fabrication, and calibration of a sounding rocket imaging system developed using the MIC and these specially processed detectors. Finally, I discuss one scientific application of sub-mm observations. I used data from the Heinrich Hertz Sub-millimeter Telescope and the Sub-Millimeter Array (SMA) to observe sub-millimeter transitions and continuum emission towards AFGL 2591. I tested the use of vibrationally excited HCN emission to probe the protostellar accretion disk structure. I measured vibrationally excited HCN line ratios in order to elucidate the appropriate excitation mechanism. I find collisional excitation to be dominant, showing the emission originates in extremely dense (n&sim10;11 cm-3), warm (T&sim1000; K) gas. Furthermore, from the line profile of the v=(0, 22d, 0) transition, I find evidence for a possible accretion disk.
Date Created
2012
Agent