Matching Items (10)
Filtering by

Clear all filters

152143-Thumbnail Image.png
Description
Radio frequency (RF) transceivers require a disproportionately high effort in terms of test development time, test equipment cost, and test time. The relatively high test cost stems from two contributing factors. First, RF transceivers require the measurement of a diverse set of specifications, requiring multiple test set-ups and long test

Radio frequency (RF) transceivers require a disproportionately high effort in terms of test development time, test equipment cost, and test time. The relatively high test cost stems from two contributing factors. First, RF transceivers require the measurement of a diverse set of specifications, requiring multiple test set-ups and long test times, which complicates load-board design, debug, and diagnosis. Second, high frequency operation necessitates the use of expensive equipment, resulting in higher per second test time cost compared with mixed-signal or digital circuits. Moreover, in terms of the non-recurring engineering cost, the need to measure complex specfications complicates the test development process and necessitates a long learning process for test engineers. Test time is dominated by changing and settling time for each test set-up. Thus, single set-up test solutions are desirable. Loop-back configuration where the transmitter output is connected to the receiver input are used as the desirable test set- up for RF transceivers, since it eliminates the reliance on expensive instrumentation for RF signal analysis and enables measuring multiple parameters at once. In-phase and Quadrature (IQ) imbalance, non-linearity, DC offset and IQ time skews are some of the most detrimental imperfections in transceiver performance. Measurement of these parameters in the loop-back mode is challenging due to the coupling between the receiver (RX) and transmitter (TX) parameters. Loop-back based solutions are proposed in this work to resolve this issue. A calibration algorithm for a subset of the above mentioned impairments is also presented. Error Vector Magnitude (EVM) is a system-level parameter that is specified for most advanced communication standards. EVM measurement often takes extensive test development efforts, tester resources, and long test times. EVM is analytically related to system impairments, which are typically measured in a production test i environment. Thus, EVM test can be eliminated from the test list if the relations between EVM and system impairments are derived independent of the circuit implementation and manufacturing process. In this work, the focus is on the WLAN standard, and deriving the relations between EVM and three of the most detrimental impairments for QAM/OFDM based systems (IQ imbalance, non-linearity, and noise). Having low cost test techniques for measuring the RF transceivers imperfections and being able to analytically compute EVM from the measured parameters is a complete test solution for RF transceivers. These techniques along with the proposed calibration method can be used in improving the yield by widening the pass/fail boundaries for transceivers imperfections. For all of the proposed methods, simulation and hardware measurements prove that the proposed techniques provide accurate characterization of RF transceivers.
ContributorsNassery, Afsaneh (Author) / Ozev, Sule (Thesis advisor) / Bakkaloglu, Bertan (Committee member) / Kiaei, Sayfe (Committee member) / Kitchen, Jennifer (Committee member) / Arizona State University (Publisher)
Created2013
150241-Thumbnail Image.png
Description
ABSTRACT To meet stringent market demands, manufacturers must produce Radio Frequency (RF) transceivers that provide wireless communication between electronic components used in consumer products at extremely low cost. Semiconductor manufacturers are in a steady race to increase integration levels through advanced system-on-chip (SoC) technology. The testing costs of these devices

ABSTRACT To meet stringent market demands, manufacturers must produce Radio Frequency (RF) transceivers that provide wireless communication between electronic components used in consumer products at extremely low cost. Semiconductor manufacturers are in a steady race to increase integration levels through advanced system-on-chip (SoC) technology. The testing costs of these devices tend to increase with higher integration levels. As the integration levels increase and the devices get faster, the need for high-calibre low cost test equipment become highly dominant. However testing the overall system becomes harder and more expensive. Traditionally, the transceiver system is tested in two steps utilizing high-calibre RF instrumentation and mixed-signal testers, with separate measurement setups for transmitter and receiver paths. Impairments in the RF front-end, such as the I/Q gain and phase imbalance and nonlinearity, severely affect the performance of the device. The transceiver needs to be characterized in terms of these impairments in order to guarantee good performance and specification requirements. The motivation factor for this thesis is to come up with a low cost and computationally simple extraction technique of these impairments. In the proposed extraction technique, the mapping between transmitter input signals and receiver output signals are used to extract the impairment and nonlinearity parameters. This is done with the help of detailed mathematical modeling of the transceiver. While the overall behavior is nonlinear, both linear and nonlinear models to be used under different test setups are developed. A two step extraction technique has been proposed in this work. The extraction of system parameters is performed by using the mathematical model developed along with a genetic algorithm implemented in MATLAB. The technique yields good extraction results with reasonable error. It uses simple mathematical operation which makes the extraction fast and computationally simple when compared to other existing techniques such as traditional two step dedicated approach, Nonlinear Solver (NLS) approach, etc. It employs frequency domain analysis of low frequency input and output signals, over cumbersome time domain computations. Thus a test method, including detailed behavioral modeling of the transceiver, appropriate test signal design along with a simple algorithm for extraction is presented.
ContributorsSreenivassan, Aiswariya (Author) / Ozev, Sule (Thesis advisor) / Kiaei, Sayfe (Committee member) / Bakkaloglu, Bertan (Committee member) / Arizona State University (Publisher)
Created2011
Description
My project focuses on the future of traditional radio and answering the question of whether or not it will be able to survive in a digital age. I provided a literature review to offer background of the history of radio and the challenges it has faced during its existence. I

My project focuses on the future of traditional radio and answering the question of whether or not it will be able to survive in a digital age. I provided a literature review to offer background of the history of radio and the challenges it has faced during its existence. I addressed five specific areas: television, satellite radio, Internet radio, podcasts, and mobile devices. My creative element consisted of a radio documentary that compiled interviews from professionals in the broadcast industry. I answered three questions in my research: 1.) Do experts in the industry believe traditional radio will be able to survive the emergence of new technologies and non-traditional programming? Or, will these new technologies and non-traditional programming ultimately overcome traditional radio and become the new standard? 2.) In what ways do industry experts believe that the traditional radio format will have to change in order to compete and prevail over new technologies to remain successful? 3.) Which non-traditional radio formats do industry experts believe pose the biggest threat to traditional radio? In conclusion, I found uncertain times--but also times of opportunity and innovation lie ahead for the broadcast industry. Traditional radio will exist for the foreseeable future. As long as the radio dial exists in cars, traditional radio will remain relevant. In order to adapt as digital technology gains popularity, traditional radio must provide programming that is interesting and compelling to satisfy the increased thirst for audio. Keywords: future of traditional radio, disruption, digital audio
ContributorsBrown, Benjamin Donald (Author) / Blatt, Rebecca (Thesis director) / Rackham, Brian (Committee member) / Walter Cronkite School of Journalism and Mass Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
133181-Thumbnail Image.png
Description
Chicago, the third largest city in the United States, is frequently in the national media's spotlight for negative news such as violence or failed gun laws. The city is hardly ever talked about in a positive light. This study aims to inform and educate outsiders of what the city is

Chicago, the third largest city in the United States, is frequently in the national media's spotlight for negative news such as violence or failed gun laws. The city is hardly ever talked about in a positive light. This study aims to inform and educate outsiders of what the city is like through the perspective lens of Chicago residents. To grasp a general understanding of Chicago, this creative project was completed through a narrative and interview-driven podcast series and split up into different topic categories. These categories were Chicago food, Chicago neighborhoods, Chicago's Southside, and Chicago sports. These topic areas are some of the things Chicago is most known for and give an adequate representation of what the city is like. Researching and putting this creative project into a podcast form proved how podcasts can be an alternative to in-depth and long-form journalism projects. The Chicago food episode called "Harold's v. Uncle Remus" explains the delicious food culture and showed two of the popular black restaurant chains that cater to the city. These two chicken spots are always a hot topic in heated debates of what place has the best chicken. The neighborhoods episode called "Won't You Be My Neighbor" highlights some of Chicago's interesting neighborhoods that tourists may not have on their attractions lists. This episode talks about the Pill Hill, Printers Row, and Little Italy neighborhoods, which all have unique histories. "Southside With You" explores the infamous region of Chicago, tells its history, and gives a theory as to why it continues to be the area it is known for in the media. Lastly, the sports episode "Sports: A History Lesson" is a full interview with a Chicago resident who has been a Chicago sports fan since the mid-60s and who has experienced the effects of racial divisions in sports. The episodes give only a peak of what the large city is like, but they demonstrate that Chicago is not this scary place, but a place with a complicatedly fascinating history.
ContributorsCarter, Jade (Author) / Thornton, Leslie-Jean (Thesis director) / Gatewood, Kira (Committee member) / School of Social and Behavioral Sciences (Contributor) / Walter Cronkite School of Journalism & Mass Comm (Contributor) / Barrett, The Honors College (Contributor)
Created2018-12
136912-Thumbnail Image.png
Description
Using data from the Arizona Radio Observatory Submillimeter Telescope, we have studied the active, star-forming region of the R Coronae Australis molecular cloud in 12CO (2-1), 13CO (2-1), and HCO+ (3-2). We baselined and mapped the data using CLASS. It was then used to create integrated intensity, outflow, and centroid

Using data from the Arizona Radio Observatory Submillimeter Telescope, we have studied the active, star-forming region of the R Coronae Australis molecular cloud in 12CO (2-1), 13CO (2-1), and HCO+ (3-2). We baselined and mapped the data using CLASS. It was then used to create integrated intensity, outflow, and centroid velocity maps in IDL. These clearly showed the main large outflow, and then we identified a few other possible outflows.
ContributorsBlumm, Margaret Elizabeth (Author) / Groppi, Christopher (Thesis director) / Bowman, Judd (Committee member) / Mauskopf, Philip (Committee member) / Barrett, The Honors College (Contributor) / School of Earth and Space Exploration (Contributor)
Created2014-05
134589-Thumbnail Image.png
Description
Radio astronomy is a subfield in astronomy that deals with objects emitting frequencies around 10 MHz to 100 GHz. The Low Frequency Array (LOFAR) is a array of radio antennas in Europe that can reach very low frequencies, roughly between 10-240 MHz. Our project was to image and clean a

Radio astronomy is a subfield in astronomy that deals with objects emitting frequencies around 10 MHz to 100 GHz. The Low Frequency Array (LOFAR) is a array of radio antennas in Europe that can reach very low frequencies, roughly between 10-240 MHz. Our project was to image and clean a field from LOFAR. The data was a 10 degree square in the sky centered at a right ascension of 10:19:34.608 and a declination +49.36.52.482. It was observed for 600 seconds at 141 MHz. To clean the field, we had to flag and remove any stations that were not responding. Using a program called FACTOR, we cleaned the image and reduced the residuals. Next we checked the validity of our sources. We checked positional offsets for our sources using the TGSS survey at 150 MHz, and corrected the declination of our LOFAR sources by a factor of 0.0002 degrees. We also fixed the LOFAR fluxes by a factor of 1.15. After this systematic check, we calculated the spectral index of our sources using the FIRST survey at 1435 MHz. We plotted this spectral index against LOFAR flux as well as redshift of the sources, and compared these to literature.
ContributorsStawinski, Stephanie Mae (Author) / Scannapieco, Evan (Thesis director) / Windhorst, Rogier (Committee member) / Karen, Olsen (Committee member) / Department of Physics (Contributor) / School of International Letters and Cultures (Contributor) / School of Earth and Space Exploration (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description

A reflection on my diverse educational experience as a sports journalism student, key lessons I learned about specific forms of communication and content creation within social media, written reporting and radio/podcasting and the demand for versatility among all modern journalists.

ContributorsBreber, Carson (Author) / Boivin, Paola (Thesis director) / Karpman, Chris (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / Walter Cronkite School of Journalism and Mass Comm (Contributor)
Created2022-05
157761-Thumbnail Image.png
Description
In the upcoming decade, powerful new astronomical facilities such as the James Webb Space Telescope (JWST), the Square Kilometer Array (SKA), and ground-based 30-meter telescopes will open up the epoch of reionization to direct astronomical observation. One of the primary tools used to understand the bulk astrophysical properties of the

In the upcoming decade, powerful new astronomical facilities such as the James Webb Space Telescope (JWST), the Square Kilometer Array (SKA), and ground-based 30-meter telescopes will open up the epoch of reionization to direct astronomical observation. One of the primary tools used to understand the bulk astrophysical properties of the high-redshift universe are empirically-derived star-forming laws, which relate observed luminosity to fundamental astrophysical quantities such as star formation rate. The radio/infrared relation is one of the more mysterious of these relations: despite its somewhat uncertain astrophysical origins, this relation is extremely tight and linear, with 0.3 dex of scatter over five orders of magnitude in galaxy luminosity. The effects of primordial metallicities on canonical star-forming laws is an open question: a growing body of evidence suggests that the current empirical star forming laws may not be valid in the unenriched, metal-poor environment of the very early universe.

In the modern universe, nearby dwarf galaxies with less than 1/10th the Solar metal abundance provide an opportunity to recalibrate our star formation laws and study the astrophysics of extremely metal-deficient (XMD) environments in detail. I assemble a sample of nearby dwarf galaxies, all within 100 megaparsecs, with nebular oxygen abundances between 1/5th and 1/50th Solar. I identify the subsample of these galaxies with space-based mid- and far-infrared data, and investigate the effects of extreme metallicities on the infrared-radio relationship. For ten of these galaxies, I have acquired 40 hours of observations with the Jansky Very Large Array (JVLA). C-band (4-8 GHz) radio continuum emission is detected from all 10 of these galaxies. These represent the first radio continuum detections from seven galaxies in this sample: Leo A, UGC 4704, HS 0822+3542, SBS 0940+544, and SBS 1129+476. The radio continuum in these galaxies is strongly associated with the presence of optical H-alpha emission, with spectral slopes suggesting a mix of thermal and non-thermal sources. I use the ratio of the radio and far-infrared emission to investigate behavior of the C-band (4-8 GHz) radio/infrared relation at metallicities below 1/10th Solar.

I compare the low metallicity sample with the 4.8 GHz radio/infrared relationship from the KINGFISHER nearby galaxy sample Tabatabaei et al. 2017 and to the 1.4 GHz radio/infrared relationship from the blue compact dwarf galaxy sample of Wu et al. 2008. The infrared/radio ratio q of the low metallicity galaxies is below the average q of star forming galaxies in the modern universe. I compare these galaxies' infrared and radio luminosities to their corresponding Halpha luminosities, and find that both the infrared/Halpha and the radio/H-alpha ratios are reduced by nearly 1 dex in the low metallicity sample vs. higher metallicity galaxies; however the deficit is not straightforwardly interpreted as a metallicity effect.
ContributorsMonkiewicz, Jacqueline Ann (Author) / Bowman, Judd (Thesis advisor) / Scowen, Paul (Thesis advisor) / Mauskopf, Philip (Committee member) / Scannapieco, Evan (Committee member) / Jansen, Rolf (Committee member) / Arizona State University (Publisher)
Created2018
132610-Thumbnail Image.png
Description
The Hydrogen Epoch of Reionization Array, HERA, is a radio telescope currently being built in South Africa that plans to observe the early universe, specifically the earliest period of star and galaxy formation. It plans to use a tool called a delay spectrum to separate signal emitted from this time

The Hydrogen Epoch of Reionization Array, HERA, is a radio telescope currently being built in South Africa that plans to observe the early universe, specifically the earliest period of star and galaxy formation. It plans to use a tool called a delay spectrum to separate signal emitted from this time from the much brighter radio foregrounds. It is the purpose of this paper to outline the method used to characterize the contamination of these delay spectra by bright emissions of radio here on Earth called radio frequency interference, RFI. The portion of the bandwidth containing the signal from the period of initial star formation was specifically examined. In order to receive usable data, the HERA commissioning team was assisted in the evaluation of the most recent data releases. On the first batch of usable data, flagging algorithms were run in order to mask all of the RFI present. A method of filling these masked values was determined, which allowed for the delay spectrum to be observed. Various methods of injecting RFI into the data were tested which portrayed the large dependence of the delay spectrum on its presence. Finally, the noise power was estimated in order to predict whether or not the limitations observed in the dynamic range were comparable to the noise floor. By examining the evolution of the delay spectrum's power as a range of noise power was introduced, there is a good amount of evidence that this limitation is in fact the noise floor. From this, we see that excision algorithms and interpolation used are capable of removing the effects of most all of the RFI contamination.
ContributorsBechtel, Shane Kirkpatrick (Author) / Bowman, Judd (Thesis director) / Jacobs, Daniel (Committee member) / Beardsley, Adam (Committee member) / School of Earth and Space Exploration (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
Description
Ever since Cleveland-based disc jockey Alan Freed coined the term "rock 'n' roll" in the early 1950s, the genre has gone through various mass media and digital changes over the decades. These changes took place on the radio, television and internet. Each platform had its own unique ways of increasing

Ever since Cleveland-based disc jockey Alan Freed coined the term "rock 'n' roll" in the early 1950s, the genre has gone through various mass media and digital changes over the decades. These changes took place on the radio, television and internet. Each platform had its own unique ways of increasing the popularity of rock artists as well as the genre itself. Although the radio is not as popular today as it was in the 20th century, it helped pave the way for today’s most popular music streaming platforms like Spotify and Apple Music. The television gave artists a chance to be seen nationally or even worldwide. Music videos and live performances allowed viewers to see past artists’ voices and witness their energy. The internet gave bands and artists multiple platforms to share their content and connect with fans. In 2020, having a social media presence became essential for artists wanting to maintain a successful music career during the COVID-19 pandemic. Rock music most likely would not be what it is today if it had not gone through these various changes.
ContributorsUrriola, Monica (Author) / Thornton, Leslie (Thesis director) / Agne, Tim (Committee member) / School of International Letters and Cultures (Contributor) / Walter Cronkite School of Journalism & Mass Comm (Contributor) / Barrett, The Honors College (Contributor)
Created2020-12