Matching Items (392)
152200-Thumbnail Image.png
Description
Magnetic Resonance Imaging using spiral trajectories has many advantages in speed, efficiency in data-acquistion and robustness to motion and flow related artifacts. The increase in sampling speed, however, requires high performance of the gradient system. Hardware inaccuracies from system delays and eddy currents can cause spatial and temporal distortions in

Magnetic Resonance Imaging using spiral trajectories has many advantages in speed, efficiency in data-acquistion and robustness to motion and flow related artifacts. The increase in sampling speed, however, requires high performance of the gradient system. Hardware inaccuracies from system delays and eddy currents can cause spatial and temporal distortions in the encoding gradient waveforms. This causes sampling discrepancies between the actual and the ideal k-space trajectory. Reconstruction assuming an ideal trajectory can result in shading and blurring artifacts in spiral images. Current methods to estimate such hardware errors require many modifications to the pulse sequence, phantom measurements or specialized hardware. This work presents a new method to estimate time-varying system delays for spiral-based trajectories. It requires a minor modification of a conventional stack-of-spirals sequence and analyzes data collected on three orthogonal cylinders. The method is fast, robust to off-resonance effects, requires no phantom measurements or specialized hardware and estimate variable system delays for the three gradient channels over the data-sampling period. The initial results are presented for acquired phantom and in-vivo data, which show a substantial reduction in the artifacts and improvement in the image quality.
ContributorsBhavsar, Payal (Author) / Pipe, James G (Thesis advisor) / Frakes, David (Committee member) / Kodibagkar, Vikram (Committee member) / Arizona State University (Publisher)
Created2013
151833-Thumbnail Image.png
Description
The end of the nineteenth century was an exhilarating and revolutionary era for the flute. This period is the Second Golden Age of the flute, when players and teachers associated with the Paris Conservatory developed what would be considered the birth of the modern flute school. In addition, the founding

The end of the nineteenth century was an exhilarating and revolutionary era for the flute. This period is the Second Golden Age of the flute, when players and teachers associated with the Paris Conservatory developed what would be considered the birth of the modern flute school. In addition, the founding in 1871 of the Société Nationale de Musique by Camille Saint-Saëns (1835-1921) and Romain Bussine (1830-1899) made possible the promotion of contemporary French composers. The founding of the Société des Instruments à Vent by Paul Taffanel (1844-1908) in 1879 also invigorated a new era of chamber music for wind instruments. Within this groundbreaking environment, Mélanie Hélène Bonis (pen name Mel Bonis) entered the Paris Conservatory in 1876, under the tutelage of César Franck (1822-1890). Many flutists are dismayed by the scarcity of repertoire for the instrument in the Romantic and post-Romantic traditions; they make up for this absence by borrowing the violin sonatas of Gabriel Fauré (1845-1924) and Franck. The flute and piano works of Mel Bonis help to fill this void with music composed originally for flute. Bonis was a prolific composer with over 300 works to her credit, but her works for flute and piano have not been researched or professionally recorded in the United States before the present study. Although virtually unknown today in the American flute community, Bonis's music received much acclaim from her contemporaries and deserves a prominent place in the flutist's repertoire. After a brief biographical introduction, this document examines Mel Bonis's musical style and describes in detail her six works for flute and piano while also offering performance suggestions.
ContributorsDaum, Jenna Elyse (Author) / Buck, Elizabeth (Thesis advisor) / Holbrook, Amy (Committee member) / Micklich, Albie (Committee member) / Schuring, Martin (Committee member) / Norton, Kay (Committee member) / Arizona State University (Publisher)
Created2013
ContributorsMatthews, Eyona (Performer) / Yoo, Katie Jihye (Performer) / Roubison, Ryan (Performer) / ASU Library. Music Library (Publisher)
Created2018-03-25
ContributorsHoeckley, Stephanie (Performer) / Lee, Juhyun (Performer) / ASU Library. Music Library (Publisher)
Created2018-03-24
150127-Thumbnail Image.png
Description
This dissertation describes development of a procedure for obtaining high quality, optical grade sand coupons from frozen sand specimens of Ottawa 20/30 sand for image processing and analysis to quantify soil structure along with a methodology for quantifying the microstructure from the images. A technique for thawing and stabilizing

This dissertation describes development of a procedure for obtaining high quality, optical grade sand coupons from frozen sand specimens of Ottawa 20/30 sand for image processing and analysis to quantify soil structure along with a methodology for quantifying the microstructure from the images. A technique for thawing and stabilizing frozen core samples was developed using optical grade Buehler® Epo-Tek® epoxy resin, a modified triaxial cell, a vacuum/reservoir chamber, a desiccator, and a moisture gauge. The uniform epoxy resin impregnation required proper drying of the soil specimen, application of appropriate confining pressure and vacuum levels, and epoxy mixing, de-airing and curing. The resulting stabilized sand specimen was sectioned into 10 mm thick coupons that were planed, ground, and polished with progressively finer diamond abrasive grit levels using the modified Allied HTP Inc. polishing method so that the soil structure could be accurately quantified using images obtained with the use of an optical microscopy technique. Illumination via Bright Field Microscopy was used to capture the images for subsequent image processing and sand microstructure analysis. The quality of resulting images and the validity of the subsequent image morphology analysis hinged largely on employment of a polishing and grinding technique that resulted in a flat, scratch free, reflective coupon surface characterized by minimal microstructure relief and good contrast between the sand particles and the surrounding epoxy resin. Subsequent image processing involved conversion of the color images first to gray scale images and then to binary images with the use of contrast and image adjustments, removal of noise and image artifacts, image filtering, and image segmentation. Mathematical morphology algorithms were used on the resulting binary images to further enhance image quality. The binary images were then used to calculate soil structure parameters that included particle roundness and sphericity, particle orientation variability represented by rose diagrams, statistics on the local void ratio variability as a function of the sample size, and the local void ratio distribution histograms using Oda's method and Voronoi tessellation method, including the skewness, kurtosis, and entropy of a gamma cumulative probability distribution fit to the local void ratio distribution.
ContributorsCzupak, Zbigniew David (Author) / Kavazanjian, Edward (Thesis advisor) / Zapata, Claudia (Committee member) / Houston, Sandra (Committee member) / Arizona State University (Publisher)
Created2011
151112-Thumbnail Image.png
Description
The Arizona State University Herbarium began in 1896 when Professor Fredrick Irish collected the first recorded Arizona specimen for what was then called the Tempe Normal School - a Parkinsonia microphylla. Since then, the collection has grown to approximately 400,000 specimens of vascular plants and lichens. The most recent project

The Arizona State University Herbarium began in 1896 when Professor Fredrick Irish collected the first recorded Arizona specimen for what was then called the Tempe Normal School - a Parkinsonia microphylla. Since then, the collection has grown to approximately 400,000 specimens of vascular plants and lichens. The most recent project includes the digitization - both the imaging and databasing - of approximately 55,000 vascular plant specimens from Latin America. To accomplish this efficiently, possibilities in non-traditional methods, including both new and existing technologies, were explored. SALIX (semi-automatic label information extraction) was developed as the central tool to handle automatic parsing, along with BarcodeRenamer (BCR) to automate image file renaming by barcode. These two developments, combined with existing technologies, make up the SALIX Method. The SALIX Method provides a way to digitize herbarium specimens more efficiently than the traditional approach of entering data solely through keystroking. Using digital imaging, optical character recognition, and automatic parsing, I found that the SALIX Method processes data at an average rate that is 30% faster than typing. Data entry speed is dependent on user proficiency, label quality, and to a lesser degree, label length. This method is used to capture full specimen records, including close-up images where applicable. Access to biodiversity data is limited by the time and resources required to digitize, but I have found that it is possible to do so at a rate that is faster than typing. Finally, I experiment with the use of digital field guides in advancing access to biodiversity data, to stimulate public engagement in natural history collections.
ContributorsBarber, Anne Christine (Author) / Landrum, Leslie R. (Thesis advisor) / Wojciechowski, Martin F. (Thesis advisor) / Gilbert, Edward (Committee member) / Lafferty, Daryl (Committee member) / Arizona State University (Publisher)
Created2012
ContributorsMcClain, Katelyn (Performer) / Buringrud, Deanna (Contributor) / Lee, Juhyun (Performer) / ASU Library. Music Library (Publisher)
Created2018-03-31
157327-Thumbnail Image.png
Description
The challenge of radiation therapy is to maximize the dose to the tumor while simultaneously minimizing the dose elsewhere. Proton therapy is well suited to this challenge due to the way protons slow down in matter. As the proton slows down, the rate of energy loss per unit path length

The challenge of radiation therapy is to maximize the dose to the tumor while simultaneously minimizing the dose elsewhere. Proton therapy is well suited to this challenge due to the way protons slow down in matter. As the proton slows down, the rate of energy loss per unit path length continuously increases leading to a sharp dose near the end of range. Unlike conventional radiation therapy, protons stop inside the patient, sparing tissue beyond the tumor. Proton therapy should be superior to existing modalities, however, because protons stop inside the patient, there is uncertainty in the range. “Range uncertainty” causes doctors to take a conservative approach in treatment planning, counteracting the advantages offered by proton therapy. Range uncertainty prevents proton therapy from reaching its full potential.

A new method of delivering protons, pencil-beam scanning (PBS), has become the new standard for treatment over the past few years. PBS utilizes magnets to raster scan a thin proton beam across the tumor at discrete locations and using many discrete pulses of typically 10 ms duration each. The depth is controlled by changing the beam energy. The discretization in time of the proton delivery allows for new methods of dose verification, however few devices have been developed which can meet the bandwidth demands of PBS.

In this work, two devices have been developed to perform dose verification and monitoring with an emphasis placed on fast response times. Measurements were performed at the Mayo Clinic. One detector addresses range uncertainty by measuring prompt gamma-rays emitted during treatment. The range detector presented in this work is able to measure the proton range in-vivo to within 1.1 mm at depths up to 11 cm in less than 500 ms and up to 7.5 cm in less than 200 ms. A beam fluence detector presented in this work is able to measure the position and shape of each beam spot. It is hoped that this work may lead to a further maturation of detection techniques in proton therapy, helping the treatment to reach its full potential to improve the outcomes in patients.
ContributorsHolmes, Jason M (Author) / Alarcon, Ricardo (Thesis advisor) / Bues, Martin (Committee member) / Galyaev, Eugene (Committee member) / Chamberlin, Ralph (Committee member) / Arizona State University (Publisher)
Created2019
ContributorsHur, Jiyoun (Performer) / Lee, Juhyun (Performer) / ASU Library. Music Library (Publisher)
Created2018-03-01
135354-Thumbnail Image.png
Description
Introduction: There are 350 to 400 pediatric heart transplants annually according to the Pediatric Heart Transplant Database (Dipchand et al. 2014). Finding appropriate donors can be challenging especially for the pediatric population. The current standard of care is a donor-to-recipient weight ratio. This ratio is not necessarily

Introduction: There are 350 to 400 pediatric heart transplants annually according to the Pediatric Heart Transplant Database (Dipchand et al. 2014). Finding appropriate donors can be challenging especially for the pediatric population. The current standard of care is a donor-to-recipient weight ratio. This ratio is not necessarily a parameter directly indicative of the size of a heart, potentially leading to ill-fitting allografts (Tang et al. 2010). In this paper, a regression model is presented - developed by correlating total cardiac volume to non-invasive imaging parameters and patient characteristics – for use in determining ideal allograft fit with respect to total cardiac volume.
Methods: A virtual, 3D library of clinically-defined normal hearts was compiled from reconstructed CT and MR scans. Non-invasive imaging parameters and patient characteristics were collected and subjected to backward elimination linear regression to define a model relating patient parameters to the total cardiac volume. This regression model was then used to retrospectively accept or reject an ‘ideal’ donor graft from the library for 3 patients that had undergone heart transplantation. Oversized and undersized grafts were also transplanted to qualitatively analyze virtual transplantation specificity.
Results: The backward elimination approach of the data for the 20 patients rejected the factors of BMI, BSA, sex and both end-systolic and end-diastolic left ventricular measurements from echocardiography. Height and weight were included in the linear regression model yielding an adjusted R-squared of 82.5%. Height and weight showed statistical significance with p-values of 0.005 and 0.02 respectively. The final equation for the linear regression model was TCV = -169.320+ 2.874h + 3.578w ± 73 (h=height, w=weight, TCV= total cardiac volume).
Discussion: With the current regression model, height and weight significantly correlate to total cardiac volume. This regression model and virtual normal heart library provide for the possibility of virtual transplant and size-matching for transplantation. The study and regression model is, however, limited due to a small sample size. Additionally, the lack of volumetric resolution from the MR datasets is a potentially limiting factor. Despite these limitations the virtual library has the potential to be a critical tool for clinical care that will continue to grow as normal hearts are added to the virtual library.
ContributorsSajadi, Susan (Co-author) / Lindquist, Jacob (Co-author) / Frakes, David (Thesis director) / Ryan, Justin (Committee member) / Harrington Bioengineering Program (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05