Matching Items (3)
Filtering by

Clear all filters

136520-Thumbnail Image.png
Description
Deconvolution of noisy data is an ill-posed problem, and requires some form of regularization to stabilize its solution. Tikhonov regularization is the most common method used, but it depends on the choice of a regularization parameter λ which must generally be estimated using one of several common methods. These methods

Deconvolution of noisy data is an ill-posed problem, and requires some form of regularization to stabilize its solution. Tikhonov regularization is the most common method used, but it depends on the choice of a regularization parameter λ which must generally be estimated using one of several common methods. These methods can be computationally intensive, so I consider their behavior when only a portion of the sampled data is used. I show that the results of these methods converge as the sampling resolution increases, and use this to suggest a method of downsampling to estimate λ. I then present numerical results showing that this method can be feasible, and propose future avenues of inquiry.
ContributorsHansen, Jakob Kristian (Author) / Renaut, Rosemary (Thesis director) / Cochran, Douglas (Committee member) / Barrett, The Honors College (Contributor) / School of Music (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
135425-Thumbnail Image.png
Description
The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor

The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor edge detection method was therefore developed to realize an edge detector directly from spectral data. This thesis explores the possibilities of detecting edges from the phase of the spectral data, that is, without the magnitude of the sampled spectral data. Prior work has demonstrated that the spectral phase contains particularly important information about underlying features in a signal. Furthermore, the concentration factor method yields some insight into the detection of edges in spectral phase data. An iterative design approach was taken to realize an edge detector using only the spectral phase data, also allowing for the design of an edge detector when phase data are intermittent or corrupted. Problem formulations showing the power of the design approach are given throughout. A post-processing scheme relying on the difference of multiple edge approximations yields a strong edge detector which is shown to be resilient under noisy, intermittent phase data. Lastly, a thresholding technique is applied to give an explicit enhanced edge detector ready to be used. Examples throughout are demonstrate both on signals and images.
ContributorsReynolds, Alexander Bryce (Author) / Gelb, Anne (Thesis director) / Cochran, Douglas (Committee member) / Viswanathan, Adityavikram (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
153479-Thumbnail Image.png
Description
Analysis of social networks has the potential to provide insights into wide range of applications. As datasets continue to grow, a key challenge is the lack of a widely applicable algorithmic framework for detection of statistically anomalous networks and network properties. Unlike traditional signal processing, where models of truth or

Analysis of social networks has the potential to provide insights into wide range of applications. As datasets continue to grow, a key challenge is the lack of a widely applicable algorithmic framework for detection of statistically anomalous networks and network properties. Unlike traditional signal processing, where models of truth or empirical verification and background data exist and are often well defined, these features are commonly lacking in social and other networks. Here, a novel algorithmic framework for statistical signal processing for graphs is presented. The framework is based on the analysis of spectral properties of the residuals matrix. The framework is applied to the detection of innovation patterns in publication networks, leveraging well-studied empirical knowledge from the history of science. Both the framework itself and the application constitute novel contributions, while advancing algorithmic and mathematical techniques for graph-based data and understanding of the patterns of emergence of novel scientific research. Results indicate the efficacy of the approach and highlight a number of fruitful future directions.
ContributorsBliss, Nadya Travinin (Author) / Laubichler, Manfred (Thesis advisor) / Castillo-Chavez, Carlos (Thesis advisor) / Papandreou-Suppappola, Antonia (Committee member) / Arizona State University (Publisher)
Created2015