Filtering by
- All Subjects: Radar
- Creators: Kosut, Oliver
- Creators: Bliss, Daniel W.
- Member of: Theses and Dissertations
- Member of: Barrett, The Honors College Thesis/Creative Project Collection
- Resource Type: Text
Lossy compression is a form of compression that slightly degrades a signal in ways that are ideally not detectable to the human ear. This is opposite to lossless compression, in which the sample is not degraded at all. While lossless compression may seem like the best option, lossy compression, which is used in most audio and video, reduces transmission time and results in much smaller file sizes. However, this compression can affect quality if it goes too far. The more compression there is on a waveform, the more degradation there is, and once a file is lossy compressed, this process is not reversible. This project will observe the degradation of an audio signal after the application of Singular Value Decomposition compression, a lossy compression that eliminates singular values from a signal’s matrix.
To have a better acknowledgment of constant false alarm rate approaches performance, three clutter models, Gamma, Weibull, and Log-normal, have been introduced to evaluate the detection's capability of each constant false alarm rate algorithm.
The order statistical constant false alarm rate approach outperforms other conventional constant false alarm rate methods, especially in clutter evolved environments. However, this method requires high power consumption due to repeat sorting.
In the automotive RADAR system, the computational complexity of algorithms is essential because this system is in real-time. Therefore, the algorithms must be fast and efficient to ensure low power consumption and processing time.
The reduced computational complexity implementations of cell-averaging and order statistic constant false alarm rate were explored. Their big O and processing time has been reduced.