Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger distance, play a critical role in statistical signal processing and information theory; however estimating them can be challenge. Most often, parametric assumptions are made about the two distributions to estimate the divergence of interest. In cases where no parametric model fits the data, non-parametric density estimation is used.
Download count: 0
- Partial requirement for: Ph.D., Arizona State University, 2017Note typethesis
- Includes bibliographical references (pages 102-110)Note typebibliography
- Field of study: Electrical engineering