Description

Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger distance, play a critical role in statistical signal processing and information theory; however estimating them can be challenge. Most

Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger distance, play a critical role in statistical signal processing and information theory; however estimating them can be challenge. Most often, parametric assumptions are made about the two distributions to estimate the divergence of interest. In cases where no parametric model fits the data, non-parametric density estimation is used.

Reuse Permissions
  • 7.6 MB application/pdf

    Download count: 0

    Details

    Contributors
    Date Created
    • 2017
    Resource Type
  • Text
  • Collections this item is in
    Note
    • Partial requirement for: Ph.D., Arizona State University, 2017
      Note type
      thesis
    • Includes bibliographical references (pages 102-110)
      Note type
      bibliography
    • Field of study: Electrical engineering

    Citation and reuse

    Statement of Responsibility

    by Alan Wisler

    Machine-readable links