Bootstrapped information-theoretic model selection with error control (BITSEC)

Description
Statistical model selection using the Akaike Information Criterion (AIC) and similar criteria is a useful tool for comparing multiple and non-nested models without the specification of a null model, which has made it increasingly popular in the natural and social

Statistical model selection using the Akaike Information Criterion (AIC) and similar criteria is a useful tool for comparing multiple and non-nested models without the specification of a null model, which has made it increasingly popular in the natural and social sciences. De- spite their common usage, model selection methods are not driven by a notion of statistical confidence, so their results entail an unknown de- gree of uncertainty. This paper introduces a general framework which extends notions of Type-I and Type-II error to model selection. A theo- retical method for controlling Type-I error using Difference of Goodness of Fit (DGOF) distributions is given, along with a bootstrap approach that approximates the procedure. Results are presented for simulated experiments using normal distributions, random walk models, nested linear regression, and nonnested regression including nonlinear mod- els. Tests are performed using an R package developed by the author which will be made publicly available on journal publication of research results.

Details

Contributors
Date Created
2018
Resource Type
Language
  • eng
Note
  • thesis
    Partial requirement for: M.S., Arizona State University, 2018
  • bibliography
    Includes bibliographical references (pages 73-76)
  • Field of study: Statistics

Citation and reuse

Statement of Responsibility
by Michael J. Cullan

Additional Information

English
Extent
  • vi, 76 pages : illustrations (some color)
Open Access
Peer-reviewed