Description

Mixture of experts is a machine learning ensemble approach that consists of individual models that are trained to be ``experts'' on subsets of the data, and a gating network that

Mixture of experts is a machine learning ensemble approach that consists of individual models that are trained to be ``experts'' on subsets of the data, and a gating network that provides weights to output a combination of the expert predictions. Mixture of experts models do not currently see wide use due to difficulty in training diverse experts and high computational requirements. This work presents modifications of the mixture of experts formulation that use domain knowledge to improve training, and incorporate parameter sharing among experts to reduce computational requirements.

28.71 MB application/pdf

Download count: 0

Details

Contributors
Date Created
  • 2018
Resource Type
  • Text
  • Collections this item is in
    Note
    • Doctoral Dissertation Electrical Engineering 2018

    Machine-readable links