Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

Estimation of Generalization Error

  • Cross-validation (e.g., leave one out).
    • In k-fold cross-validation the data is divided into k subsets, and the network is trained k times, each time leaving one subset out for computing the error.
    • The “crossing” makes this method an improvement over split-sampling; it allows all data to be used for training.
    • The disadvantage of cross-validation is that the network must be re-trained many times (k times in k-fold crossing).
  • Bootstrapping.
    • Bootstrapping works on random sub-samples (random shares) that are chosen from the full data set.
    • Any data item may be selected any number of times for validation.
    • The sub-samples are repeatedly analyzed.
  • No matter which method is applied, the estimate of the generalization error of the best network will be optimistic.

Speaker notes:

Content Tools

Sources

There are currently no sources for this slide.