Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

Rule Post-Pruning

  • Applied in C4.5.
  • Steps ( [1])
    1. Infer the decision tree from the training set, growing the set until the training data is fit as well as possible and allowing overfitting to occur.

    2. Convert the learned tree into an equivalent set of rules by creating one rule for each path from the root node to a leaf node.

    3. Prune (generalize) each rule by removing any preconditions that result in improving its estimated accuracy.

    4. Sort the pruned rules by their estimated accuracy, and consider them in this sequence when classifying subsequent instances.
  • Rule accuracy estimation based on the training set using a pessimistic estimate: C4.5 calculates standard deviation and takes the lower bound as estimate for rule performance.

Speaker notes:

Content Tools


There are currently no sources for this slide.