Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

Multi-Layer Feed-Forward (2)

  • Multi-Layer Perceptrons (MLP) have fully connected layers.
  • The numbers of hidden units is typically chosen by hand; the more layers, the more complex the network (see Step 2 of Building Neural Networks)
  • Hidden layers enlarge the space of hypotheses that the network can represent.
  • Learning done by back-propagation algorithm → errors are back-propagated from the output layer to the hidden layers.

Speaker notes:

Content Tools

Sources

There are currently no sources for this slide.