Current Slide
Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.
SVM—Linearly Separable
- A separating hyperplane can be written as
- W ● X + b = 0
- where W={w1, w2, …, wn} is a weight vector and b a scalar (bias)
- For 2-D it can be written as
- w + w1 x1 + w2 x2 = 0
- The hyperplane defining the sides of the margin:
- H1: w + w1 x1 + w2 x2 ≥ 1 for yi = +1, and
- H2: w + w1 x1 + w2 x2 ≤ – 1 for yi = –1
- Any training tuples that fall on hyperplanes H1 or H2 (i.e., the sides defining the margin) are support vectors
- This becomes a constrained (convex) quadratic optimization problem: Quadratic objective function and linear constraints → Quadratic Programming (QP) → Lagrangian multipliers
Speaker notes:
Content Tools
Tools
Sources (0)
Tags (0)
Comments (0)
History
Usage
Questions (0)
Playlists (0)
Quality
Sources
There are currently no sources for this slide.