# Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

### Linear Regression

- Linear regression: involves a response variable y and a single predictor variable x

\[ y=w_{0}+w_{1}x \]

where w0 (y-intercept) and w1 (slope) are regression coefficients

- Method of least squares: estimates the best-fitting straight line

\[ w_{1}=\frac{\sum_{i=1}^{|D|}(x_{i}-\bar{x})(y_{i}-\bar{y})}{\sum_{i=1}^{|D|}(x_{i}-\bar{x})^2} \]

\[ w_{0}=\bar{y}-w_{1}\bar{x} \]

- Multiple linear regression: involves more than one predictor variable
- Training data is of the form (
**X1**, y1), (**X2**, y2),…, (**X|D|**, y|D|) - Ex. For 2-D data, we may have:

\[ y=w+w_{1}x_{1}+w_{2}x_{2} \] - Solvable by extension of least square method or using SAS, S-Plus
- Many nonlinear functions can be transformed into the above

**Speaker notes:**

## Content Tools

Tools

Sources (0)

Tags (0)

Comments (0)

History

Usage

Questions (0)

Playlists (0)

Quality

### Sources

There are currently no sources for this slide.