MY MEMO
[MACHINE LEARNING] Feature Scaling and Normal Equation 본문
[MACHINE LEARNING] Feature Scaling and Normal Equation
l_j_yeon 2017. 3. 29. 20:57+) this post is based on the lecture and content in the coursera(https://www.coursera.org/) machine learning class
(professor)
Feature Scaling
We can speed up gradient descent by having each of our input values in roughly the same range.
−1 ≤
or
−0.5 ≤
These aren't exact requirements; we are only trying to speed things up. The goal is to get all input variables into roughly one of these ranges, give or take a few.
Two techniques to help with this are feature scaling and mean normalization. Feature scaling involves dividing the input values by the range (i.e. the maximum value minus the minimum value) of the input variable, resulting in a new range of just 1.
formula:
Where
Learning Rate
To summarize:
If
If
Features and Polynomial Regression
Our hypothesis function need not be linear (a straight line) if that does not fit the data well.
We can change the behavior or curve of our hypothesis function by making it a quadratic, cubic or square root function (or any other form).
eg. if
Normal Equation
There is no need to do feature scaling with the normal equation.
The following is a comparison of gradient descent and the normal equation:
Gradient Descent | Normal Equation |
---|---|
Need to choose alpha | No need to choose alpha |
Needs many iterations | No need to iterate |
O (n | O (n^3 |
Works well when n is large | Slow if n is very large |
- Redundant features, where two features are very closely related (i.e. they are linearly dependent)
- Too many features (e.g. m ≤ n). In this case, delete some features or use "regularization" (to be explained in a later lesson).
but 'pinv' function will give you a value.
'MACHINE LEARNING > Stanford University' 카테고리의 다른 글
[MACHINE LEARNING] Classification and Simplified cost function & Gradient Descent (0) | 2017.03.29 |
---|---|
[MACHINE LAERNING] Basic Conception of the Machine Learning and Cost Function (0) | 2017.03.29 |
[MACHINE LEARNING] Week2 과제 (0) | 2017.03.28 |