목록MACHINE LEARNING/Stanford University (4)
MY MEMO
+) this post is based on the lecture and content in the coursera(https://www.coursera.org/) machine learning class (professor)ClassificationTo attempt classification, one method is to use linear regression and map all predictions greater than 0.5 as a 1 and all less than 0.5 as a 0. However, this method doesn't work well because classification is not actually a linear function. The classificatio..
+) this post is based on the lecture and content in the coursera(https://www.coursera.org/) machine learning class (professor)+) you can make new algorithm using original gradient descent function so you can find this functions are sameFeature ScalingWe can speed up gradient descent by having each of our input values in roughly the same range.−1 ≤ x(i) ≤ 1or−0.5 ≤ x(i) ≤ 0.5These aren't exact re..
+) this post is based on the lecture and content in the coursera(https://www.coursera.org/) machine learning class (professor) Supervised Learning=> regression problem : we are trying to predict results within a continuous output, meaning that we are trying to map input variables to some continuous function. => classification problem : we are instead trying to predict results in a discrete outpu..
+)Octave,MATLAB1.Simple Octave/MATLAB function -first assignment is really simple. this assignment is for teaching you how to submit the code and get a score you can open the warmUpExcercise.m file and write down A = eye(5) and submit that code 2. Linear regression with one variable 2.1 Plotting the DataBefore starting on any task, it is often useful to understand the data by visualizing it. For..