Machine Learning Week 3-Classification

17/11/03 MWhite's learning notes

1. Classification

Linear Regression :y∈R
Classification(logistic regression) : y∈{0,1,...}

1.1 Hypothesis Representation

1.2 Logistic regression's Cost Function


Simplified Cost Function


Simplified Cost Function

1.3 Logistic regression's Gradient Descent


Vectorized implementation:


1.4 Advanced Optimization

library function——fminunc()

function [jVal, gradient] = costFunction(theta)
  jVal = [...code to compute J(theta)...];
  gradient = [...code to compute derivative of J(theta)...];
end
options = optimset('GradObj', 'on', 'MaxIter', 100);
initialTheta = zeros(2,1);
   [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);

2. Multiclass Classification

One-vs-all



3. Overfitting



skips θ0

3.1 Regularized Linear Regression

  • Cost Function


  • Gradient descent


  • Normal Equation


3.2 Regularized Logistic Regression

  • Cost Function


  • Gradient descent


最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。