Lecture 10 : Logistic Regression

Lecture 10 : Logistic Regression

【参考】https://redstonewill.com/236/
【概括】
Lecture 10 : Logistic Regression
1.
软性二分类问题:值越接近1,表示正类的可能性越大;越接近0,表示负类的可能性越大
Lecture 10 : Logistic Regression
Lecture 10 : Logistic Regression
Lecture 10 : Logistic Regression

Logistic Regression与Linear Classification、Linear Regression做个比较:
Lecture 10 : Logistic Regression

  1. logistic regression的err function,称之为cross-entropy error交叉熵误差:
    Lecture 10 : Logistic Regression

目的是使Ein最小:
Lecture 10 : Logistic Regression
3.
Lecture 10 : Logistic Regression
4. 基于梯度下降的Logistic Regression算法步骤如下:
Lecture 10 : Logistic Regression