||
上一讲讲述了线性回归问题,以及线性回归做二类分类问题。从上一讲我们看到,我们得到的是硬分类,即非彼即此,自然,如果我们想知道对其分类结果的可信度,即以多大可信,就是这一讲的Logistic Regression讲述的内容。
1. Logistic Regression Problem
Target function $f(x) = P(+1|x)\in [0; 1]$. Same data as hard binary classification, different target function.
2. Logistic Regression Error
linear scoring function: $s = w^T x$
Likelihood of Logistic Hypothesis:
Cross-Entropy Error:
3. Gradient of Logistic Regression Error
4. Gradient Descent
Linear Approximation:
Gradient Descent:
Logistic Regression Algorithm:
梯度下降算法对学习率$\eta$比较敏感,不同的取值对收敛步骤的影响较大,具体分析请参见Adrew Ng老师的machine learning at Coursera课程 [1].
[1] https://www.coursera.org/course/ml
Archiver|手机版|科学网 ( 京ICP备07017567号-12 )
GMT+8, 2024-11-24 14:45
Powered by ScienceNet.cn
Copyright © 2007- 中国科学报社