LeiGaoCoder的个人博客分享 http://blog.sciencenet.cn/u/LeiGaoCoder

博文

SVM笔记

已有 1979 次阅读 2013-7-25 15:53 |系统分类:科研笔记

what?

一种分类的方法,Largest Margin,

maxwmini(WXi+b)

最大的Margin

why?

支持向量的名字的来源

The points closest to the separating hyperplane are known as support vectors(最接近超平面的点)

Margin: Distance of closest example from the decision line/hyperplane

make our classifier in such a way that the farther a data point is fromthe decision boundary, the more confident we are about the prediction we’ve made.

离决策边境越远越confident,更容易确认到底是1 or -1

 

We want to have the greatestpossible margin, because if we made a mistake or trained our classifier on limiteddata, we’d want it to be as robust as possible


Create an alphas vector filled with 0s

 While the number of iterations is less than MaxIterations:

   For every data vector in the dataset:

     If the data vector can be optimized:

       Select another data vector at random

       Optimize the two vectors together

       If the vectors can’t be optimized ➞ break

   If no vectors were optimized ➞ increment the iteration count







https://blog.sciencenet.cn/blog-1013787-711180.html


下一篇:Linear regression
收藏 IP: 12.24.60.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (1 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-4-21 10:29

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部