# 支持向量机（Support Vector Machine）Part I

1. 令$w$是分类平面的法向量
2. 分类器对每个数据点的所属类别的判断方法为:$h(x)=sign(w^Tx+b)$
3. 要完美地完成分类任务即等价于要：$y_{n}(w^{T}x_{n}+b)>0$
4. 每个数据点到分类平面的距离为：$frac{1}{||w||}y_{n}(w^{T}x_{n}+b)$

$underset{b,w}{max}qquad margin(b,w)$
$subject toqquad y_{n}(w^{T}x_{n}+b)>0$
$where qquad margin(b,w) = underset{n=1,dots ,N}{min}frac{1}{||w||}y_{n}(w^{T}x_{n}+b)$

$underset{b,w}{max}qquad frac{1}{||w||}$
$subject to qquad underset{n=1,dots ,N}{min}y_{n}(w^{T}x_{n}+b)=1$

$underset{b,w}{min}qquad frac{1}{2}w^Tw$
$subject to qquad y_{n}(w^{T}x_{n}+b)geq 1$

$x_1=(0,0),y_1=-1$
$x_2=(2,2),y_1=-1$
$x_1=(2,0),y_1=1$
$x_1=(3,0),y_1=1$

$underset{u}{min} qquad frac{1}{2}u^TQu+p^Tu$
$subject to qquad a_m^Tugeq c_m$
$qquad for m = 1,2,...,M$

$u = begin{bmatrix} bw end{bmatrix}$
$Q = begin{bmatrix} 0 quad 0^T_d �_d quad I_d end{bmatrix}$
$p = 0_{d+1}$

$a^T_n = y_n[1quad x^T_n]$
$c_n = 1$
$M=N$

$underset{alpha}{min}qquad frac{1}{2}sum_{n=1}^{N}sum_{m=1}^{N}{alpha}_n{alpha}_my_ny_mz_n^Tz_m-sum_{n=1}^{N}{alpha}_n$

$text{subject to}quad sum_{n=1}^{N}y_nalpha_n = 0;$

$qquad alpha_ngeq 0,quad text{for } n=1,2,3,dots ,N$

$underset{alpha}{min}qquad frac{1}{2}alpha^TQ_Dalpha-1^Talpha$

$text{subject to} y^Talpha = 0;$

$qquad alpha_ngeq 0 text{for} n=1,2,3,dots , N$

$Q_D$矩阵中：$q_{n,m}=y_ny_mz_n^Tz_m$

CVX中代码示例：

$w=sum_{SV}alpha_ny_nz_n$

$b=y_n-w^Tz_n,text{with any} SV(z_n,y_n)$

Python中利用scikit-learn的SVM模块求解支持向量机方法如下，X,y以numpy.ndarray形式存储数据和标签: