机器学习(西瓜书)之一元线性回归公式推导

一元线性回归公式推导

1.求解偏置b的公式推导思路:由最小二乘法导出损失函数E(ω,b)E( \omega, b )—>证明损失函数E(ω,b)E( \omega, b ) 是关于wwbb的凸函数—>对损失函数E(ω,b)E( \omega, b)关于bb求一阶偏导数—>令一阶偏导数等于00解出bb

由最小二乘法导出损失函数E(ω,b)E( \omega, b):

E(w,b)=i=1m(yif(xi))2=i=1m(yi(ωxi+b))2=i=1m(yiωxib)2(p54.3.4)E_{(w,b)}=\sum^{m}_{i = 1}(y_i-f(x_i))^2=\sum^{m}_{i = 1}(y_i-(\omega x_i+b))^2=\sum^{m}_{i = 1}(y_i-\omega x_i-b)^2 \tag{$ p_{54}.3.4$}

证明损失函数E(ω,b)E(\omega, b)是关于ω\omegabb的凸函数—— 求:A=fxx(x,y)A=f_{xx}^{ 〞}(x,y)

E(w,b)ω=ω[i=1m(yiωxib)2]=i=1mω(yiωxib)2=i=1m2(yiωxib)xi=2(ωi=1mxi2i=1m(yib)xi) \frac{\partial E_{(w,b)}}{\partial \omega} =\frac{\partial }{\partial \omega} [\sum^{m}_{i = 1}(y_i-\omega x_i-b)^2] =\sum^{m}_{i = 1}\frac{\partial }{\partial \omega} (y_i-\omega x_i-b)^2 =\sum^{m}_{i = 1}2·(y_i-\omega x_i-b)·(-x_i) =2(\omega\sum^{m}_{i = 1}x_i ^2-\sum^{m}_{i = 1}(y_i-b)x_i) 上式即为即为(3.5)

A=fxx(x,y)=2E(ω,b)ω2=ω(i=1mE(w,b)ω)=ω[2(ωi=1mxi2i=1m(yib)xi)]=ω[2ωi=1mxi2]=2i=1mxi2A=f_{xx}^{ 〞}(x,y)= \frac{\partial^2E_{(\omega,b)}}{\partial \omega^2} =\frac{\partial }{\partial \omega} (\sum^{m}_{i = 1}\frac{\partial E_{(w,b)}}{\partial \omega} ) =\frac{\partial }{\partial \omega}[2(\omega\sum^{m}_{i = 1}x_i ^2-\sum^{m}_{i = 1}(y_i-b)x_i)] =\frac{\partial }{\partial \omega}[2\omega\sum^{m}_{i = 1}x_i ^2] =2\sum^{m}_{i = 1}x_i ^2

求:B=fxy(x,y)B=f_{xy}^{ 〞}(x,y)

B=fxy(x,y)=2E(ω,b)ωb=b(E(w,b)ω)=b[2(ωi=1mxi2i=1m(yib)xi)]=b[2i=1m(yib)xi)]=2i=1mxiB=f_{xy}^{ 〞}(x,y)= \frac{\partial^2E_{(\omega,b)}}{\partial \omega\partial b} =\frac{\partial }{\partial b} ( \frac{\partial E_{(w,b)}}{\partial \omega} ) =\frac{\partial }{\partial b} [2(\omega\sum^{m}_{i = 1}x_i ^2-\sum^{m}_{i = 1}(y_i-b)x_i)] =\frac{\partial }{\partial b}[-2\sum^{m}_{i = 1}(y_i-b)x_i)] =2\sum^{m}_{i = 1}x_i

求:C=fyy(x,y)C=f_{yy}^{ 〞}(x,y)

E(w,b)b=b[i=1m(yiωxib)2]=i=1mb(yiωxib)2=i=1m2(yiωxib)1=2(mbi=1m(yiωxi)) \frac{\partial E_{(w,b)}}{\partial b} = \frac{\partial }{\partial b}[\sum^{m}_{i = 1} (y_i-\omega x_i-b)^2] =\sum^{m}_{i = 1}\frac{\partial }{\partial b} (y_i-\omega x_i-b)^2 =\sum^{m}_{i = 1}2·(y_i-\omega x_i-b)·(-1) =2(mb-\sum^{m}_{i = 1}(y_i-\omega x_i))上式即为即为(3.6)

所以

机器学习(西瓜书)之一元线性回归公式推导

也即损失函数E(ω,b)E(\omega, b) 是关于ω\omegabb的凸函数得证
对损失函数E(ω,b)E(\omega, b) 关于b求一阶偏导数:机器学习(西瓜书)之一元线性回归公式推导
令一阶偏导数等于 0 解出b:机器学习(西瓜书)之一元线性回归公式推导

2.求解权重ω,\omega,的公式推导思路:对损失函数E(ω,b)E( \omega, b)关于ω\omega求一阶偏导数—>令一阶偏导数等于00解出ω\omega

令一阶偏导数等于 0 解出ω\omega机器学习(西瓜书)之一元线性回归公式推导

机器学习(西瓜书)之一元线性回归公式推导机器学习(西瓜书)之一元线性回归公式推导

3.将ω\omega向量化:

机器学习(西瓜书)之一元线性回归公式推导机器学习(西瓜书)之一元线性回归公式推导机器学习(西瓜书)之一元线性回归公式推导

数学知识—二元函数判断凹凸性:机器学习(西瓜书)之一元线性回归公式推导