1 梯度下降
f ( x + Δ x , y + Δ y ) ≃ f ( x , y ) + ∂ f ( x , y ) ∂ x Δ x + ∂ f ( x , y ) ∂ y Δ y f(x+\Delta x,y+\Delta y) \simeq f(x,y)+\frac{\partial f(x,y)}{\partial x}\Delta x+\frac{\partial f(x,y)}{\partial y}\Delta y f(x+Δx,y+Δy)≃f(x,y)+∂x∂f(x,y)Δx+∂y∂f(x,y)Δy
Δ z = f ( x + Δ x , y + Δ y ) − f ( x , y ) ≃ ∂ f ( x , y ) ∂ x Δ x + ∂ f ( x , y ) ∂ y Δ y \Delta z =f(x+\Delta x,y+\Delta y)-f(x,y) \simeq \frac{\partial f(x,y)}{\partial x}\Delta x+\frac{\partial f(x,y)}{\partial y}\Delta y Δz=f(x+Δx,y+Δy)−f(x,y)≃∂x∂f(x,y)Δx+∂y∂f(x,y)Δy
Δ z ≃ ∂ f ( x , y ) ∂ x Δ x + ∂ f ( x , y ) ∂ y Δ y \Delta z \simeq \frac{\partial f(x,y)}{\partial x}\Delta x+\frac{\partial f(x,y)}{\partial y}\Delta y Δz≃∂x∂f(x,y)Δx+∂y∂f(x,y)Δy
Δ z ≃ ( ∂ f ( x , y ) ∂ x , ∂ f ( x , y ) ∂ y ) ( Δ x , Δ y ) \Delta z \simeq (\frac{\partial f(x,y)}{\partial x},\frac{\partial f(x,y)}{\partial y})(\Delta x,\Delta y) Δz≃(∂x∂f(x,y),∂y∂f(x,y))(Δx,Δy)
Δ z ≃ ( ∂ z ∂ x , ∂ z ∂ y ) ⋅ ( Δ x , Δ y ) = ∇ z ⋅ ( Δ x , Δ y ) \Delta z \simeq (\frac{\partial z}{\partial x},\frac{\partial z}{\partial y})\cdot (\Delta x,\Delta y)=\nabla z \cdot (\Delta x,\Delta y) Δz≃(∂x∂z,∂y∂z)⋅(Δx,Δy)=∇z⋅(Δx,Δy)
如果想要让z的下降速度最快就要保证两个向量方向完全相反,也就是要保证如下公式成立
( Δ x , Δ y ) = − η ∇ z (\Delta x,\Delta y) = -\eta \nabla z (Δx,Δy)=−η∇z
2 NN误差反向传播
参数w和b的梯度表示
∂ C ∂ w j i l = δ j l a i l − 1 , ∂ C ∂ b j l = δ j l ( l = 2 , 3... ) \frac{\partial C}{\partial w^{l}_{ji}}=\delta ^l_j a^{l-1}_i,\frac{\partial C}{\partial b^{l}_{j}}=\delta ^l_j(l=2,3...) ∂wjil∂C=δjlail−1,∂bjl∂C=δjl(l=2,3...)
δ的计算方法
输出层的误差反向传播计算方法,此处L代表输出层
δ j L = ∂ C ∂ a j L a ′ ( z j L ) \delta^L_j=\frac{\partial C}{\partial a^L_j}a'(z^L_j) δjL=∂ajL∂Ca′(zjL)
C = 1 2 { ( t 1 − a 1 L ) 2 + ( t 2 − a 2 L ) 2 } C=\frac{1}{2}\{ (t_1-a^L_1)^2+(t_2-a^L_2)^2 \} C=21{(t1−a1L)2+(t2−a2L)2}
δ j L = ∂ C ∂ a j L a ′ ( z j L ) = ( a j L − t j ) a ′ ( z j L ) \delta^L_j=\frac{\partial C}{\partial a^L_j}a'(z^L_j)=(a^L_j-t_j)a'(z^L_j) δjL=∂ajL∂Ca′(zjL)=(ajL−tj)a′(zjL)
隐藏层的误差反向传播计算方法,层l和下一层l+1的递推关系,m为层l+1的神经单元个数,l为大于等于2的整数
δ i l = ( δ 1 l + 1 w 1 i l + 1 + δ 2 l + 1 w 2 i l + 1 + . . . + δ m l + 1 w m i l + 1 ) a ′ ( z i l ) \delta ^l _i = (\delta ^{l+1} _1 w ^{l+1} _{1i} + \delta ^{l+1} _2 w^{l+1} _{2i}+...+ \delta ^{l+1} _m w^{l+1} _{mi})a'(z^l _i) δil=(δ1l+1w1il+1+δ2l+1w2il+1+...+δml+1wmil+1)a′(zil)
输出层的神经单元误差
δ j 3 = ∂ C ∂ z j 3 = ∂ C ∂ a j 3 ∂ a j 3 ∂ z j 3 = ∂ C ∂ a j 3 a ′ ( z j 3 ) \delta^3_j = \frac{\partial C}{\partial z^3_j}=\frac{\partial C}{\partial a^3_j} \frac{\partial a^3_j}{\partial z^3_j}=\frac{\partial C}{\partial a^3_j}a'(z^3_j) δj3=∂zj3∂C=∂aj3∂C∂zj3∂aj3=∂aj3∂Ca′(zj3)
隐藏层的神经单元误差
δ i 2 = ( δ 1 3 w 1 i 3 + δ 2 3 w 2 i 3 ) a ′ ( z i 2 ) ( i = 1 , 2 , 3 ) \delta ^2 _i = (\delta ^3 _1 w ^3 _{1i} + \delta ^3 _2 w^3 _{2i})a'(z^2 _i)(i=1,2,3) δi2=(δ13w1i3+δ23w2i3)a′(zi2)(i=1,2,3)
3 CNN误差反向传播
输出层的梯度分量
∂ C ∂ w O n k − i j = δ n O a i j P k , ∂ C ∂ b O n = δ n O \frac{\partial C}{\partial w ^{On}}_{k-ij}=\delta ^O _n a^{Pk}_{ij},\frac{\partial C}{\partial b ^{O}}_{n}=\delta ^O _n ∂wOn∂Ck−ij=δnOaijPk,∂bO∂Cn=δnO
n为输出层神经单元的编号,k为池化层子层编号,ij为池化子层神经单元行列编号(i,j=1,2)
卷积层的梯度分量
∂ C ∂ w i j F k = δ 11 F k x i j + δ 12 F k x i j + 1 + . . . + δ 44 F k x i + 3 j + 3 \frac{\partial C}{\partial w^{Fk}_{ij}}=\delta ^{Fk}_{11}x_{ij}+\delta ^{Fk}_{12}x_{ij+1}+...+\delta ^{Fk}_{44}x_{i+3j+3} ∂wijFk∂C=δ11Fkxij+δ12Fkxij+1+...+δ44Fkxi+3j+3
k为过滤器的编号,ij为过滤器行列的编号(i,j=1,2,3)
∂ C ∂ b F k = δ 11 F k + δ 12 F k + . . . + δ 44 F k \frac{\partial C}{\partial b^{Fk}}=\delta ^{Fk}_{11}+\delta ^{Fk}_{12}+...+\delta ^{Fk}_{44} ∂bFk∂C=δ11Fk+δ12Fk+...+δ44Fk
k为过滤器的编号
输出层δ的计算方法
δ n O = ∂ C ∂ z n O = ∂ C ∂ a n O ∂ a n O ∂ z n O = ∂ C ∂ a n O a ′ ( z n O ) \delta ^O_n=\frac{\partial C}{\partial z^O_n}=\frac{\partial C}{\partial a^O_n}\frac{\partial a^O_n}{\partial z^O_n}=\frac{\partial C}{\partial a^O_n}a'(z^O_n) δnO=∂znO∂C=∂anO∂C∂znO∂anO=∂anO∂Ca′(znO)
n为输出层神经单元的编号
C = 1 2 { ( t 1 − a 1 O ) 2 + ( t 2 − a 2 O ) 2 + ( t 3 − a 3 O ) 2 } C=\frac{1}{2}\{ (t_1-a^O_1)^2+(t_2-a^O_2)^2+(t_3-a^O_3)^2 \} C=21{(t1−a1O)2+(t2−a2O)2+(t3−a3O)2}
δ n O = ( a n O − t n ) a ′ ( z n O ) \delta ^O_n=(a_n^O-t_n)a'(z_n^O) δnO=(anO−tn)a′(znO)
以上为代价函数示例及其导数,带入δ式可得
∂ C ∂ a n O = a n O − t n ( n = 1 , 2 , 3 ) \frac{\partial C}{\partial a^O_n}=a_n^O-t_n(n=1,2,3) ∂anO∂C=anO−tn(n=1,2,3)
求导数得
卷积层δ的计算方法
δ i j F k = { δ 1 O w k − i ′ j ′ O 1 + δ 2 O w k − i ′ j ′ O 2 + δ 3 O w k − i ′ j ′ O 3 } × ( 当 a i j F k 在区块中为最大值时为 1 否则为 0 ) × a ′ ( z i j F k ) \delta ^{Fk}_{ij}=\{\delta ^{O}_{1}w^{O1}_{k-i'j'}+\delta ^{O}_{2}w^{O2}_{k-i'j'}+\delta ^{O}_{3}w^{O3}_{k-i'j'}\}\times(当a^{Fk}_{ij}在区块中为最大值时为1否则为0)\times a'(z^{Fk}_{ij}) δijFk={δ1Owk−i′j′O1+δ2Owk−i′j′O2+δ3Owk−i′j′O3}×(当aijFk在区块中为最大值时为1否则为0)×a′(zijFk)
k为卷积层子层的编号,ij为卷积层神经单元的编号,i’j’是卷积层i行j列神经单元连接池化层神经单元的位置