Multi convolution layers
Max Pooling
Average Pooling
Sum Pooling
Dropout: A Simple Way to Prevent Neural Networks from Overfitting
$G(m,n)=(f*h)[m,n]=\sum_{j}\sum_k h(j,k)f[m-j,n-k]$
Padding: $p = (f-1)/2$
Striding
$n_{out}=\lfloor\frac{n_{in}+2p-f}{s}+1\rfloor$
The third dimension
Tensors Dimensions
Connections Cutting and Parameters Sharing
Two important features of CNN
Convolutional Layer Backpropagation
$dZ^{[l]}=dA^{[l]}*g^{'}(Z^{[l]})$
$dA+=\sum_{m=0}^{n_h}\sum_{n=0}^{n_w}W\cdot dZ[m,n]$
Pooling Layer
Pooling Layers Backpropagation