Convenlutional Neural Networks

  • Overview
  • Alternative View
  • More detail about CNN

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

  • Ignoring units (i.e. neurons) with probability $1-p$ during the training phase of certain set of neurons which is chosen at random
  • These units are not considered during a particular forward or backward pass

  • Learn more robust features that are useful in conjunction with many different random subsets of the other neurons.
  • Roughly doubles the number of iterations required to converge. However, training time for each epoch is less

Data Argument

Early Stopping

Well Known ConvNets

  • LeNet
  • AlexNet
  • ZF Net
  • GoogleNet
  • VGGNet
  • ResNet

Application

Machine Learning Applications and practices