Basic Deep Learning Concepts good for you

 

Basic Deep Learning Concepts

Hard concepts are Bolded

  • Supervised Learning / Unsupervised Learning / semi-supervised, weakly-supervised
  • weight initialization
  • learning rate decay
  • dropout
  • forward propagation(inference) / backward propagation

Activation

  • What is activation layer and why use it
  • ReLU, Leaky ReLU
  • softmax
  • sigmoid
  • Difference of softmax and sigmoid

Loss

  • What is loss and why use it
  • L1 Loss, L2 Loss(=MSE Loss)
  • binary cross entropy
  • cross entropy
  • Difference of binary cross entropy and cross entropy
  • Why use binary cross entropy with sigmoid, cross entropy with softmax

Networks

  • CNN (Convolution Neural Networks)
    • deconvolution layer (transpose convolution)
    • dilated convolution
  • RNN (Recurrent Neural Networks)
  • residual connection
  • U-net

Train with less data

  • data augmentation
  • transfer learning
  • semi-supervised, weakly-supervised
  • domain adaptation

Normalization

Basic%20Deep%20Learning%20Concepts%20good%20for%20you/normalization_method.png

  • batch normalization
  • Layer Norm
  • Instance Norm
  • Group Norm

Sites

Lectures

cs231n

모두를 위한 딥러닝