NNet starter

By Alexandre Allauzen

The drive with all the ressources is here.

1 Expected road-map

1.1 First day : 20/06

1.1.1 10-12, Course

Basics and reminder:

  • Neural Net basic: forward and backward
  • Computation graph and implementation overview
  • Training algorithm
  • Deep-Learning : vanishing gradient

NNet for NLP:

  • Word embedding
  • Composition : Document classification
  • Convolution

1.1.2 13:30-15:30, Course

Sequence processing:

  • Recurrent, LSTM and GRU
  • Bi-LSTM

Sequence to Sequence:

  • Case study : Machine Translation
  • Encoder-Decoder
  • Attention model
  • Transformer network

1.1.3 16-18, Lab Session

First step in pytorch:

  • Image classifier (MNIST)
  • Sentence classifier

1.2 Second day : 21/06

1.2.1 9-12, Course

Advanced models and algorithms

  • Large Vocabulary, issues and solutions
  • Training/Decoding for Sequence to Sequence
  • Variational Auto-Encoder

1.2.2 13-16, Lab Session

Generative models for NLP:

  • Recurrent language model
  • Machine Translation

2 NNet basics

To introduce neural networks, start with the videos from Hugo Larochelle. The roadmap is

  • Capsules 1.1 to 1.6 : the artificial neuron and the feed-forward architecture (definition)
  • Capsules 2.1 to 2.11: training basics

3 Pytorch basics

Download this notebook and run it with jupyter. To get the tools you can install anaconda3 and then pytorch.

PyTorch is python module. If you need a python refresher:

Numpy is one of the core library for scientific computing in Python. It provides a high-performance multidimensional array object, and tools for working with these arrays.

Using python can be easier with ipython, look at this tutorial: http://cs231n.github.io/ipython-tutorial/. If you like more standard IDE : https://www.jetbrains.com/pycharm-edu/