2022-2023: Deep-Learning and Natural Language Processing, IASD
This course is an introduction to Natural Language Processing (with deep-learning methods). *It is over now ! *
News
- The course starts the 10th of Hanuary: 8h30 at PariSantéCampus
- There will be 3 guest lectures
Registration for the readings and projects
Slides and resources
Expected schedules
It starts in january 2020 (the 16th). The course are scheduled on Tuesday, starting at 8:30 in the morning.
10-jan, course:
NLP, overview and the main tasks
- For the linguistic part you can refer to web site
- For the NLP basics: see this book
Text classification with NNet
The basics and a first NNet with W2V
Further and essential readings:
- Natural Language Processing (Almost) from Scratch
- A Neural Probabilistic Language Model
- A Primer on Neural Network Models for Natural Language Processing
For word2vec and fasttext:
17-jan, course: sequence models
- Language modelling
- ngram language model
- recurrent model
Some interesting readings:
24-jan, course on Advanced models
- The end of recurrent model, with LSTM
- Bi-LSTM, Attention
- Transformer
Some readings:
- Vanishing Gradient
- ELMO paper
- ULMFit paper
- The paper that introduced attention
- You can also look at the BERT paper and Transformer paper, even I found them not very easy to read.
31-jan, course on Representation learning and contrastive estimation
by Matthieu Labeau
07-feb, course on Syntax !
by Benoit Crabbé
14-feb, Readings
14-march, course on model probing
by Guillaume Wisniewski
Evaluation
The evaluation is in two parts. For both, first make your team (typically 2/3 students).
Reading
The goal is to read an article an to make a presentation (the 14-feb). A list will be availble soon, but you can also propose one (I must agree beforehand). Select one article per team to read and analyse the paper to make a clear and synthetic presentation. Some questions you may use to guide your reading are (among others):
- Did you like the paper? Did you find it interesting? Be honest!
- What are the most important things you learned from the paper? Why are they important?
- Do the lessons learned generalize beyond the specific task? Do they contribute towards building an important system or application?
- Is the experimental setup satisfying? Any experiments missing? Any obvious or important baseline missing?
- Is the problem/approach well motivated?
- Are you convinced by the results? Why?
- Is the writing clear? Is the paper well structured?
The important dates are :
- Make up your team and select the paper before the 1-February
- Presentation: the 14-February
Project
A list is available, but you can also propose one (I must agree beforehand).
- Team and the project registration : before 1-feb
- Deliverable for the XXX: 2 pages (pdf only) to describe the data, the task and your plan
- Deliverable for XXX: a github/gitlab repository
- Final deliverable: a report in pdf and the code via the git repos
Final deadline: XXX
Feel free to use the teams channel to interact with me or with the other groups.