Natural Language Processing and Translation
Code | Completion | Credits | Range | Language |
---|---|---|---|---|
BECM36NLPT | Z,ZK | 6 | 2P+2C | English |
- Course guarantor:
- Ondřej Bojar
- Lecturer:
- Ondřej Bojar
- Tutor:
- Ondřej Bojar
- Supervisor:
- Department of Computer Science
- Synopsis:
-
The course covers the area of natural language processing (NLP) by means of an in-depth focus on the task of machine
translation (MT).
- Requirements:
- Syllabus of lectures:
-
1. NLP Experiment. Methods of Evaluation. Evaluating Text Generation Tasks, incl. Machine Translation.
2. Layers of Linguistic Analysis. Overview of Approaches to Machine Translation.
3. Processing Text with Neural Networks. Neural Machine Translation
4. Data Acquisition and Preparation. Alignment.
5. Classical AI Search Space. Phrase-Based Machine Translation.
6. Search Space Factorization. Morphology in Machine Translation.
7. Syntactic Analysis. Structural Approaches to Machine Translation.
8. Transformer Architecture. Caveats on Interpreting Results.
9. Word and Sentence Representations.
10. Transfer Learning in NLP. Multi-lingual Models. Multi-lingual Machine Translation.
11. Multi-modality and NLP. Non-text Modalities in Translation.
12. Pre-trained Models, Large Language Models, Instruction Tuning. Emergent Properties.
13. Self-Learning in NLP. Unsupervised Machine Translation.
- Syllabus of tutorials:
- Study Objective:
- Study materials:
-
Koehn, Philipp. Statistical Machine Translation. Cambridge University Press; 2009.
Koehn, Philipp. Neural Machine Translation. Cambridge University Press; 2020.
Bojar, Ondřej (2012). Čeština a strojový překlad: Strojový překlad našincům, našinci strojovému překladu. ISBN 978-80-904571-4-0. 168 pp.
Kocmi Tom, Macháček Dominik, Bojar Ondřej (2021). The Reality of Multi-Lingual Machine Translation. ISBN 978-80-88132-11-0. 191 pp.
Vaswani, Ashish, et al. Attention is all you need. Advances in neural information processing systems, 2017.
Rogers, Anna; Kovaleva, Olga; Rumshisky, Anna. A primer in BERTology: What we know about how BERT works. Transactions of the Association for Computational Linguistics, 2021, 8: 842-866.
Ouyang, Long, et al. Training language models to follow instructions with human feedback. Advances in Neural Information Processing Systems, 2022, 35: 27730-27744.
- Note:
- Further information:
- No time-table has been prepared for this course
- The course is a part of the following study plans: