Research group

Machine Learning Applications and Deep Learning


6 March 2019

A multitude of natural language processing tasks are solved most effectively by fine-tuning a model that is pretrained on a large corpus of general purpose text.

At the seminar, we will discuss the BERT model from google that utilizes a pretrained transformer to show state-of-the-art result in the task of machine translation.

Speaker: Andrei Gusev.

Presentation language: Russian.

Date and time: March 6th, 18:30-20:00.

Location: Times, room 204.

Videos from previous seminars are available at