A multitude of natural language processing tasks are solved most effectively by fine-tuning a model that is pretrained on a large corpus of general purpose text.
At the seminar, we will discuss the BERT model from google that utilizes a pretrained transformer to show state-of-the-art result in the task of machine translation.
Speaker: Andrei Gusev.
Presentation language: Russian.
Date and time: March 6th, 18:30-20:00.
Location: Times, room 204.
Videos from previous seminars are available at http://bit.ly/MLJBSeminars
- About seminars
29 May 2019ICLR 2019 Overview
22 May 2019Segmentation in 2019. The fastest and the most accurate.
8 May 2019Speech recognition and speech synthesis
17 April 2019Open Questions about Generative Adversarial Networks
10 April 2019Adaptive Sampled Softmax with Kernel Based Sampling