JetBrains Research unites scientists working in challenging new disciplines

Training Neural Networks with Local Error Signals

Supervised training of neural networks for classification is typically performed with a global loss function. Classic back-propagation can have unpleasant side effects like “backward locking” — all previous layers are locked until gradients for the current layer and other problems which can’t be resolved without changing training algorithm. An alternative approach is to train the network with layer-wise loss functions.

On the next seminar we will discuss this new approach and how it can get close to the state-of-the-art on a variety of image datasets.

Speaker: Oktai Tatanov.

Presentation language: Russian.

Date and time: January 30th, 18:30-20:00.

Location: Times, room 204.

Materials:
Nøkland, Arild, and Lars Hiller Eidnes. "Training Neural Networks with Local Error Signals." arXiv preprint arXiv:1901.06656 (2019).

Videos from previous seminars are available at http://bit.ly/MLJBSeminars