Deep residual networks (ResNets) have significantly pushed forward the state-of-the-art on image classification, increasing in performance as networks grow both deeper and wider. However, memory consumption becomes a bottleneck, as one needs to store the activations in order to calculate gradients using backpropagation. We present the Reversible Residual Network (RevNet), a variant of ResNets where each layer's activations can be reconstructed exactly from the next layer's.
At the seminar, we will talk about RevNets, how they allow to reduce memory consumption and protect against adversarial attacks.
Speaker: Rauf Kurbanov.
Presentation language: Russian.
Date and time: September 17th, 8:00-9:30 pm.
Location: Times, room 204.
Videos from previous seminars are available at http://bit.ly/MLJBSeminars