Invention of superpositional linear-nonlinear models well-known as neural networks is one of the most significant events in machine learning development. Last decade it became clear that using max-plus arithmetics (an important part of tropical mathematics) in structure of convolutional neural networks (for ReLU-activation and max-pooling layers) significantly improves their quantitative and qualitative characteristics and opens a new horizon in designing scalable deep architechtures with adequate learning procedures that were not available in previous generation of "shallow" neural models. We discuss new perspectives of superpositional linear-nonlinear modelling in consensus dynamics modelling of complex distrubuted systems including multi-agent and multi-team systems and how the tropical mathematics could resolve some intrinsic difficulties of the whole neural network approach.
Speaker: Dmitry Nikolaev.
Presentation language: Russian.
Date and Time: February 5th, 18:30-20:00.
Place: Times, room 204.
Videos from previous seminars are available at http://bit.ly/MLJBSeminars