Symmetry and conservation laws in neural networks
- Date
- November 20 (Fri) at 10:00 - 11:00, 2020 (JST)
- Speaker
-
- Hidenori Tanaka (Group Leader & Senior Scientist, Physics & Informatics Laboratories, NTT Research, Inc., USA / Visiting Scholar, Stanford University, USA)
- Venue
- via Zoom
- Language
- English
Symmetry is the central guiding principle in the exploration of the physical world but has been underutilized in understanding and engineering neural networks. We first identify simple yet powerful geometrical properties imposed by symmetry. Then, we apply the theory to answer a series of following important questions: (i) What, if anything, can we quantitatively predict about the complex learning dynamics of real-world deep learning models driven by real-world datasets? (ii) How can we make deep learning models more efficient by removing parameters without disconnecting information flow? (iii) How can we distill experimentally testable neuroscientific hypotheses by reducing the complexity of deep learning models mimicking the brain? Overall, our approach demonstrates how we can harness the principles of symmetry and conservation laws to reduce deep learning models' complexity and make advances in the science and engineering of biological and artificial neural networks.