日時
2020年11月20日(金)10:00 - 11:00 (JST)
講演者
  • Hidenori Tanaka (Group Leader & Senior Scientist, Physics & Informatics Laboratories, NTT Research, Inc., USA / Visiting Scholar, Stanford University, USA)
会場
  • via Zoom
言語
英語

Symmetry is the central guiding principle in the exploration of the physical world but has been underutilized in understanding and engineering neural networks. We first identify simple yet powerful geometrical properties imposed by symmetry. Then, we apply the theory to answer a series of following important questions: (i) What, if anything, can we quantitatively predict about the complex learning dynamics of real-world deep learning models driven by real-world datasets? (ii) How can we make deep learning models more efficient by removing parameters without disconnecting information flow? (iii) How can we distill experimentally testable neuroscientific hypotheses by reducing the complexity of deep learning models mimicking the brain? Overall, our approach demonstrates how we can harness the principles of symmetry and conservation laws to reduce deep learning models' complexity and make advances in the science and engineering of biological and artificial neural networks.

関連ニュース