Date
December 8 (Tue) at 13:00 - 14:00, 2020 (JST)
Speaker
Venue
  • via Zoom
Language
English

The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope.

*Detailed information about the seminar refer to the email.

Related News