日時
2020年12月8日13:00 - 14:00 (JST)
講演者
田中 章詞 (数理創造プログラム 上級研究員) Edit
会場
via Zoom
言語
英語

The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope.

*Detailed information about the seminar refer to the email.

関連ニュース