Kicking off the study group of information theory, Akinori Tanaka (Senior Research Scientist of iTHEMS and AIP) talked about the connection between Langevin equation and deep neural networks. He first showed that by analyzing Langevin equations one can derive the second law of thermodynamics which posits that the total entropy of the system should increase. He then delved into stochastic gradient descent (SDG) and showed how to apply it to train the neural network in general. We discussed too enthusiastically for him to finish his talk, and so we'll organize the second part next week (08 December 2020). We are all looking forward to discussing more, and thank you so much for the elegant talk!
-Ryosuke Iritani (iTHEMS)

関連イベント