Information Theory SG Seminar
26 events
-
Accelerated equilibration in classical stochastic systems
January 13 (Wed) at 13:00 - 14:00, 2021
Kyosuke Adachi (Special Postdoctoral Researcher, iTHEMS / Special Postdoctoral Researcher, Nonequilibrium Physics of Living Matter RIKEN Hakubi Research Team, RIKEN Center for Biosystems Dynamics Research (BDR))
Shortcuts to adiabaticity (STA) [1] are processes that make a given quantum state evolve into a target state in a fast manner, which can be useful to avoid decoherence in quantum experiments. In this journal club, I will concisely review the concept of STA, and then focus on the recently proposed classical counterparts of STA, sometimes called engineered swift equilibration, in Brownian particle systems [2] and evolutionary systems [3].
Venue: via Zoom
Event Official Language: English
-
Review on the Lieb-Robinson bound
December 23 (Wed) at 13:00 - 14:00, 2020
Yukimi Goto (Special Postdoctoral Researcher, iTHEMS)
The Lieb-Robinson bound is inequality on the group velocity of information propagation for quantum many-body systems. In this talk, I review this bound mathematically and explain some consequences of the bound.
Event Official Language: English
-
Quantum Wasserstein distance of order 1
December 16 (Wed) at 13:00 - 14:30, 2020
Ryusuke Hamazaki (Senior Research Scientist, iTHEMS / RIKEN Hakubi Team Leader, Nonequilibrium Quantum Statistical Mechanics RIKEN Hakubi Research Team, RIKEN Cluster for Pioneering Research (CPR))
The Wasserstein distance is an indicator for the closeness of two probability distributions and is applied to various fields ranging from information theory to neural networks [1]. It is particularly useful to treat the geometry of the underlying space, such as tensor-product structures. In this journal club, I talk about one of the recent proposals on quantum extension of the Wasserstein distance [2]. After reviewing basic properties of classical Wasserstein distance, e.g., its relation to concentration phenomena, I discuss how they might be generalized to quantum realm.
Venue: via Zoom
Event Official Language: English
-
Seminar
Statistical model for meaning representation of language
December 16 (Wed) at 10:30 - 12:00, 2020
Koichiro Yoshino (Team Leader, Robotics Project, RIKEN Cluster for Science, Technology and Innovation Hub (RCSTI))
One of the final goals of natural language processing is building a model to capture the semantic meaning of language elements. Language modeling is a recent research trend to build a statistical model to express the meaning of language. The language model is based on the distributional hypothesis. The distributional hypothesis indicates that the surrounding elements of the target element describe the meaning of the element. In other words, relative positions between sentence elements (morphologies, words, and sentences) are essential to know the element's meaning. Recent works on distributed representation mainly focus on relations between clear elements: characters, morphologies, words, and sentences. However, it is essential to use structural information of languages such as dependency and semantic roles for building a human-understandable statistical model of languages. In this talk, we describe the statistical language model's basis and then discuss our research direction to introduce the language structure.
Venue: via Zoom
Event Official Language: English
-
Journal Club of Information Theory SG II
December 8 (Tue) at 13:00 - 14:00, 2020
Akinori Tanaka (Senior Research Scientist, iTHEMS)
The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope.
Venue: via Zoom
Event Official Language: English
-
Journal Club of Information Theory SG
December 1 (Tue) at 13:00 - 14:00, 2020
Akinori Tanaka (Senior Research Scientist, iTHEMS)
The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope. *Detailed information about the seminar refer to the email.
Venue: via Zoom
Event Official Language: English
26 events
Events
Categories
series
- iTHEMS Colloquium
- MACS Colloquium
- iTHEMS Seminar
- iTHEMS Math Seminar
- DMWG Seminar
- iTHEMS Biology Seminar
- iTHEMS Theoretical Physics Seminar
- Information Theory SG Seminar
- Quantum Matter Seminar
- ABBL-iTHEMS Joint Astro Seminar
- Math-Phys Seminar
- Quantum Gravity Gatherings
- RIKEN Quantum Seminar
- Quantum Computation SG Seminar
- Asymptotics in Astrophysics SG Seminar
- GW-EOS WG Seminar
- DEEP-IN Seminar
- NEW WG Seminar
- Lab-Theory Standing Talks
- QFT-core Seminar
- STAMP Seminar
- QuCoIn Seminar
- Number Theory Seminar
- Academic-Industrial Innovation Lecture
- Berkeley-iTHEMS Seminar
- iTHEMS-RNC Meson Science Lab. Joint Seminar
- RIKEN Quantum Lecture
- Theory of Operator Algebras
- iTHEMS Intensive Course-Evolution of Cooperation
- Introduction to Public-Key Cryptography
- Knot Theory
- iTHES Theoretical Science Colloquium
- SUURI-COOL Seminar
- iTHES Seminar