情報理論SGセミナー

セミナー

情報理論SGセミナー

Quantum annealing and its fundamental aspects/ Quantum annealing and its application to real world

2021年8月4日13:30 - 16:00

大関 真之 (東北大学 大学院 情報科学研究科 教授 / 東京工業大学 科学技術創成研究院 教授 / 株式会社シグマアイ 代表取締役)

Talk A (13:30~14:30) Title: Quantum annealing and its fundamental aspects Abstract: We introduce a heuristic solver for combinatorial optimization problem, quantum annealing. The quantum annealing utilizes the quantum tunneling effect to search the ground state. In particular, the Ising model with the transverse field is employed for demonstration of the quantum annealing. Most of the combinatorial optimization problem can be described by the Ising model and they are solved by quantum annealing. A decade ago, the D-Wave systems Inc. succeeded in realizing the quantum annealing in their manufactured spin system. In this talk, the concept of quantum annealing and its implementation in the D-Wave quantum annealer are introduced. Talk B (14:40~15:40) Title: Quantum annealing and its application to real world Abstract: In this talk, we review the fundamental aspects of quantum annealing and show several applications to practical combinatorial optimization problems. In particular, in Japan, many researchers in industry are interested in practical applications of quantum annealing. We, Tohoku University, are performing various collaboration with many companies in Japan. The first example is to control automated guided vehicles in collaboration with DENSO. The second one is to list hotel recommendation on a web site with Recruit lifestyle. Other ones are also exhibited as far as possible. Let us discuss a future perspective of the quantum annealing in practical applications.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Overview of Tensor Networks in Machine Learning

2021年7月28日13:30 - 14:50

チビン・チョウ (理化学研究所 革新知能統合研究センター (AIP) テンソル学習チーム チームリーダー)

Tensor Networks (TNs) are factorizations of high dimensional tensors into networks of many low-dimensional tensors, which have been studied in quantum physics, high-performance computing, and applied mathematics. In recent years, TNs have been increasingly investigated and applied to machine learning and signal processing, due to its significant advances in handling large-scale and high-dimensional problems, model compression in deep neural networks, and efficient computations for learning algorithms. This talk aims to present a broad overview of recent progress of TNs technology applied to machine learning from perspectives of basic principle and algorithms, novel approaches in unsupervised learning, tensor completion, multi-task, multi-model learning and various applications in DNN, CNN, RNN and etc. We also discuss the future research directions and new trend in this area.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Introduction to the replica method

2021年6月23日13:30 - 15:40

樺島 祥介 (東京大学 大学院理学系研究科 教授)

The replica method is a mathematical technique for evaluating the "quenched" average of logarithm (or a real number power) of the partition function with respect to predetermined random variables that condition the objective system. The technique has a long history, dating back at least to a book by Hardy et al in 1930s, but has become well known only since its application to the physics of spin glasses in 1970s. More recently, its application range is spreading rapidly to various fields in information science, including information theory, communication theory, signal processing, computational complexity theory, machine learning, etc. In this talk, we introduce the basic idea of the replica method and its mathematical fault illustrating a few examples. *Detailed information about the seminar refer to the email.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal Club: Intrinsically Disordered Region (IDR)

2021年5月19日13:00 - 14:00

足立 景亮 (数理創造プログラム 基礎科学特別研究員 / 理化学研究所 生命機能科学研究センター 生体非平衡物理学理研白眉研究チーム 基礎科学特別研究員)

A class of protein domain, which is called intrinsically disordered region (IDR), is known to take no rigid three dimensional structure. Recent studies have shown that IDRs can show biological functions through phase separation, and it is important to clarify what kind of amino acid sequence of IDR leads to phase separation and what kind of mutation results in malfunction. In this journal club, I will discuss these topics by reviewing recent papers. *Detailed information about the seminar refer to the email.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Thermodynamic Uncertainty Relation Connects Physics, Information Science, and Biology

2021年4月28日13:30 - 16:00

長谷川 禎彦 (東京大学大学院 情報理工学系研究科 准教授)

Higher precision demands more resources. Although this fact is widely accepted, it has only recently been theoretically proved. The thermodynamic uncertainty relation serves as a theoretical basis for this notion, and it states that current fluctuations are bounded from below by thermodynamic costs, such as entropy production and dynamical activity. In this seminar, I show a strong connection between the thermodynamic uncertainty relation and information theory by deriving it through information inequality known as a Cramér-Rao bound, which provides the error bound for any statistical estimator. Moreover, by using a quantum Cramér-Rao bound, I derive a quantum extension of thermodynamic uncertainty relation, which holds for general open quantum systems. The thermodynamic uncertainty relation predicts the fundamental limit of biomolecular processes, and thus it can be applied to infer the entropy production, corresponding to the consumption of adenosine triphosphate, of biological systems in the absence of detailed knowledge about them. *Detailed information about the seminar refer to the email.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal Club: Trace inequalities and their applications

2021年4月14日14:30 - 15:30

後藤 ゆきみ (数理創造プログラム 基礎科学特別研究員)

In this talk, I will explain trace inequalities and related topics. Mainly, I focus on results concerning quantum entropy. This talk is an elementary introduction to that subjects. *Detailed information about the seminar refer to the email.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal Club: Reinforcement Learning

2021年3月24日13:00 - 14:00

田中 章詞 (数理創造プログラム 上級研究員)

Reinforcement Learning (RL) is a scheme of Machine Learning that is applicable "without training data." Instead, we prepare a "world" that agents (learners) can probe, and try to optimize their behavior. Historically, study of RL has deep connection to studies of psychology and neuroscience. In this journal club, I would like to give a lightning review of RL. *Detailed information about the seminar refer to the email.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal Club: Large deviation statistics of Markovian quantum systems

2021年2月17日13:00 - 14:30

濱崎 立資 (数理創造プログラム 上級研究員 / 理化学研究所 開拓研究本部 濱崎非平衡量子統計力学理研白眉研究チーム 理研白眉研究チームリーダー)

Large deviation is a mathematical framework to treat “rare events” in random processes [1]. In this journal club, I talk about recent development of large deviation analysis in open Markovian quantum systems [2,3]. I first introduce the notion of large deviation statistics using the simple independent and identically distributed random variables. I then review recent development of level 2.5 large deviation statistics for classical Markovian jump processes and its application to thermodynamic uncertainty relation [4]. Finally, I discuss how the classical results are extended to quantum regime.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal Club: Sampling the stable structures based on replica-permutation method

2021年1月27日13:00 - 14:30

横田 宏 (数理創造プログラム 特別研究員)

When we want to search the (meta)stable structures of the macromolecules such as protein, the combination of molecular dynamics simulation and replica exchange method (REM) is useful. In REM, sampling is performed by exchanging replicas (copies) of the system having different temperatures when this process is accepted based on Metropolis algorithm. In this method, the exchange can be rejected, which leads to the decrease in the sampling efficiency. To obtain more efficient sampling than that of REM, Itoh and Okumura proposed replica-permutation method (RPM) in which the replicas are permutated to perform sampling based on Suwa-Toudou algorithm. In this Journal club, I will introduce RPM and some examples of its application.

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Information theory in ecology: Markov chain, Venn diagram, Kronecker (and Cartesian graph) products, and Tsallis entropy

2021年1月20日13:00 - 14:00

入谷 亮介 (数理創造プログラム 研究員)

This is more like an introductory talk on how I was motivated to work with information theory, and include unpublished data. Ecologists have been long interested in understanding diversity (divergence) of natural ecosystems. One possible way of accounting for diversity is to use a species' presence/absence table across spatial locations (species-location table), in which we record 1 if a focal species is present in a given site (otherwise 0). Recent interest lies in assessing how diversity (e.g., the number of species) changes with time: for instance, extinction and colonization of species may result in the modification of such tables with time. However, we are yet to have theoretical toolkits to model the dynamics of spcies-site tables. In this talk, I will introduce my model (in collaboration with R. Hamazaki, S. Tatsumi, and M Cadotte) of the dynamics of species-site tables based on Markovian stochastic processes. Specifically, our apporach allows us to analytically obtain the solution of the full stochastic dynamics by means of localizing the dynamics to a single site and then expanding it towards the global sites with Kronecker's prodcut (in linear algebra) or Cartesian product (in graph theory). Intuition obtains from illustrating the dynamics onto Venn diagram, where we draw several sets (corresponding to locations) and binary numbers (corresponding to presence-absence data) and consider random walks on Venn diagram acorss sets; also this Venn diagram based interpretation is mathematically underpinned by Cartesian product of graphs. Finally I will briefly talk about how we assess diversity of ecosystems using Tsallis entropy (or the generalized Shannon entropy).

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Accelerated equilibration in classical stochastic systems

2021年1月13日13:00 - 14:00

足立 景亮 (数理創造プログラム 基礎科学特別研究員 / 理化学研究所 生命機能科学研究センター 生体非平衡物理学理研白眉研究チーム 基礎科学特別研究員)

Shortcuts to adiabaticity (STA) [1] are processes that make a given quantum state evolve into a target state in a fast manner, which can be useful to avoid decoherence in quantum experiments. In this journal club, I will concisely review the concept of STA, and then focus on the recently proposed classical counterparts of STA, sometimes called engineered swift equilibration, in Brownian particle systems [2] and evolutionary systems [3].

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Review on the Lieb-Robinson bound

2020年12月23日13:00 - 14:00

後藤 ゆきみ (数理創造プログラム 基礎科学特別研究員)

The Lieb-Robinson bound is inequality on the group velocity of information propagation for quantum many-body systems. In this talk, I review this bound mathematically and explain some consequences of the bound.

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Quantum Wasserstein distance of order 1

2020年12月16日13:00 - 14:30

濱崎 立資 (数理創造プログラム 上級研究員 / 理化学研究所 開拓研究本部 濱崎非平衡量子統計力学理研白眉研究チーム 理研白眉研究チームリーダー)

The Wasserstein distance is an indicator for the closeness of two probability distributions and is applied to various fields ranging from information theory to neural networks [1]. It is particularly useful to treat the geometry of the underlying space, such as tensor-product structures. In this journal club, I talk about one of the recent proposals on quantum extension of the Wasserstein distance [2]. After reviewing basic properties of classical Wasserstein distance, e.g., its relation to concentration phenomena, I discuss how they might be generalized to quantum realm.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Statistical model for meaning representation of language

2020年12月16日10:30 - 12:00

吉野 幸一郎 (理化学研究所 科技ハブ産連本部 (RCSTI) ロボティクスプロジェクト チームリーダー)

One of the final goals of natural language processing is building a model to capture the semantic meaning of language elements. Language modeling is a recent research trend to build a statistical model to express the meaning of language. The language model is based on the distributional hypothesis. The distributional hypothesis indicates that the surrounding elements of the target element describe the meaning of the element. In other words, relative positions between sentence elements (morphologies, words, and sentences) are essential to know the element's meaning. Recent works on distributed representation mainly focus on relations between clear elements: characters, morphologies, words, and sentences. However, it is essential to use structural information of languages such as dependency and semantic roles for building a human-understandable statistical model of languages. In this talk, we describe the statistical language model's basis and then discuss our research direction to introduce the language structure.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal club of Information Theory SG II

2020年12月8日13:00 - 14:00

田中 章詞 (数理創造プログラム 上級研究員)

The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope.

会場: via Zoom

イベント公式言語: 英語

セミナー

情報理論SGセミナー

Journal club of Information Theory SG

2020年12月1日13:00 - 14:00

田中 章詞 (数理創造プログラム 上級研究員)

The practical updating process of deep neural networks based on stochastic gradient descent is quite similar to stochastic dynamics described by Langevin equation. Under the Langevin system, we can "derive" 2nd law of thermodynamics, i.e. increasing the total entropy of the system. This fact suggests "2nd law of thermodynamics in deep learning." In this talk, I would like to explain this idea roughly, and there will be no concrete new result, but it may provide us new perspectives to study neural networks, I hope.

会場: via Zoom

イベント公式言語: 英語