Date
October 21 (Mon) at 15:30 - October 23 (Wed) at 14:30, 2019 (JST)
Speaker
Language
Japanese

Oct.21 15:30-16:30, 16:40-17:40, Okochi Hall
Oct.22 13:30-14:30, room #435-437, Main Research Building
Oct.23 13:30-14:30, room #435-437, Main Research Building

Mean dimension is a topological invariant of dynamical systems, which counts the number of parameters per second for describing the orbits of dynamical systems.

I will explain the connections between this quantity and information theory:

  1.  Mean dimension and communication: One of Shannon's monumental works is a calculation of the amount of information which we can transmit by using band-limited signals (i.e. telephone signal). We develop its ”topological dynamical version” and apply it to the problem of ”embedding dynamical systems into Hilbert cubes”. Mean dimension plays a role of key-parameter in this study.
  2. Mean dimension and data compression:  Shannon entropy is the fundamental limit of lossless data compression.
    However, most signals in our world (e.g. audio signals and images) are modeled by continuous variables and hence have infinite entropy. If we want to compress such infinite entropy signals, we have to consider lossy data compression method. Rate distortion theory is responsible for such lossy data compression. We found that if we apply a kind of mini-max principle to rate distortion theory then we get mean dimension.

On 21st October, I will explain the outline of the above (1) and (2). I will explain more details about (1) on 22nd and about (2) on 23th.

This plan of talks may be changed according to requests of participants of the seminar.

Related News