RIKEN Quantum Lecture
5 events
-
LectureLectures on Quantum Measurement Theory: IV
June 30 (Tue) 15:30 - 17:00, 2026
Masanao Ozawa (Professor Emeritus, Nagoya University)
Lecture IV: Instruments in classical mechanics, quantum field theory, and cognitive science In algebraic quantum field theory, measurements describable by interactions between the field and the measuring apparatus are characterized by the class of completely positive instruments that satisfy the condition called the normal extension property (NEP) (Okamura-Ozawa 2016). In classical mechanics, traditionally only non-invasive measurements—those with trivial interaction—were considered admissible, for the observability of the trajectory of motion. Here, however, the full class of measurements realizable by classical-mechanical interactions is characterized in terms of instruments with NEP for the basis of the study of invasive measurements of classical systems. Cognitive processes are also represented by completely positive instruments, along with the long-standing paradigm provided by von Helmholtz, who described a sensation-perception process as a sort of measuring interaction and referred to it as an unconscious inference. This framework is used to show the compatibility of the question order effect and the response replicability effect (Ozawa-Khrennikov 2019), which failed to be explained in an earlier approach using only projective measurement models. It is shown that there exists an instrument model, realizing both the question order effect and the response replicability effect, that is also capable of almost faithfully reproducing public-opinion survey data such as the well-known Clinton-Gore survey by Gallup in 1997 (Ozawa-Khrennikov 2021).
Venue: Seminar Room #359 (Main Venue) / via Zoom
Event Official Language: English
-
LectureLectures on Quantum Measurement Theory: III
June 23 (Tue) 15:30 - 17:00, 2026
Masanao Ozawa (Professor Emeritus, Nagoya University)
Lecture III: Measurement error, disturbance, the universally valid reformulation of Heisenberg’s uncertainty principle, and a quantitative generalization of the Wigner–Araki–Yanase theorem Definitions of measurement error and disturbance are introduced (Ozawa 2002, 2019) and it is shown that there exists a solvable model for a physically realizable measurement that serves as a counterexample both to Heisenberg’s uncertainty principle in the conventional formulation and to the SQL (Ozawa 1988, 1989, 2002). Thus, those limits are no more considered as universal limits. In fact, the above counter example to SQL was found in 1988 using the idea of contractive state measurements by Yuen (1983) and the LIGO was started in 1994 to succeed in the gravitational wave detection in 2015 as announced in 2016. New formulations are then proved for the uncertainty principle concerning the errors in the approximate simultaneous measurement of two physical quantities, called the "joint error relation" (Ozawa 2003b, 2004), and for the uncertainty principle concerning the error and disturbance associated with the measurement of a single physical quantity, called the "error-disturbance relation" (Ozawa 2003a). From the error-disturbance relation, a quantitative relation for measurement error under an additive conservation law is proved (Ozawa 2002a, 2003b), generalizing the "Wigner–Araki–Yanase theorem" (Wigner 1952, Araki-Yanase 1960), which states that a physical quantity not commuting with a conserved quantity cannot be measured accurately by a measurement interaction satisfying an additive conservation law. The above relation also derives limits for realizing quantum computing and operations under conservation laws (Ozawa 2002b), the results later developed as the resource theory of asymmetry.
Venue: Seminar Room #359 (Main Venue) / via Zoom
Event Official Language: English
-
LectureLectures on Quantum Measurement Theory: II
June 16 (Tue) 15:30 - 17:00, 2026
Masanao Ozawa (Professor Emeritus, Nagoya University)
Lecture II: Modern approach: Quantum instruments, POVMs, measuring processes, intersubjectivity, and value reproducibility The modern approach to quantum measurement theory is based on the "realizability theorem" stating that a measurement is physically realizable if and only if its statistical properties are represented by a completely positive instrument, and this is also equivalent to saying that the measurement can be described by an interaction with a measuring apparatus (Ozawa 1984, 2004). The conventional analysis of a measuring process determines the post-measurement object state by applying the "projection postulate" to the meter measurement in the post-measurement state that "entangles" the object and the apparatus, but the above result has been established without assuming the projection postulate altogether; rather we use only the classical Bayesian probability update rule (Ozawa 1984). We introduce the "intersubjectivity theorem" that states that, when multiple observers simultaneously and statistically correctly measure the same physical quantity, they obtain the same measurement value and the "value reproducibility theorem" that states that a statistically correct measurement correctly reproduces the value of the physical quantity immediately before the measurement (Ozawa 2025). The above three theorems essentially solves the so-called measurement problem, since we eliminate the collapse of the wave function and we establish the reality of the the pre-measurement value of the measured observable to be copied to the meter value and to be recorded by the observer.
Venue: Seminar Room #359 (Main Venue) / via Zoom
Event Official Language: English
-
LectureLectures on Quantum Measurement Theory: I
June 2 (Tue) 15:30 - 17:00, 2026
Masanao Ozawa (Professor Emeritus, Nagoya University)
Lecture I: Conventional approach: Repeatability, Heisenberg’s original uncertainty principle, and the SQL for gravitational-wave detection The conventional approach to quantum measurement theory taken by von Neumann (1932), Dirac (1958), and Schrödinger (1935) assumes the "repeatability hypothesis" stating that if a physical quantity is measured twice in succession, then the same value is obtained each time, which is often quantitatively generalized to the "approximately repeatable hypothesis" stating that after a measurement of a physical quantity with error ε, the post-measurement deviation around the measured value is no larger than ε; this is equivalent to saying that the state after obtaining a measurement result with error ε becomes an ε-approximate eigenstate corresponding to that measurement result. From the approximate repeatability hypothesis, one can derive "Heisenberg’s original formulation of the uncertainty principle," namely, that when position and momentum are approximately measured simultaneously, the product of their respective errors is at least ℏ/2 (Heisenberg 1927, Kennard 1927, Ozawa 2015), as well as the "standard quantum limit (SQL) for monitoring the free-mass position", which states that when the position of a free mass m is measured at a time interval τ, the result of the second measurement cannot be predicted with uncertainty smaller than (ℏτ/ m)^{1/2} (Caves 1985). The last result leads to a sensitivity limit for interferometric gravitational-wave detectors, and in the early 1980s it was therefore argued that gravitational waves of the expected strength could not be observed using interferometric detectors (Braginsky et al. 1980, Caves et al. 1980).
Venue: Seminar Room #359 (Main Venue) / via Zoom
Event Official Language: English
-
Lecture
Rapid development of cold-atom quantum computers and their prospect
December 26 (Tue) 13:30 - 17:00, 2023
Takafumi Tomita (Assistant Professor, Photo-Molecular Science, Institute for Molecular Science)
Note for participants: For on-site participants, please register via the registration form. For online participants finding the Zoom link, you can get it after filling the registration form. Program: 13:30-15:00 Lecture 1 15:00-15:30 Coffee break 15:30-17:00 Lecture 2 Abstract: In this talk, I will give an overview of the recent rapid progress of cold-atom quantum computers. In a cold-atom quantum computer, a laser-cooled atomic gas in a vacuum chamber is captured with a two-dimensional trap array called an optical tweezers array, which is an array of tightly focused laser beams. An array of cold single atoms thus created is initialized, gate operated, and readout with other laser beams. Because of its controllability and scalability, the cold-atom quantum computer has been attracting much attention, as one of the most promising candidates in the race to develop quantum-computer hardware. I will describe the characteristics and development trends of the cold-atom hardware, as well as the development of a cold-atom quantum computer at Institute for Molecular Science including the realization of an ultrafast quantum gate using ultrashort laser pulses.
Venue: #435-437, 4F, Main Research Building (Main Venue) / via Zoom
Event Official Language: English
5 events