日時
2026年4月21日(火)10:00 - 11:00 (JST)
講演者
  • Zhang Bingzhi (PostDoc, University of Southern California, USA)
会場
  • via Zoom
言語
英語
ホスト
Jinyang Li

Deep generative models are key-enabling technology to computer vision, text generation, and large language models. Generative models for quantum data offer a promising route toward learning and preparing complex quantum-state ensembles. In this talk, I will introduce the quantum denoising diffusion probabilistic model (QuDDPM) [1], which adapts the diffusion-model idea to quantum systems through a forward randomization process and a trainable backward denoising dynamics. I will discuss how this framework enables stepwise learning of target quantum state ensembles and demonstrate its capabilities in various learning tasks. I will then present its extension to mixed states to eliminate the need for scrambling [2]. I will conclude with a brief discussion of recent results on scaling laws of quantum information lifetime in monitored quantum dynamics, emphasizing how mid-circuit measurements can maintain information and provide useful intuition for measurement-assisted quantum machine learning.

References

  1. Bingzhi Zhang, Peng Xu, Xiaohui Chen, Quntao Zhuang, Generative quantum machine learning via denoising diffusion probabilistic models, 132, 100602 (2024), doi: 10.1103/PhysRevLett.132.100602
  2. Gino Kwun, Bingzhi Zhang, Quntao Zhuang, Mixed-state quantum denoising diffusion probabilistic model, 111, 032610 (2025), doi: 10.1103/PhysRevA.111.032610
  3. Bingzhi Zhang, Fangjun Hu, Runzhe Mo, Tianyang Chen, Hakan E Tureci, Quntao Zhuang, Scaling Laws of Quantum Information Lifetime in Monitored Quantum Dynamics, (2025), arXiv: 2506.22755

このイベントは研究者向けのクローズドイベントです。一般の方はご参加頂けません。メンバーや関係者以外の方で参加ご希望の方は、フォームよりお問い合わせ下さい。講演者やホストの意向により、ご参加頂けない場合もありますので、ご了承下さい。

このイベントについて問い合わせる