Date
April 21 (Tue) 10:00 - 11:00, 2026 (JST)
Speaker
  • Zhang Bingzhi (PostDoc, University of Southern California, USA)
Venue
  • via Zoom
Language
English
Host
Jinyang Li

Deep generative models are key-enabling technology to computer vision, text generation, and large language models. Generative models for quantum data offer a promising route toward learning and preparing complex quantum-state ensembles. In this talk, I will introduce the quantum denoising diffusion probabilistic model (QuDDPM) [1], which adapts the diffusion-model idea to quantum systems through a forward randomization process and a trainable backward denoising dynamics. I will discuss how this framework enables stepwise learning of target quantum state ensembles and demonstrate its capabilities in various learning tasks. I will then present its extension to mixed states to eliminate the need for scrambling [2]. I will conclude with a brief discussion of recent results on scaling laws of quantum information lifetime in monitored quantum dynamics, emphasizing how mid-circuit measurements can maintain information and provide useful intuition for measurement-assisted quantum machine learning.

References

  1. Bingzhi Zhang, Peng Xu, Xiaohui Chen, Quntao Zhuang, Generative quantum machine learning via denoising diffusion probabilistic models, 132, 100602 (2024), doi: 10.1103/PhysRevLett.132.100602
  2. Gino Kwun, Bingzhi Zhang, Quntao Zhuang, Mixed-state quantum denoising diffusion probabilistic model, 111, 032610 (2025), doi: 10.1103/PhysRevA.111.032610
  3. Bingzhi Zhang, Fangjun Hu, Runzhe Mo, Tianyang Chen, Hakan E Tureci, Quntao Zhuang, Scaling Laws of Quantum Information Lifetime in Monitored Quantum Dynamics, (2025), arXiv: 2506.22755

This is a closed event for scientists. Non-scientists are not allowed to attend. If you are not a member or related person and would like to attend, please contact us using the inquiry form. Please note that the event organizer or speaker must authorize your request to attend.

Inquire about this event