Near Term Quantum
Near-term quantum computing research focuses on developing and applying quantum algorithms and machine learning models on currently available noisy quantum hardware. Key areas include variational quantum algorithms (VQAs), particularly for optimization and machine learning tasks, with research exploring improved gradient estimation methods, AI-assisted circuit compilation, and novel quantum neural network architectures like equivariant convolutional circuits and hybrid classical-quantum models. These efforts aim to overcome limitations imposed by noise and limited qubit numbers, ultimately seeking to demonstrate practical quantum advantages in fields such as drug discovery, materials science, and machine learning.
Papers
Dynamical simulation via quantum machine learning with provable generalization
Joe Gibbs, Zoë Holmes, Matthias C. Caro, Nicholas Ezzell, Hsin-Yuan Huang, Lukasz Cincio, Andrew T. Sornborger, Patrick J. Coles
Out-of-distribution generalization for learning quantum dynamics
Matthias C. Caro, Hsin-Yuan Huang, Nicholas Ezzell, Joe Gibbs, Andrew T. Sornborger, Lukasz Cincio, Patrick J. Coles, Zoë Holmes