Efficient Quantum Algorithm for Solving Linear Distributed Delay Differential Equations
Abstract
Non-Markovian dynamics is ubiquitous in both quantum and classical systems, but the numerical computation of the time-delay dynamics is demanding. In this work, we propose an efficient quantum algorithm for solving linear distributed delay differential equations and identify the condition under which it applies. Using the linear chain trick, the distributed delay differential equations can be embedded into ordinary differential equations augmented with auxiliary variables, when the kernel function is characterized by a phase-type distribution. Employing the Schrödingerization method, the resulting equations can be embedded into the Schrödinger equation and efficiently solved by Hamiltonian simulation. Although this embedding requires the augmented differential equation to be semi-stable, we show that it is satisfied if and only if the original distributed-delay differential equations are semi-stable. The query complexity to obtain the normalized solution state of the -dimensional delay system is with , , , and being the allowable error, the dimension of the auxiliary variables associated with each kernel function, the Hamiltonian operator, and its sparsity, respectively. The gate complexity is given by this quantity multiplied by , where is the number of precision bits. To demonstrate the efficacy of our method, we present its applications to the generalized master equation and to the Redfield equation of the dephasing model.
Source: arXiv:2603.17941v1 - http://arxiv.org/abs/2603.17941v1 PDF: https://arxiv.org/pdf/2603.17941v1 Original Link: http://arxiv.org/abs/2603.17941v1