论文标题

广泛量子神经网络动力学的分析理论

Analytic theory for the dynamics of wide quantum neural networks

论文作者

Liu, Junyu, Najafi, Khadijeh, Sharma, Kunal, Tacchino, Francesco, Jiang, Liang, Mezzacapo, Antonio

论文摘要

参数化的量子电路可以用作量子神经网络,并且在解决学习问题的训练时,有可能优于其经典同行。迄今为止,关于他们在实际问题的表现的大部分结果本质上都是启发式的。特别是,尚未完全了解量子神经网络训练的收敛速率。在这里,我们分析了梯度下降的动力学,以用于一类变异量子机学习模型的训练误差。我们将广泛的量子神经网络定义为参数化的量子电路,该量子电路是大量量子位和变异参数的极限。然后,我们找到一个简单的分析公式,该公式捕获其损失函数的平均行为并讨论我们发现的后果。例如,对于随机量子电路,我们预测并表征了剩余训练误差的指数衰减,这是系统参数的函数。我们最终通过数值实验验证了我们的分析结果。

Parameterized quantum circuits can be used as quantum neural networks and have the potential to outperform their classical counterparts when trained for addressing learning problems. To date, much of the results on their performance on practical problems are heuristic in nature. In particular, the convergence rate for the training of quantum neural networks is not fully understood. Here, we analyze the dynamics of gradient descent for the training error of a class of variational quantum machine learning models. We define wide quantum neural networks as parameterized quantum circuits in the limit of a large number of qubits and variational parameters. We then find a simple analytic formula that captures the average behavior of their loss function and discuss the consequences of our findings. For example, for random quantum circuits, we predict and characterize an exponential decay of the residual training error as a function of the parameters of the system. We finally validate our analytic results with numerical experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源