@kirk86

Avoiding local minima in variational quantum eigensolvers with the natural gradient optimizer

, , and . (2020)cite arxiv:2004.14666Comment: 16 pages, 6 figures, 1 table.

Abstract

We compare the BFGS optimizer, ADAM and Natural Gradient Descent (NatGrad) in the context of Variational Quantum Eigensolvers (VQEs). We systematically analyze their performance on the QAOA ansatz for the Transverse Field Ising Model (TFIM) as well as on overparametrized circuits with the ability to break the symmetry of the Hamiltonian. The BFGS algorithm is frequently unable to find a global minimum for systems beyond about 20 spins and ADAM easily gets trapped in local minima. On the other hand, NatGrad shows stable performance on all considered system sizes, albeit at a significantly higher cost per epoch. In sharp contrast to most classical gradient based learning, the performance of all optimizers is found to decrease upon seemingly benign overparametrization of the ansatz class, with BFGS and ADAM failing more often and more severely than NatGrad. Additional tests for the Heisenberg XXZ model corroborate the accuracy problems of BFGS in high dimensions, but they reveal some shortcomings of NatGrad as well. Our results suggest that great care needs to be taken in the choice of gradient based optimizers and the parametrization for VQEs.

Description

[2004.14666] Avoiding local minima in variational quantum eigensolvers with the natural gradient optimizer

Links and resources

Tags