Abstract
Quantum machine learning is an emerging field at the intersection of machine
learning and quantum computing. Classical cross entropy plays a central role in
machine learning. We define its quantum generalization, the quantum cross
entropy, prove its lower bounds, and investigate its relation to quantum
fidelity. In the classical case, minimizing cross entropy is equivalent to
maximizing likelihood. In the quantum case, when the quantum cross entropy is
constructed from quantum data undisturbed by quantum measurements, this
relation holds. Classical cross entropy is equal to negative log-likelihood.
When we obtain quantum cross entropy through empirical density matrix based on
measurement outcomes, the quantum cross entropy is lower-bounded by negative
log-likelihood. These two different scenarios illustrate the information loss
when making quantum measurements. We conclude that to achieve the goal of full
quantum machine learning, it is crucial to utilize the deferred measurement
principle.
Users
Please
log in to take part in the discussion (add own reviews or comments).