On the design and analysis of near-term quantum network protocols using Markov decision processes

07/07/2022
by   Sumeet Khatri, et al.
0

The quantum internet is one of the frontiers of quantum information science research. It will revolutionize the way we communicate and do other tasks, and it will allow for tasks that are not possible using the current, classical internet. The backbone of a quantum internet is entanglement distributed globally in order to allow for such novel applications to be performed over long distances. Experimental progress is currently being made to realize quantum networks on a small scale, but much theoretical work is still needed in order to understand how best to distribute entanglement, especially with the limitations of near-term quantum technologies taken into account. This work provides an initial step towards this goal. In this work, we lay out a theory of near-term quantum networks based on Markov decision processes (MDPs), and we show that MDPs provide a precise and systematic mathematical framework to model protocols for near-term quantum networks that is agnostic to the specific implementation platform. We start by simplifying the MDP for elementary links introduced in prior work, and by providing new results on policies for elementary links in the steady-state (infinite-time) limit. In particular, we show that the well-known memory-cutoff policy is optimal in the steady-state limit. Then we show how the elementary link MDP can be used to analyze a quantum network protocol in which we wait for all elementary links to be active before creating end-to-end links. We then provide an extension of the MDP formalism to two elementary links, which is useful for analyzing more sophisticated quantum network protocols. Here, as new results, we derive linear programming relaxations that allow us to obtain optimal steady-state policies with respect to the expected fidelity and waiting time of the end-to-end link.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset