Maintained coherence for better quantum information applications

How do we prevent decoherence in quantum information technology and, in turn, create more advanced applications? This question is at the heart of the MQC project, which investigated ways to maintain coherence in quantum systems.

With Moore’s Law soon to hit a wall, all eyes are on quantum information technology. We’re not too far off: quantum architectures such as trapped ions, colour defects in crystals and Rydberg atoms are already available and can be used to implement quantum information applications.

From there, the principle is simple at least on paper: the longer quantum coherence can be maintained, the richer and more interesting quantum applications get. In other words, quantum physicists are in need of new approaches preventing decoherence due to noise, leakage and decay channels.

Itsik Cohen has investigated ways to maintain this coherence and realise a variety of quantum applications under the MQC project. The Marie Skłodowska-Curie fellow agreed to discuss his approach and findings with us.

Your work focuses on quantum coherence. Why is it important for the future of quantum computing?

Itsik Cohen: In physics, coherence is maintained as long as waves preserve their relative phase, thus enabling the interference phenomenon. The same applies to quantum coherence, which is maintained when the quantum superposition (phase and amplitude) is kept stable.

Quantum coherence is at the heart of quantum information technology. The latter can only be realised as long as quantum coherence is preserved. In fact, as quantum applications increase in complexity, coherence time needs to be extended. In a similar manner, longer coherence times reveal higher performance and higher quantum operation fidelity, which is extremely important especially for quantum computing.

How do you achieve such coherence?

To maintain coherence, one should overcome noise, leakage and decay channels that constitute the main sources of decoherence. Refocusing techniques such as dynamical decoupling and quantum error correction have been developed specifically to this end.

The field of dynamical decoupling was born with Hahn’s idea to refocus inhomogeneous broadening in Nuclear Magnetic Resonance (NMR). This effect was named Spin Echo. It is currently used in many areas of physics, from atomic systems to condensed matter. Breakthroughs in this field have given us the ability to initialise, manipulate and detect the state of a qubit with extremely high precision. Even more impressively, it enabled the prolongation of the coherence time of qubits by many orders of magnitude. A complementary approach to pulse dynamical decoupling is the continuous one: A continuous resonant driving field opens an energy gap that protects against the slow power spectrum of the decoherence source.

The field of quantum error correction, on the other hand, was born with Peter Shor’s syndrome measurement and feedback algorithm. In Shor’s algorithm, a single computational qubit is described by 9 real qubits, and noise is detected by measurements which may preserve the qubit subspace intact. Then one can apply feedback operations and invert the noise process. Note that as the QEC protocols demand more resources, it is experimentally preferable to use DD schemes when possible.

What would you say makes the project’s approach particularly innovative?

Utilising refocusing schemes to maintain quantum coherence depends on the noise sources that vary between each quantum platform, and each experimental setup. Naively using this protecting intervention might also refocus and destroy the desired quantum application.

Overcoming this difficulty makes my projects particularly innovative, since a great deal of creativity has to be taken in order to obtain the noise compensation while realising the desired applications.

How did you proceed to solve problems with quantum communications platforms?

In a previous project, we theoretically proposed a scheme for entanglement distribution in quantum networks. By sending a single photon between quantum nodes – each consisting of an atomic qubit embedded in a cavity – we manage to generate a multi control phase gate between the atoms. This is an important universal gate, needed for quantum search protocols. One of the experimental obstacles lies in maintaining the optical phase of the photon due to optical length fluctuations. We overcame that problem by sending multiple photons to the quantum network: We used a pulsed version of dynamical decoupling to refocus the random optical phase.

More recently, in a theory-experiment collaboration with the Weizmann Institute group led by Ofer Firstenberg, we demonstrated that continuous dynamical decoupling can be applied to protect a collective excitation in warm atoms against Doppler broadening. When a photon (or a weak coherent pulse) is absorbed by atomic vapour, this global atomic excitation performs as a spin wave. Each atom possesses a phase according to its position, and the momentum of the absorbed photon. Due to random atomic velocities, the global excitation is subjected to Doppler decoherence that destroys the desired atomic phase.

To tackle this, we introduce an additional ancillary sensor state, having an opposite sensitivity to the same Doppler mechanism. By smartly driving the transition between the excited and senor states, we couple them and obtain a protected dressed state that is insensitive to Doppler noise. The coherence time is prolonged.

What are, according to you, the most important outcomes of these two projects so far and why?

Both projects are important to the field of quantum information technologies. Both have many further applications, and can lead to more enriching research. However, as a theoretician, I think that any theoretical proposal that matures to be experimentally realised is eventually more important than the ones that do not. For that reason, I think that the experiment-theory collaboration in the second project is more important. This is true at least for now, until the first project is realised experimentally.

In the long run, what do you hope will be the impact of your research on quantum computing? What applications do you foresee?

Rydberg atoms have recently become an interesting candidate as a quantum computation platform. Our scheme can be used to protect against Doppler decoherence, while still effectively populating the Rydberg states that are needed for generating interactions between different atoms. Therefore, I am sure that our scheme will be useful for upcoming experiments in quantum computing with Rydberg atoms.

Scaling quantum computation over a large number of qubits remains one of the main challenges of quantum information processing. Using our scheme for entanglement distribution in quantum networks goes hand-in-hand with the concept of a hierarchy quantum computer, where each quantum node is a quantum computer of a few qubits. I am therefore convinced that, even though our scheme has yet to be verified experimentally, it eventually will be.

Can you tell us more about your follow-up plans, if any?

There are many options. One of them is to pursue academic research in the field of quantum technologies. On the other hand, there are many companies outside academia where I can also contribute. I haven’t quite decided yet!


published: 2020-06-02
Comments
Privacy Policy