Technology

A Logical Path to Effective Quantum Technologies

A Logical Path to Effective Quantum Technologies

Efficiently developing quantum technologies necessitates a multifaceted approach that includes fields such as quantum computing, quantum communication, and quantum sensing.

Researchers at the University of Stuttgart have demonstrated that a key component of many quantum computation and communication schemes can be performed with efficiency that exceeds the commonly assumed upper theoretical limit, opening up new avenues for a variety of photonic quantum technologies.

Not only has quantum science transformed our understanding of nature, but it is also inspiring ground-breaking new computing, communication, and sensor devices. Exploiting quantum effects in such ‘quantum technologies’ usually necessitates a combination of in-depth understanding of the underlying quantum-physical principles, systematic methodological advances, and astute engineering.

It is precisely this combination that researchers in Prof. Stefanie Barz’s group at the University of Stuttgart and the Center for Integrated Quantum Science and Technology (IQST) have demonstrated in a recent study, in which they improved the efficiency of an essential building block of many quantum devices beyond an apparent inherent limit.

Researchers at the University of Stuttgart have demonstrated that a key component of many quantum computation and communication schemes can be performed with efficiency that exceeds the commonly assumed upper theoretical limit, opening up new avenues for a variety of photonic quantum technologies.

From philosophy to technology

A property known as quantum entanglement is a key player in the field of quantum technologies. The first step in developing this concept was a heated debate between Albert Einstein and Niels Bohr.

Their argument, in a nutshell, was about how information can be shared across multiple quantum systems. Importantly, this can happen in ways that classical physics has no analog for. The debate that Einstein and Bohr started remained largely philosophical until the 1960s, when physicist John Stewart Bell devised an experimental method to resolve the disagreement. Bell’s framework was first investigated in experiments with photons, or light quanta. Three pioneers in this field – Alain Aspect, John Clauser, and Anton Zeilinger – were jointly awarded last year’s Nobel Prize in Physics for their groundbreaking works toward quantum technologies.

Bell died in 1990, but his name lives on, not least in the so-called Bell states. These are the quantum states of two particles that are as entangled as they can be. There are four Bell states in total, and Bell-state measurements, which determine which of the four states a quantum system is in, are a necessary tool for putting quantum entanglement into practice. Perhaps most famously, Bell-state measurements are a key component in quantum teleportation, which enables most quantum communication and computation.

But there is a problem: when experiments are performed using conventional optical elements, such as mirrors, beam splitters, and waveplates, then two of the four Bell states have identical experimental signatures and are therefore indistinguishable from each other. This means that the overall probability of success (and thus the success rate of, say, a quantum-teleportation experiment) is inherently limited to 50 percent if only such ‘linear’ optical components are used. Or is it?

A linear path to efficient quantum technologies

With all the bells and whistles

This is where the work of the Barz group comes in. As they recently reported in the journal Science Advances, doctoral researchers Matthias Bayerbach and Simone D’Aurelio carried out Bell-state measurements in which they achieved a success rate of 57.9 percent. But how did they reach an efficiency that should have been unattainable with the tools available?

Their outstanding result was made possible by using two additional photons in tandem with the entangled photon pair. It has been known in theory that such ‘auxiliary’ photons offer a way to perform Bell-state measurements with an efficiency beyond 50 percent. However, the experimental realization has remained elusive.

One reason for this is that sophisticated detectors that can determine the number of photons impinging on them are required. Bayerbach and D’Aurelio overcame this difficulty by employing 48 single-photon detectors that worked in near-perfect synchrony to detect the precise states of up to four photons arriving at the detector array. With this capability, the team was able to detect distinct photon-number distributions for each Bell state – albeit with some overlap for the two previously indistinguishable states, which is why, even in theory, the efficiency could not exceed 62.5 percent. However, the 50% barrier has been broken. Furthermore, the probability of success can be arbitrarily close to 100%, at the expense of having to add a greater number of ancilla photons.

Bright prospects

Also, the most sophisticated experiment is plagued by imperfections, and this reality has to be taken into account when analyzing the data and predicting how the technique would work for larger systems. The Stuttgart researchers therefore teamed up with Prof. Dr. Peter van Loock, a theorist at the Johannes Gutenberg University in Mainz and one of the architects of the ancilla-assisted Bell-state measurement scheme.

Van Loock and Barz are both members of the PhotonQ collaboration, which is funded by the BMBF and brings together academic and industrial partners from across Germany to work on the development of a specific type of photonic quantum computer. One of the first fruits of this collaborative effort is the improved Bell-state measurement scheme.

Although the increase in efficiency from 50 to 57.9 percent may appear minor, it provides a significant advantage in scenarios requiring a series of sequential measurements, such as long-distance quantum communication. It is critical for such upscaling that the linear-optics platform has a low instrumental complexity when compared to other approaches.