Professor Stephen Hawking has pleaded with world leaders to keep technology under control before it destroys humanity.
venerdì 26 settembre 2014
First Quantum Logic Operation For An Integrated Photonic Chip
The first teleportation of a photon inside a photonic chip illustrates both the potential for quantum computation and the significant challenges that lay ahead.
Back in 2001, an obscure group of theoretical physicists proved a remarkable result. They showed that it was possible to build a quantum computer out of ordinary optical components, such as mirrors, lenses, lasers and so on.
That was something of a surprise. Until then, physicists had thought that quantum computing would only be possible using non-linear crystals and other exotic components that are hard to control.
The prospect of using ordinary bits and pieces has an important consequence. It immediately suggests that more powerful devices can be built simply by adding extra components. This problem of scalability has always plagued other attempts to build quantum computers.
The reaction from the theoretical physics community was barely controlled excitement. But in practice, this approach has never lived up to its early promise. That’s because it is hard to build even ordinary optical components into chip-like devices that can be scaled like conventional silicon chips. It is just not possible to manufacture them with the required performance and tolerances.
Today, Benjamin Metcalf at the University of Oxford and a few pals show how they are tackling these problems while aiming for the ultimate goal of scalable quantum computation. These guys have built the first self-contained photonic chip capable of teleportation, one of the fundamental logic operations necessary for quantum computation. The device is a proof-of-principle demonstration that scalable quantum computers of this type are firmly in the crosshairs of experimental physicists. But it also reveals that significant challenges lay ahead.
Quantum teleportation is a standard procedure in quantum optics laboratories all over the world. It aims to transfer the quantum state of an input qubit, Q1, to a target qubit, Q3. The process begins by creating a pair of entangled photons, Q2 and Q3. These share the same quantum existence so that a measurement on one immediately influences the other.
This measurement is important. Known as a two qubit Bell state measurement, it is carried out on both Q1 and Q2 at the same time. Because Q2 is entangled with Q3, this results in the quantum state of Q1 being transferred to Q3. In other words, the quantum state of Q1 is teleported to Q3, which may be an entirely different part of the universe.
This process is usually carried out using low intensity laser beams and ordinary components such as mirrors and optical fibres. But the new photonic device shrinks all these components onto a single silicon chip.
It has source of photons, beam splitters, silica waveguides to channel the photons through the device as well as components for creating and measuring quantum bits or qubits. One of the key questions these guys set out to answer is how well each of these components work and how their limitations contribute to the overall performance of the chip.
Until now, one problem with this approach is that it is difficult to create high quality single photons in chip-based devices. What’s more, these photons tend to get absorbed by imperfect beam splitters or scattered in the silica waveguides, dramatically reducing the robustness of the process.
The advance that Metcalf and co have achieved is to dramatically improve the quality of their single photon sources while characterising the losses from other optical components such as beam splitters and waveguides for the first time. In doing so they’ve demonstrated one of the basic logic operations of quantum computing inside a photonic chip for the first time: the teleportation of a qubit.
The new chip is by no means perfect: it performs with around 89 percent fidelity. One source of errors is the photon source, which is far from ideal. “Whilst the success of this experiment relies on our development of high-quality single photon sources with exceptional heralded state purity and heralding efficiency, the absence of a true on-demand single photon source continues to limit the achievable fidelity,” they say.
A more significant source of errors is the non-ideal beam splitters, which by themselves reduce the fidelity of the device to around 90 percent. That’s good enough for secure communication. “But it is still below the fidelity of 99% thought to be required for a fault-tolerant quantum computer,” admit Metcalf and co.
It is inevitable that beam splitters and waveguides made in this way will deviate from their design parameters. The challenge is to ensure that these deviations are kept to a minimum or corrected by other components in real-time.
Finally, future photonic chips will need better materials that reduce the loss of photons due to scattering. That becomes particularly important as chips become larger and more complex.
So the scale of the future challenges are clear. If physicists want to build photonic chips capable of carrying out quantum computation, they will need better photons guns, less lossy materials and active components that can measure correct aberrations in real-time.
That’s a big ask. Large-scale quantum computers are coming but on this evidence, not in the very near future in photonic form.