data compression technique that could save significant amounts of quantum memory, bringing quantum technologies closer to reality...

(Squished cat image taken from this website) | F U CN RD THS, you already appreciate the basics of data compression (and have a very clean mind). Data compression is a technology we all use every day, whether we think about it or not. It is what allows those huge media files to be stored in formats which will manageably fit on the USB drive on your keychain, or be efficiently sent over the internet. In its simplest form, compression relies on redundancy. Just as many vowels are superfluous, since the list of consonants "vwls" would already uniquely specify which word you have in mind, many of the bits in large computer files are unnecessary. Think of sending an image file, pixel by pixel. The image may contain many black regions – a whole lot of "zeroes" in the data file. Why type 1000 zeroes in a row (using up 1000 bytes of memory) when you could just say "now insert 1000 zeroes," an instruction that could be stated in just a few bytes? Patterns, repetition, all the same things we use to process an image mentally when we look at it, can be used to store the important information in a smaller file – after all, if a picture is only worth 1000 words, why should it take a million bytes to store it? |

But there is a new emerging set of technologies based on quantum information. In the quantum computers scientists are aiming to build now, instead of merely being a 1 or a 0, each bit (or "qubit," for "quantum bit") would be like a little Schrödinger's Cat – where the famous feline is known for being alive and dead at the same time, a qubit could be 1 at 0 at the same time. It's been shown that such qubits could be amazingly powerful for all sorts of tasks from transmitting ultra-secure codes to breaking the codes we previously considered secure, and from simulating the quantum mechanics of previously mystifying complex systems to speeding up the search for items in an unsorted database. But first we must learn to harness and store quantum information. In particular, the development of quantum memories has only just begun, and proves remarkably difficult – hard enough to keep a cat alive in a sealed box, but imagine the task of keeping it alive and dead at the same time (without any one on the outside being able to get a whiff of which is the case)!

One of the fascinating – and troublesome – aspects of quantum information is light-heartedly known as the "no-cloning theorem." If I give you a classical bit and ask you "could you make 1000 copies of this bit," you can easily (if laboriously) type in 1000 zeroes or 1000 ones. But with a qubit, no such thing is possible. You cannot measure the state of Schrödinger's Cat without disturbing it (forcing it to "choose" life or death once and for all), and therefore you cannot make additional copies. On the flip side, unlike in our classical world, if you had 1000 identical copies of a quantum bit, you could get much more information from it than you could if you had just a single one. This fact is central to the business of characterizing quantum states and quantum processors: to get good information about a quantum system, you need many identically prepared copies of similar systems.

Given all of this, it had not been clear what the fate of data compression would be in the world of quantum information. In an upcoming edition of Physical Review Letters, a group of physicists at the University of Toronto, led by CIFAR Fellow Aephraim Steinberg and supported by NSERC, aided by a collaboration with the University of Tokyo, present an algorithm for quantum data compression, and demonstrate experimentally that it works. This algorithm is applicable to a very specific problem: if you are given a large number of identically prepared quantum systems (as described earlier), do you need to store all of them, or is there some way to compress all the information about this quantum state into a smaller memory? The researchers found that the state could be compressed exponentially – that is, while it would take a 10-qubit memory to store all the information about 1000 identical qubits, it would only take an 11-qubit memory to store all the information about 2000, and so on. 20 qubits would contain everything you'd ever want to know about a million. In this proof of principle, a million was a little bit out of reach; the scientists contented themselves to taking 3 qubits and proving they could compress them into 2. The information was stored in the paths and polarisations of pairs of entangled photons, and by applying some of the latest ideas in optical quantum computing, the group was able to carry out the compression and prove that all the information was still present in the smaller "file" at the end.

Quantum information offers a wealth of possible applications, but also raises deep questions about the relationship between physics and information – how much information can be stored in a given physical system? How much information is needed to describe the state of a quantum system? This work sheds light on some of the striking differences between information in the classical and quantum worlds, while also promising to provide an exponential reduction in the amount of quantum memory needed for certain tasks.

The technical article, "Quantum Data Compression of a Qubit Ensemble" by Lee Rozema, Dylan Mahler, Alex Hayat, Peter Turner, and Aephraim Steinberg, was published as Phys. Rev. Lett. 113, 160504 (2014); and David Lindley wrote a "Focus" article about it for Physics 7, 106 (2014).

**For Further Reading:**

"Squeeze me, tease me, please me!" by John Preskill

Bacon, Chuang, & Harrow, "Efficient Quantum Circuits for Schur and Clebsch-Gordan Transforms," PRL 97, 170502 (2006)

Plesch and Buzek, "Efficient compression of quantum information," PRA 81, 032317 (2010)

"Quantum Data are Compressed for the First Time" (Physics World)

Quantum bits get their first compression (Nature News)

Back to Aephraim's home page