May to August 1999

**Organisers**: A Albrecht (*UC Davis*), Peter Knight (*Imperial*), RM Solovay (*Berkeley*), W Zurek (*LANL*)

10 May to 20 August 1999

Organisation

Participation

Meetings and Workshops

Achievements

Conclusions

The remarkable increase in the power of computers over the last few decades creates the illusion that this process will continue indefinitely. However, it is clear that in the next 20 years a limit will be reached which is dictated by fundamental physical processes: within this time-scale the size of individual components etched onto micro-chips will approach atomic dimensions. In order to allow progress to continue a fundamentally new approach will be needed. Interestingly, at this fundamental component-size limit, the quantum mechanical nature of the interactions within the computer would become important. These considerations have led researchers to question whether a new type of computer could be constructed - a quantum computer - which would take advantage of quantum behaviour in order to perform computing tasks in different and vastly more efficient ways. At the heart of these ideas is the notion of entanglement in which the quantum states of different particles or fields become linked in a fundamental way leading to surprising non-local phenomena. Theoretical work has shown that a quantum processor may be able to perform certain important tasks much more efficiently than its classical counterpart because of the intrinsically parallel nature of quantum computations.

The emerging field of quantum information processing has grown at a staggering rate over the last few years. Up until now much of the activity has been theoretical but a few seminal experiments have been performed. These early experiments have highlighted just how difficult it will be to build a working quantum computer, by identifying decoherence in quantum systems as a key issue in practical implementations. The quantum superpositions used in this type of processor are very fragile and can be destroyed by a wide variety of sources of dissipation and noise that may prove to be very difficult to isolate and control. As a result, the study of decoherence has become a vital ingredient for current and future experimental programmes.

Thus far, only two technologies have been used to demonstrate simple quantum gates: single trapped atomic ions and nuclear magnetic resonance (NMR) in macroscopic samples. These are shown in Fig 1 (below), from the ion trap quantum computing group of Prof Blatt in Innsbruck, in which laser-cooled trapped ions are held in space by confining electrodes, and Fig 2 (below) from Dr Laflamme’s discussion of his NMR work during the programme.

There is currently a debate under way as to whether the NMR experiments truly qualify for ‘quantum processor’ status since an ensemble of approximately 1023 systems is used. This methodology also suffers in that it is not scaleable to large numbers of quantum bits (qubits). By contrast, the trapped ion work is less developed but clearly generates ‘true’ quantum gates since it is intrinsically a single atom technique.

Both approaches have shown the deleterious effects of decoherence. Fortuitously, it has been shown that quantum information processing may be possible even in the presence of a certain amount of decoherence through quantum error correction procedures. Much work was devoted to this topic at the programme, with some emphasis being given to a comparison of the merits of using quantum error correction or using the newly identified ‘Decoherence-Free Subspaces’.

In Fig 3 (below), we show (from Kwiat et al) the angular distribution of correlated photons from an optical parametric amplifier: the rings correspond to different frequencies emitted. The sum of the upper and lower beam frequencies must add up to the pump laser frequency; this plus momentum conservation generates maximally entangled pairs of photons used in much of quantum information processing (including quantum teleportation, a central topic of discussion at the programme).

In Fig 4 (below), we show the image of two laser cooled trapped ions: these can become entangled and used for 2-bit quantum gates at NIST in Boulder.

The core of the programme was concerned with the role of quantum coherence in superpositions, and especially entangled states, when embedded in a decoherent environment. The programme brought together experts in quantum coherence, quantum information theory, decoherence, complexity theory, and the like. It was the first major workshop in this rapidly developing area and was, in our view, highly successful. The Workshop organising team consisted of the Programme Organisers plus workshop organisers R Jozsa (Plymouth - now Bristol) and M B Plenio (Imperial College).

In addition, we were helped in many ways by the presence of two local experts who were then at Newton Institute, S Popescu and N Linden.

The initial proposer and organiser, Prof A Albrecht, moved from Imperial to UC Davis in 1998, and Prof P L Knight from Imperial then joined the organising team to maintain a UK resident organiser.

The long-term programme ran at the Newton Institute, but the workshops were transferred rather late in the day (due to building activities and associated disturbance) to New Hall, where some practical difficulties were experienced.

The finances of the workshops were somewhat precarious and only the co-location of the European Science Foundation Network initial meeting (coordinated by Dr Plenio) with the INI workshops made possible the high level activity we eventually enjoyed.

The facilities provided for us at the INI were outstanding. If we were to make minor criticisms, we should say that it is incredible that a modern building with so many windows does not have an effective air-conditioning system.

We were very grateful to the local staff of the INI for their professional assistance before and during the meeting. Their help was invaluable.

The week-by-week programmes of the workshops were outstanding. In particular, the second week of the programme consisted of a workshop on Entanglement and Quantum Information Processing, which was organised in conjunction with the ESF programme on Quantum Information Theory and Quantum Computation (launched at the beginning of 1999).

About 100 researchers from all over the world participated. Exchange of ideas was intense and stimulating: a number of publications posted to the Los Alamos preprint server acknowledge the workshop, and more work is in preparation. In the conference program, leading researchers as well as young researchers (PhD students and young postdocs) were given the opportunity to present their latest results. It was encouraging to see that a substantial fraction of the talks reported work that had not been published in any journal nor had appeared on the Los Alamos preprint server. This enhanced the active, workshop atmosphere of the meeting.

The scientific standard was high and a number of exciting new results, both experimental and theoretical, were reported. On the experimental side for example the work presented by Haroche (Paris), which appeared in Nature after the meeting, and the impressive progress on quantum cryptography by Gisin (Geneva) and Hughes (Los Alamos), generated much interest. The theoretical study of quantum entanglement of finite systems received a strong stimulus from work reported by Nielsen (Caltech) who introduced a new mathematical structure for this situation. Plenio (London) further demonstrated that entanglement can be used in a catalytic way without consuming it, thereby demonstrating an entirely new quality of entanglement. Both results will appear in PRL. The workshop had a strongly interdisciplinary nature, with participants from chemistry, experimental and theoretical physics (quantum optics, solid state physics and even cosmology), and mathematics to computer science. The resulting interdisciplinary activities are exemplified by the work on quantum computation in Nuclear Magnetic Resonance (Laflamme, Knill and Jones, as reported in PRL and Nature).

It was the general feeling that the meeting was very successful and marked the most important event in the field of quantum information theory in 1999. The scientific quality of the meeting demonstrated that the field is healthy and progressing well.

Major new insights were obtained. In particular:

- The need for entanglement in the speed up of quantum computing, and the way that NMR quantum computing works.
- The role quantum teleportation plays in secure quantum communication.
- The construction of bounds on the amount of entanglement within a particular mixed state.
- The use of a new mathematical technique of ‘majorisation’ in quantum information theory.
- The emergence of classicality though the intervention of decoherence, and the role of ‘erasure’.

The programme fulfilled the aims of the initial proposal. Scientists from different disciplines were able to interact and share insights. New results were obtained and a number of important papers written. The informal seminar series was valuable in making participants known to each other and in encouraging interaction. Many of the European participants were able to join forces in formulating collaborative applications for funding to the European Union for further support in this area. (We learnt that the EU has committed 16 million euros to this field and essentially every senior participant in our programme is now involved in this new collaboration for the future.)

Publications are being collected by the INI Information Officer. I have noted many papers appearing on the quant-ph Los Alamos server with INI acknowledgments.