Scalable Architectures for Quantum Simulation

635. WE-Heraeus-Seminar

Veranstaltungsbericht

29 Jan - 01 Feb 2017

Where:

Physikzentrum Bad Honnef

Scientific organizers:

Dr. A. Fuhrer, IBM Rüschlikon/CH • Prof. D. DiVincenzo, RWTH Aachen • Prof. F. Willhelm-Mauch, U Saarbrücken • Dr. S. Filipp, IBM Rüschlikon/CH

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

—Richard Feynman

 

The 635th WE-Heraeus Seminar on "Scalable Architectures for Quantum Simulation" brings together scientists from academia and industry who are investigating scalable quantum systems that may or may not require the overhead of full fault tolerant operation. Theory and experiment from the superconducting and spin qubit, atomic quantum gas and ion trap communities will be present, as well as researchers working on optimal control and novel methods for quantum simulation. The seminar will focus on quantum computing architectures and methods that promise to reach a level of complexity to solve selected problems more efficiently than a conventional computer.

Richard Feynman's statement above from 1981 is widely considered to have given birth to the field of quantum simulation. Since then, various highly active fields of research have emerged with the common aim to use well-controlled quantum systems such as trapped ions, atomic quantum gases, spins in semiconducting nanostructures, and superconducting circuits to solve a known Hamiltonian problem that is not efficiently tractable with classical methods. It is, however, becoming evident that the necessary error-correction capability is challenging and that thousands of well-controlled qubits are necessary to build an operational prototype.

Much progress has been made in materials, nanofabrication as well as in instrumentation and measuring technology. Over the years, this has led to continuous improvements in the accuracy of quantum systems. Today, platforms containing on the order of 10 qubits are state of the art and systems of a few tens to hundreds of qubits seem entirely feasible. Considering that even storing a 60-qubit quantum state would require exabytes (1018 bytes) of memory—which far exceeds the capabilities of a conventional computer—it is becoming increasingly intriguing to imagine what physical problems could be investigated with a quantum simulator featuring a few dozen qubits even with a finite error rate.

Of particular interest are problems involving interacting fermions and frustrated spins for quantum chemistry and condensed matter physics, e.g., to explore the Fermi–Hubbard model and investigate high-temperature superconductivity. Another application is the efficient solution of classically NP-hard optimization problems by determining the ground state of an Ising spin lattice by means of quantum annealing protocols.

The central challenge associated with all quantum computers is twofold: (1) to control exactly and quickly the properties and couplings of as many qubits as possible and (2) to decouple the qubits from any external perturbation. None of the quantum systems introduced so far has emerged as the frontrunner in this contradictory matrix of scalability, controllability, manipulation speed and error rate of the qubits. It is the aim of this workshop to trigger discussions about successful methods, current challenges and future perspectives of the various systems with regards to scalability and applications.