## Select Seminar Series

All seminars- Home
- ›
- Studying at the Faculty
- ›
- Seminars ›
- The Chaim Leib Pekeris Memorial Lecture

# The Chaim Leib Pekeris Memorial Lecture

Symmetry is defined in terms of structure-preserving transformations (automorphisms); regularity in terms of numerical invariants. Symmetry always implies regularity but there are many highly regular combinatorial objects (such as "strongly regular graphs") with no symmetry. The opposite of irregularity is regularity, not symmetry. Yet we show that in a well-defined sense, *the opposite of hidden irregularity is hidden symmetry*, and in fact hidden symmetry of a particularly robust kind.

The symmetry of a circle is easily destroyed: just "individualize" two non-opposite points -- color one of them red, the other blue -- and all the symmetry is gone. In fact, the resulting structure is completely irregular: every point is uniquely identified by a pair of numerical invariants, namely, its pair of distances to the two individualized points. We shall say that the circle has a high degree of hidden irregularity.

In contrast, *Johnson graphs* are objects with robust symmetry: individualizing a small number of vertices of a Johnson graph hardly makes a dent in its symmetry.

Recent work on the algorithmic problem of Graph Isomorphism has revealed that Johnson graphs are unique in this regard: Every finite relational structure of small arity either has a measurable (say 10%) hidden irregularity (revealed by individualizing a polylogarithmic number of elements) or has a large degree of hidden symmetry, manifested in a canonically embedded Johnson graph on more than 90% of the underlying set.

This dichotomy is the key Divide-and-Conquer tool in recent progress on the worst-case complexity of the Graph Isomorphism problem.

This subject is purely combinatorial and does not require advanced mathematical apparatus. The group theoretic aspects of the new Graph Isomorphism test will be discussed in a follow-up seminar on January 30.

Quantum computers are hypothetical devices, based on quantum physics, which would enable us to perform certain computations (among them some that Chaim Leib Pekeris pioneered) hundreds of orders of magnitude faster than digital computers. This feature is coined “quantum supremacy.” We start the lecture with a gentle introduction to computing - classical and quantum, with basic notions of computational complexity, and with the vision of quantum computers.

A main reason for concern regarding the feasibility of quantum computers is that quantum systems are inherently noisy. We will explain what is "noise" and describe an optimistic hypothesis regarding quantum noise that will allow quantum computing and a pessimistic hypothesis that won’t. The remarkable progress witnessed during the past two decades in the field of experimental physics of controlled quantum systems places the decision between the pessimistic and optimistic hypotheses within reach. On the optimistic side, one aspect or another of quantum supremacy might be seen by experiments in the near future: by implementing quantum error-correction or by systems of free bosons or by exotic new phases of matter called anyons or by quantum annealing, or in various other ways.

In the lecture I will explain my pessimistic line of research and here is a brief summary of my view: understanding quantum computers in the presence of noise requires consideration of behavior at different scales. In the small scale, standard models of noise from the mid-90s are suitable, and quantum evolutions and states described by them manifest a very low-level computational power. This small-scale behavior has far-reaching consequences for the behavior of noisy quantum systems at larger scales. On the one hand, it does not allow reaching the starting points for quantum fault tolerance and quantum supremacy, making them both impossible at all scales. On the other hand, it leads to novel implicit ways for modeling noise at larger scales and to various predictions on the behavior of noisy quantum systems.

We will rely on the theory of noise-sensitivity and stability developed with Benjamini and Schramm in the late 90s and on recent work with Guy Kindler related to the mysterious gap between permanents and determinants (or, in other words, between bosons and fermions).

The Ising model is an archetypical model of the order-disorder phase transition: though simple to formulate, it exhibits a complex behavior, much like the real-world phenomena in solid-state physics, ferromagnetism, chemistry, biology, computer science.

In 1920 the physicist Wilhelm Lenz proposed to model ferromagnetic materials by a lattice of plus-minus spins with interacting neighbors. His student Ernst Ising solved the model in dimension one four years later. The one-dimensional behavior turned out to be trivial, with no phase transition when the interaction strength changes, and for a decade people searched for other possible models. However, a ferromagnetic phase transition was established by Rudolf Peierls in higher dimensions, and in 1944 Lars Onsager famously calculated the free energy for the Ising model in two dimensions.

Since then the Ising model became widely studied, as it exhibits complicated phase transition behavior, yet allows a very detailed analysis in dimension two. We will give a historical introduction to the model, and then describe some recent results.