Home / Sections / Features / The Science of Secrecy

The science of secrecy


Date: May 3, 2022

Passwords, puzzles, encryptions, and keys—such have been the weapons used in battle between code makers and breakers for millennia. The last century—the first century of the electronic computer, computer networks, and wireless communication—has seen a meteoric rise in the sophistication of both secrecy and decipherment, and that upsurge is only accelerating as we enter the quantum age. These intricacies of modern-day communication and information sharing affect so many aspects of our day-to-day lives, and so it follows that cryptography has captured the imagination of Weizmann scientists—with implications for privacy, fairness, and security in healthcare, legislation, finance, banking, and beyond.

The beauty of entanglement
“It’s an exciting time to be in the field of cryptography,” says Dr. Rotem Arnon-Friedman, a recent recruit to the Department of Physics of Complex Systems. Dr. Arnon-Friedman tackles one of the basic security questions in quantum cryptography: how to create a shared secret key—an encryption/decryption scheme—that is impossible to crack, even if the hacker is able to manipulate the communications and has access to a quantum computer.
This secret key is based on the status of sets of ‘entangled’ particles (such as atoms or photons) separated by great distances. Entanglement is a central component of quantum mechanics. Roughly speaking, two systems are said to be entangled if the behavior of the entire physical system cannot be explained by considering each of the two subsystems separately.
Furthermore, Dr. Arnon-Friedman is working on how to create secret keys that don’t depend on the use of any particular device—a process known as device-independent key distribution. 
What makes this research so exciting to her is not only the potential power of such systems to ensure privacy and security, but the fact that the work involves a synergistic partnership between theory and experiment. That is, theoreticians like Dr. Arnon-Friedman are refining quantum science theories of how to establish long-distance entanglement, which experimentalists—such as her current collaborative partners at the University of Chicago—then engineer into complex quantum systems used in proof-of-concept studies. The knowledge gained from one group of these ‘entangled’ scientists directly guides the work of the other group, and vice versa.

Post-quantum cryptography
While Dr. Arnon-Friedman focuses on entangled pairs of particles separated by long distances to block quantum-computing hackers, Prof. Zvika Brakerski, from the Department of Computer Science and Applied Mathematics, aims to design algorithms and hash-based systems that run on ‘ordinary’ computers, and yet are secure against a cryptanalytic attack by a quantum computer. This work is referred to as post-quantum cryptography.
In other words, although RSA algorithms are near-impossible to hack using current approaches—doing so requires solving mathematical problems of extreme complexity—the advent of quantum computers could drastically shorten the cyberattack process, thereby compromising security. Prof. Brakerski is thus working to stay ahead of that curve.

Teaching computers how to diagnose
But there is more to cryptography than making commercial ventures secure or being able to send secret messages among spies—corporate or otherwise. Prof. Guy Rothblum, also in the Department of Computer Science and Applied Mathematics, argues that the ‘lens of cryptography’ is also relevant to building systems and tools that can help us whenever there’s an issue of trust between two parties, whenever there’s a need for authentication and privacy protection, and even when we are attempting to use ‘big data’ analysis to triage the most appropriate medical treatment for different patients.  
The latter case is especially relevant to issues of fairness in medicine. By combining big data science with genomics, scientists and physicians—such as those participating in the Weizmann Institute’s current collaboration with Clalit Health Systems, Israel’s largest health maintenance organization—aim to personalize medicine as precisely as possible. Achieving this ambitious goal requires access to repositories of electronic health records, as well as newly acquired data from patients’ blood and tissue samples. The idea is to use mathematical and statistical models of patients’ combined data to improve diagnosis and personalized treatment protocols.
We might assume cryptography’s role here relates solely to privacy protection, but that’s only part of the story. As Prof. Rothblum points out, the computer models that crunch the data do so using sophisticated algorithms to ‘teach’ computer programs how to diagnose.
But who wrote those algorithms? Where do the data used in this machine learning process come from? Do the data accurately reflect minority groups? A powerful algorithm may be able to accurately predict which medicines work best for middle-aged men, but be highly inaccurate when trying to do so for women of the same age, or men of different ethnicities. Fairness requires that algorithms are written and programs are trained to account for the true variety of patients a physician may encounter. This is the space in which cryptographers like Prof. Rothblum work. 
In effect, although they won’t make the final treatment decision, such cryptographers will illuminate the risks and opportunities presented by algorithmic decision-making systems. 

Trade-offs and data fidelity
The role and purpose of cryptography has implications beyond healthcare. We live in a world increasingly governed by computer science—from algorithms that select candidates for job interviews, to online mortgage eligibility calculators. This increasing automation isn’t a matter of ‘good’ or ‘bad’; it is a case of greater efficiency that comes from algorithms written by humans based on data collected from a sampling of other humans. And therefore, the algorithms may be faulty, or the data may be incomplete. While we can improve the accuracy of both, it comes at a cost: privacy.
But must it be either or? Accuracy or privacy? Not necessarily, says Prof. Rothblum. Sometimes that’s a false choice. It’s a case of stepwise trade-offs, not an all-or-nothing dilemma. The major challenge facing cryptographers today and, in the years ahead, is to improve cryptographic tools so that it will be possible to maintain control of one’s personal information without unbearable risk—even in cases where genomic and behavioral data are combined—by identifying and calculating the trade-offs along the way.

Bridging basic and applied mathematics
Another way to improve the accuracy of algorithms is to protect and maintain the fidelity of data over time and space.
Both Prof. Rothblum and Prof. Irit Dinur—another member of the Department of Computer Science and Applied Mathematics—have investigated ways to strengthen the fidelity of data transfer. Though it may come as a surprise, the solution to such a ‘real world’ problem as ‘computer glitches’ may actually lie in basic science and mathematics.
Prof. Dinur’s approach takes root in one of the most well-known challenges in math and computer science: the P (polynomial) versus NP (non-polynomial) time problem. The P/NP problem is so daunting, a $1 million prize has been offered for the first correct solution by the Clay Mathematics Institute.
The challenge involves problems with a unique characteristic: Although it is easy to test whether a given solution is correct, it is difficult to actually calculate a solution. That is, if we were given a problem together with an appropriate solution, it would be relatively easy to verify that the solution is correct (think Sudoku—it is easy to see whether a solution to a board is correct or incorrect). But if we were only given the problem (a Sudoku board with only a single number) and had to work on finding the solution ourselves, it would likely take much longer—so long (for some problems), that there may be no practical way to come up with the correct solution.
No practical solution—that’s how complex it can be to ensure the fidelity of large data transfers involving millions of bits. But is the problem truly unsolvable?
Prof. Dinur’s solution—crafted in collaboration with Prof. Alex Lubotsky, newly recruited to the Department of Mathematics, as well as mathematicians at the Hebrew University of Jerusalem—is based on a multi-dimensional calculation that is able to examine only a small number of bits, yet detect errors in many other bits, including those that are “far removed” within the file (spatially separated among the reams of data). For example, reading 100 bits is enough to test a file hundreds of gigabytes in size (hundreds of billions of bits). All this is done quickly and efficiently, with little redundancy.
Further research on such ‘local testing’ methods may lead to the development of more sophisticated and effective tools for noise resilience in data storage and telecommunication. 
“Our starting point was completely theoretical,” says Prof. Dinur. “The end result is exciting because it connects abstract mathematics to topics that are much more practical.