IBM Research has a long and illustrious history in creating and implementing cryptography. As we prepare to enter an exciting era of quantum computing, we are again leading the foundational and engineering challenges to create the next generation of cryptography safe in this era. We have been working with the National Institute of Standards and Technology (NIST) over the last five years to design new generations of public key cryptography. During this time, we have understood what it takes to migrate some of our key platforms to become quantum-safe and have been working with key clients to help them on the same journey.

We are very proud that several members of our team have been involved in all four algorithms selected for standardization by NIST on July 5, 2022 (see below).



IBM scientists help develop NIST’s quantum-safe standards

Learn more

Foundational Cryptography

Preparing for the next era of computing with Quantum-Safe Cryptography

Learn more

Quantum-Safe Crypto Migration

Is your cybersecurity ready to take the quantum leap?

Learn more

Quantum-Safe Systems

The risk

The Quantum Risk

Cryptography underpins much of the security that we use today to protect data and systems. Large scale quantum computers threaten much of this cryptography. The starting point for any organization is to understand the specific risks for its particular business and operations.

Learn more

Quantum-Safe Cloud

The foundation

Quantum-Safe Cryptography

Creating cryptography that is safe in the quantum era requires an understanding of quantum capabilities coupled with deep foundational knowhow in mathematics and cryptographic algorithm design. These new algorithms are designed to rely on the computational difficulty of problems from the mathematical areas of Lattices, Isogenies, Hash Functions, and Multivariate Equations.

Learn more

Quantum-Safe Cloud

The practice

Quantum-safe Migration

Migrating a business to become secure in the quantum era can be complex and costly. It does not have to be when combined with legacy system migration, with the introduction of zero trust capabilities or with the move to secure software supply chains.

Learn more

Our Research Activities

We help to make our IBM Cloud and IBM Systems quantum-safe and continue foundational cryptography research beyond the current quantum-safe algorithms.

The Quantum Risk

Why We Are Working on Quantum-Safe Cryptography

Quantum science and technology have lead to breakthroughs that threaten some of the core cryptographic principles that are used to secure systems and to protect data communications. Quantum machines large enough to pose a real threat are many years, perhaps even decades, away from being built. However, there is a tangible threat to the data and systems that we are building today and that will still have value when we enter the era of large quantum machines.

To understand the threat dimensions it is helpful to look at the impact that the future misuse of a large quantum machine can have:

  • Confidential data that has been harvested, stolen or lost over the years could be decrypted
  • Assets on long term blockchains - such as Satoshi's original bitcoins - could be fraudulently transferred
  • Digital signatures that are used to legally validate transactions could be called into question
  • Legacy systems could be targeted with fraudulent software updates
  • Digital evidence could be manipulated
These threats apply to all data, systems and technologies that are not made quantum safe.



In a nutshell:

  1. The impact is in the future but the problem is NOW
  2. We need new cryptography
  3. We need to transition to new cryptography

We estimate that over 20 billion digital devices will need to be either upgraded or replaced in the next 10-20 years to use the new forms of quantum resistant encrypted communication. This will require a massive effort, similar to the Y2K effort that occurred in the computing industry 20 years ago. We do recommend that organizations start planning for this now"

—World Economic Forum

Quantum risk

What is at stake?

The question of when cryptography will be broken by quantum computing is unfortunate, for it implicitly frames the threat to be sometime in the future. The threat is today. The impact is in the future. Data that is considered securely protected today is already lost to a future quantum adversary if stolen or harvested. All data, past, present, and future that is not protected using quantum-safe security will be at risk, and the longer the migration to quantum-safe standards is postponed, the more data there is that will be at risk.

The probability of a quantum threat within a certain time frame will depend on several factors that include:

  1. The rate at which quantum computers scale,
  2. Improvements in quantum algorithms or the discovery of new algorithms,
  3. Access to data and security artefacts required for the attack – for example public key certificates,
  4. The difficulty of adding mitigating approaches to threatened systems.

Enterprises need to monitor industry progress in quantum computing and the speed at which industry standards are being made quantum safe. There is also an opportunity to make new application development and legacy system migration projects aware of the need for quantum safety and aware of the concept of cryptographic agility.

Quantum safe risk management needs to consider the “security time value” of systems and data. This concept looks at the value of a vulnerability at a specific time in the future. Systems where the security impact of a breach is high for years to come will need mitigating actions far earlier than the expected arrival date of large-scale quantum computers.

The Consequences

The systems that we are building today are vulnerable

Quantum risk

Attackers can attack systems through fraudulent authentication

Learn more

The data that we are protecting today is vulnerable

Quantum risk

Attackers can harvest and later decrypt confidential information

Learn more

The legal underpinnings of digitalization are vulnerable

Quantum risk

Attackers can manipulate legal history through forged digital signatures

Learn more

The first threat that we identify is in the authentication methods used for many systems that are based on current PKI technology. Systems often live for decades, meaning that they will be vulnerable if they survive into a quantum era. Technologies such as bitcoin are already a decade old and the cryptography they use is difficult to migrate since it was not designed to do so.

The second example are sensitive email exchanges across public infrastructure protected by classical encryption (Threat 2). If encrypted exchanges are harvested today, they are considered safe and non-decryptable. However, an attacker keeping these exchanges may be able to decrypt them with the aid of a quantum computer in the future.

A final threat dimension is the fact that society and businesses now attach legal value to digital signatures. Threat 3 describes how a future quantum adversary will be able to create valid looking but fraudulent digital signatures. This has a yet unexplored set of implications for all business relying on digital signatures for business processes.

Quantum-Safe Cryptography

Quantum-safe (sometimes also called “post-quantum”) cryptography is the design and implementation of protocols that are believed to be secure against the added computational capabilities of quantum computers. The two quantum algorithms that cause problems for current cryptography are Grover’s algorithm and Shor’s algorithm. Grover’s algorithm allows one to brute-force search a list in time that is smaller than the size of the list. This algorithm mostly affects the security of symmetric-key primitives (e.g. AES, SHA-256, etc.) and the protection against it generally requires one to simply double the key size. Shor’s algorithm, on the other hand, is more troublesome for the security of certain public-key primitives (e.g. RSA, EC-DSA, etc.). To withstand Shor’s algorithm, the key sizes of these schemes would need to increase exponentially, which would render them practically useless.

The field of quantum-safe cryptography deals with building public-key cryptography, which can be implemented on standard devices, that can resist quantum attacks. While quantum computers have the potential to devastatingly solve certain mathematical problems, there are many other problems that have been studied for several decades for which we don’t believe that quantum algorithms are at all helpful. Some of these problems come from the mathematical areas of lattices, codes, isogenies, and multivariate equations. Migrating to quantum-safe cryptography requires us to first design efficient foundational primitives based on the hardness of such problems, and later combine them into various protocols.

A current central research objective of our scientists is the design, implementation, and standardization of new quantum-safe cryptographic algorithms that can replace the classical non-quantum-safe ones. These include encryption and signature schemes that are currently undergoing standardization by NIST, as well as more advanced schemes from the area of privacy-preserving cryptography.


NIST Post-Quantum Standardization

From 2017 to 2022, NIST went through three rounds of a selection process to produce standards for quantum-safe encryption and digital signature schemes. There were 69 initial submissions that were judged on the basis of security and performance. The third and final round was completed at the end of March, and NIST announced the selection of new algorithms to recommend for standardization in July 2022. 

IBM Research scientists have been involved in creating many quantum safe algorithm designs. Below are the algorithms with IBM Research leadership and contributions in the NIST Post-Quantum Cryptography (PQC) finals and the selected standards.

IBM Research Scientists' Involvement in NIST PQC Public-Key Encryption/KEMs Finalists

🥇 NIST selected primary standard


Kyber is a public key encryption / key establishment mechanism based on the hardness of finding short vectors in Euclidean lattices. More specifically, Kyber is based on the module learning with errors problem. It offers high security, balanced key and ciphertext sizes, and leading performance on a diverse range of platforms.

IBM Research scientists Vadim Lyubashevsky and Gregor Seiler contributed to the design and implementation of Kyber.

Learn more

IBM Research Scientists' Involvement in NIST PQC Digital Signature Finalists

🥇 NIST selected primary standard


Similar to Kyber, Dilithium is a lattice-based signature scheme based on the module learning with errors and module short integer solution problems. Its construction follows the Fiat-Shamir with aborts paradigm that was invented by IBM Researcher Vadim Lyubashevsky. Unlike other signature schemes, Dilithium lends itself to high-confidence secure implementations and still offers very fast performance in optimized implementations. The combined key and signature size of Dilithium is the second smallest in the competition.

The Dilithium team is led by Vadim Lyubashevsky, and Gregor Seiler also contributed to the design and led the implementation of the scheme.

Learn more

🥈 NIST selected standard


Falcon is also a lattice-based signature scheme. Compared to Dilithium, Falcon uses a different design paradigm and offers shorter key and signature sizes at the cost of higher implementation complexity and slightly worse performance, especially on constrained devices. The combined key and signature size of Falcon is the smallest in the competition.

IBM Researchers Vadim Lyubashevsky and Gregor Seiler contributed to the design of Falcon.

Learn more

🥈 NIST selected standard


Relying only on the security of standard hash functions, SPHINCS+ is the most conservative signature scheme. This strong security guarantee comes at the cost of a somewhat large signature size or a somewhat large signing time, depending on which variant of SPHINCS+ is used. SPHINCS+ is closely related to the stateful eXtended Merkle Signature Scheme (XMSS), which is standardized by the IETF and recommended by NIST. Unlike XMSS, SPHINCS+ is not stateful, which makes SPHINCS+ suitable for general use.

Ward Beullens contributed to the design of SPHINCS+ since the third round of the NIST process.

Learn more


Lattice-Based Cryptography

“Lattice-based cryptography” is an approach for constructing security primitives. It is based on problems from an area of mathematics called “geometry of numbers.”

Suppose that one is given a square, full-rank matrix A and a value b = Ax mod p, where x is a vector with 0/1 coefficients and p is a small (e.g. 13-bit) prime. One is then tasked with finding x. This problem has a unique solution x, which is actually quite easy to find by using Gaussian elimination.

On the other hand, if one is given a slightly “noisy” version of Ax, that is Ax+e mod p, where e is some random vector with 0/1 coefficients, then for matrices of large-enough dimension (say, around 512), this problem becomes surprisingly difficult.

This type of problem is related to both the subset sum and the learning parity with noise problems that have been widely studied since the 1980s and have not succumbed to any algorithmic attacks, either classical or quantum.



European Research Council (ERC)

PLAZA: Post-Quantum Lattice-Based Zero-Knowledge (2021 – 2026)

The goal of the PLAZA project is to extend the efficient lattice-based techniques that were used to create the new quantum-safe NIST standards to create practical zero-knowledge proofs and privacy-based protocols. It is our hope to have all the necessary pieces in place before the decentralized, privacy-based ecosystem receives widespread adoption.

Principal investigator:
Vadim Lyubashevsky

Learn more

European Research Council (ERC)

FELICITY: Foundations of Efficient Lattice Cryptography (2016 – 2021)

Public key cryptography is the backbone of internet security, but most of the current mathematical assumptions on which it relies can be broken by quantum computers. Lattice cryptography is considered the most promising candidate to become the basis of tomorrow’s cryptography. The FELICITY project is pushing the boundaries of what can be efficiently built based on the difficulty of lattice problems.

Principal investigator:
Vadim Lyubashevsky

Learn more

Industry efforts and collaborations

Open Source

Open Quantum Safe

The Open Quantum Safe (OQS) project is an open-source project that aims to support the development and prototyping of quantum-safe cryptography. OQS consists of two main lines of work: liboqs, an open-source C library for quantum-resistant cryptographic algorithms, and prototype integrations into protocols and applications, including the widely used OpenSSL library.

Together with partners from industry and academia, we provide the latest upstream implementations of quantum-safe algorithms to OQS and support the development of new features and releases of the project. We use OQS as a testbed to see how quantum-safe cryptography behaves in practice, across different protocol stacks and applications. This provides us with valuable insights before putting quantum-safe cryptography into production. All of the development in OQS takes place publicly on Github.

Learn more


Quantum-Safe Key Serialization

The current standardization of quantum-safe algorithms concentrates on the algorithm specification, but not how keys should be specified and managed.

Keys for quantum safe algorithms are typically larger, and in some cases much larger than the keys that we use today for ECC and RSA. For many existing systems, to store and move quantum-safe keys, these first need to be compressed.

The uncoordinated specification of compressed keys for ECC cryptography led to complexity for cryptographic implementors.

Together with other industry collaborators, with the aim of smoothing the migration path to quantum safe cryptography, we have proposed a working item at the IETF LAMPS standards group.

Learn more

Open Source

Fully Homomorphic Encryption (FHE)

Confidential computing technologies such as Fully Homomorphic Encryption (FHE), allow processing on encrypted data without having to decrypt the data.

The breakthrough in FHE came at IBM in 2009, when Craig Gentry used lattice cryptography to create the world’s first fully homomorphic encryption scheme. The use of lattice cryptography gave FHE quantum-safe security before it was even thought possible that quantum computers of any reasonable size could be built.

IBM Research is an active developer of FHE technologies to enable AI processing on encrypted data. See the release of HElayers, a software development kit (SDK) for the practical and efficient execution of secure AI workloads using fully homomorphic encrypted (FHE) data.

Learn more

Quantum-Safe Migration

While there isn't a fully standardized set of algorithms for quantum-safe key exchanges or signatures, the threat of quantum computing to asymmetric cryptography is well recognized. Beyond just the awareness of a such a threat, it is instrumental to already now prepare organizations for a full transition to QSC as soon as the relevant standardization is ratified. 

This implies that actions need to be put in place to establish an inventory of crypto-algorithms currently in use, such that those vulnerable to quantum computing can readily be migrated.

The latter implies the establishment of a migration plan, for which adequate funding and management structure is a must-have. The migration plan might consist of a single step directly to the exclusive use of QSC algorithms, or it can be a two step procedure using hybrid legacy/QSC algorithms as an intermediate solution. Enabling a migration without any disruption of business services must be a strong focus point: Beyond the inventory of crypto-algorithms and a related migration plan, the establishment, distribution, verification and revocation of QSC-certificates needs to be addressed.



Quantum-safe migration


Michael Osborne
Michael Osborne
Manager, Foundational Cryptography and Quantum-Safe Cloud & Systems