Image generated by Dall-e
At its pre-IIW workshop on Monday, April 14, the OpenID Foundation convened a panel on Post-Quantum Computing and Identity. The panelists included:
- Andrea D’Intino, Dyne.org
- Nancy Cam-Winget, Cisco
- John Bradley, Yubico
- Rick Byers, Google
- Gail Hodges, OpenID Foundation (moderator)
This summary draws out general themes and comments are not attributed to individuals or their organizations. You can also view this summary by the OpenID Chairman, Nat Sakimura.
Dissecting the Threat
Traditional public-key cryptography (PKC) will not withstand the code-breaking power of quantum computers. Put simply, today’s encryption methods depend upon complex mathematical problems that conventional computers cannot solve. PKC is used widely: to protect private data at rest, to secure messages (data in transit), to prevent unwanted third parties from hijacking digital sessions, and many other things. In the case of the OpenID Foundation, our standards - from OIDC to OID4VP and Federation - leverage PKC to secure identity data. As an increasingly digital society, it is how we all secure private data, business processes, government secrets, and critical infrastructure.
In a world of quantum computing, however, certain algorithms become more feasible to run (i.e. Shor’s algorithm and Grover’s search). With these algorithms in play, those mathematical problems that underpin PKC become breakable.
The panel discussed several ways in which modern identity infrastructure is vulnerable to these threats.
Tokens - OpenID Connect, Verifiable Credentials, and other standards rely heavily on tokens. The Identity Token at the end of an OIDC flow, for example, is a JSON Web Token (JWT). Tokens depend upon encrypted digital signatures that will come under threat in a post-quantum world.
Certificates and Transport Layer Security - the panel discussed the dependence of Transport Layer Security (TLS) on certificates and digital signatures. Digitally signed certificates come under threat when those signatures can more easily be forged in a post quantum environment.
Hardware Security Modules - create, store, and (ideally) never transmit key material. Today, this irrefutably establishes signature provenance. However, if the key-generation algorithm used by that HSM isn’t quantum-resistant, then any keys generated are vulnerable. Furthermore, there are key transmission and import processes undertaken today by large enterprises who require scalable and resilient signing capabilities: unless those transfer protocols are quantum-resistant, private keys may be intercepted.
Ultimately, quantum threatens not only our data, but also our Roots of Trust: once these systems are broken, how will anyone be able to reliably trust that they know the nature of any entity they are interacting with online?
For more information about the threat of quantum in the context of today’s cryptography, see Google’s Threat Model for Post-Quantum Cryptography.
Not a Y2K Event
When asked whether there were parallels between the quantum threat and Y2K, the panelists agreed that the analogy doesn’t hold: Y2K was a predictable event with a known timeline. The “post-quantum cross-over” is much less clear cut and the threats more insidious.
Most importantly, the threats of quantum computing actually begin long before the technology is available. While quantum computers do not seem to pose an immediate threat, since they are not readily available, the panelists agreed that much of today’s encrypted data is sill threatened. In part, this is because nation-state actors, in particular, may have these capabilities on the foreseeable horizon.
It is also because data captured today can be stored (and is being stored) with the intent of future decryption. This includes data stored today that could lead to the reverse engineering of signing keys years into the future. For the avoidance of doubt, this includes data stored or transmitted by individuals, companies, governments, and all other actors online. Encrypted digital transactions, like payments and secure messages, intercepted today can be stored and decrypted tomorrow. As such, there are major implications for the way we secure any secrets that will still matter in the future - whether that be state secrets or the private keys of our hardware.
Preparation is Key
…no pun intended…
The panelists agreed that, given the recognizable threat, players across the landscape must work to develop and align around quantum-resistant algorithms that can be implemented at scale. This alignment will enable organizations to roll-out future-proofed hardware and protect today’s data against the threats of tomorrow. In the interim, many players are introducing “quantum-safe” solutions and offering guidance as to how organizations can best position themselves today. Some of these solutions involve actively testing and “doubling up” pre/post-quantum controls.
All panelists agreed that not all safeguards are “post-quantum.” Many best-practices of today will continue to be protective and, as yet, are not sufficiently commonplace (see, for example, Cyber Safety Review Board’s recent recommendations to Cloud Service Providers). These include best practices for key storage, key rotation, passkey (or password) management. Today’s essential hygiene remains critical and the ecosystem will continue to be only as protected as its weakest link. The panelists argued that a robust cybersecurity position has dependencies across the whole supply chain. If we want to solve the problems, the industry must continue to advocate for comprehensive standards and controls that protect.
This includes supporting global efforts to find quantum-resistant algorithms that scale and complement existing infrastructures. For example, NIST is in the middle of a six-year competition to standardize algorithms fit for essential use cases. There were four winners of the 2022 competition:
- Kyber (a key exchange mechanism)
- CRYSTALS-Dilithium (signing)
- Falcon (signing)
- Sphincs+ (signing)
The panel saw examples of how some of this had been implemented, including dyne.org’s implementation of Dilithium-signed Decentralized Identifiers.
What does this mean for OpenID Foundation?
Members of the panel and OIDF members in the room reached consensus that, given the post-quantum vulnerabilities to identity tokens, credentials, and trust ecosystems, this is an area that requires immediate attention on behalf of the Foundation.
OIDF standards have a role in underpinning trust today. Our standards are designed to mitigate modern security threats. FAPI, for example, provides data schemas and security guidance appropriate for use cases requiring financial-grade privacy and security. OpenID Federation also has a role underpinning trust between counterparties through cryptographically signed entity statements. Both standards have a place addressing the risks in today’s cybersecurity landscape: broad adoption of these and other emerging standards will reduce risk today.
However, while OIDF standards utilize modern cryptography, all Work Groups need to keep a watchful eye on emerging standards for quantum-resilience. At a point in the not-too-distant future, they will need to bake new cryptographic best practices into the standards. In the mean time, we can increase our advocacy and cross-industry collaborative efforts to test and refine quantum-safe algorithms.
We welcome your thoughts and contributions in the comments or to director@oidf.org