What Is Post-Quantum Cryptography (and What It Isn’t)
Post-quantum cryptography (PQC) refers to cryptographic algorithms designed to remain secure even if large-scale quantum computers become practical. These algorithms are built to resist attacks from both classical computers and future quantum systems.
The need for PQC comes from a specific weakness in today’s cryptography. Much of the internet relies on public-key cryptography, including RSA, elliptic curve cryptography (ECC), and Diffie-Hellman, for key exchange, authentication, and digital signatures. These systems are secure against classical computers but are vulnerable to quantum algorithms that can efficiently solve the underlying mathematical problems.
Post-quantum cryptography addresses this risk by replacing vulnerable public-key mechanisms with alternatives based on different mathematical foundations, ones that are not known to be efficiently broken by quantum computers.
What PQC is not is a complete overhaul of encryption as we know it. The symmetric algorithms that actually protect data in transit and at rest, such as AES-256, remain secure even in a post-quantum world when used with appropriate key sizes. The core challenge lies in how encryption keys are established, identities are authenticated, and trust is enforced across systems.
This distinction matters because it keeps the problem scoped and practical. Post-quantum cryptography is not about rewriting every application or abandoning modern encryption. It is about upgrading the public-key foundations that underpin secure communication, identity, and trust across modern infrastructure.
Importantly, PQC is no longer theoretical. Standards bodies have finalized the first generation of post-quantum algorithms, and cloud providers and software vendors have begun introducing support, often through hybrid configurations that combine classical and post-quantum techniques. This shift moves PQC from academic research into real-world security planning.
For cloud environments in particular, PQC is best understood as a future-proofing measure. Data encrypted today, including API traffic, backups, logs, and identity tokens, may need to remain confidential for many years. Post-quantum cryptography exists to ensure that the security decisions made now continue to hold up as computing capabilities evolve.
Why Post-Quantum Cryptography Is a Data Problem, Not a Quantum One
When people talk about post-quantum cryptography, the conversation often centers on when quantum computers will arrive. That framing misses the real issue. The risk is not tied to a specific breakthrough date. It is tied to how long your data needs to remain secure.
If you manage information that must stay confidential for ten, twenty, or even more years, the threat already exists. Encrypted data captured today can be stored and decrypted later once quantum-capable systems become available. This turns post-quantum cryptography into a data longevity problem rather than a future technology problem.
Cloud environments make this risk more pronounced. Encryption is used everywhere, from TLS connections and API calls to service-to-service communication and identity tokens. These systems rely heavily on public-key cryptography for key establishment and authentication. While the data itself may be protected by strong symmetric encryption, the mechanisms that negotiate and authenticate those keys are vulnerable to future quantum attacks.
The scale and automation of cloud infrastructure amplify exposure. Short-lived workloads generate large volumes of encrypted traffic, certificates, and credentials. Even when individual services are ephemeral, the data they process and the logs they generate often are not. Backups, archives, and audit records can persist for years, creating long-lived assets that must remain secure well into the future.
This is why waiting for certainty around quantum timelines is risky. Once sensitive data has been exposed through interception or collection, it cannot be taken back. Post-quantum cryptography exists to ensure that data encrypted today remains protected tomorrow, regardless of how computing capabilities evolve.
Where Today’s Cryptography Breaks (and Where It Doesn’t)
Not all cryptography is equally affected by the rise of quantum computing. Understanding where the real weaknesses are helps keep post-quantum planning focused and practical.
The primary risk lies in public-key cryptography. Algorithms such as RSA, elliptic curve cryptography (ECC), and Diffie-Hellman underpin key exchange, authentication, and digital signatures across the internet. These systems rely on mathematical problems that are difficult for classical computers to solve but are expected to be tractable for sufficiently powerful quantum computers. Once those problems can be solved efficiently, attackers could impersonate trusted services, forge signatures, or decrypt previously secure communications.
By contrast, the symmetric cryptography that protects the actual contents of data in transit and at rest is far more resilient. Algorithms like AES-256 are not fundamentally broken by quantum computing when used correctly. The same is true for modern hash functions and message authentication codes with adequate key sizes.
This distinction is critical. Post-quantum cryptography is not about replacing every encryption primitive in your environment. It is about addressing the public-key systems that establish trust and exchange keys. If those foundations fail, even strong symmetric encryption cannot protect data, because the keys themselves can be compromised.
In practice, this means the focus of PQC readiness is narrow but deep. Security teams need to understand where public-key cryptography is used for TLS handshakes, SSH access, API authentication, certificate chains, and identity systems. These are the points where quantum-resistant algorithms will eventually need to replace or augment today’s mechanisms.
By scoping the problem correctly, organizations can avoid unnecessary disruption while still addressing the parts of their cryptographic stack that truly matter in a post-quantum world.
The Practical State of Post-Quantum Cryptography
Post-quantum cryptography is no longer confined to research papers and long-term forecasts. Standards bodies have finalized the first generation of quantum-resistant algorithms, and vendors have begun the slow work of turning those standards into deployable technology.
This matters because standardization is the line between theory and practice. With NIST finalizing post-quantum algorithms for key establishment and digital signatures, organizations now have a stable foundation to plan against. These standards give software vendors, cloud providers, and security teams a shared target for implementation and interoperability.
At the same time, it is important to separate standards from readiness. Most production environments are not yet fully prepared to adopt post-quantum cryptography end to end. Support across cryptographic libraries, protocols, and cloud services is still uneven, and many implementations are limited to specific use cases or hybrid configurations that combine classical and post-quantum techniques.
This uneven adoption is expected. Cryptography sits deep in operating systems, libraries, protocols, and managed services. Updating it safely takes time, especially when backward compatibility and performance must be preserved. As a result, most organizations will operate in mixed environments for years, where some connections support post-quantum algorithms and others still rely on classical cryptography.
For security teams, the takeaway is not to wait for universal support. It is to plan for a phased transition. Migration plans should be broken into three parts:
Inventory what needs to be migrated. You cannot upgrade what you cannot see.
Migrate TLS and SSH key exchange functionality. This step is urgent because it mitigates the Harvest Now, Decrypt Later risk, and it is easy because it typically involves upgrading software or adjusting a configuration rather than replacing keys or credentials.
Migrate everything else. Regenerating key pairs, updating authentication mechanisms, and replacing cryptographic libraries embedded in applications is more complex and will take longer.
This phased approach keeps the problem manageable. The first two steps are achievable today, while the third will unfold over years as ecosystem support matures.
Understanding this reality helps set the right expectations. Post-quantum readiness is not a one-time upgrade. It is a multi-year process that requires visibility, prioritization, and steady progress as support matures across the ecosystem.
Why Post-Quantum Cryptography Is Hard in Cloud Environments
Post-quantum cryptography is challenging in any environment, but cloud infrastructure makes the problem significantly harder to reason about and manage.
In the cloud, cryptography is not confined to a small set of well-defined systems. It is embedded across nearly every layer of the stack. TLS secures inbound and outbound traffic. Service-to-service communication relies on certificates and mutual authentication. APIs depend on signed requests and tokens. Identity systems use cryptographic keys to establish trust between users, workloads, and services.
At the same time, cloud environments are highly dynamic. Workloads are ephemeral. Containers are rebuilt frequently. Services scale automatically. Third-party managed services introduce cryptographic choices that are outside direct application control. As a result, cryptography is constantly being created, rotated, and retired, often without a single team having a complete view of where and how it is used.
This dynamism makes post-quantum migration fundamentally different from traditional cryptographic upgrades. In on-prem environments, teams could often inventory servers, rotate certificates, and move forward in a controlled sequence. In the cloud, cryptographic usage is distributed across infrastructure, platforms, and applications, with ownership split between security, engineering, and cloud operations teams.
The result is a visibility gap. Organizations may know that they rely heavily on RSA or ECC, but lack clarity on where those algorithms appear, which services depend on them, and which data flows they protect. Without that context, planning a transition to post-quantum cryptography becomes guesswork.
This is the core challenge cloud security teams face with PQC. It is not a lack of standards or algorithms. It is the difficulty of understanding cryptographic exposure in environments that change continuously and operate at scale. Until that visibility exists, meaningful migration planning remains out of reach.
The First Step to PQC Readiness: Cryptographic Visibility
Before an organization can plan a transition to post-quantum cryptography, it needs a clear understanding of where cryptography is actually used. Without that visibility, migration efforts tend to default to assumptions, blanket policies, or incomplete upgrades that miss the areas of highest risk.
Cryptographic visibility starts with inventory. Security teams need to know which certificates, keys, protocols, and libraries are active across their cloud environments. This includes TLS configurations on load balancers and APIs, SSH keys used for access, cryptographic libraries embedded in container images, and the key material used by identity systems and managed services.
In cloud environments, this inventory cannot be static. Workloads are ephemeral, services are redeployed frequently, and dependencies change as teams ship new code. Traditional approaches that rely on manual tracking or periodic audits struggle to keep up. Effective visibility requires continuous discovery that reflects how cryptography is actually used in running environments.
This is where the concept of a Cryptographic Bill of Materials begins to emerge. Much like a Software Bill of Materials maps application dependencies, a Cryptographic Bill of Materials aims to map the cryptographic components that software and infrastructure rely on. While standards for a Cryptographic Bill of Materials are still evolving, the underlying principle is already essential. You cannot assess quantum risk without knowing which algorithms and key types are in play.
Visibility also enables prioritization. Not every instance of public-key cryptography carries the same risk. Internet-facing endpoints, long-lived certificates, and systems that protect sensitive or regulated data deserve attention before internal services with short-lived credentials. A clear inventory makes it possible to focus on the cryptographic assets that matter most, rather than treating all usage as equally urgent.
Post-quantum readiness does not start with replacing algorithms. It starts with understanding exposure. Once teams can see where cryptography lives and how it supports data flows and identities, planning a phased and realistic migration becomes possible.
How Wiz Helps Teams Prepare for Post-Quantum Cryptography
Preparing for post-quantum cryptography starts with understanding where quantum-vulnerable cryptography exists in real cloud environments. Wiz approaches this problem from the inside out, using agentless visibility to identify cryptographic usage across infrastructure, platforms, and applications without requiring agents or code changes.
Wiz continuously discovers where public-key cryptography is used across cloud workloads and services. This includes TLS configurations on load balancers and APIs, certificates used for service-to-service authentication, cryptographic libraries embedded in container images, and key material associated with identity systems and managed services. By surfacing where RSA, elliptic curve cryptography, and Diffie-Hellman–based mechanisms appear, Wiz helps teams see where post-quantum migration will eventually be required.
Visibility alone is not enough. The challenge with post-quantum cryptography is deciding where to act first. Wiz adds context through its Security Graph, which correlates cryptographic findings with network exposure, identity permissions, and data access patterns. This allows teams to distinguish between low-risk internal usage and cryptographic assets that protect sensitive data, authenticate privileged access, or sit on internet-facing attack paths.
Testing PQC Support with the Wiz PQC Tester
In addition to internal visibility, Wiz provides a practical way to assess external readiness through the Wiz PQC Tester. The PQC Tester allows teams to evaluate whether public-facing endpoints support post-quantum or hybrid key exchange mechanisms during TLS negotiation.
This matters because internet-facing services are the easiest targets for data harvesting. Even when internal systems are well controlled, externally exposed endpoints are where encrypted traffic is most likely to be intercepted and stored. The PQC Tester helps teams quickly understand whether their public endpoints are still relying solely on classical key exchange or whether post-quantum protections are beginning to take effect.
On its own, endpoint testing provides a useful signal but limited context. The strength of Wiz’s approach is combining that external signal with internal visibility. Teams can correlate PQC Tester results with cloud asset ownership, cryptographic inventory, exposure paths, and data sensitivity. This makes it possible to answer practical questions, such as which teams own endpoints that lack post-quantum support, which services sit behind those endpoints, and which data flows would be impacted.
Together, internal discovery and external testing provide a more complete picture of post-quantum readiness. Teams can validate progress, identify gaps, and prioritize remediation based on real risk rather than assumptions.
As standards mature and cloud providers expand post-quantum support, Wiz provides continuous insight into both progress and drift. New services that reintroduce quantum-vulnerable cryptography are surfaced quickly, helping teams maintain forward momentum throughout a multi-year transition.