Google post-quantum cryptography push speeds migration
Google says post-quantum cryptography adoption must accelerate as quantum attack cost estimates for RSA-2048 fall sharply.
Google has sharpened its post-quantum cryptography timeline and is pushing the industry to migrate sooner than many security teams planned. In its March 2026 posture, centered on the February statement that a cryptographically relevant quantum computer is no longer safely distant, Google is telling developers and infrastructure teams to treat quantum-vulnerable public-key systems as a live migration problem, not a long-range research topic.
The practical impact is straightforward. If your stack still depends on RSA, ECDSA, or elliptic-curve key exchange in certificates, KMS workflows, service-to-service auth, or device identity, your replacement window is shorter than the relaxed reading many organizations took from NIST’s broad 2035 endpoint.
Google’s accelerated position
Google’s quantum security statement frames the threat around two points. First, “store now, decrypt later” is already a relevant risk model for sensitive data with long confidentiality lifetimes. Second, migration has to happen before a large-scale quantum machine exists, because cryptographic replacement across infrastructure takes years.
Google says it has been preparing since 2016 and has used PQC protections for internal communications since 2022. It has also rolled out post-quantum protections across Chrome, products, and data-center infrastructure, while aligning its own migration with current NIST guidance.
One detail matters for accuracy. The accessible Google materials support the accelerated migration stance, but do not provide a direct official quote for a hard 2029 deadline. The substance is still clear: Google is pressing for an earlier cutoff than many enterprises assumed.
The technical reason the deadline moved up
The urgency is tied to Google’s 2025 quantum factoring work. Google reported that RSA-2048 could theoretically be broken by a quantum computer with 1 million noisy qubits running for less than one week, down from its 2019 estimate of 20 million physical qubits.
That is a 20x reduction in estimated quantum resources for a meaningful real-world target.
| Estimate | RSA-2048 attack cost |
|---|---|
| Google 2019 | 20 million physical qubits |
| Google 2025 | 1 million noisy qubits, under 1 week |
Current machines remain far below that scale, at roughly 100 to 1,000 qubits. The issue is not immediate breakage. The issue is that the migration lead time now dominates the hardware lead time.
If you build long-lived systems, the same planning discipline used in AI security monitoring applies here. You do not wait for the threat to be fully operational before changing the architecture.
Which cryptography is in scope
The affected systems are the standard public-key primitives used almost everywhere in modern infrastructure:
- RSA
- Elliptic Curve Diffie-Hellman
- ECDSA
- Other classical public-key key-establishment and signature schemes in the same category
The current standardized replacements come from NIST’s finalized PQC standards published in 2024:
| Function | Standard | Algorithm |
|---|---|---|
| Key establishment | FIPS 203 | ML-KEM |
| Digital signatures | FIPS 204 | ML-DSA |
| Digital signatures | FIPS 205 | SLH-DSA |
For developers, this is less about theory than dependency mapping. Anywhere you use key exchange, certificate issuance, signed artifacts, firmware signing, or roots of trust, you need an inventory.
Google Cloud is already productizing the migration
Google’s urgency is backed by deployable services. Cloud KMS supports PQC digital signatures with ML-DSA-65 and SLH-DSA-SHA2-128s, plus a hash-based variant. Its KMS algorithm support also includes ML-KEM-768, ML-KEM-1024, and KEM-XWING for key encapsulation.
That means the migration is no longer blocked on standards availability. It has moved into implementation and rollout work, which is where most organizations get delayed by certificate chains, hardware dependencies, vendor support, and protocol compatibility.
This is the same pattern developers already see in fast-moving AI stacks. A capability becomes real only when it lands in production tooling, whether that is multicloud security controls or code security pipelines.
The timeline tension with NIST and NSA
NIST’s current draft transition guidance still points to 2035 as the main federal target for completed migration, with many classical schemes deprecated after 2030 and disallowed after 2035.
NSA’s CNSA 2.0 direction is tighter for national security systems. Draft operational dates point to 2027 for new products and services, 2030 for replacing unsupported equipment, and 2031 for broader system coverage.
| Organization | Key dates |
|---|---|
| Publicly pressing for faster migration now | |
| NIST | 2030 deprecation, 2035 disallow/completion horizon |
| NSA / CNSA 2.0 | 2027, 2030, 2031 operational milestones |
For enterprise teams, Google’s stance lands much closer to the NSA view than to the most relaxed interpretation of NIST’s 2035 endpoint.
What developers should do now
Start with a cryptographic inventory, not a procurement exercise. Identify every place your systems depend on RSA or elliptic-curve public-key cryptography, especially TLS termination, service identity, code signing, device onboarding, KMS-backed signatures, and archived sensitive data.
Then prioritize crypto agility. If your systems cannot swap algorithms without application rewrites, that is the real blocker. Teams already doing disciplined interface design in areas like agent infrastructure and evaluation should apply the same approach here: isolate the cryptographic boundary, test replacement paths early, and remove hard-coded assumptions before the standards deadline turns into an outage.
Get Insanely Good at AI
The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.
Keep Reading
What Are Parameters in AI Models?
Parameters are the numbers that make AI models work. Here's what they are, why models have billions of them, and what the count actually tells you about capability.
Google Closes $32B Wiz Acquisition, Reshaping Cloud Security
Google has closed its $32B Wiz deal, signaling a major push toward multicloud, code-to-runtime, and AI-native security.
Cohere Transcribe debuts as open-source ASR model
Cohere Transcribe launches as a 2B open-source speech-to-text model with 14-language support, self-hosting, and vLLM serving.
OpenAI's New Bounty Targets Prompt Injection and Agent Abuse
OpenAI’s public Safety Bug Bounty rewards reports on agentic abuse, prompt injection, data exfiltration, and account integrity risks.
GitHub Code Security Can Now Detect Bugs Beyond CodeQL
GitHub Code Security adds AI bug detection to extend code scanning coverage beyond CodeQL in ecosystems like Bash, Dockerfiles, Terraform, and PHP.