Cloudflare Moves Post-Quantum Deadline to 2029: 5 Things Every Security Team Needs to Know Now
Cloudflare called the new quantum research 'a real shock' and pulled its deadline forward. Here's what changed and what to do.
Cloudflare Called the New Quantum Research ‘a Real Shock.’ Here’s What That Means for You.
Cloudflare has moved its post-quantum security deadline to 2029, and the company isn’t being subtle about why. Someone inside Cloudflare told Time that the new quantum computing research was “a real shock” and that the company would “need to speed up our efforts considerably.” That’s not the language of a company executing a planned roadmap. That’s the language of a company that just got surprised.
If you run infrastructure, manage security policy, or make decisions about cryptographic standards, this deadline shift is the most important thing that happened in your field this month. Here’s what changed, what it means, and what the 2029 target actually requires.
The Deadline That Just Moved
For years, the working assumption in enterprise security was that quantum computers capable of breaking modern encryption were a 2030s problem — maybe a 2035 problem. That gave organizations a comfortable runway. Migrate eventually. Prioritize other things now.
That window just closed.
Cloudflare announced it is targeting 2029 to be fully post-quantum secure, including post-quantum authentication — not just encryption. The company cited “credible new research and rapid industry developments” as the reason for pulling the deadline forward. This is not a minor adjustment. Six years of assumed runway just became four, and the research driving that compression is still accelerating.
Hire a contractor. Not another power tool.
Cursor, Bolt, Lovable, v0 are tools. You still run the project.
With Remy, the project runs itself.
The specific research Cloudflare was responding to came from two directions simultaneously. Google published estimates suggesting a future quantum computer could attack the P-256 elliptic curve discrete logarithm problem using fewer than 1,200 logical qubits and fewer than 19 million Toffoli gates. A separate estimate from the same team put it at 1,450 logical qubits and fewer than 17 million Toffoli gates. Either way, the number is dramatically lower than prior estimates — and Google calculated that such an attack could run on a superconducting quantum computer with fewer than 500,000 physical qubits, potentially in minutes.
Separately, researchers connected to Caltech and a company called Oatomic published a paper arguing that Shor’s algorithm could run at cryptographically relevant scales with approximately 10,000 reconfigurable atomic qubits. Their estimate: around 26,000 physical qubits could attack P-256 in a few days.
Both papers are theoretical. Neither represents a machine that exists today. But Cloudflare, which processes a significant fraction of the world’s internet traffic, looked at this research and said: we need to move faster.
What the Research Actually Says (and What It Doesn’t)
The Oatomic paper has not been peer-reviewed. Princeton’s Jeff Thompson, a quantum computing researcher, warned that the paper’s assumptions are untested and that “it is very easy to shrink a computer on paper if you assume better qubits.” That’s a legitimate criticism. Resource estimates in quantum computing have a long history of optimistic assumptions that don’t survive contact with physical hardware.
Google’s approach was more unusual. Instead of publishing the exact attack circuits — which would amount to publishing a partial blueprint for breaking encryption — Google used a zero-knowledge proof to verify its claims. A zero-knowledge proof lets you demonstrate that you know something without revealing the thing itself. In this context: Google proved its estimates are valid without handing anyone the actual method. That’s a meaningful signal about how seriously the company takes the dual-use risk of this research.
John Preskill, one of the most respected names in quantum computing and a co-author of the Oatomic paper, told Time he was surprised by how much the team was able to reduce the qubit count. He also made a point that tends to get lost in the coverage: humans were still driving the research. The AI helped search through a vast space of possible approaches. The scientists decided which questions to ask.
That distinction matters because the AI contribution here wasn’t incidental. The Oatomic team used OpenEvolve, an open-source tool that uses large language models to optimize algorithms through a process similar to evolutionary search. Instead of researchers manually testing a handful of approaches, the system could evaluate thousands. The team’s early algorithms were reportedly about 1,000 times worse than what they ended up with. One author said plainly that the project “would not work” without the AI-assisted improvements.
This is the part that should recalibrate your mental model of how fast this field moves. The threat to encryption isn’t just “quantum computers are getting bigger.” It’s “quantum computers are getting bigger and the algorithms they run are getting more efficient and AI is helping researchers find those efficiencies faster.” All three are happening at once.
Why Authentication Is the Part That Should Worry You More
Most of the public conversation about quantum computing and encryption focuses on confidentiality — the idea that a future quantum computer could decrypt messages that are encrypted today. That’s real. But Cloudflare’s 2029 deadline specifically calls out authentication as a top priority, and that’s the more immediately dangerous problem.
Other agents start typing. Remy starts asking.
Scoping, trade-offs, edge cases — the real work. Before a line of code.
Here’s the distinction. Encryption protects the content of a communication. Authentication proves identity — that the server you’re talking to is actually the server it claims to be, that the software update you’re installing was actually signed by the company that wrote it, that the API key you’re using belongs to who it says it does.
If a quantum attacker can forge authentication credentials, they don’t need to decrypt your traffic. They can impersonate your bank, your software vendor, your identity provider. They walk through the front door with a forged key, as Cloudflare’s blog put it.
Long-lived keys are the specific vulnerability here. Root certificates, code signing certificates, API authentication keys — these are credentials that get issued once and trusted for years. If a quantum computer can break the cryptography underlying those keys, every system that trusts them is compromised. And because these keys have long lifespans, the window for an attacker to harvest them now and break them later is already open.
This is the harvest-now-decrypt-later threat that the NSA, CISA, and NIST have all issued warnings about. The attack is simple in concept: collect encrypted traffic and authentication material today, store it, and wait for quantum hardware to mature enough to break it. The data being stolen right now may not need to be decrypted for five or ten years. That’s fine, if you’re patient.
NIST finalized its first three post-quantum cryptography standards on August 13, 2024, and has been urging system administrators to begin transitioning immediately. The standards exist. The migration path exists. The problem is that full integration takes time — and that time is now shorter than most organizations planned for.
The Non-Obvious Problem: Turning Off the Old Stuff
Here’s what gets buried in most coverage of post-quantum migration: adding new cryptography isn’t enough.
Cloudflare has been ahead of most organizations on this. The company enabled post-quantum encryption for all websites and APIs it proxies back in 2022, and more than 65% of human traffic passing through its network is already post-quantum encrypted. That’s a meaningful number. But Cloudflare is explicit that encryption alone doesn’t get you to the finish line.
The full migration requires disabling quantum-vulnerable cryptography entirely. If you add post-quantum encryption but leave the old algorithms available, an attacker can execute a downgrade attack — tricking two systems into negotiating the weaker, vulnerable protocol even though the stronger one exists. The new lock doesn’t help if you leave the old lock on the door.
After the old cryptography is turned off, secrets like passwords and access tokens may need to be rotated. That creates a dependency chain involving third-party vendors, validation systems, fraud monitoring, and compliance requirements. It’s not an app update. It’s closer to replacing the locks, the keys, the ID cards, and the alarm systems across your entire infrastructure while the building stays open.
For teams thinking about how to manage that kind of coordinated migration, the tooling question becomes important quickly. Platforms like MindStudio handle orchestration across complex workflows — 200+ models, 1,000+ integrations, and a visual builder for chaining agents — which is relevant when you’re trying to automate audit, monitoring, or compliance workflows across a large system without writing all the glue code yourself.
Day one: idea. Day one: app.
Not a sprint plan. Not a quarterly OKR. A finished product by end of day.
The migration also has a sequencing problem. You can’t just flip a switch. Different parts of your stack will be ready at different times. Third-party dependencies will lag. Validation frameworks will need updating. The 2029 deadline sounds distant until you map out the actual dependency graph.
What the 2029 Deadline Requires in Practice
Cloudflare’s target is specific: fully post-quantum secure, including authentication, by 2029. That’s four years. For large organizations with complex infrastructure, four years is not a lot of time.
A few things follow from this.
The harvest-now-decrypt-later threat means the clock started before 2029. If your organization handles data that needs to remain confidential for more than four years — government records, medical data, financial information, long-term contracts — that data is potentially at risk right now, from adversaries who are collecting it today. The migration deadline isn’t just about when quantum computers arrive. It’s about when the data you’re generating today becomes vulnerable.
NIST’s three finalized post-quantum standards give you a concrete starting point. The algorithms are standardized. The migration path is defined. What’s missing for most organizations is the internal prioritization to treat this as urgent rather than eventual.
The AI acceleration factor is the wildcard. The Oatomic team’s algorithms improved by roughly 1,000x with AI assistance. If that kind of improvement is repeatable — and there’s no reason to think it isn’t — then the resource estimates for quantum attacks will continue to compress. The 2029 deadline is based on current research. If the research keeps moving at this pace, 2029 may not be conservative enough.
Cloudflare’s internal reaction — “a real shock,” accelerating considerably — is the most useful signal here. This is a company with deep expertise in cryptographic infrastructure, with financial incentives to be accurate rather than alarmist, and with direct visibility into how the internet’s security layer actually works. When that company describes new research as a shock and pulls its deadline forward, that’s worth taking seriously.
The question Cloudflare’s blog posed is worth sitting with: not when will encrypted data be at risk, but how long before an attacker walks through the front door with a quantum forged key. The forging isn’t happening today. But the harvesting might be.
Where to Start Before the Deadline Catches You
The practical starting point is inventory. You can’t migrate what you haven’t mapped. That means identifying every place in your stack where public-key cryptography is used — TLS connections, code signing, API authentication, certificate infrastructure — and assessing which of those use algorithms that are quantum-vulnerable.
For teams building internal tooling to support that kind of audit, the spec-driven approach is worth considering. Remy, MindStudio’s full-stack app compiler, takes annotated markdown specs as input and compiles them into complete TypeScript applications — backend, database, auth, deployment. If you’re building a cryptographic asset inventory tool or a migration tracking system, writing the spec first and compiling the application from it keeps the requirements explicit and auditable, which matters when you’re trying to demonstrate compliance.
The NIST standards are the technical foundation. The 2029 deadline is the forcing function. The harvest-now-decrypt-later threat is the reason you can’t wait until 2028 to start.
Cloudflare called this a real shock. The honest read is that it should be a useful one.
Remy doesn't build the plumbing. It inherits it.
Other agents wire up auth, databases, models, and integrations from scratch every time you ask them to build something.
Remy ships with all of it from MindStudio — so every cycle goes into the app you actually want.
For more on how AI is intersecting with security research, the Claude Mythos cybersecurity capabilities post covers Anthropic’s work on vulnerability discovery — a related thread worth following. And if you’re thinking about how AI agents fit into security workflows more broadly, the AI agents for research and analysis roundup covers the tooling landscape. For context on where AI-assisted algorithm discovery fits in the broader picture of what frontier models can do, the OpenAI Spud model overview is a useful reference point.