Cardano’s hoskinson on post-quantum cryptography risks of moving too early

Cardano founder Charles Hoskinson believes the crypto industry will inevitably have to adopt post-quantum cryptography—but warns that doing it too early, and without the right infrastructure, could seriously damage blockchain performance and usability.

According to Hoskinson, the issue is no longer whether post-quantum algorithms exist—the core problem is *when and how* they should be deployed.

He pointed to the fact that standardized post-quantum schemes are already available. In 2024, the U.S. National Institute of Standards and Technology (NIST) finalized a first batch of post-quantum cryptographic standards, as part of a long-running program to prepare digital systems for a future with powerful quantum computers. From a purely mathematical and engineering standpoint, the tools needed to harden blockchains against quantum attacks are now on the table.

But Hoskinson emphasized that simply swapping out existing cryptography for these new algorithms is not a drop-in upgrade. The performance and scalability trade-offs are significant.

Post-quantum schemes, he noted, often require much more computational effort and storage. Many of the leading candidate algorithms are roughly an order of magnitude more demanding than today’s widely-used elliptic curve cryptography. In practice, that can mean signatures and proofs that are around ten times larger, operations that are roughly ten times slower, and protocols that are materially more resource-hungry across the board.

For a blockchain, those differences are not just an implementation detail—they directly affect throughput, fees, node requirements, and the user experience. Bigger proofs and signatures mean more data per transaction, which inflates block sizes, increases bandwidth consumption, and makes it harder for smaller operators to run full nodes. Slower cryptographic operations extend validation times, impacting block production, finality, and consensus efficiency.

Hoskinson argued that, unless hardware and network infrastructure evolve in tandem, moving an entire blockchain ecosystem to post-quantum primitives too early could amount to self-sabotage. A chain that becomes dramatically slower or more expensive to use, purely in anticipation of a quantum threat that may still be years away, risks losing developers, users, and liquidity to more responsive competitors.

He framed the dilemma as a question of timing and readiness, not of technical feasibility. In his view, the industry needs to avoid both extremes: ignoring the quantum threat until it’s too late, and overreacting by adopting heavy, immature post-quantum schemes long before hardware, wallets, and validators can support them efficiently.

This timing problem is especially acute for public blockchains, where changes to core cryptography affect everything from consensus to smart contracts and wallet infrastructure. Unlike centralized systems that can push through a coordinated upgrade, open networks must align miners, validators, developers, and users around any switch in cryptographic foundations. That coordination alone can take years.

Hoskinson also highlighted the role of hardware acceleration. Today’s blockchains already lean heavily on optimized libraries and specialized hardware to keep existing cryptographic operations fast and cheap. To make post-quantum systems practical at scale, similar acceleration will be essential—whether through optimized CPU instructions, GPUs, FPGAs, or even dedicated ASICs.

Without that support, he warned, post-quantum networks could end up with bloated blocks, high fees, and slow transaction confirmation, undermining the core value proposition of many chains. In other words, security against a future adversary could come at the cost of competitiveness in the present.

At the same time, Hoskinson acknowledged that the quantum threat is real. Many of today’s most widely used public-key schemes, including those that secure blockchain wallets and transaction signatures, are vulnerable in principle to sufficiently powerful quantum computers. While no such machines exist yet at the scale required to break modern cryptography, research in quantum computing is advancing steadily, and cryptographers typically aim to harden systems *before* that threshold is crossed.

This creates a strategic tension: crypto networks must plan for a multi-decade lifespan, where assets might need to remain secure for decades, but the exact arrival time and capabilities of large-scale quantum computers remain uncertain. A cautious, staged migration path may be preferable to an abrupt, network-wide switch.

In practical terms, that could mean:

– Designing protocols that are “quantum-agile,” able to support multiple cryptographic schemes and rotate keys or algorithms without disrupting the network.
– Introducing hybrid approaches that combine classical and post-quantum algorithms, so that an attacker would need to break both to compromise security.
– Gradually upgrading key infrastructure—wallets, nodes, smart contract platforms—so that when a full switch becomes necessary, most of the ecosystem is already prepared.

Hoskinson’s comments suggest that Cardano and other major networks will likely pursue such incremental strategies rather than rushing into full post-quantum adoption on the consensus layer right away.

He also implied that the cost of premature migration is not purely technical. If early post-quantum choices are later found to be flawed or suboptimal, the ecosystem could be forced into multiple disruptive upgrades. That churn can erode trust, complicate developer tooling, and leave long-lived applications in a constant state of cryptographic flux.

Another important dimension is backward compatibility. Existing addresses, keys, and contracts on many blockchains were created using classical cryptography. Transition plans must consider how to protect historical funds and long-standing contracts, potentially requiring complex key rotation mechanisms, migration incentives, or multi-signature schemes that blend old and new cryptographic primitives.

For users, the abstract debate about post-quantum algorithms translates into concrete questions: Will my coins remain safe if quantum computers become practical? Will I need to move funds to new addresses or upgrade my wallet? Will transaction fees rise or network speed suffer because of heavier cryptography?

Hoskinson’s stance essentially argues for a balanced answer: yes, the ecosystem must evolve to remain safe in a post-quantum world, but that shift should be phased, data-driven, and aligned with real advances in hardware, standards, and threat intelligence. Overhauling cryptography purely out of fear, without the supporting infrastructure, might cause more damage than the theoretical attack it is meant to prevent.

In parallel, research continues on more efficient post-quantum schemes and better implementations. As libraries mature and hardware vendors begin to consider acceleration paths for lattice-based and other post-quantum primitives, the performance penalty Hoskinson described may shrink over time. The “ten times slower and larger” characterization reflects the current state of many candidate algorithms, not necessarily the final shape of optimized, production-ready systems.

For now, his warning serves as a reminder that security, scalability, and decentralization are tightly coupled. Any move toward post-quantum security will inevitably involve trade-offs among these pillars. The challenge for blockchain developers and communities is to navigate those trade-offs without sacrificing the usability and openness that made public ledgers viable in the first place.

Ultimately, Hoskinson’s message is that post-quantum cryptography is not a silver bullet that can be bolted onto existing chains overnight. It is a long-term engineering and governance challenge, where the right answer depends as much on timing, incentives, and ecosystem coordination as it does on mathematical security proofs.