Connect with us

Crypto World

Bitcoin’s Quantum Defense Plan: What BIP-360 Actually Changes

Published

on

Bitcoin’s Quantum Defense Plan: What BIP-360 Actually Changes

Key takeaways

  • BIP-360 formally puts quantum resistance on Bitcoin’s road map for the first time. It represents a measured, incremental step rather than a dramatic cryptographic overhaul.

  • Quantum risk primarily targets exposed public keys, not Bitcoin’s SHA-256 hashing, making public key exposure the central vulnerability developers aim to reduce.

  • BIP-360 introduces Pay-to-Merkle-Root (P2MR), which removes Taproot’s key path spending option and forces all spends through script paths to minimize elliptic curve exposure.

  • Smart contract flexibility remains intact, as P2MR still supports multisig, timelocks and complex custody structures via Tapscript Merkle trees.

Bitcoin was built to withstand hostile economic, political and technical scenarios. As of March 10, 2026, its developers are preparing to confront an emerging threat: quantum computing.

The recent publication of Bitcoin Improvement Proposal 360 (BIP-360) officially adds quantum resistance to Bitcoin’s long-term technical road map for the first time. While some headlines portray it as a dramatic shift, the reality is far more measured and incremental.

This article explores how BIP-360 introduces Pay-to-Merkle-Root (P2MR) to reduce Bitcoin’s quantum exposure by removing Taproot key path spending. It explains what the proposal improves, what trade-offs it introduces and why it does not yet make Bitcoin fully post-quantum secure.

Why quantum computing poses a risk to Bitcoin

For security, Bitcoin depends on cryptography, primarily the Elliptic Curve Digital Signature Algorithm (ECDSA) and Schnorr signatures introduced via Taproot. Regular computers cannot realistically derive a private key from a public key. However, a powerful quantum computer running Shor’s algorithm could break elliptic curve discrete logarithms, exposing those keys.

Advertisement

Key distinctions include:

  • Quantum attacks hit public-key cryptography hardest, not hashing.

  • Bitcoin’s SHA-256 remains relatively strong against quantum methods. Grover’s algorithm only provides a quadratic speedup, not an exponential one.

  • The real risk appears when public keys become exposed on the blockchain.

This is why the community focuses on public key exposure as the primary quantum risk vector.

Bitcoin’s vulnerabilities in 2026

Not every address type in the Bitcoin network faces the same level of future quantum threat:

  • Reused addresses: Spending reveals the public key onchain, leaving it exposed to a future cryptographically relevant quantum computer (CRQC).

  • Legacy pay to public key (P2PK) outputs: Early Bitcoin transactions directly embedded public keys in transaction outputs.

  • Taproot key path spends: Taproot (2021) offers two paths: a compact key path (which exposes a tweaked public key on spend) or a script path (which reveals scripts via a Merkle proof). The key path is the main theoretical weak point under a quantum attack.

BIP-360 directly targets that key path exposure.

What BIP-360 introduces: P2MR

BIP-360 adds a new output type, Pay-to-Merkle-Root (P2MR), modeled closely on Taproot but with one critical change. It removes the key path spending option entirely.

Instead of committing to an internal public key like Taproot, P2MR commits solely to the Merkle root of a script tree. To spend:

Advertisement

No public key based spending route exists at all.

Eliminating key path spends means:

  • No public key exposure for direct signature checks.

  • All spending routes rely on hash-based commitments.

  • Long-term elliptic curve public key exposure drops sharply.

Hash-based methods are far more resilient to quantum attacks than elliptic curve assumptions. This significantly shrinks the attack surface.

What BIP-360 preserves

A common misconception is that dropping key path spending weakens smart contracts or scripting. It does not. P2MR fully supports:

Advertisement
  • Multisig setups

  • Timelocks

  • Conditional payments

  • Inheritance schemes

  • Advanced custody

BIP-360 executes all these functions via Tapscript Merkle trees. While the process retains full scripting capability, the convenient but vulnerable direct signature shortcut disappears.

Did you know? Satoshi Nakamoto briefly acknowledged quantum computing in early forum discussions, suggesting that if it became practical, Bitcoin could migrate to stronger signature schemes. This shows that upgrade flexibility was always part of the design philosophy.

Practical implications of BIP-360

BIP-360 may sound like a purely technical refinement, but its impact would be felt at the wallet, exchange and custody levels. If activated, it would gradually reshape how new Bitcoin outputs are created, spent and secured, especially for users prioritizing long-term quantum resilience.

  • Wallets could introduce opt-in P2MR addresses (likely starting with “bc1z”) as a “quantum-hardened” choice for new coins or long-term holdings.

  • Transactions will be slightly larger (more witness data from script paths), potentially raising fees somewhat compared to Taproot key path spends. Security trades off against compactness.

  • A full rollout would require updates to wallets, exchanges, custodians and hardware wallets. Planning should start years in advance.

Did you know? Governments are already preparing for “harvest now, decrypt later” risks, where encrypted data is stored today in anticipation of future quantum decryption. This strategy mirrors concerns about exposed Bitcoin public keys.

Advertisement

What BIP-360 explicitly does not do

While BIP-360 strengthens Bitcoin in the face of future quantum threats, it is not a sweeping cryptographic overhaul. Understanding its limits is just as important as understanding its innovations:

  • No automatic upgrade for existing coins: Old unspent transaction outputs (UTXO) remain vulnerable until users manually move funds to P2MR outputs. Migration depends on user behavior.

  • No new post-quantum signatures: BIP-360 does not replace ECDSA or Schnorr with lattice-based (for example, Dilithium or ML-DSA) or hash-based (for example SPHINCS+) schemes. It only removes the Taproot key path exposure pattern. A full base layer transition to post-quantum signatures would require a much larger change.

  • No complete quantum immunity: A sudden CRQC breakthrough would still require massive coordination among miners, nodes, exchanges and custodians. Dormant coins could create complex governance issues and network stress could follow.

Why developers are acting now

Quantum progress is uncertain. Some believe it is decades away. Others point to IBM’s late 2020s fault-tolerant goals, Google’s chip advances, Microsoft’s topological research and US government transitions planned for 2030-2035.

Critical infrastructure migrations take many years. Bitcoin’s developers stress planning across BIP design, software, infrastructure and user adoption. Waiting for certainty in quantum progress could leave insufficient time for infrastructure upgrades.

If consensus builds, a phased soft fork could unfold:

Advertisement
  1. Activate the P2MR output type

  2. Wallets, exchanges and custodians add support

  3. Gradual user migration over years

This mirrors the optional then widespread adoption of SegWit and Taproot.

The broader debate around BIP-360

Debate continues on urgency and costs. Questions under discussion include:

  • Are modest fee increases acceptable for HODLers?

  • Should institutions lead the migration?

  • What about coins that never move?

  • How should wallets signal “quantum safety” without causing unnecessary alarm?

This is an ongoing conversation. BIP-360 advances the discussion but does not close it.

Did you know? The idea that quantum computers could threaten cryptography dates back to 1994, when mathematician Peter Shor introduced Shor’s algorithm, long before Bitcoin existed. Bitcoin’s future quantum planning is essentially a response to a 30-year-old theoretical breakthrough.

Advertisement

What users can do right now

There is no need to panic for now, as quantum threats are not imminent. Prudent steps you might take include:

  • Never reuse addresses

  • Stick to up-to-date wallet software

  • Follow protocol upgrade news

  • Watch for P2MR support in wallets

Those with large holdings should quietly map exposures and consider contingency plans.

BIP-360: The first step toward quantum resistance

BIP-360 represents Bitcoin’s first concrete step toward reducing its quantum exposure at the protocol level. It redefines how new outputs can be created, minimizes public key leaks and sets the stage for long-term migration planning.

It does not change existing coins automatically, keeps current signatures intact and underscores the need for a careful, coordinated ecosystem-wide effort. True quantum resistance will come from sustained engineering and phased adoption, not a single BIP.

Advertisement

Cointelegraph maintains full editorial independence. The selection, commissioning and publication of Features and Magazine content are not influenced by advertisers, partners or commercial relationships.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Crypto World

Market Analysis: EUR/USD Reclaims Ground While USD/JPY Momentum Fades

Published

on

Market Analysis: EUR/USD Reclaims Ground While USD/JPY Momentum Fades

EUR/USD is recovering losses from 1.1500. USD/JPY is correcting gains from 159.00 and might decline further if it stays below 158.30.

Important Takeaways for EUR/USD and USD/JPY Analysis Today

  • The Euro struggled to stay in a positive zone and declined below 1.1700 before finding support.
  • There was a break above a connecting bearish trend line with resistance at 1.1580 on the hourly chart of EUR/USD at FXOpen.
  • USD/JPY started a decent increase above 157.00 before the bears appeared near 158.90.
  • There is a key contracting triangle forming with resistance near 158.30 on the hourly chart at FXOpen.

EUR/USD Technical Analysis

On the hourly chart of EUR/USD at FXOpen, the pair started a fresh decline from 1.1825. The pair broke below 1.1665 and the 50-hour simple moving average. Finally, it tested the 1.1500 zone. A low was formed at 1.1507, and the pair is now recovering losses.

There was a move above 1.1550 and a connecting bearish trend line at 1.1580. The pair surpassed the 38.2% Fib retracement level of the downward move from the 1.1826 swing high to the 1.1507 low. On the upside, the pair is now facing resistance near the 50% Fib retracement at 1.1665.

The first major hurdle for the bulls could be 1.1705. A break above 1.1705 could set the pace for another increase. In the stated case, the pair might rise toward 1.1775.

If not, the pair might drop again. Immediate support is near the 50-hour simple moving average and 1.1620. The next key area of interest might be 1.1565. If there is a downside break below 1.1565, the pair could drop towards 1.1505. The main target for the bears on the EUR/USD chart could be 1.1440, below which the pair could start a major decline.

Advertisement

USD/JPY Technical Analysis

On the hourly chart of USD/JPY at FXOpen, the pair gained pace for a move above 158.00. The US dollar even traded close to 159.00 against the Japanese yen before the bears emerged.

A high was formed at 158.90 before a downside correction. The pair dipped below 158.00 and the 50% Fib retracement level of the upward move from the 156.45 swing low to the 158.90 high. However, the bulls were active above 157.00 and protected the 61.8% Fib retracement.

The pair is back above the 50-hour simple moving average and 158.00. Immediate resistance on the USD/JPY chart is near 158.30. There is also a key contracting triangle at 158.30.

If there is a close above the triangle and the hourly RSI moves above 65, the pair could rise towards 158.90. The next major barrier for the bulls could be 159.25, above which the pair could test 160.00 in the near term.

On the downside, the first major support is near 158.00. The next key region for the bears might be 157.40. If there is a close below 157.40, the pair could decline steadily. In the stated case, the pair might drop towards 156.45. Any more losses might send the pair toward 155.85.

Advertisement

Trade over 50 forex markets 24 hours a day with FXOpen. Take advantage of low commissions, deep liquidity, and spreads from 0.0 pips (additional fees may apply). Open your FXOpen account now or learn more about trading forex with FXOpen.

This article represents the opinion of the Companies operating under the FXOpen brand only. It is not to be construed as an offer, solicitation, or recommendation with respect to products and services provided by the Companies operating under the FXOpen brand, nor is it to be considered financial advice.

Source link

Advertisement
Continue Reading

Crypto World

Scaling AI Makes It Riskier

Published

on

Scaling AI Makes It Riskier

Opinion by: Mohammed Marikar, co-founder at Neem Capital

Artificial intelligence has consistently been defined by scale, so far — bigger models, faster processing, expanding data centers. The assumption, based on traditional technology cycles, was that scale would keep improving performance and, over time, costs would fall and access would expand.

That assumption is now breaking down. AI is not scaling like other software. Instead, it is capital-intensive, constrained by physical limits, and hitting diminishing returns far earlier than expected.

The numbers make this clear. Electricity demand from global data centers will more than double by 2030 — levels once associated with entire industrial sectors. In the US alone, data center power demand is projected to rise well over 100 percent before the decade ends. This expansion is demanding trillions of dollars in new investment alongside major expansions in grid capacity.

Advertisement

Meanwhile, these systems are being embedded into law, finance, compliance, trading and risk management, where errors propagate quickly but credibility is non-negotiable. In June 2025, the UK High Court warned lawyers to immediately stop submitting filings that cited fabricated case law generated by AI tools.

The scaling AI debate

When an AI system can invent a precedent that never existed, and a professional relies on it, debates about scaling start becoming serious questions of public trust. Scaling is amplifying AI’s weaknesses rather than solving them.

Part of the problem lies in what scale actually improves. Large language models (LLMs) are evolving to become increasingly fluent because language is pattern-based. The more examples an LLM sees of how real people write, summarize and translate, the faster it improves.

Deeper intelligence — reasoning — does not scale the same way. The next generation of AI must understand cause and effect and know when an answer is uncertain or incomplete. It will need to explain why a conclusion follows, not simply produce a confident response. This does not reliably improve with more parameters or more compute.

Advertisement

The consequence is a growing verification burden. Humans must spend more time checking machine output rather than acting on it, and that burden builds as systems are deployed more widely.

The cost of training AI models

Training frontier AI models has already become extraordinarily expensive, with credible tracking suggesting costs have been multiplying year over year, and projections that single training runs could soon exceed $1 billion. Training is only the entry cost.

The larger expense is inference: running these models continuously, at scale, with real latency, uptime and verification requirements. Every query consumes energy. Every deployment requires infrastructure. As usage grows, energy use and costs compound.

In terms of markets and crypto, AI systems are increasingly used to monitor onchain activity, analyze sentiment, generate codes for smart contracts, flag suspicious transactions and automate decisions.

Advertisement

In such a fast-moving, competitive environment, fluent but unreliable AI propagates errors quickly; false signals move capital, and fabricated explanations and hallucinations undermine trust. One example of this is false positives being generated in automated Anti-Money Laundering (AML) flagging, a common issue that wastes time and resources on investigating innocent trading activity.

Time to improve reasoning

Scaling AI systems without improving their reasoning amplifies risk, especially in use cases where automation and credibility are vital and tightly coupled.

Ensuring AI is economically viable and socially valuable means we cannot rely on scaling. The dominant approach today prioritizes increasing compute and data while leaving the underlying reasoning machinery largely unchanged, a strategy that is becoming more expensive without becoming proportionally safer.

Related: Crypto dev launches website for agentic AI to ‘rent a human’

Advertisement

The alternative is architectural. Systems need to do more than predict the next word. They need to represent relationships, apply rules, check their own steps and make it possible to see how conclusions were reached.

This is where cognitive or neurosymbolic systems come into play. By organizing knowledge into interrelated concepts, rather than relying solely on brute-force pattern matching, these systems can deliver high reasoning capability with far lower energy and infrastructure demands.

Emerging “cognitive AI” platforms are demonstrating how structured reasoning systems can operate on local servers or edge devices, allowing users to keep control over their own knowledge rather than outsourcing cognition to distant infrastructure.

Cognitive AI systems are harder to design and can underperform on open-ended tasks, but when reasoning is reusable in this way rather than rederived from scratch through massive compute, costs fall and verification becomes tractable.

Advertisement

Control over how AI is built matters as much as how it reasons. Communities need systems they can shape, audit and deploy without waiting for permission from centralized platform owners.

Some platforms are exploring this frontier by using blockchain to enable both individuals and corporations to contribute data, models and computing resources. By decentralizing AI development itself, these approaches reduce concentration risk and align deployment with local needs rather than global demands.

AI faces an inflection point. When reasoning can be reused rather than rediscovered through massive pattern matching, systems require less compute per decision and impose a smaller verification burden on humans. That shifts the economics. Experimentation becomes cheaper, inference becomes more predictable. Scaling no longer depends on exponential increases in infrastructure.

Scaling has already done what it could. What it has exposed, just as clearly, is the limit relying on size alone. The question now is whether the industry keeps pushing scale or starts investing in architectures that make intelligence reliable before making it bigger.

Advertisement

Opinion by: Mohammed Marikar, co-founder at Neem Capital.