Connect with us

Crypto World

Self-Healing Protocols: The Next Evolution in DeFi Resilience

Published

on

Self-Healing Protocols: The Next Evolution in DeFi Resilience

Decentralized finance (DeFi) has revolutionized the way users interact with financial services, removing intermediaries and enabling permissionless access to lending, trading, and asset management. Yet, as the ecosystem has grown, so have the risks: market volatility, liquidity crises, and exploits can cause sudden, severe disruptions. Enter Self-Healing Protocols, a class of smart contracts designed to anticipate, react, and adapt to adverse conditions automatically.

What Are Self-Healing Protocols?

A self-healing protocol is a smart contract system engineered to respond dynamically to stress events. Rather than relying solely on governance intervention or manual adjustments, these protocols can automatically:

  • Adjust incentives: For example, increasing yield rewards to encourage liquidity provision when a pool is undercapitalized.

  • Rebalance pools: Automatically shift liquidity between pools or adjust token weights to maintain stability and minimize slippage.

  • Redistribute risk: Move exposure away from highly leveraged positions or risky assets to protect the system during market crashes.

These mechanisms essentially allow a protocol to “heal itself” in response to abnormal conditions, reducing systemic risk and enhancing user confidence.

How They Work

Self-healing protocols leverage a combination of on-chain oracles, algorithmic rules, and dynamic parameters. Key components include:

  1. Real-Time Data Monitoring: Oracles feed the protocol with market prices, liquidity metrics, and on-chain activity.

  2. Automated Trigger Mechanisms: Smart contracts detect stress conditions—like a sudden liquidity drop or extreme volatility—and trigger corrective actions.

  3. Dynamic Incentive Adjustments: Rewards and penalties are algorithmically recalibrated to encourage stabilizing behavior among participants.

  4. Risk Redistribution Algorithms: Funds can be automatically reallocated across pools, vaults, or derivatives to minimize the impact of defaults or liquidations.

Some protocols also integrate simulation engines that run stress-test scenarios on-chain to anticipate potential crises before they escalate.

Advertisement

Benefits of Self-Healing Protocols

  • Reduced Governance Lag: Human intervention is often slow and reactionary. Self-healing protocols act instantly.

  • Resilience Against Market Shocks: Liquidity imbalances and sudden withdrawals are mitigated before they snowball.

  • Improved User Trust: Knowing that a protocol can adapt autonomously increases confidence among liquidity providers and traders.

  • Enhanced Composability: Other DeFi products can safely integrate with self-healing protocols without inheriting all the risk.

Challenges and Considerations

Despite their promise, self-healing protocols are not without challenges:

  • Complexity and Audit Risk: More logic means more potential for bugs. Thorough audits are critical.

  • Oracle Dependence: Reliance on external data sources can introduce new points of failure.

  • Economic Exploits: Sophisticated actors may attempt to game dynamic incentive mechanisms.

  • Transparency vs. Flexibility: Too much automatic adjustment can be hard for users to understand, possibly reducing adoption.

Looking Ahead

Self-healing protocols represent a frontier where algorithmic finance meets resilience engineering. Projects exploring this concept could redefine how DeFi handles risk, moving the ecosystem closer to fully autonomous, self-stabilizing financial networks.

As DeFi matures, these protocols may become a standard layer of protection, much like insurance or circuit breakers in traditional finance—but fully automated and embedded in code.

REQUEST AN ARTICLE

Source link

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Crypto World

Scaling Next-Gen AI Is Increasing Risks, Not Benefits

Published

on

Crypto Breaking News

Artificial intelligence has long been defined by scale—larger models, faster processing, and sprawling data centers. Yet a growing cohort of researchers, investors, and practitioners is suggesting the traditional growth path is hitting a ceiling. AI is increasingly capital-intensive and tethered to physical limits, with diminishing returns appearing sooner than many anticipated. The latest data underscore the shift: electricity demand from global data centers is projected to more than double by 2030, a surge comparable to expanding entire industrial sectors; in the United States, data-center power usage is forecast to rise well over 100% by the end of the decade. As the economics of AI tighten, trillions of dollars in new investment and substantial grid upgrades loom, coinciding with the way the technology embeds itself into finance, law, and crypto workflows.

Key takeaways

  • Energy demand tied to AI is accelerating, with the IEA projecting data-center electricity use will more than double by 2030, highlighting a fundamental constraint in the current scaling paradigm.
  • The United States could see data-center power consumption surge by more than 100% before the 2030s, signaling a major resource and infrastructure challenge for AI-enabled sectors.
  • Frontier AI training costs are skyrocketing, with estimates suggesting single training runs could exceed $1 billion, making inference and ongoing operation the dominant long-term expense.
  • The verification burden grows with scale: as AI outputs proliferate, human oversight becomes increasingly critical to prevent errors from propagating, such as false positives in automated AML flagging.
  • Architectural shifts toward cognitive or neurosymbolic systems—emphasizing reasoning, verifiability, and localized deployment—offer a path to reduce energy use and improve reliability versus brute-force scaling.
  • Blockchain-enabled, decentralized AI concepts may distribute data, models, and computing resources more broadly, potentially lowering concentration risk and aligning deployment with local needs.

Sentiment: Neutral

Market context: The convergence of AI with crypto analytics and DeFi tooling sits amid broader questions about energy consumption, regulation, and the governance of automated decision-making. As AI tools increasingly monitor on-chain activity, assess sentiment, and assist in smart-contract development, the industry faces a tighter coupling between performance, verification, and accountability.

Why it matters

The debate over AI scaling is not a theoretical one—it touches the core of how and where AI is deployed in high-stakes sectors. Large language models (LLMs) have grown fluent by pattern-matching across vast text corpora, enabling impressive capabilities but not necessarily robust, reliable reasoning. As these systems become embedded in legal workflows, financial risk management, and crypto operations, the consequences of incorrect outputs become less tolerable and more costly.

Training frontier AI models remains a mission-critical and expensive endeavor. Independent analyses suggest that the cumulative cost of training can be immense, with credible voices estimating that a single training run could cross the $1 billion threshold in the near future. Yet even more consequential is the ongoing cost of inference—running models at scale with low latency, high uptime, and rigorous verification requirements. Each query consumes energy, and each deployment necessitates infrastructure. As usage expands, energy use compounds, pressuring both operators and grids alike. In crypto contexts, AI systems increasingly monitor on-chain activity, analyze sentiment, generate code for smart contracts, flag suspicious transactions, and automate decision-making; missteps here can move capital and undermine trust across markets.

Advertisement

The industry is beginning to recognize that fluency alone is insufficient. When AI can produce convincing but incorrect conclusions, verification burdens intensify. False positives in AML flagging, for instance, have been documented as a practical drag on resources, diverting investigators from genuine activity. This dynamic underscores why a shift toward architectures that integrate cause-and-effect reasoning, explicit rules, and self-checking mechanisms is gaining traction. Cognitive AI and neurosymbolic approaches—where knowledge is structured into interrelated concepts and reasoning can be revisited and audited—promise higher reliability with lower energy demands than brute-force scaling.

Beyond the architecture, there is a broader trend toward decentralization of AI development itself. Some platforms explore blockchain-enabled models for contributing data, models, and computing resources, reducing concentration risk and aligning deployment with local needs. In a field where room for error is small and the stakes are high, the ability to inspect, audit, and shape AI systems matters just as much as the outputs they produce. The turning point is clear: scaling for the sake of scale may no longer be sufficient. The industry must invest in architectures that make intelligence more reliable, verifiable, and controlled by communities rather than distant, centralized infrastructure.

As AI considerations bleed into crypto workflows, the stakes grow sharper. On-chain monitoring, sentiment analysis for market signals, automated code generation for smart contracts, and risk-management automation are all increasingly dependent on AI, yet they demand a higher standard of trust. The tension between speed and accuracy—between fast, automated decisions and verifiable reasoning—will shape the next wave of crypto tooling and governance. The upshot is not simply bigger models; it is better systems that can reason about their own steps, explain conclusions, and operate within clear constraints.

Ultimately, the industry faces an inflection point. If architecture and reasoning take precedence over sheer scale, AI could become more affordable to operate, while remaining safer and more controllable. The era of growth-at-any-cost may yield to a more deliberate phase where wealth creation in AI and crypto hinges on transparent verification, resilient design, and decentralized collaboration. The author argues that the path forward lies in rethinking how intelligence is built and deployed—prioritizing robust reasoning and governance over incremental increases in parameter counts.

Advertisement

What to watch next

  • Regulatory and policy developments around AI safety, auditing, and accountability in finance and crypto.
  • Advances in cognitive AI and neurosymbolic architectures, including practical deployments on edge devices and local servers.
  • Decentralized AI initiatives that use blockchain-inspired models to distribute data, models, and computing resources.
  • Shifts in data-center capacity, energy pricing, and grid infrastructure tied to AI-enabled demand.
  • New benchmarks or case studies illustrating the trade-offs between scale, reasoning, and verification in real-world crypto applications.

Sources & verification

  • Energy demand from AI: IEA, Energy and AI — energy demand from AI.
  • U.S. data-center power demand projections: Pew Research Center / energy use at US data centers amid the AI boom.
  • UK legal AI cautionary note: Guardian article on the High Court warning against AI-generated fabricated case law in legal filings (June 2025).
  • AML false positives and AI risk: IBM Think topics on AI fraud detection in banking and related AML flagging issues.
  • Costs to train frontier AI models and ongoing inference costs: Epoch AI blog and Digital Experience Live analyses.
  • On-chain and crypto AI applications: efforts around Ethereum and on-chain tooling that leverage AI signals (as referenced in industry coverage).

Rethinking AI scaling: energy, reasoning, and the crypto interface

Artificial intelligence has long scaled on a simple premise—more data, bigger models, faster hardware would continually unlock better performance and lower costs. The latest economic and technical signals, however, suggest a pivot. Energy and capital intensity are rising faster than anticipated, with global data-center electricity demand projected to more than double by 2030. In the United States alone, data-center power consumption is expected to rise by more than 100% before the decade ends, a trajectory that will require massive investments in grid capacity and infrastructure as AI becomes embedded in critical sectors, including markets, compliance, and on-chain activity monitoring.

Training frontier AI models remains extraordinarily expensive, with credible estimates pointing to costs that could top $1 billion per training run. Yet even more consequential is the ongoing cost of inference—sustained, low-latency operation that must deliver results with high reliability. In markets and crypto, AI systems are increasingly used to monitor on-chain activity, analyze sentiment, generate smart-contract code, flag suspicious transactions, and automate governance decisions. The result is a double exposure: the potential for rapid, data-driven signals coupled with the risk of false signals that can misallocate capital or mischaracterize risk. Notably, false positives in automated AML flagging illustrate how unreliable outputs can waste human resources and erode trust when deployed widely.

To address these pressures, the narrative is shifting away from sheer scale toward architectures that emphasize reasoning and verifiability. Cognitive AI and neurosymbolic approaches seek to braid pattern recognition with structured knowledge, rules, and self-checks. These systems aim to deliver usable reasoning traces and transparent decision processes, reducing the need for brute-force computation and enabling more predictable energy use. Early demonstrations suggest that local or edge deployments, supported by knowledge representations, could keep control with users and organizations rather than entrusting cognition to centralized, opaque infrastructure.

Decentralized AI models—where data, models, and computation can be contributed by diverse participants—offer another path to resilience. By distributing the workload and oversight, communities can mitigate concentration risk and tailor AI deployments to local needs. In this ecosystem, the role of governance becomes more pronounced: platforms must enable auditing, adjustment, and interoperability without compromising security or performance. The shift toward more sophisticated reasoning, coupled with a commitment to verifiable outcomes, marks a meaningful departure from scaling solely for scale’s sake. If the industry can operationalize cognitive architectures at scale, the economics of AI may improve—reducing both energy consumption per decision and the verification burden on human operators.

In the crypto arena, this evolution matters. The reliability of AI-assisted on-chain analytics, fraud detection, and smart-contract tooling will influence investor confidence and market integrity. The path forward requires not only bigger systems but smarter ones—systems whose inner workings can be inspected, challenged, and improved by a broad community. The debate is no longer about whether AI should grow, but how to grow it in a way that is auditable, trustworthy, and aligned with the needs of decentralized finance and broader digital markets.

Advertisement

Risk & affiliate notice: Crypto assets are volatile and capital is at risk. This article may contain affiliate links. Read full disclosure

Source link

Advertisement
Continue Reading

Crypto World

BTC remains modestly lower at $69,500 following in line inflation data

Published

on

U.S. inflation, Polkadot upgrade, Solstice-Kamino announcement: Crypto Week Ahead

U.S. inflation data met expectations on Wednesday, reinforcing anticipation that the Federal Reserve will keep interest rates steady not just at its March 18 meeting, but likely at the bank’s April meeting as well.

The Consumer Price Index (CPI) rose 0.3% in February, according to a report from the Bureau of Labor Statistics. Economist forecasts had been for a rise of 0.3% and January’s increase was 0.2%.

On a year-over-year basis, CPI was higher by 2.4% against expectations of 2.4% and January’s 2.4%.

Core CPI, which excludes food and energy costs, rose 0.2% in February versus forecasts of 0.2% and January’s 0.3%. Year-over-year core CPI was higher by 2.5% versus forecasts of 2.5% and January’s 2.5%.

Advertisement

Under modest pressure for the morning, bitcoin was trading at $69,500 in the minutes following the report, lower by 1.2% over the past 24 hours.

U.S. stock index futures were slightly lower across the board and the 10-year Treasury yield ticked up to 4.18%. The main actor in markets this week, WTI crude oil was higher by 4.2% to $87 per barrel.

Ahead of the data, markets were pricing in a 99% probability that the Federal Reserve would leave interest rates unchanged at its March meeting next week, according to the CME FedWatch tool. For the April meeting, rate cut odds were at just 11% versus 21% one month ago.

February’s inflation numbers, of course, are somewhat old news given the events that have transpired since, namely the war in Iran and spiking oil prices. How much this plays into the Fed’s thinking on interest rates should become more evident following next week’s policy meeting.

Advertisement

Source link

Continue Reading

Crypto World

Mining giant Foundry to introduce institutional zcash mining pool

Published

on

Mining difficulty drops by most since 2021 as miners capitulate

Foundry Digital, one of largest Bitcoin mining pools by hashrate, said it plans to introduce a zcash (ZEC) mining pool by next month, expanding beyond BTC and bringing a large institutional operator into the privacy-focused network.

With the new pool, Foundry aims to offer zcash miners a U.S.-based platform designed around compliance checks, reporting standards and operational controls often required by public companies and large firms.

The move addresses what Foundry describes as a gap in Zcash infrastructure. While the cryptocurrency has existed for nearly a decade, much of its mining ecosystem still consists of smaller global pools that often operate outside formal compliance frameworks.

“Zcash has matured into an institutional-grade asset, but the mining infrastructure supporting it hasn’t kept pace,” Foundry CEO Mike Colyer said in a statement shared with CoinDesk.

Advertisement

Betting on privacy

The expansion comes as privacy-focused cryptocurrencies regain attention across the market as new crypto tax reporting rules, with threat of asset seizure, kicked in across the European Union at the turn of the year and as onchain analysis keeps developing, leading to growing demand for financial anonymity.

Zcash, along with other privacy coins including monero (XMR) and dash (DASH) has seen renewed interest that has helped their prices surge. ZEC has seen significant outperformance, up more than 670% in the last 12 month period, compared XMR’s 72% rise in the same period, while DASH is up 51%.

ZEC’s outperformance can likely be attributed to its hybrid privacy model, which makes shielded – completely anonymous – transactions optional with selective disclosure. This means that transactions can be transparent for custody and exchanges, and attracted accumulation from a Winklevoss-backed treasury firm as well as into the Grayscale Zcash Trust.

Foundry’s shift toward zcash also likely reflects broader changes in mining economics. Bitcoin mining profitability has tightened following the 2024 halving, which cut block rewards in half while mining difficulty surged.

Advertisement

Speaking to CoinDesk, Coyler pushed back on the idea the move is primarily a response to lowering bitcoin margins.

“We evaluate opportunities based on where institutional infrastructure is needed, not on bitcoin margins at any given moment,” he said. “Foundry’s bitcoin mining business is strong and remains our core foundation.”

The expansion, Coyler said, was over an identified gap in compliant Zcash infrastructure. “Institutional and public miners who want exposure to zcash have had no US-based, compliant, purpose-built infrastructure to do it through,” he added.

As for whether the move shows a broader multi-chain strategy, Coyler said the company’s focus is “squarely on bitcoin and zcash” for now, though he added that Foundry is “always evaluating opportunities” that align with its mission and the demands of institutional miners.

Advertisement

While the price of bitcoin saw a major rise to near $125,000 late last year, its price has since corrected to now stand at $69,500. That has seen hashprice, a measure of expected value of 1TH/s of hashing power a day, drop from over $60 to $30 per petahash.

As margins shrink, many large mining firms have begun exploring other proof-of-work networks to diversify revenue.

Zcash mining infrastructure

Zcash launched in 2016 as a privacy-focused cryptocurrency built on zero-knowledge proof technology. The network allows users to send transactions on a public blockchain while keeping key details private. Using a cryptographic method known as zk-SNARKs, Zcash can verify that a transaction is valid without revealing the sender, receiver or amount involved.

Like Bitcoin, the Zcash network relies on proof-of-work mining to secure its blockchain and miners use specialized hardware to solve complex mathematical puzzles to help secure the network. When a miner or mining pool solves one of these puzzles, it adds a new block of transactions to the chain and earns a reward in newly issued ZEC tokens along with transaction fees.

Advertisement

Zcash blocks are produced about every 75 seconds, faster than bitcoin’s blocks which are produced every 10 minutes. Still, both shared a supply cap of 21 million coins. The mining process uses an algorithm called Equihash, which differs from Bitcoin’s SHA-256 and was designed to require large amounts of memory during computation.

Network difficulty, which helps the time between block production remain consistent, means the probability of solving a block alone is low. As a result miners bundle together in what are known as mining pools, in which participants combine computing power and share rewards based on how much work they contribute. Large pools can influence the stability and decentralization of a network because they control significant portions of its total hashrate.

Foundry’s zcash pool

Foundry said its zcash pool will include identity verification checks for participants through rigorous know-your-customer and anti-money laundering compliance, transparent payout calculations and reporting tools aimed at institutional users. It’ll feature a dedicated support team and its operations will be based in the United States.

The company plans to apply the same operational framework used by its bitcoin pool, which has undergone SOC 1 Type 2 and SOC 2 Type 2 compliance audits, it said.

Advertisement

Mining rewards will be distributed through transparent Zcash addresses, not shielded ones, the company said. The pool will be paying miners on a Pay Per Last N Shares (PPLNS) model, which Coyler said is “fully auditable” and provides detailed data supporting daily payment reconciliation.

Foundry didn’t disclose the fee for miners, saying only it will offer “competitive pool fee rates.” There will be no minimum hashrate threshold to join the pool, Coyler said, noting that the Zcash mining ecosystem is still emerging.

The company expects demand from miners that already operate in regulated environments such as North America. Many of those firms rely on formal reporting systems and compliance programs to meet corporate governance requirements.

If the zcash pool launches on schedule in 2026, it would mark one of the largest institutional entries into the Zcash mining ecosystem to date. Other major mining pools operating within it include F2Pool, 2Miners, and ViaBTC.

Advertisement

Source link

Continue Reading

Crypto World

Market Analysis: EUR/USD Reclaims Ground While USD/JPY Momentum Fades

Published

on

Market Analysis: EUR/USD Reclaims Ground While USD/JPY Momentum Fades

EUR/USD is recovering losses from 1.1500. USD/JPY is correcting gains from 159.00 and might decline further if it stays below 158.30.

Important Takeaways for EUR/USD and USD/JPY Analysis Today

  • The Euro struggled to stay in a positive zone and declined below 1.1700 before finding support.
  • There was a break above a connecting bearish trend line with resistance at 1.1580 on the hourly chart of EUR/USD at FXOpen.
  • USD/JPY started a decent increase above 157.00 before the bears appeared near 158.90.
  • There is a key contracting triangle forming with resistance near 158.30 on the hourly chart at FXOpen.

EUR/USD Technical Analysis

On the hourly chart of EUR/USD at FXOpen, the pair started a fresh decline from 1.1825. The pair broke below 1.1665 and the 50-hour simple moving average. Finally, it tested the 1.1500 zone. A low was formed at 1.1507, and the pair is now recovering losses.

There was a move above 1.1550 and a connecting bearish trend line at 1.1580. The pair surpassed the 38.2% Fib retracement level of the downward move from the 1.1826 swing high to the 1.1507 low. On the upside, the pair is now facing resistance near the 50% Fib retracement at 1.1665.

The first major hurdle for the bulls could be 1.1705. A break above 1.1705 could set the pace for another increase. In the stated case, the pair might rise toward 1.1775.

If not, the pair might drop again. Immediate support is near the 50-hour simple moving average and 1.1620. The next key area of interest might be 1.1565. If there is a downside break below 1.1565, the pair could drop towards 1.1505. The main target for the bears on the EUR/USD chart could be 1.1440, below which the pair could start a major decline.

Advertisement

USD/JPY Technical Analysis

On the hourly chart of USD/JPY at FXOpen, the pair gained pace for a move above 158.00. The US dollar even traded close to 159.00 against the Japanese yen before the bears emerged.

A high was formed at 158.90 before a downside correction. The pair dipped below 158.00 and the 50% Fib retracement level of the upward move from the 156.45 swing low to the 158.90 high. However, the bulls were active above 157.00 and protected the 61.8% Fib retracement.

The pair is back above the 50-hour simple moving average and 158.00. Immediate resistance on the USD/JPY chart is near 158.30. There is also a key contracting triangle at 158.30.

If there is a close above the triangle and the hourly RSI moves above 65, the pair could rise towards 158.90. The next major barrier for the bulls could be 159.25, above which the pair could test 160.00 in the near term.

On the downside, the first major support is near 158.00. The next key region for the bears might be 157.40. If there is a close below 157.40, the pair could decline steadily. In the stated case, the pair might drop towards 156.45. Any more losses might send the pair toward 155.85.

Advertisement

Trade over 50 forex markets 24 hours a day with FXOpen. Take advantage of low commissions, deep liquidity, and spreads from 0.0 pips (additional fees may apply). Open your FXOpen account now or learn more about trading forex with FXOpen.

This article represents the opinion of the Companies operating under the FXOpen brand only. It is not to be construed as an offer, solicitation, or recommendation with respect to products and services provided by the Companies operating under the FXOpen brand, nor is it to be considered financial advice.

Source link

Advertisement
Continue Reading

Crypto World

Scaling AI Makes It Riskier

Published

on

Scaling AI Makes It Riskier

Opinion by: Mohammed Marikar, co-founder at Neem Capital

Artificial intelligence has consistently been defined by scale, so far — bigger models, faster processing, expanding data centers. The assumption, based on traditional technology cycles, was that scale would keep improving performance and, over time, costs would fall and access would expand.

That assumption is now breaking down. AI is not scaling like other software. Instead, it is capital-intensive, constrained by physical limits, and hitting diminishing returns far earlier than expected.

The numbers make this clear. Electricity demand from global data centers will more than double by 2030 — levels once associated with entire industrial sectors. In the US alone, data center power demand is projected to rise well over 100 percent before the decade ends. This expansion is demanding trillions of dollars in new investment alongside major expansions in grid capacity.

Advertisement

Meanwhile, these systems are being embedded into law, finance, compliance, trading and risk management, where errors propagate quickly but credibility is non-negotiable. In June 2025, the UK High Court warned lawyers to immediately stop submitting filings that cited fabricated case law generated by AI tools.

The scaling AI debate

When an AI system can invent a precedent that never existed, and a professional relies on it, debates about scaling start becoming serious questions of public trust. Scaling is amplifying AI’s weaknesses rather than solving them.

Part of the problem lies in what scale actually improves. Large language models (LLMs) are evolving to become increasingly fluent because language is pattern-based. The more examples an LLM sees of how real people write, summarize and translate, the faster it improves.

Deeper intelligence — reasoning — does not scale the same way. The next generation of AI must understand cause and effect and know when an answer is uncertain or incomplete. It will need to explain why a conclusion follows, not simply produce a confident response. This does not reliably improve with more parameters or more compute.

Advertisement

The consequence is a growing verification burden. Humans must spend more time checking machine output rather than acting on it, and that burden builds as systems are deployed more widely.

The cost of training AI models

Training frontier AI models has already become extraordinarily expensive, with credible tracking suggesting costs have been multiplying year over year, and projections that single training runs could soon exceed $1 billion. Training is only the entry cost.

The larger expense is inference: running these models continuously, at scale, with real latency, uptime and verification requirements. Every query consumes energy. Every deployment requires infrastructure. As usage grows, energy use and costs compound.

In terms of markets and crypto, AI systems are increasingly used to monitor onchain activity, analyze sentiment, generate codes for smart contracts, flag suspicious transactions and automate decisions.

Advertisement

In such a fast-moving, competitive environment, fluent but unreliable AI propagates errors quickly; false signals move capital, and fabricated explanations and hallucinations undermine trust. One example of this is false positives being generated in automated Anti-Money Laundering (AML) flagging, a common issue that wastes time and resources on investigating innocent trading activity.

Time to improve reasoning

Scaling AI systems without improving their reasoning amplifies risk, especially in use cases where automation and credibility are vital and tightly coupled.

Ensuring AI is economically viable and socially valuable means we cannot rely on scaling. The dominant approach today prioritizes increasing compute and data while leaving the underlying reasoning machinery largely unchanged, a strategy that is becoming more expensive without becoming proportionally safer.

Related: Crypto dev launches website for agentic AI to ‘rent a human’

Advertisement

The alternative is architectural. Systems need to do more than predict the next word. They need to represent relationships, apply rules, check their own steps and make it possible to see how conclusions were reached.

This is where cognitive or neurosymbolic systems come into play. By organizing knowledge into interrelated concepts, rather than relying solely on brute-force pattern matching, these systems can deliver high reasoning capability with far lower energy and infrastructure demands.

Emerging “cognitive AI” platforms are demonstrating how structured reasoning systems can operate on local servers or edge devices, allowing users to keep control over their own knowledge rather than outsourcing cognition to distant infrastructure.

Cognitive AI systems are harder to design and can underperform on open-ended tasks, but when reasoning is reusable in this way rather than rederived from scratch through massive compute, costs fall and verification becomes tractable.

Advertisement

Control over how AI is built matters as much as how it reasons. Communities need systems they can shape, audit and deploy without waiting for permission from centralized platform owners.

Some platforms are exploring this frontier by using blockchain to enable both individuals and corporations to contribute data, models and computing resources. By decentralizing AI development itself, these approaches reduce concentration risk and align deployment with local needs rather than global demands.

AI faces an inflection point. When reasoning can be reused rather than rediscovered through massive pattern matching, systems require less compute per decision and impose a smaller verification burden on humans. That shifts the economics. Experimentation becomes cheaper, inference becomes more predictable. Scaling no longer depends on exponential increases in infrastructure.

Scaling has already done what it could. What it has exposed, just as clearly, is the limit relying on size alone. The question now is whether the industry keeps pushing scale or starts investing in architectures that make intelligence reliable before making it bigger.

Advertisement

Opinion by: Mohammed Marikar, co-founder at Neem Capital.