Connect with us

Crypto World

Revolutionising AI Application Development with Language Models

Published

on

Revolutionising AI Application Development with Language Models

by Gonzalo Wangüemert Villalba

4 September 2025

Introduction The open-source AI ecosystem reached a turning point in August 2025 when Elon Musk’s company xAI released Grok 2.5 and, almost simultaneously, OpenAI launched two new models under the names GPT-OSS-20B and GPT-OSS-120B. While both announcements signalled a commitment to transparency and broader accessibility, the details of these releases highlight strikingly different approaches to what open AI should mean. This article explores the architecture, accessibility, performance benchmarks, regulatory compliance and wider industry impact of these three models. The aim is to clarify whether xAI’s Grok or OpenAI’s GPT-OSS family currently offers more value for developers, businesses and regulators in Europe and beyond. What Was Released Grok 2.5, described by xAI as a 270 billion parameter model, was made available through the release of its weights and tokenizer. These files amount to roughly half a terabyte and were published on Hugging Face. Yet the release lacks critical elements such as training code, detailed architectural notes or dataset documentation. Most importantly, Grok 2.5 comes with a bespoke licence drafted by xAI that has not yet been clearly scrutinised by legal or open-source communities. Analysts have noted that its terms could be revocable or carry restrictions that prevent the model from being considered genuinely open source. Elon Musk promised on social media that Grok 3 would be published in the same manner within six months, suggesting this is just the beginning of a broader strategy by xAI to join the open-source race. By contrast, OpenAI unveiled GPT-OSS-20B and GPT-OSS-120B on 5 August 2025 with a far more comprehensive package. The models were released under the widely recognised Apache 2.0 licence, which is permissive, business-friendly and in line with requirements of the European Union’s AI Act. OpenAI did not only share the weights but also architectural details, training methodology, evaluation benchmarks, code samples and usage guidelines. This represents one of the most transparent releases ever made by the company, which historically faced criticism for keeping its frontier models proprietary. Architectural Approach The architectural differences between these models reveal much about their intended use. Grok 2.5 is a dense transformer with all 270 billion parameters engaged in computation. Without detailed documentation, it is unclear how efficiently it handles scaling or what kinds of attention mechanisms are employed. Meanwhile, GPT-OSS-20B and GPT-OSS-120B make use of a Mixture-of-Experts design. In practice this means that although the models contain 21 and 117 billion parameters respectively, only a small subset of those parameters are activated for each token. GPT-OSS-20B activates 3.6 billion and GPT-OSS-120B activates just over 5 billion. This architecture leads to far greater efficiency, allowing the smaller of the two to run comfortably on devices with only 16 gigabytes of memory, including Snapdragon laptops and consumer-grade graphics cards. The larger model requires 80 gigabytes of GPU memory, placing it in the range of high-end professional hardware, yet still far more efficient than a dense model of similar size. This is a deliberate choice by OpenAI to ensure that open-weight models are not only theoretically available but practically usable. Documentation and Transparency The difference in documentation further separates the two releases. OpenAI’s GPT-OSS models include explanations of their sparse attention layers, grouped multi-query attention, and support for extended context lengths up to 128,000 tokens. These details allow independent researchers to understand, test and even modify the architecture. By contrast, Grok 2.5 offers little more than its weight files and tokenizer, making it effectively a black box. From a developer’s perspective this is crucial: having access to weights without knowing how the system was trained or structured limits reproducibility and hinders adaptation. Transparency also affects regulatory compliance and community trust, making OpenAI’s approach significantly more robust. Performance and Benchmarks Benchmark performance is another area where GPT-OSS models shine. According to OpenAI’s technical documentation and independent testing, GPT-OSS-120B rivals or exceeds the reasoning ability of the company’s o4-mini model, while GPT-OSS-20B achieves parity with the o3-mini. On benchmarks such as MMLU, Codeforces, HealthBench and the AIME mathematics tests from 2024 and 2025, the models perform strongly, especially considering their efficient architecture. GPT-OSS-20B in particular impressed researchers by outperforming much larger competitors such as Qwen3-32B on certain coding and reasoning tasks, despite using less energy and memory. Academic studies published on arXiv in August 2025 highlighted that the model achieved nearly 32 per cent higher throughput and more than 25 per cent lower energy consumption per 1,000 tokens than rival models. Interestingly, one paper noted that GPT-OSS-20B outperformed its larger sibling GPT-OSS-120B on some human evaluation benchmarks, suggesting that sparse scaling does not always correlate linearly with capability. In terms of safety and robustness, the GPT-OSS models again appear carefully designed. They perform comparably to o4-mini on jailbreak resistance and bias testing, though they display higher hallucination rates in simple factual question-answering tasks. This transparency allows researchers to target weaknesses directly, which is part of the value of an open-weight release. Grok 2.5, however, lacks publicly available benchmarks altogether. Without independent testing, its actual capabilities remain uncertain, leaving the community with only Musk’s promotional statements to go by. Regulatory Compliance Regulatory compliance is a particularly important issue for organisations in Europe under the EU AI Act. The legislation requires general-purpose AI models to be released under genuinely open licences, accompanied by detailed technical documentation, information on training and testing datasets, and usage reporting. For models that exceed systemic risk thresholds, such as those trained with more than 10²⁵ floating point operations, further obligations apply, including risk assessment and registration. Grok 2.5, by virtue of its vague licence and lack of documentation, appears non-compliant on several counts. Unless xAI publishes more details or adapts its licensing, European businesses may find it difficult or legally risky to adopt Grok in their workflows. GPT-OSS-20B and 120B, by contrast, seem carefully aligned with the requirements of the AI Act. Their Apache 2.0 licence is recognised under the Act, their documentation meets transparency demands, and OpenAI has signalled a commitment to provide usage reporting. From a regulatory standpoint, OpenAI’s releases are safer bets for integration within the UK and EU. Community Reception The reception from the AI community reflects these differences. Developers welcomed OpenAI’s move as a long-awaited recognition of the open-source movement, especially after years of criticism that the company had become overly protective of its models. Some users, however, expressed frustration with the mixture-of-experts design, reporting that it can lead to repetitive tool-calling behaviours and less engaging conversational output. Yet most acknowledged that for tasks requiring structured reasoning, coding or mathematical precision, the GPT-OSS family performs exceptionally well. Grok 2.5’s release was greeted with more scepticism. While some praised Musk for at least releasing weights, others argued that without a proper licence or documentation it was little more than a symbolic gesture designed to signal openness while avoiding true transparency. Strategic Implications The strategic motivations behind these releases are also worth considering. For xAI, releasing Grok 2.5 may be less about immediate usability and more about positioning in the competitive AI landscape, particularly against Chinese developers and American rivals. For OpenAI, the move appears to be a balancing act: maintaining leadership in proprietary frontier models like GPT-5 while offering credible open-weight alternatives that address regulatory scrutiny and community pressure. This dual strategy could prove effective, enabling the company to dominate both commercial and open-source markets. Conclusion Ultimately, the comparison between Grok 2.5 and GPT-OSS-20B and 120B is not merely technical but philosophical. xAI’s release demonstrates a willingness to participate in the open-source movement but stops short of true openness. OpenAI, on the other hand, has set a new standard for what open-weight releases should look like in 2025: efficient architectures, extensive documentation, clear licensing, strong benchmark performance and regulatory compliance. For European businesses and policymakers evaluating open-source AI options, GPT-OSS currently represents the more practical, compliant and capable choice.  In conclusion, while both xAI and OpenAI contributed to the momentum of open-source AI in August 2025, the details reveal that not all openness is created equal. Grok 2.5 stands as an important symbolic release, but OpenAI’s GPT-OSS family sets the benchmark for practical usability, compliance with the EU AI Act, and genuine transparency.

Source link

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Crypto World

Fluid Proposes Establishing a Foundation Funded by $3M Annual Grant From DAO

Published

on

the-defiant

If approved, the governance proposal by Instadapp’s COO would establish a non-profit foundation to oversee the DeFi protocol’s code, frontend and trademarks.

Fluid DAO is considering a proposal to transfer all of the DeFi platform’s intellectual property into a Cayman Islands foundation, and to approve a $250,000 monthly grant to fund development and operations.

The proposal was submitted on Monday, Feb. 23, by DMH, the COO of Instadapp, the firm behind Fluid. It calls for the creation of the Fluid Foundation governed by DAO votes, a familiar corporate setup for crypto organizations.

Under the plan, “all Fluid Protocol smart contract code,” front-end interfaces, domains, trademarks and related assets would be transferred to the foundation. Once completed, the assets would “belong to the Foundation — not to any individual, company, or labs entity,” DMH wrote.

Advertisement

The foundation would have no owners and would operate through custodians and directors, according to the proposal. Its sole purpose would be to hold and steward the protocol’s intellectual property on behalf of the DAO.

“The Fluid team acts as custodians of the Foundation — not owners,” the proposal states, with FLUID token holders retaining “ultimate authority” through governance.

Control Stays with DAO

The proposal argues that a legal entity is needed as the protocol, which now has over $1 billion in total value locked (TVL), expands and engages with off-chain counterparties. A foundation structure would allow Fluid to meet “AML, KYC, banking, and regulatory requirements” without altering how token-based governance functions, the proposal argues.

Token holders would also retain the power to change foundation policy or shut it down entirely. The proposal says holders could “in an extreme case, dissolve the Foundation entirely through a governance vote.” DMH further elaborated in a response to a comment on the proposal:

Advertisement

“It is very important to understand that in the legal field, token holders and DAO have no rights; this is why we are creating a legal wrapper that can now have ownership rights over the protocol, and this foundation has no ownership.”

To fund the structure, the DAO is being asked to approve a $250,000 monthly grant, or about $3 million a year from its treasury, which is funded by protocol revenue. The budget would cover engineering, infrastructure, security, business development and general operational costs, according to DMH.

‘Foundation Bears the Legal Costs’

Fluid operates a decentralized lending and borrowing protocol, as well as a swap interface. According to data from DefiLlama, that combination has brought Fluid roughly $1.2 billion in TVL and generated about $1.1 million in revenue in January. In August, the platform saw a record high revenue of $1.52 million. Taking Fluid’s best revenue month yet, the grant would consume around 16% of that monthly revenue.

the-defiant
Fluid’s TVL and revenue. Source: DefiLlama

If approved, legal work to transfer the IP is expected to be completed by mid-2026, with Cayman Islands counsel handling the process. The team also plans to move ownership of all EVM deployments under direct DAO governance.

Some raised concerns about liability if the foundation were sued. In response, DMH said that “if the foundation gets sued, the foundation itself bears the legal costs and any liability.”

Over the past 24 hours, FLUID slid 6% from around $2 to $1.88, but has since recovered to $1.96.

Advertisement

The Defiant reached out to Instadapp for comments on the proposal, but hasn’t heard back by press time.

Late last year, a fee-related dispute between the two main entities behind Aave — Aave Labs and Aave DAO — turned into a broader debate on how crypto organizations should be structured.

Source link

Advertisement
Continue Reading

Crypto World

Tom Lee’s ETH losses at Bitmine exceed FTX customer losses

Published

on

Tom Lee’s ETH losses at Bitmine exceed FTX customer losses

Tom Lee, founder of Fundstrat and Chairman of ether (ETH) treasury company Bitmine Immersion Technologies, has lost more on ETH using other people’s money than the $8 billion worth of losses suffered by FTX customers.

With 4,422,659 ETH purchased at an average $3,850 apiece, Lee’s company raised capital to buy the asset at over $2,000 more per coin than today’s price.

As a result, he’s lost $8.8 billion of his company’s assets.

At time of writing, ETH is trading at $1,843, down 60% over the past six months alone. Unfortunately, Bitmine Immersion has been buying tons of ETH over that bearish period — increasing losses for its investors at an alarming rate. 

Over the past six months, as ETH was declining 60%, Bitmine Immersion bought an extra 2,708,760 ETH. 

Those progressively disastrous additions increased the company’s losses from $4.8 billion to $8.8 billion.

Read more: Even Ethereum treasury companies are selling ETH to pay off debt

Advertisement

Bitmine Immersion lost $8.8 billion by buying ETH

It’s not particularly remarkable for digital asset treasury (DAT) companies to have declined in value.

The Wall Street fad, which peaked in early summer 2025, was to overpay for leverage in the hope that the mania would increase to even more exuberant heights, or that the company could convince bond investors or other capital allocators to offer it even more leverage.

In the distant future, all DATs focused on the ultimately limited supply of bitcoin (BTC) or ETH as another reason to invest in these leveraged acquisition strategies, even though their efforts to corner the market usually fizzled within single digit percentages of the outstanding supply of those assets.

What started as modest premiums of a few percentage points quickly ballooned into stock debuts rallying to 23x the value of their crypto holdings.

Advertisement

That once-23x overvalued stock, like many similar treasury stocks, fell 98% by November from its May peak, and is now down over 99%.

Bitmine Immersion is down 88% from its July 2025 high. It’s lost over $600 million on its ETH holdings in the past week.

Within five months of its June 3, 2025 peak, Lee’s company had shed 80% of its stock value. By February 5 of this year, Lee’s ETH treasury had lost $8 billion for investors, and that loss extended to as much as $9 billion intraday this morning. 

Got a tip? Send us an email securely via Protos Leaks. For more informed news, follow us on X, Bluesky, and Google News, or subscribe to our YouTube channel.

Advertisement

Source link

Advertisement
Continue Reading

Crypto World

Coinbase Opens Commission-Free Stock and ETF Trading to All US Users

Published

on

Coinbase Opens Commission-Free Stock and ETF Trading to All US Users

Coinbase has opened stock and exchange-traded fund trading to all US users, allowing customers to buy and sell equities alongside crypto within the same app on a 24/5 basis. The rollout includes commission-free trading, fractional shares, and instant funding with USD or USDC. 

According to a company post on Tuesday, thousands of stocks are available to trade 24 hours a day, five days a week, with approximately 6,000 securities currently supported and plans to expand that number in the coming weeks.

Coinbase said it aims to introduce stock perpetual futures for non-US users through Coinbase Bermuda Ltd., subject to regulatory approval, and said it intends to offer tokenized equities in the future.

Today’s announcement comes on the heels of Coinbase expanding its prediction markets offering to all 50 US states last month through a partnership with Kalshi, allowing users to trade contracts tied to real-world events across sports, politics and culture. 

Advertisement

Brian Armstrong, CEO of Coinbase, posted the news today on X, writing “The everything exchange is growing.”

Source: Brian Armstrong

Related: WisdomTree gets SEC approval for round-the-clock trading of tokenized MMF

Tokenized equities gain traction from crypto platforms to Wall Street

Tokenized equities, blockchain-based representations of traditional shares, have emerged as a major theme in crypto over the past year.

In June, more than 60 tokenized stocks became available on crypto exchanges Kraken and Bybit, as well as on Solana-based DeFi platforms. The rollout, led by Backed Finance through its xStocks product, gave users blockchain-based exposure to major companies including Apple, Amazon, Tesla, Nvidia, Meta, Coinbase and Robinhood.

In October, fintech Robinhood expanded its own tokenization program on the Arbitrum blockchain, adding 80 new stock tokens and bringing its total to 493 tokenized assets.

Advertisement

While crypto-native and fintech platforms have led recent rollouts, interest in tokenized equities now extends to some of the world’s largest exchanges.

In September, Nasdaq filed with the US Securities and Exchange Commission (SEC) seeking approval to list tokenized equities, and in November, the exchange’s head of digital assets strategy, Matt Savarese, told CNBC that securing SEC approval to list tokenized versions of exchange-listed stocks is a top priority for the company.

In January, the New York Stock Exchange and its parent company, Intercontinental Exchange, announced plans to develop a platform for trading tokenized stocks and ETFs. The proposed system would support 24/7 trading and instant settlement by combining NYSE’s Pillar matching engine with blockchain-based post-trade infrastructure.

Advertisement

Coinbase also today announced a partnership with Yahoo Finance to enable users to move from researching an asset on Yahoo Finance to executing a trade on Coinbase with one click. Yahoo Finance will incorporate real-time information from Coinbase for asset discovery and tracking.

The US-based exchange said Coinbase One members can earn rewards on USDC (USDC) balances used for trading, and Yahoo Finance users will be offered a one-month trial of Coinbase One Basic as part of the partnership.

Magazine: Is China hoarding gold so yuan becomes global reserve instead of USD?