Connect with us

Crypto World

Revolutionising Advanced Problem-Solving with AI

Published

on

Revolutionising Advanced Problem-Solving with AI

by Gonzalo Wangüemert Villalba

4 September 2025

Introduction The open-source AI ecosystem reached a turning point in August 2025 when Elon Musk’s company xAI released Grok 2.5 and, almost simultaneously, OpenAI launched two new models under the names GPT-OSS-20B and GPT-OSS-120B. While both announcements signalled a commitment to transparency and broader accessibility, the details of these releases highlight strikingly different approaches to what open AI should mean. This article explores the architecture, accessibility, performance benchmarks, regulatory compliance and wider industry impact of these three models. The aim is to clarify whether xAI’s Grok or OpenAI’s GPT-OSS family currently offers more value for developers, businesses and regulators in Europe and beyond. What Was Released Grok 2.5, described by xAI as a 270 billion parameter model, was made available through the release of its weights and tokenizer. These files amount to roughly half a terabyte and were published on Hugging Face. Yet the release lacks critical elements such as training code, detailed architectural notes or dataset documentation. Most importantly, Grok 2.5 comes with a bespoke licence drafted by xAI that has not yet been clearly scrutinised by legal or open-source communities. Analysts have noted that its terms could be revocable or carry restrictions that prevent the model from being considered genuinely open source. Elon Musk promised on social media that Grok 3 would be published in the same manner within six months, suggesting this is just the beginning of a broader strategy by xAI to join the open-source race. By contrast, OpenAI unveiled GPT-OSS-20B and GPT-OSS-120B on 5 August 2025 with a far more comprehensive package. The models were released under the widely recognised Apache 2.0 licence, which is permissive, business-friendly and in line with requirements of the European Union’s AI Act. OpenAI did not only share the weights but also architectural details, training methodology, evaluation benchmarks, code samples and usage guidelines. This represents one of the most transparent releases ever made by the company, which historically faced criticism for keeping its frontier models proprietary. Architectural Approach The architectural differences between these models reveal much about their intended use. Grok 2.5 is a dense transformer with all 270 billion parameters engaged in computation. Without detailed documentation, it is unclear how efficiently it handles scaling or what kinds of attention mechanisms are employed. Meanwhile, GPT-OSS-20B and GPT-OSS-120B make use of a Mixture-of-Experts design. In practice this means that although the models contain 21 and 117 billion parameters respectively, only a small subset of those parameters are activated for each token. GPT-OSS-20B activates 3.6 billion and GPT-OSS-120B activates just over 5 billion. This architecture leads to far greater efficiency, allowing the smaller of the two to run comfortably on devices with only 16 gigabytes of memory, including Snapdragon laptops and consumer-grade graphics cards. The larger model requires 80 gigabytes of GPU memory, placing it in the range of high-end professional hardware, yet still far more efficient than a dense model of similar size. This is a deliberate choice by OpenAI to ensure that open-weight models are not only theoretically available but practically usable. Documentation and Transparency The difference in documentation further separates the two releases. OpenAI’s GPT-OSS models include explanations of their sparse attention layers, grouped multi-query attention, and support for extended context lengths up to 128,000 tokens. These details allow independent researchers to understand, test and even modify the architecture. By contrast, Grok 2.5 offers little more than its weight files and tokenizer, making it effectively a black box. From a developer’s perspective this is crucial: having access to weights without knowing how the system was trained or structured limits reproducibility and hinders adaptation. Transparency also affects regulatory compliance and community trust, making OpenAI’s approach significantly more robust. Performance and Benchmarks Benchmark performance is another area where GPT-OSS models shine. According to OpenAI’s technical documentation and independent testing, GPT-OSS-120B rivals or exceeds the reasoning ability of the company’s o4-mini model, while GPT-OSS-20B achieves parity with the o3-mini. On benchmarks such as MMLU, Codeforces, HealthBench and the AIME mathematics tests from 2024 and 2025, the models perform strongly, especially considering their efficient architecture. GPT-OSS-20B in particular impressed researchers by outperforming much larger competitors such as Qwen3-32B on certain coding and reasoning tasks, despite using less energy and memory. Academic studies published on arXiv in August 2025 highlighted that the model achieved nearly 32 per cent higher throughput and more than 25 per cent lower energy consumption per 1,000 tokens than rival models. Interestingly, one paper noted that GPT-OSS-20B outperformed its larger sibling GPT-OSS-120B on some human evaluation benchmarks, suggesting that sparse scaling does not always correlate linearly with capability. In terms of safety and robustness, the GPT-OSS models again appear carefully designed. They perform comparably to o4-mini on jailbreak resistance and bias testing, though they display higher hallucination rates in simple factual question-answering tasks. This transparency allows researchers to target weaknesses directly, which is part of the value of an open-weight release. Grok 2.5, however, lacks publicly available benchmarks altogether. Without independent testing, its actual capabilities remain uncertain, leaving the community with only Musk’s promotional statements to go by. Regulatory Compliance Regulatory compliance is a particularly important issue for organisations in Europe under the EU AI Act. The legislation requires general-purpose AI models to be released under genuinely open licences, accompanied by detailed technical documentation, information on training and testing datasets, and usage reporting. For models that exceed systemic risk thresholds, such as those trained with more than 10²⁵ floating point operations, further obligations apply, including risk assessment and registration. Grok 2.5, by virtue of its vague licence and lack of documentation, appears non-compliant on several counts. Unless xAI publishes more details or adapts its licensing, European businesses may find it difficult or legally risky to adopt Grok in their workflows. GPT-OSS-20B and 120B, by contrast, seem carefully aligned with the requirements of the AI Act. Their Apache 2.0 licence is recognised under the Act, their documentation meets transparency demands, and OpenAI has signalled a commitment to provide usage reporting. From a regulatory standpoint, OpenAI’s releases are safer bets for integration within the UK and EU. Community Reception The reception from the AI community reflects these differences. Developers welcomed OpenAI’s move as a long-awaited recognition of the open-source movement, especially after years of criticism that the company had become overly protective of its models. Some users, however, expressed frustration with the mixture-of-experts design, reporting that it can lead to repetitive tool-calling behaviours and less engaging conversational output. Yet most acknowledged that for tasks requiring structured reasoning, coding or mathematical precision, the GPT-OSS family performs exceptionally well. Grok 2.5’s release was greeted with more scepticism. While some praised Musk for at least releasing weights, others argued that without a proper licence or documentation it was little more than a symbolic gesture designed to signal openness while avoiding true transparency. Strategic Implications The strategic motivations behind these releases are also worth considering. For xAI, releasing Grok 2.5 may be less about immediate usability and more about positioning in the competitive AI landscape, particularly against Chinese developers and American rivals. For OpenAI, the move appears to be a balancing act: maintaining leadership in proprietary frontier models like GPT-5 while offering credible open-weight alternatives that address regulatory scrutiny and community pressure. This dual strategy could prove effective, enabling the company to dominate both commercial and open-source markets. Conclusion Ultimately, the comparison between Grok 2.5 and GPT-OSS-20B and 120B is not merely technical but philosophical. xAI’s release demonstrates a willingness to participate in the open-source movement but stops short of true openness. OpenAI, on the other hand, has set a new standard for what open-weight releases should look like in 2025: efficient architectures, extensive documentation, clear licensing, strong benchmark performance and regulatory compliance. For European businesses and policymakers evaluating open-source AI options, GPT-OSS currently represents the more practical, compliant and capable choice.  In conclusion, while both xAI and OpenAI contributed to the momentum of open-source AI in August 2025, the details reveal that not all openness is created equal. Grok 2.5 stands as an important symbolic release, but OpenAI’s GPT-OSS family sets the benchmark for practical usability, compliance with the EU AI Act, and genuine transparency.

Source link

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Crypto World

Zama Token Debuts at $400 Milion Valuation

Published

on

ZAMA Chart

ZAMA is currently trading 30% below its ICO price.

Zama’s highly anticipated $ZAMA token has made headlines as the first production-scale use of Fully Homomorphic Encryption (FHE) on the Ethereum mainnet.

However, the token is currently trading at $0.035, marking a 30% decrease from its initial coin offering (ICO) price).

ZAMA Chart
ZAMA Chart

Zama’s auction format was notable for its confidentiality features. The token sale raised $118.5 million through a sealed-bid Dutch auction, using Zama’s technology to protect the privacy of participants’ bids.

Zama’s focus on FHE is part of a broader strategy to enable confidential smart contracts on Ethereum. This technology enables computation on encrypted data without first decrypting it, enhancing privacy for blockchain applications.

Advertisement

This article was generated with the assistance of AI workflows.

Source link

Continue Reading

Crypto World

Trump-Linked World Liberty Financial Draws House Scrutiny After $500M UAE Stake Revealed

Published

on

❌

A US House investigation has turned its focus to World Liberty Financial, a Trump-linked crypto venture.

The move follows a recent Wall Street Journal report of a $500M UAE-linked stake agreed shortly before President Donald Trump’s inauguration.

Rep. Ro Khanna, a Democrat from California and the ranking member of the House Select Committee on the Chinese Communist Party, on Wednesday sent a letter to World Liberty co-founder Zach Witkoff seeking ownership records, payment details and internal communications tied to the reported deal and related transactions.

Khanna wrote that the Journal reported “lieutenants to an Abu Dhabi royal secretly signed a deal with the Trump Family to purchase a 49% stake in their fledgling cryptocurrency venture [World Liberty Financial] for half a billion dollars” shortly before Trump took office.

Advertisement

He argued the reported investment raises questions about conflicts of interest, national security and whether US technology policy shifted in ways that benefited foreign capital tied to strategic priorities.

Meanwhile, Trump has said he had no knowledge of the deal. Speaking to reporters on Monday, he said he was not aware of the transaction and noted that his sons and other family members manage the business and receive investments from various parties.

Crypto Venture Deal Draws Scurinty Over AI And National Security Policy Intersection

The letter also linked the reported stake to US export controls on advanced AI chips and concerns about diversion to China through third countries.

Khanna said the Journal report suggested the UAE-linked investment “may have resulted in significant changes to U.S. Government policies designed to prevent the diversion of advanced artificial intelligence chips and related computing capabilities to the People’s Republic of China.”

According to the Journal account cited in the letter, the agreement was signed by Eric Trump days before the inauguration.

Advertisement

The investor group was described as linked to Sheikh Tahnoon bin Zayed Al Nahyan, the UAE national security adviser. Two senior figures connected to his network later joined World Liberty’s board.

USD1 Stablecoin Use Raises Questions Over Influence And Profits

Khanna’s letter pointed to another UAE-linked deal involving World Liberty’s USD1 stablecoin, which he said was used to facilitate a $2B investment into Binance by MGX, an entity tied to Sheikh Tahnoon. He wrote that this use “helped catapult USD1 into one of the world’s largest stablecoins”, which could have increased fees and revenues for the project and its shareholders.

The lawmaker also connected the Binance investment to later policy developments, including chip export decisions and a presidential pardon for Binance founder Changpeng Zhao.

Advertisement

He cited a former pardon attorney who said, “The influence that money played in securing this pardon is unprecedented. The self-dealing aspect of the pardon in terms of the benefit that it conferred on President Trump, and his family, and people in his inner circle is also unprecedented.”

Khanna framed the overall picture as more than political optics. “Taken together, these arrangements are not just a scandal, but may even represent a violation of multiple laws and the United States Constitution,” he wrote, citing conflict-of-interest rules and the Constitution’s Foreign Emoluments Clause.

Khanna Warns Of National Security Stakes In WLFI Case

He asked World Liberty to answer detailed questions and produce documents by March 1, 2026, including agreements tied to the reported 49% stake, payment flows, communications with UAE-linked representatives, board appointments, due diligence and records tied to the USD1 stablecoin’s role in the Binance transaction.

Advertisement

Khanna also pressed for details on any discussions around export controls, US policy toward the UAE and strategic competition with China, as well as communications related to President Trump’s decision to pardon Zhao.

The probe lands at a moment when stablecoins sit closer to the center of market structure debates, and when politically connected crypto ventures face sharper questions about ownership, governance and access.

Khanna closed his letter with a warning about the stakes, writing, “Congress will not be supine amid this scandal and its unmistakable implications on our national security.”

The post Trump-Linked World Liberty Financial Draws House Scrutiny After $500M UAE Stake Revealed appeared first on Cryptonews.

Advertisement

Source link

Continue Reading

Crypto World

Feds Crypto Trace Gets Incognito Market Creator 30 Years

Published

on

Dark Markets, Court, Dark Web

The creator of Incognito Market, the online black market that used crypto as its economic heart, has been sentenced to 30 years in prison after some blockchain sleuthing led US authorities straight to the platform’s steward.

The Justice Department said on Wednesday that a Manhattan court gave Rui-Siang Lin three decades behind bars for owning and operating Incognito, which sold $105 million worth of illicit narcotics between its launch in October 2020 and its closure in March 2024.

Lin, who pleaded guilty to his role in December 2024, was sentenced for conspiring to distribute narcotics, money laundering, and conspiring to sell misbranded medication.

Incognito allowed users to buy and sell drugs using Bitcoin (BTC) and Monero (XMR) while taking a 5% cut, and Lin’s undoing ultimately came after the FBI traced the platform’s crypto to an account in Lin’s name at a crypto exchange.

Advertisement

“Today’s sentence puts traffickers on notice: you cannot hide in the shadows of the Internet,” said Manhattan US Attorney Jay Clayton. “Our larger message is simple: the internet, ‘decentralization,’ ‘blockchain’ — any technology — is not a license to operate a narcotics distribution business.”

Dark Markets, Court, Dark Web
Source: US Attorney SDNY

In addition to prison time, Lin was sentenced to five years of supervised release and ordered to pay more than $105 million in forfeiture.

Crypto tracing led FBI right to Lin

In March 2024, the Justice Department said Lin closed Incognito and stole at least $1 million that its users had deposited in their accounts on the platform.

Lin, known online as “Pharoah,” then attempted to blackmail Incognito’s users, demanding that buyers and vendors pay him or he would publicly share their user history and crypto addresses.

Lin wrote “YES, THIS IS AN EXTORTION!!!” in a post to Incognito’s website. Source: Department of Justice

Months later, in May 2024, authorities arrested Lin, a Taiwanese national, at New York’s John F. Kennedy Airport after the FBI tied him to Incognito partly by tracing the platform’s crypto transfers to a crypto exchange account in Lin’s name.

The FBI said a crypto wallet that Lin controlled received funds from a known wallet of Incognito’s, and those funds were then sent to Lin’s exchange account.

Advertisement

Related: AI-enabled scams rose 500% in 2025 as crypto theft goes ‘industrial’

The agency said it traced at least four transfers showing Lin’s crypto wallet sent Bitcoin originally from Incognito to a “swapping service” to exchange it for XMR, which was then deposited to the exchange account.

The exchange gave the FBI a photo of Lin’s Taiwanese driver’s license used to open the account, along with an email address and phone number, and the agency tied the email and number to an account at the web domain registrar Namecheap.