Connect with us
DAPA Banner
DAPA Coin
DAPA
COIN PAYMENT ASSET
PRIVACY · BLOCKDAG · HOMOMORPHIC ENCRYPTION · RUST
ElGamal Encrypted MINE DAPA
🚫 GENESIS SOLD OUT
DAPAPAY COMING

Tech

Microsoft rejects critical Azure vulnerability report, no CVE issued

Published

on

Azure

A security researcher claims Microsoft quietly fixed an Azure Backup for AKS vulnerability after rejecting his report, and blocking a CVE from being issued.

The researcher’s report describes a critical privilege escalation flaw that allowed cluster-admin access from the low-privileged “Backup Contributor” role.

Microsoft disputes the claim, telling BleepingComputer the behavior was expected and that “no product changes were made,” despite the researcher documenting new permission checks and failed exploit attempts after disclosure, suggestive of a silent patch.

CERT agrees it’s a bug, but Microsoft blocks CVE

Security researcher Justin O’Leary discovered the security flaw this March, and reported it to Microsoft on March 17.

Advertisement

Microsoft Security Response Center (MSRC) rejected the report on April 13, claiming the issue only involved obtaining cluster-admin on a cluster where “the attacker already held administrator access,” a characterization O’Leary says misrepresents the attack entirely.

“This is factually incorrect,” states the researcher.

“The vulnerability allows a user with zero Kubernetes permissions to gain cluster-admin. The attack does not require existing cluster access — it grants it.”

O’Leary further says that Microsoft described the submission to MITRE as “AI-generated content,” something he says did not address the technical merits of the report.

Advertisement

After the rejection, O’Leary escalated the issue to CERT Coordination Center, which independently validated the vulnerability on April 16 and, according to the researcher, assigned it an identifier, VU#284781:

CERT assigning the flaw a disclosure date and tracking identifier
CERT/CC assigning the flaw a tracking identifier and disclosure date

​​​​​​
​(Justin O’Leary)

CERT/CC had initially scheduled public disclosure for June 1, 2026, but that disclosure never happened.

On May 4, Microsoft staff reportedly contacted MITRE recommending against CVE assignment, again arguing the issue required pre-existing administrative access:

Microsoft blocks CVE
Microsoft recommending MITRE against a CVE issuance

(Justin O’Leary)

CERT/CC later closed the case under CNA hierarchy rules, effectively leaving Microsoft (which is a CNA) with final authority over CVE issuance for its own products.

How the attack worked

Azure Backup for AKS uses Trusted Access to grant backup extensions cluster-admin privileges inside Kubernetes clusters.

According to O’Leary, the flaw allowed anyone with only the Backup Contributor role on a backup vault to trigger that Trusted Access relationship without already having Kubernetes permissions.

Advertisement

An attacker could enable backup on a target AKS cluster, causing Azure to automatically configure Trusted Access with cluster-admin privileges. From there, an attacker could extract secrets through backup operations or restore malicious workloads into the cluster.

O’Leary classified the issue as a Confused Deputy vulnerability (CWE-441), where Azure RBAC and Kubernetes RBAC trust boundaries interacted in a manner that bypassed expected authorization controls.

Microsoft says no changes made, behavior says otherwise

BleepingComputer reached out to Microsoft to understand if the tech giant considered this finding to be a valid security vulnerability.

A Microsoft spokesperson told BleepingComputer:

Advertisement

“Our assessment concluded that this is not a security vulnerability, but rather expected behavior that requires pre-existing administrative privileges within the customer’s environment. Therefore, no product changes were made to address this report and no CVE or CVSS score were issued.”

However, following the disclosure of his report this month, O’Leary observed that the original attack path no longer works.

“Current behavior returns errors that did not exist in March 2026,” he states:

ERROR: UserErrorTrustedAccessGatewayReturnedForbidden

“The Trusted Access role binding is missing/has gotten removed”

Advertisement

According to O’Leary, Azure Backup for AKS now requires Trusted Access to be manually configured before backup can be enabled, reversing the earlier behavior where Azure configured it automatically.

He also observed additional permission checks that were absent during his original testing in March. The vault MSI now requires Reader permissions on both the AKS cluster and snapshot resource group, while the AKS cluster MSI requires Contributor permissions on the snapshot resource group.

In other words, the vulnerability appears to have been fixed, but Microsoft has neither issued a public advisory nor notified customers.

The visibility problem for defenders

Without a CVE or advisory, defenders have little visibility into the exposure window or remediation timeline.

Advertisement

“Organizations that granted Backup Contributor between an unknown start date and May 2026 were exposed to privilege escalation,” writes the researcher.

“Without a CVE, security teams cannot track this exposure. Silent patching protects vendors, not customers.”

The case highlights a structural problem with no easy fix.

Disputes between security researchers and major vendors over severity, exploitability, and disclosure have become common in recent years, especially as vulnerability disclosure programs face increasing volumes of reports.

Advertisement

Some open-source maintainers have also publicly complained that AI-assisted reports are overwhelming bug bounty and security triage systems, making it harder for legitimate findings to receive timely attention. Cases where big tech ignored patching valid flaws despite repeated contact by different researchers are not uncommon either.

Without a framework that realigns incentives for all parties, responsible disclosure risks becoming a bureaucratic exercise that serves no one—least of all the organizations left exposed in the dark.


article image

Automated pentesting tools deliver real value, but they were built to answer one question: can an attacker move through the network? They were not built to test whether your controls block threats, your detection rules fire, or your cloud configs hold.

This guide covers the 6 surfaces you actually need to validate.

Download Now

Source link

Advertisement
Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

Classic 7 is Windows 10 LTSC cosplaying as Windows 7

Published

on

oSes

Uncanny rebuild resurrects the 2009 desktop, complete with support, updates, and licensing questions

For those who miss what Windows looked like in 2009, Classic 7 is a heavily modified version of Windows 10 IoT LTSC, reworked to make it look as much as possible like Windows 7, while still being in support and receiving updates.

This has been
accomplished thanks to a large compilation of skins, themes, add-ons,
tweaks, and so on – some of which are real components from older
versions of Windows, adapted and modified to run on Windows 10.

Advertisement

We were not sure whether to cover Classic 7, because while it is impressive and fun, we are not at all sure it is legitimate to use. But we can see a target audience.


The good old sky-blue login screen with its decorative vine tells you that things are not what they seem

This isn’t just a layer of makeup; it’s more like a face transplant.
It includes some real binaries from Windows 7, and indeed earlier
versions, adapted and grafted onto Windows 10. One component is the
Windows Media Center from Windows XP, which was cut
from Windows 10
before release.

The specific version of Windows 10 that it’s modified is significant.
It’s Windows 10 IoT LTSC. We talked
about this specific edition in April 2025
because it’s the last
version of Windows 10 that is still in support and receiving updates. The standard Windows 10 Enterprise LTSC release will continue to receive updates until 2027,
and the IoT edition, which is only available in US English, will get
updates until 2032 – so this is the longest-lived version of Windows 10.

At the bottom of our story on Windows 10 LTSC, we mentioned the
slightly shady world of third-party modified editions of Windows.
Classic 7 is one; it’s a modified version of an Enterprise edition of
Windows, one that’s only available for legitimate licensing via a Volume
License Agreement. Unless you have appropriate volume licensing for the underlying Windows edition and have paid the fairly hefty
fee, this is an unlicensed copy of Windows. So we have to spell out that
this is not for production use, and you should not use it in
any working environment. It’s an interesting hack, though, and it might
be a bit of fun for a home gaming machine or something like that. 

Advertisement

As an
aside, one of the most widely used tools for activating unauthorized
copies of Windows and Office, MassGrave, is in fact hosted
on GitHub
. In other words, Microsoft itself is hosting tools to
activate unlicensed copies of Windows and Office. Whether that counts as
tacit approval, we wouldn’t like to say.


This, unbelievably, is a Windows 10 desktop. Yes, we know it says it’s Windows 7 Ultimate

Classic 7 has been under construction for over a year and a half, and
it’s the sequel to an earlier project called Reunion7 – also hosted on GitHub, as it
happens.

As its list of
credits
shows, Classic 7 is in part a compilation of a lot of
existing tools. Some of them are relatively well known, such as Winaero Tweaker, which can run on any copy of Windows and, among lots of other options, allows some of the
less desirable changes in the Windows UI to be undone – for instance,
switching to the hidden Aero Lite theme.

Classic 7 includes this and a lot more besides. We could identify some of the couple of dozen credited projects, such as the Aero11 theme, itself a port
of Aero10 to Windows
11. This works alongside OpenGlass, which
brings Aero-style transparency to Windows 10.

Advertisement

There’s also the Windows NT
Modding Utility
, and another hack that lets you change the Windows
version number reported on the command-line, called Custom CMD
Version Text
. Multiple sub-components come from the Windhawk mods collection, some
credited to a developer called ImSwordQueen, whose themes
can be seen on
DeviantArt
.


Classic 7 runs the original Windows 7 Explorer, and there’s a README file on the desktop with credits

Other components are more than just cosmetic. For instance, the
remarkable description of Explorer7:
“explorer7 is a wrapper library that allows Windows 7’s
explorer.exe to run properly on modern Windows versions,
aiming to resurrect the original Windows 7 shell experience.” So this is not merely a theme for Windows 10 Explorer: as far as we can tell,
it’s the real Windows 7 Explorer, but running on top of 10. The same appears to apply to Control Panel as well, thanks to the Control
Panel Restoration Pack
. Thanks to the Windows
Media Center (Modern Hardware)
effort, this is the real XP version,
which an on-screen message says replaced the Windows 8 version used in
an older build.

We tried Classic 7 in VMware, and the experience is quite uncanny. We
did hit some glitches: our first installation failed when we let it do
its own disk partitioning. Deleting all the partitions, manually creating a single large C: drive, and telling the installer to use that worked. A few error messages did
appear here and there. Trying to change screen resolution went badly
awry until we installed the VMware guest additions. Opening Windows
Update just threw an error.

Overall, though, it is genuinely remarkable. It looks and feels
like Windows 7 – but in principle, you can run the latest apps and drivers and
they should work. It even includes your choice of older Firefox versions,
including version 115 ESR, skinned to look exactly like Internet Explorer – an effort called BeautyFox.

Advertisement

Although the menu says ‘About Internet Explorer,’ it isn’t: this is Firefox, and the title bar shows that alpha-blending is working

Last year, we wrote a piece on running
Windows 7 in 2025
and it really reminded us how great the
2009 release looked compared to anything that’s come since. Apparently,
that late-noughties translucent look is now known as Frutiger
Aero
, and frankly we miss it.

In all honesty, we feel Classic 7 goes too far. We don’t want
Help/About dialog boxes, and even the winver tool and
the ver command to lie to us. We’d prefer something that
told the truth, but looked pretty while doing it.

But as we wrote last year, some personal friends are still running
Windows 7 by choice, and compatibility is starting to become a
problem. If you want a recent Firefox, well, you’re out of luck. Firefox
115
from 2023 still works, and remarkably, it’s still
getting security fixes
now: the March end-of-life has been postponed
again, and it’s currently
August 2026
. The Irish Sea wing of Vulture Towers is still running
it on OS X 10.13 and it works flawlessly.

This is a way out: to keep the 17-year-old vintage look, while
running a codebase that still has another five years in it. If you’re
that determined, it’s an option… and it’s undeniably an attractive GUI.
Whether this unauthorized rebuild of an unlicensed OS is an attractive
option, though – you must decide that for yourself. ®

Advertisement

Source link

Continue Reading

Tech

Hackers exploit auth bypass flaw in Burst Statistics WordPress plugin

Published

on

Hackers exploit auth bypass flaw in Burst Statistics WordPress plugin

Hackers are leveraging a critical authentication bypass vulnerability in the WordPress plugin Burst Statistics to obtain admin-level access to websites.

Burst Statistics is a privacy-focused analytics plugin active on 200,000 WordPress sites and marketed as a lightweight alternative to Google Analytics.

The flaw, tracked as CVE-2026-8181, was introduced on April 23 with the release of version 3.4.0 of the plugin. The vulnerable code was also present in the following iteration, version 3.4.1.

According to Wordfence, which discovered CVE-2026-8181 on May 8, the flaw allows unauthenticated attackers to impersonate known admin users during REST API requests, and even create rogue admin accounts.

Advertisement

“This vulnerability allows unauthenticated attackers who know a valid administrator username to fully impersonate that administrator for the duration of any REST API request, including WordPress core endpoints such as /wp-json/wp/v2/users, by supplying any arbitrary and incorrect password in a Basic Authentication header,” explains Wordfence.

“In a worst-case scenario, an attacker could exploit this flaw to create a new administrator-level account with no prior authentication whatsoever.”

The root cause is the incorrect interpretation of the ‘wp_authenticate_application_password()’ function results, specifically, treating a ‘WP_Error’ as an indication of successful authentication.

However, the researchers explain that WordPress can also return ‘null’ in some cases, which is mistakenly treated as an authenticated request.

Advertisement

As a result, the code calls ‘wp_set_current_user()’ with the attacker-supplied username, effectively impersonating that user for the duration of the REST API request.

Admin usernames may be exposed in blog posts, comments, or even in public API requests, but attackers can also use brute-force techniques to guess them.

Admin-level access allows attackers to access private databases, plant backdoors, redirect visitors to unsafe locations, distribute malware, create rogue admin users, and more.

While Wordfence warned in its post that they “expect this vulnerability to be targeted by attackers and, as such, updating to the latest version as soon as possible is critical,” its tracker shows that malicious activity has already begun.

Advertisement

According to the same platform, the website security firm has blocked over 7,400 attacks targeting CVE-2026-8181 in the past 24 hours, so the activity is significant.

Users of the Burst Statistics plugin are recommended to upgrade to the patched release, version 3.4.2, released on May 12, 2026, or disable the plugin on their site.

WordPress.org stats show that Burst Statistics had 85,000 downloads since the release of 3.4.2, so assuming that all were for the latest version, there remain roughly 115,000 sites exposed to admin takeover attacks.


article image

Automated pentesting tools deliver real value, but they were built to answer one question: can an attacker move through the network? They were not built to test whether your controls block threats, your detection rules fire, or your cloud configs hold.

This guide covers the 6 surfaces you actually need to validate.

Advertisement

Download Now

Source link

Continue Reading

Tech

Analogue 3D’s Latest Update Lets You Save Whenever You Want

Published

on





Analogue is adding a bit of “modern convenience” to its contemporary remake of the Nintendo 64 with its latest update. In the 1.3.0 version of Analogue’s 3DOS, players get the ability to quicksave whenever they want thanks to the company’s “signature save-state system” called Memories. Now, instead of risking it trying to make it to the next save point with low HP, the Memories feature lets you capture game progress at any point and reload whenever you want.

Analogue first introduced Memories with the Analogue Pocket in 2022 and would later advertise the feature as part of the Analogue 3D’s announcement. However, the quicksaving feature was ultimately delayed and didn’t come with the console’s launch in November 2025. Now that it’s here, Analogue has introduced hotkeys to create Memories, which works on both 8BitDo’s 64 Controller and the original N64 controllers, and is letting 3D owners generate up to 20 save files with Memories. According to Analogue, the oldest file will automatically be deleted when creating a new quicksave, but players can pin a specific Memory to ensure its preservation.

Advertisement

Analogue 3D owners can download the latest firmware, which also comes with a few bug fixes, on the company’s website. As much as this update comes as welcome news for existing owners, those still looking to get their hands on a Analogue 3D will have to wait for new stock alerts, as both the original and limited-edition colorways have been out of stock for some time now.



Advertisement

Source link

Continue Reading

Tech

Marketing operating system Nectar Social raises $30M Series A led by Menlo

Published

on

AI-powered marketing platform Nectar Social announced Thursday that it raised a $30 million Series A round led by Menlo Ventures and its Anthology Fund, which was created alongside Anthropic.

The company, which officially exited stealth last year, is an agentic operating system for marketers. It told TechCrunch that it uses autonomous AI agents to help brands run “social activity, moderation, creator workflows, competitive intelligence and commerce conversations end-to-end.” It also has data partnerships with companies like Meta and Reddit that allows the Nectar agent pull and pool data into one place from various platforms, rather than brands needing to use different tools to manage different platforms.

Nectar Social was founded by sisters Misbah and Farah Uraizee, ex-Meta employees. Misbah, the CEO, told TechCrunch that this round will help the company expand and hire more across applied AI, enginnering, and go to market.

“The buying conversation has moved into social, and no human team can staff every place it happens,” Misbah said. “We’re accelerating our category lead in building the operating system that lets brands show up everywhere.”

Advertisement

The company said clients include Liquid Death, Figma, and e.l.f Beauty. Other investors in the round include Gwyneth Paltrow’s Kinship Ventures, GV, and True Ventures.

Source link

Continue Reading

Tech

Why don’t AirPods Max have an Apple logo on them?

Published

on

An ex-Apple designer has revealed just how long ago that the company began working on AirPods Max and the reasoning behind not having an Apple logo on the product.

The original AirPods came out in 2016, and AirPods Max weren’t launched until 2020. So it’s an easy assumption that Apple only decided to make AirPods Max once those initial tiny white earbuds had proven to be a success.

Then, too, it took so long for AirPods Max to ever be updated that it had to feel like they were not considered important. That’s especially so since their eventual update chiefly consisted of switching them to using USB-C for charging.

Yet according to designer Eugene Whang, he was working on AirPods Max fully five years before they were released. Speaking to Highsnobiety magazine, Whang described the job as really being about three products.

Advertisement

Those were the headband, the case, and the cushion that together hold the AirPods Max comfortably against people’s ears. It was reportedly especially hard to get the cushioned section right because of trying to suit as many different size and shape heads as they could.

The process involved experimenting with “hundreds and hundreds of variations,” he said.

According to Whang, there weren’t just practical or technical issues to consider, nor was it all about what to add to the product. Instead, there were deeper issues, such as the way AirPods Max intentionally omit something every other Apple product has: an Apple logo.

“We didn’t want to brand your head,” says Whang.

Advertisement

Inside the design team

AirPods Max was reportedly one of the last products Whang worked on in his 22 years at Apple. And throughout that time, Apple was always about being disciplined in your approach to design.

“We would huddle around a table for hours,” says Whang. “Everyone’s equal. You’re only as good as your ideas. We were very direct with one another.”

“No one had any ego,” he continued. “You’re not criticizing the person; you’re criticizing the idea.”

Jony Ive has talked before about the importance of detail, and of how the care that goes into a product is felt by the user. Whang believes that too.

Advertisement

“If it’s not right on the inside, it’s not going to be right on the outside,” he says. “We literally designed from the inside out.”

“The interior details would have as much design work as the exterior,” he continues. “The shape of the PCB. The placement of components. Constant shuffling and Tetris of internals.”

Hand pressing a button on rose gold over-ear headphones worn on a person with light brown hair, showing close-up of ear cup, headband hinge, and control knob

Hundreds and hundreds of variations were tried for how AirPods Max should fit comfortably

Whang is not shy about how he says Apple’s designers “literally shaped culture through our products.” But he seems to say it as factual, rather than through ego, because he then says that immediately after launch, the whole team is constantly concerned.

Advertisement

“There’s always defense mode,” he says. “What’s going to go wrong that we didn’t think of?”

Then, too, he was able to talk about this incredible impact of Apple products, and the concomitant pressures to keep doing well. “When you’re in the eye of the storm, it doesn’t feel that crazy. You’re just doing the work,” he says.

Whang left Apple shortly after Jony Ive did, and was one of the designers who went to work for Ive’s startup, LoveFrom. Ultimately, he wouldn’t shortly quit in order to care for his ill mother, but says that during his time at LoveFrom, he worked on a huge range of projects from technology to interior design.

“It was amazing, the work was inspiring,” he says, “the people were inspiring.”

Advertisement

Separately, LoveFrom most recently showed off its technology and interior design with the Ferrari Luce electric vehicle.

Source link

Advertisement
Continue Reading

Tech

Git is unprepared for the AI coding tsunami

Published

on

Last month, Mitchell Hashimoto, HashiCorp co-founder, publicly declared that he was moving his popular open source Ghostty terminal emulator project from GitHub. GitHub runs the world’s largest service built on the Git distributed version control system, created by Linus Torvalds.

Once an enthusiastic user, Hashimoto grew disillusioned with service disruptions, and increasingly slow pull requests. “This is no longer a place for serious work if it just blocks you out for hours per day, every day,” he wrote. 

Hashimoto was quick to defend Git itself: “The issue isn’t Git, it’s the infrastructure we rely on around it: issues, PRs, Actions, etc.”

Many have blamed GitHub’s performance on Microsoft, which acquired the company in 2018. But to be fair, GitHub itself has been experiencing heavier-than-expected traffic thanks to a proliferation of AI-generated pull requests.

Advertisement

In 2025, GitHub saw a 206 percent year-over-year growth in AI-generated projects measured by the use of Bash shell scripts, a widespread way of running agents. And more AI code means more bugs. Research from GitClear found that AI-generated code heaped 10.83 issues per pull request, compared to 6.45 for the old-fashioned human variety.

Our new agentic workforce is raising big questions about how the entire software development lifecycle (SDLC) should evolve, and if Git should come along. 

“Agents are nudging us toward a continuous flow,” warned Peco Karayanev, co-founder of DevOps platform provider Autoptic, which bridges Git-based deployments with observability tools for agent-based remediation.

Autoptic’s entire user base runs on some form of Git, either homebrew or from a service provider like GitLab. 

Advertisement

Given the volume and magnitude of changes across repos, “we need git to start operating in a more continuous mode,” Karayanev wrote in an email interview.

Git operations, especially when used in GitOps-style automated deployments, still need to be managed by people. Updates, commits, pushes, merges are often yoked into sequences of “stop/go” episodes where someone has to hit enter on the keyboard a few times to continue the workflow, Karayanev noted. This model may not hold up once agents start getting priority. 

A butler for Git

Git has always had its share of critics, especially those who use the tool daily. 

There may not be another piece of software that is so widely adopted and yet so inscrutable. Torvalds and other Linux kernel developers built Git in 2005 after frustrations with trying to shoehorn Linux code into the commercial BitKeeper tool. Linux, a global group project of mammoth proportions, required a distributed version control system able to support non-linear development of thousands of parallel branches. 

Advertisement

Like any distributed system, Git can be difficult to understand.

One of the co-founders of GitHub, Scott Chacon co-wrote a book on using Git (2009’s Pro Git) and still he finds himself occasionally flummoxed by the version control system.

There are still “sharp edges” to Git, Chacon told The Register. “There’s a lot of stuff that it doesn’t do very well from a usability standpoint,” he said. 

Chacon co-founded GitButler as a way to “rethink the porcelain” of Git, to make Git more suitable to modern workflows. (Last month, GitButler received $17 million in venture capital funding). 

Advertisement

Think of GitButler as a super-powered Git client. It allows the developer to work on two different branches simultaneously, using a technique called virtual branching. It reconciles the code a developer is working on with the upstream code. They can reorder commits, or edit the comments of a previous commit. It offers richer metadata about the files being worked on. It can show which commits are unique to that branch.

Best of all, it eliminates what many developers call “rebase hell,” where merges into an updated codebase must be checked one at a time, a problem GitButler solves by keeping the user’s code synchronized with what is upstream.

Many of these actions GitButler offers can be done through the Git command itself – although Git’s command language, and its rules, can be so obtuse that “you will probably make a mistake at some point,” Chacon said.

A Git for agents

Chacon believes GitHub’s current reliability issues stem from the current tsunami of agentic work. 

Advertisement

This is “ironic” because GitHub was built to scale Git, he said. “But an influx of agents is pushing the service to the brink.”

The problem lies not with Git itself, but with everyone using one service, Chacon argued. Last year, GitHub had about 180 million users working across 630 million repositories – with 121 million created in 2025 alone, according to the company’s most recent annual Octoverse report. 

“From the longer-term perspective, it doesn’t need to be like this,” he argued. Maybe Git should be run locally, mirrored globally and managed with clients … such as GitButler, Chacon suggested. Perhaps Git-based version control systems could be customized for specific industry verticals. 

We need to think about how we “distribute these systems more,” he said. “Git is designed to be distributed but we’re not distributing it,” he said.

Advertisement

GitButler has created a command line interface specifically for agents. It was designed to give MCP servers an integrated map of the repository, which otherwise would require stitching together multiple Git commands. The Virtual Files concept allows the agent to work on a section of code that is also being worked on by a developer, or another agent.

These are changes that point to a rethinking of how a Git workflow should run. 

“I think all of these systems should fundamentally change, because all of our workflows have changed, right? There needs to be different, sort of primitives for how to deal with these problem sets,” Chacon said. 

A tip from gaming development

One company that wants its platform to replace Git altogether is Diversion, which has built an eponymous distributed version control system initially pitched for large-scale game design.

Advertisement

“Git’s architecture is actually an issue that prevents scaling,” argues Diversion CEO Sasha Medvedovsky in an interview with The Register. “Fundamentally it’s an architecture problem that can’t be fixed and is a bottleneck for end users and hosting services.”

Git is a distributed system insofar as every user, or hosted service, requires a dedicated database (much like blockchain). “It’s not distributed in the regular sense but rather replicated,” he wrote in an exchange with The Register on LinkedIn. 

Operations run on a single thread, making concurrent operations impossible. As a result, the larger the repository, the slower the commit operations – a deadly combination for fast-paced agentic software development, Medvedovsky noted. 

Of course, every CEO will have their talking points ready about a competitor’s weaknesses (Diversion is finalizing a blog post with hard numbers about Git and GitHub performance). But there are a growing number of other initiatives around prepping Git for the challenging times ahead.

Advertisement

Perhaps most notable is Jujutsu, a Git-compatible distributed version control system, stewarded by Google senior software engineer Martin von Zweigbergk. Like GitButler, Jujutsu (jj) aims to eliminate a lot of the annoyances that come with Git. It includes an undo button and the ability to keep committing even when there is a conflict. 

And because everything written in C must be recast into Rust these days, long-time Git contributor Sebastian Thiel started a project called Gitoxide to rebuild Git in Rust. Potential benefits include significant performance improvements through multicore processing, and the much-needed memory safety that comes with Rust. 

Will Git 3 solve all the problems?

Git’s chief maintainer is Junio Hamano, who took the reins from Torvalds in 2005. And he remains busy keeping Git current.

At FOSDEM this February, core Git contributor and GitLab engineering manager Patrick Steinhardt discussed some of the changes coming in the next version of Git, version 3, which is gradually being rolled out this year. 

Advertisement

One of the chief improvements will be in the way Git manages the commit references, the IDs that point to each change being made. Surprisingly, this operation is a real bottleneck for the software. “The design is inefficient,” Steinhardt told the audience.

Every time a programmer commits a code change, it gets recorded in a “packed-refs” file, which saves time by not giving each commit its own reference file. 

As projects grow larger, however, it takes longer for Git to amend or to delete a reference in packed-refs (One GitLab repo has a packed-refs file of more than 20 million references, Steinhardt said). 

This is especially problematic when you have multiple, simultaneous readers and writers of that file. And just forget about getting a consistent view of all the references. 

Advertisement

The freshly implemented Reftable feature, which will be the default in Git 3.0, stores references in an indexable binary format. The Git folks borrowed this concept from the Eclipse Foundation’s JGit Java implementation of Git. 

Reftable allows for block updates, eliminating the need to rewrite a 2 GB-sized file for a single entry. And it is much faster for reading, which would pave the way for Git supporting larger, more sprawling repositories – perfect for an ever-busy agentic workforce.

For nearly two decades, Git has proved to be the version control system of choice for geeks worldwide. But even with these new features and various third-party enhancements, can it retain relevance for a new generation of agentically enhanced coders? 

The battle is on. ®

Advertisement

Source link

Continue Reading

Tech

‘What is so worrying is not just that this information is being sold, but how little it can cost’: NordVPN research claims your stolen card details are being sold online for less than a fancy coffee

Published

on


  • Bank cards and IDs are easily available, and cheap, on the dark web, NordVPN warns
  • UK citizens are a major target, with their data worth more
  • The best fix is to secure your online accounts

We already know about the risks of using the internet, and how basic cybersecurity hygiene principles can do a lot of the heavy lifting when it comes to keeping you safe, but NordVPN has revealed exactly what happens with your data after it’s been stolen.

The company’s research found stolen UK payment card details are now being commonly sold on dark web marketplaces for around £9 (around $12), with full more complete ‘digital identity packs’ being sold for around £30 ($40).

Source link

Advertisement
Continue Reading

Tech

Digital chief for England’s schools. Must enjoy data, AI, and concrete problems

Published

on

Public Sector

Are you ready to RAAC?

England’s Department for Education is
advertising a role paying up to £200,000 a year to lead a new digital and
infrastructure group overseeing school buildings and maintenance, as well as technology and data.

Its Director General, Digital and
Infrastructure, will lead the technology function of
around 1,800 staff, develop a new strategy covering digital services, data, and artificial intelligence, and lead work on a unique identifier
for children and other learners in England. Scotland, Wales, and
Northern Ireland run education services on a devolved basis.

Advertisement

The successful candidate will also
implement a new strategy for “the education estate” of schools,
colleges, nurseries, and children’s homes. The job ad warns the function “carries
some of the highest levels of risk and accountability in the
department – including life-and-death decisions on safety,” citing ongoing work to remove unsafe reinforced
autoclaved aerated concrete (RAAC) from schools.

“I am looking for a leader who is
motivated by impact – someone who is able to combine their digital
and data expertise with their drive to improve outcomes for children
and young people,” writes the department’s permanent secretary, Susan Acland-Hood, in a briefing document with the advert. “Whilst
you do not need to be an expert on education policy, you need to be
curious and committed to rapidly building your understanding of the
latest evidence, system, and policy landscape.”

The department is willing to base the
job in Bristol, Cambridge, Coventry, Darlington, London, Manchester,
Nottingham, or Sheffield, although those who do not work in the
capital will need to go there frequently. Applications close on June 1.

Several other departments have
recently advertised digital director-general posts, the civil service
job category just below permanent secretary (equivalent to chief
executive). In January, England’s Department of Health and Social Care
advertised the role of director general for technology, digital and
data
with a salary of up to £285,000 a year. 

Advertisement

In February, the Ministry of Defence offered £270,000 to £300,000
for its chief digital and information
officer job
. And in April, the Department for Science, Innovation and Technology
advertised for three directors-general, one paid £174,000 and the
other two paying between £200,000 and £260,000 annually. ®

Source link

Continue Reading

Tech

This Is Toyota’s Cheapest Model Available In 2026

Published

on





There’s nothing wrong with wanting a bargain when you buy a new vehicle. Finding a reasonably-priced car definitely helps ease the pocketbook, especially when everything else just gets more and more expensive. To get the most bang for your buck here, then, you’ll want something from a brand known for making affordable and reliable products, and any list of those brands must include Toyota — after all, it dethroned Subaru as Consumer Reports’ most reliable brand in 2025

Toyota produces several vehicles that won’t break the bank for the 2026 model year, such as the Camry and Prius, which both have starting prices under $30,000. However, if you want the absolute cheapest Toyota, you need to turn to the 2026 Toyota Corolla. Toyota has been producing the Corolla since the mid-1960s, and it’s been a wonderful budget-friendly option for those wanting a compact car from the beginning. As of mid-2026, the Corolla is the tenth best-selling vehicle in the United States for the year, and JD Power ranks it as the most dependable compact car on the market.

Advertisement

The 2026 Toyota Corolla has a starting price of just $23,125 (plus a $1,295 delivery, processing, and handling fee) for the base LE trim. That makes it $1,455 less than the second-cheapest Toyota, the 2026 Corolla Hatchback. While that’s definitely very affordable, going with the cheapest possible Corolla means you’ll be missing out on quite a few options available on higher-priced variants.

Advertisement

What you do and don’t get with the cheapest Corolla

There’s a lot that comes standard with even the most basic 2026 Toyota Corolla LE. Maybe most surprisingly, you get treated to a vast array of safety features, like blind spot monitoring, cross-traffic alerts, pedestrian detection, lane tracing assist, and more. You also get to enjoy the conveniences of Apple CarPlay or Android Auto. These may be enough to entice some buyers, but there are some features that the base LE trim simply cannot provide.

One of the sillier ways Toyota can get more money out of you is with what color you want your car to be. The Corolla comes in eight colors, but two of them cost extra. If you want a car that is Ruby Flare Pearl or White Chill Pearl, that’ll cost you an additional $475. You will also have to pay more for your Corolla LE if you want a push-button start, keyless entry, or wireless charging for your devices. Those are all available in a premium package that costs $1,135.

Then there are all the things you have to choose a different, more expensive trim to get. If you want a hybrid powertrain, you’re looking at a starting price of $24,975 (plus the $1,295 fee). Some features, like the upgraded JBL audio system, aren’t even available as an upgrade on the LE and will require a higher-end trim. The Corolla LE doesn’t even offer variable speeds for your intermittent windshield wipers. You do get a lot for a low price with the 2026 Toyota Corolla LE, but you can’t get everything.

Advertisement



Source link

Advertisement
Continue Reading

Tech

Your Browser Probably Lies To The Big Sites (Blame Chrome)

Published

on

When you visit certain large sites in Firefox or Safari, the browser may detect your visit and change its behavior. It could be as simple as lying about its identity, or it may totally change how it renders the page. But according to a post by [Den Odel], this isn’t a conspiracy between browsers and big Internet — rather, it is a byproduct of Chrome’s dominance.

Here’s how it goes. Chrome puts out a new feature and everyone rushes to implement it on their site. Maybe the new code breaks other browsers. Maybe the other browser supports the feature, but the website doesn’t detect it correctly or is unaware. Maybe it just relies on some quirk of Chrome. Regardless, Firefox and Safari will change to match the site rather than mess up the user’s experience.

If you want to check it out, Firefox will show you what it does and let you disable specific fixes if you visit the about:compat URL. For Safari, you’ll have to read code from a file named quirks. Bugzilla tracks the fixes for Firefox, if you want more details.

Advertisement

Browsers are huge and complex so even niche browsers, today, usually use one of a handful of rendering engines. It seems that the question isn’t if a big company should control the way the web works. It is more a question of which one is currently dominating.

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025