Anthropic on Thursday released Claude Opus 4.6, a major upgrade to its flagship artificial intelligence model that the company says plans more carefully, sustains longer autonomous workflows, and outperforms competitors including OpenAI’s GPT-5.2 on key enterprise benchmarks — a release that arrives at a tumultuous moment for the AI industry and global software markets.
The launch comes just three days after OpenAI released its own Codex desktop application in a direct challenge to Anthropic’s Claude Code momentum, and amid a $285 billion rout in software and services stocks that investors attribute partly to fears that Anthropic’s AI tools could disrupt established enterprise software businesses.
For the first time, Anthropic’s Opus-class models will feature a 1 million token context window, allowing the AI to process and reason across vastly more information than previous versions. The company also introduced “agent teams” in Claude Code — a research preview feature that enables multiple AI agents to work simultaneously on different aspects of a coding project, coordinating autonomously.
“We’re focused on building the most capable, reliable, and safe AI systems,” an Anthropic spokesperson told VentureBeat about the announcements. “Opus 4.6 is even better at planning, helping solve the most complex coding tasks. And the new agent teams feature means users can split work across multiple agents — one on the frontend, one on the API, one on the migration — each owning its piece and coordinating directly with the others.”
Advertisement
Why OpenAI and Anthropic are locked in an all-out war for enterprise developers
The release intensifies an already fierce competition between Anthropic and OpenAI, the two most valuable privately held AI companies in the world. OpenAI on Monday released a new desktop application for its Codex artificial intelligence coding system, a tool the company says transforms software development from a collaborative exercise with a single AI assistant into something more akin to managing a team of autonomous workers.
AI coding assistants have exploded in popularity over the last year, and OpenAI said more than 1 million developers have used Codex in the past month. The new Codex app is part of OpenAI’s ongoing effort to lure users and market share away from rivals like Anthropic and Cursor.
The timing of Anthropic’s release — just 72 hours after OpenAI’s Codex launch — underscores the breakneck pace of competition in AI development tools. OpenAI faces intensifying competition from Anthropic, which posted the largest share increase of any frontier lab since May 2025, according to a recent Andreessen Horowitz survey. Forty-four percent of enterprises now use Anthropic in production, driven by rapid capability gains in software development since late 2024. The desktop launch is a strategic counter to Claude Code’s momentum.
Advertisement
According to Anthropic’s announcement, Opus 4.6 achieves the highest score on Terminal-Bench 2.0, an agentic coding evaluation, and leads all other frontier models on Humanity’s Last Exam, a complex multi-discipline reasoning test. On GDPval-AA — a benchmark measuring performance on economically valuable knowledge work tasks in finance, legal and other domains — Opus 4.6 outperforms OpenAI’s GPT-5.2 by approximately 144 ELO points, which translates to obtaining a higher score approximately 70% of the time.
Claude Opus 4.6 leads or matches competitors across most benchmark categories, according to Anthropic’s internal testing. The model showed particular strength in agentic tasks, office work and novel problem-solving. (Source: Anthropic)
Inside Claude Code’s $1 billion revenue milestone and growing enterprise footprint
The stakes are substantial. Asked about Claude Code’s financial performance, the Anthropic spokesperson noted that in November, the company announced that Claude Code reached $1 billion in run rate revenue only six months after becoming generally available in May 2025.
The spokesperson highlighted major enterprise deployments: “Claude Code is used by Uber across teams like software engineering, data science, finance, and trust and safety; wall-to-wall deployment across Salesforce’s global engineering org; tens of thousands of devs at Accenture; and companies across industries like Spotify, Rakuten, Snowflake, Novo Nordisk, and Ramp.”
Advertisement
That enterprise traction has translated into skyrocketing valuations. Earlier this month, Anthropic signed a term sheet for a $10 billion funding round at a $350 billion valuation. Bloomberg reported that Anthropic is simultaneously working on a tender offer that would allow employees to sell shares at that valuation, offering liquidity to staffers who have watched the company’s worth multiply since its 2021 founding.
How Opus 4.6 solves the ‘context rot’ problem that has plagued AI models
One of Opus 4.6’s most significant technical improvements addresses what the AI industry calls “context rot“—the degradation of model performance as conversations grow longer. Anthropic says Opus 4.6 scores 76% on MRCR v2, a needle-in-a-haystack benchmark testing a model’s ability to retrieve information hidden in vast amounts of text, compared to just 18.5% for Sonnet 4.5.
“This is a qualitative shift in how much context a model can actually use while maintaining peak performance,” the company said in its announcement.
The model also supports outputs of up to 128,000 tokens — enough to complete substantial coding tasks or documents without breaking them into multiple requests.
Advertisement
For developers, Anthropic is introducing several new API features alongside the model: adaptive thinking, which allows Claude to decide when deeper reasoning would be helpful rather than requiring a binary on-off choice; four effort levels (low, medium, high, max) to control intelligence, speed and cost tradeoffs; and context compaction, a beta feature that automatically summarizes older context to enable longer-running tasks.
Opus 4.6 dramatically outperformed its predecessor on tests measuring how well models retrieve information buried in long documents — a key capability for enterprise coding and research tasks. (Source: Anthropic)
Anthropic’s delicate balancing act: Building powerful AI agents without losing control
Anthropic, which has built its brand around AI safety research, emphasized that Opus 4.6 maintains alignment with its predecessors despite its enhanced capabilities. On the company’s automated behavior audit measuring misaligned behaviors such as deception, sycophancy, and cooperation with misuse, Opus 4.6 “showed a low rate” of problematic responses while also achieving “the lowest rate of over-refusals — where the model fails to answer benign queries — of any recent Claude model.”
When asked how Anthropic thinks about safety guardrails as Claude becomes more agentic, particularly with multiple agents coordinating autonomously, the spokesperson pointed to the company’s published framework: “Agents have tremendous potential for positive impacts in work but it’s important that agents continue to be safe, reliable, and trustworthy. We outlined our framework for developing safe and trustworthy agents last year which shares core principles developers should consider when building agents.”
Advertisement
The company said it has developed six new cybersecurity probes to detect potentially harmful uses of the model’s enhanced capabilities, and is using Opus 4.6 to help find and patch vulnerabilities in open-source software as part of defensive cybersecurity efforts.
Anthropic says its newest model exhibits the lowest rate of problematic behaviors — including deception and sycophancy — of any Claude version tested, even as capabilities have increased. (Source: Anthropic)
Sam Altman vs. Dario Amodei: The Super Bowl ad battle that exposed AI’s deepest divisions
The rivalry between Anthropic and OpenAI has spilled into consumer marketing in dramatic fashion. Both companies will feature prominently during Sunday’s Super Bowl. Anthropic is airing commercials that mock OpenAI’s decision to begin testing advertisements in ChatGPT, with the tagline: “Ads are coming to AI. But not to Claude.”
OpenAI CEO Sam Altman responded by calling the ads “funny” but “clearly dishonest,” posting on X that his company would “obviously never run ads in the way Anthropic depicts them” and that “Anthropic wants to control what people do with AI” while serving “an expensive product to rich people.”
Advertisement
The exchange highlights a fundamental strategic divergence: OpenAI has moved to monetize its massive free user base through advertising, while Anthropic has focused almost exclusively on enterprise sales and premium subscriptions.
The $285 billion stock selloff that revealed Wall Street’s AI anxiety
The launch occurs against a backdrop of historic market volatility in software stocks. A new AI automation tool from Anthropic PBC sparked a $285 billion rout in stocks across the software, financial services and asset management sectors on Tuesday as investors raced to dump shares with even the slightest exposure. A Goldman Sachs basket of US software stocks sank 6%, its biggest one-day decline since April’s tariff-fueled selloff.
The selloff was triggered by a new legal tool from Anthropic, which showed the AI industry’s growing push into industries that can unlock lucrative enterprise revenue needed to fund massive investments in the technology. One trigger for Tuesday’s selloff was Anthropic’s launch of plug-ins for its Claude Cowork agent on Friday, enabling automated tasks across legal, sales, marketing and data analysis.
Thomson Reuters plunged 15.83% Tuesday, its biggest single-day drop on record; and Legalzoom.com sank 19.68%. European legal software providers including RELX, owner of LexisNexis, and Wolters Kluwer experienced their worst single-day performances in decades.
Advertisement
Not everyone agrees the selloff is warranted. Nvidia CEO Jensen Huang said on Tuesday that fears AI would replace software and related tools were “illogical” and “time will prove itself.” Mark Murphy, head of U.S. enterprise software research at JPMorgan, said in a Reuters report it “feels like an illogical leap” to say a new plug-in from an LLM would “replace every layer of mission-critical enterprise software.”
What Claude’s new PowerPoint integration means for Microsoft’s AI strategy
Among the more notable product announcements: Anthropic is releasing Claude in PowerPoint in research preview, allowing users to create presentations using the same AI capabilities that power Claude’s document and spreadsheet work. The integration puts Claude directly inside a core Microsoft product — an unusual arrangement given Microsoft’s 27% stake in OpenAI.
The Anthropic spokesperson framed the move pragmatically in an interview with VentureBeat: “Microsoft has an official add-in marketplace for Office products with multiple add-ins available to help people with slide creation and iteration. Any developer can build a plugin for Excel or PowerPoint. We’re participating in that ecosystem to bring Claude into PowerPoint. This is about participating in the ecosystem and giving users the ability to work with the tools that they want, in the programs they want.”
Claude’s new PowerPoint integration, shown here analyzing a market research slide, places Anthropic’s AI directly inside a flagship Microsoft product — despite Microsoft’s major investment in rival OpenAI. (Source: Anthropic)
Advertisement
The data behind enterprise AI adoption: Who’s winning and who’s losing ground
Data from a16z’s recent enterprise AI survey suggests both Anthropic and OpenAI face an increasingly competitive landscape. While OpenAI remains the most widely used AI provider in the enterprise, with approximately 77% of surveyed companies using it in production in January 2026, Anthropic’s adoption is rising rapidly — from near-zero in March 2024 to approximately 40% using it in production by January 2026.
The survey data also shows that 75% of Anthropic’s enterprise customers are using it in production, with 89% either testing or in production — figures that slightly exceed OpenAI’s 46% in production and 73% testing or in production rates among its customer base.
Enterprise spending on AI continues to accelerate. Average enterprise LLM spend reached $7 million in 2025, up 180% from $2.5 million in 2024, with projections suggesting $11.6 million in 2026 — a 65% increase year-over-year.
OpenAI remains the dominant AI provider in enterprise settings, but Anthropic’s share has surged from near zero in early 2024 to roughly 40 percent of companies using it in production by January 2026. (Source: Andreessen Horowitz survey, January 2026)
Advertisement
Pricing, availability, and what developers need to know about Claude Opus 4.6
Opus 4.6 is available immediately on claude.ai, the Claude API, and major cloud platforms. Developers can access it via claude-opus-4-6 through the API. Pricing remains unchanged at $5 per million input tokens and $25 per million output tokens, with premium pricing of $10/$37.50 for prompts exceeding 200,000 tokens using the 1 million token context window.
For users who find Opus 4.6 “overthinking” simpler tasks — a characteristic Anthropic acknowledges can add cost and latency — the company recommends adjusting the effort parameter from its default high setting to medium.
The recommendation captures something essential about where the AI industry now stands. These models have grown so capable that their creators must now teach customers how to make them think less. Whether that represents a breakthrough or a warning sign depends entirely on which side of the disruption you’re standing on — and whether you remembered to sell your software stocks before Tuesday.
When you think wearables from the likes of Google, Motorola and Samsung, you probably think earbuds and maybe watches. But in the age of AI, a whole new world of wearable tech is coming to life, and we could see these companies soon branch out to make AI-powered pins, pendants and other unexpected gadgets too.
This new generation of wearable tech will be made possible by Qualcomm, which on Monday announced the latest version of its wearables chip, the Snapdragon Wear Elite, at Mobile World Congress in Barcelona. This new platform will be used by a range of partners, including Google, Motorola and Samsung to design a constellation of new devices.
Qualcomm’s philosophy toward wearables is very much “build it, and they will come.” It makes the underlying technology that will power devices and will then encourage companies to build on top of it how they see fit.
Advertisement
When I attended the company’s Snapdragon Summit in Hawaii last year, Qualcomm Chief Marketing Officer Don McGuire painted me a picture of how he imagines the convergence of AI and wearables playing out.
“AI is going to be ambient in a lot of ways,” he told me. It might not even be called a “device” if it’s something woven into your clothing or worn on your person. “There’s lots of ideas out there floating around,” he said.
At the same event, Dino Bekis, who runs Qualcomm’s wearables business, introduced me to the Looki L1 — a life-logging camera created with the company’s W5 Gen 2 chip. This is the wearables platform Qualcomm introduced last year, which was designed to work with Google’s Wear OS and launched with the Pixel Watch.
Unlike its predecessor, the new Wear Elite chip will work across Google’s Wear OS, Android and Linux, with a neural processing unit that enables on-device AI with low power consumption. This is key for wearable devices, which you don’t necessarily want to charge every day. Qualcomm says the Wear Elite’s advanced power management enables 30% longer battery use, compared to the previous version, with rapid charging bringing devices to 50% in around ten minutes.
Advertisement
“The Snapdragon Wear Elite platform opens new possibilities, delivering the performance, battery life and connectivity essential for the next generation of Wear OS,” said Bjørn Kilburn, general manager of Wear OS by Google, in a statement.
The first devices powered by the Wear Elite chip should be available in the coming months, with Motorola saying it will use the platform to build more AI wearable devices like AI concept Project Maxwell, which it showed at CES in January, and Samsung saying it will integrate Wear Elite into the next Galaxy Watch. This will make the watch into “an even more holistic wellness companion,” said InKang Song, EVP and head of tech strategy at Samsung.
Samsung and Google might be focused on watches, but Snapdragon Wear Elite points to a future halo of personal wearables, which CNET Editor at Large Scott Stein has explored in more detail. The possibilities stretch beyond what we’ve seen so far as this latest platform is embraced by companies big and small. I’ll be looking for demos making use of the new chip this week at MWC, so stay tuned for more.
The Klipsch Flexus Core 200 ($549) is about as sensible as soundbars get. If you’re wondering what I mean by that, the Core 200 offers up all the essentials; Dolby Atmos processing, up-firing speakers for height effects, and HDMI eARC connectivity in a reasonably compact and powerful package. And by stripping out the things that not everyone needs in a soundbar – specifically, extra HDMI ports and built-in Wi-Fi for streaming music, Klipsch managed to hit an affordable price point with the Flexus Core 200. They don’t call it Core for nothing.
Sandwiched between the 2.1-channel Core 100 ($349) and 5.1.2-channel Core 300 ($1,199) in the Flexus soundbar lineup, the Core 200 walks the line between basic TV sound enhancement and full-on Atmos immersion. It can also be scaled up to a 5.1.2- or even a 5.1.4-channel configuration by adding an optional Klipsch wireless subwoofer and surround speakers. For this review, I paired it with the Flexus SUB 100 subwoofer ($349 each) and Flexus SURR 100 rear speakers ($249/pair). Total system price: $1,175.
What Is It?
The Klipsch Flexus Core 200 is a 3.1.2-channel powered soundbar that decodes Dolby Atmos and legacy Dolby Digital and PCM formats. DTS:X is not supported, an omission some may find disappointing now that support for that format has been added to movies on the Disney+ streaming service.
As I mentioned above, there’s no Wi-Fi onboard for music listening via TIDAL Connect, Spotify Connect and other services, though Bluetooth is on-board for basic streaming of music, internet radio and podcasts. Bluetooth is also used by the Klipsch Connect setup app and for wireless hookup between the soundbar, subwoofer and rear speakers.
Advertisement
The 44-inch-wide Flexus Core 200 soundbar is a good fit for 55-inch and larger TVs.
At 44 inches (111.8 cm) wide, 3 inches (7.8cm) high, and 5 inches (12.6cm) deep, the Core 200 mates well visually with 55- or 65-inch TVs, though its somewhat chonky 3-inch height means you may need a TV with an adjustable stand to provide sufficient screen clearance. The soundbar’s attractive cabinet is made of plastic, wood, and metal, and there are black and walnut finish options.
Being a Klipsch soundbar, there’s going to be a horn somewhere, and in this case it’s the center speaker’s 0.75-inch horn-loaded tweeter, which is flanked by two 2.25-inch aluminum cone drivers. The same 2.25-inch drivers are also used for the left and right speakers and up-firing elevation speakers, while a pair of 4-inch paper cone woofers bring the bass. Onboard power for the Onkyo-designed amplifier section is specified at 185 watts (RMS) and frequency response at 43Hz-20kHz.
The Flexus Transport USB transmitter (at right) is included with Klipsch’s Flexus SUB subwoofers and Flexus Surr rear speakers.
Along with the Core 200’s HDMI eARC port, there’s an optical digital input, an RCA output for a hardwired subwoofer connection, a USB-C port (service only) and a USB Type-A port to plug in the wireless dongle that connects the SUB 100 subwoofer and SURR 100 rear speakers.
Controls located on the Core 200’s top surface let you toggle power on and off, switch inputs and adjust the volume level. A large alphanumeric LED display located on the front provides visual feedback when making adjustments using the Klipsch Connect app or the included remote control – a useful and very welcome feature that’s not always provided on soundbars, including ones priced significantly higher than the Core 200.
Advertisement
Basic touch-sensitive controls are located on the Core 200 soundbar’s top surface.
The Klipsch Flexus SUB 100 I used for my test packs a 10-inch paper cone woofer powered by an 80-watt (RMS) class D amplifier in a sealed enclosure. It comes with the Flexus Transport USB transmitter used for the soundbar’s wireless connection, and it also has an RCA input for a hardwired hookup. At 13.25 inches (33.7cm) wide x 13.3 inches (33.8cm) high x 13.75 inches (35cm) deep, it’s a relatively compact cube and makes for a good visual match with the Core 200 soundbar.
The Klipsch Flexus SURR 100 speakers I used for rear channels are tiny, almost toy-like at 4.25 inches (10.8cm) wide x 6.75 inches (17.1cm) high x 4.25 inches (10.8cm) deep. Each speaker uses a 3-inch paper cone driver powered by 25 watts (RMS) and the package also comes with the Flexus Transport USB transmitter. Klipsch offers a beefier rear speaker option in the Flexus SURR 200 ($499/pair), a model that adds a 2.25-inch up-firing driver to the 3-inch front-facing one to convey Dolby Atmos height effects.
Klipsch Flexus SUB 100 Subwoofer (left) and SURR 100 Wireless Rear Speakers (right)
Setup and Use
I found setting up the Core 200 with Klipsch’s optional subwoofer and rear speakers to be super easy compared to other soundbar-based wireless surround systems I’ve tested. If you’re using just the Core 200, all that’s needed is to run a cable from your TV’s HDMI eARC/ARC port to the soundbar’s HDMI port. Connect it to power and you’re done – even the Klipsch Connect app is optional and not needed for setup.
Advertisement. Scroll to continue reading.
Advertisement
If, like me, you’re extending the system with a wireless sub and speakers, you’ll need to insert one of the included USB transmitters into the soundbar’s USB type-A port and then press the Connect button located on the rear panel of both the subwoofer and rear speakers. An audio tone confirms that a wireless connection has been made and an LED indicator light, also located on the back, changes from a pulsing to a solid white.
For my setup, I had the Core 200 placed on a stand beneath a 75-inch TV in my 9 x 12 x 16 (H x W x D) foot viewing room, the SUB 100 in the front right corner, and the Surround 100s on stands to the right and left and slightly behind my sofa.
The Core 200 soundbar’s full-featured remote control.
Klipsch’s remote control provides access to most adjustments for tuning the soundbar for your viewing environment or whatever content you’re listening to or watching. There are buttons to select the Sound (Movie or Music) and Night (volume levelling) modes, and to configure Dialog level (1-3 or Off) plus front height, back left and right, and subwoofer level. All of these adjustments are indicated on the Core 200’s big, beautiful front LED display, and you can also adjust the display’s brightness using the remote.
The Klipsch Connect app features all the same adjustments, but further provides a three-band EQ with multiple presets plus a Custom setting. EQ may get scoffed at by audio purists, but I find it to be highly necessary for soundbars, where you regularly need to make adjustments for the differences in TV and movie soundtracks, as well as for any music you listen to.
Advertisement
The Core 200’s large alphanumeric LED display provides easy to read visual feedback.
Movie Performance
I started out my evaluation with the Klipsch soundbar alone before adding the subwoofer and rear speakers to the mix. The Core 200 had a nicely balanced presentation overall, with full bass and clear, natural-sounding dialogue. Atmos effects were also pronounced, especially with the soundbar’s height adjustments edged up toward maximum level.
These qualities served F1 well when I streamed it on Apple TV via my Apple TV 4K. In the movie’s opening scene, Sonny Hayes (Brad Pitt) races at Daytona to the strains of Led Zeppelin’s “Whole Lotta Love.” The roar of the car engines was vivid and clean, and I could easily hear a shifting level of spaciousness to the commentator’s voice as the action cut between interior and exterior shots. John Paul Jones’ bass came across as muscular and deep, with a level of dynamic power that was impressive for a standalone soundbar.
The Klipsch Control app duplicates all the remote control functions and adds EQ adjustments (center).
Watching F1 revealed the Core 200’s ability to cast a tall and wide soundstage, especially when a fireworks display lit up the night sky during the race, but the presentation was mainly locked to the front of the room. This was evident when I watched the scene from the Dune: Part II 4K Blu-ray where Paul Atreides (Timothée Chalamet) hitches a ride on a giant sandworm – as worm and rider plowed through the desert landscape, the spray of sand was cast high and wide, but I didn’t feel overly immersed in the action.
It was the same deal when I watched the scene from the Twisters 4K Blu-ray where the doomed young meteorologists flee a monster tornado only to get sucked up one by one into the deadly funnel. The sound of the storm was powerful and dramatic, but I didn’t feel like I was in the eye of the storm. Dialogue in this complex and chaotic sequence also tended to get obscured, but a few hits of the Dialog button on the remote successfully boosted it to the point where I could hear it.
Movies with SUB & SURR
Watching the same clips with the SUB 100 and SURR 100 speakers added to the mix elevated the Core 200’s game to the point where I’d deem them indispensable. It’s not that Klipsch’s soundbar isn’t effective on its own; for the price, I’d even say it’s an overachiever. But adding those optional extras brought about a surprisingly effective increase in both dynamic range and surround immersion. I was actually caught off guard by it.
Advertisement
The Klipsch Flexus SURR 100 proved surprisingly potent for compact rear speakers.
Giving F1 another spin, the SUB 100 created a deep foundation of bass that added dimension to engine sounds and emphasis to John Bonham’s kick drum. The little SURR 100 speakers lit up as the cars circled the track, providing a strong sense of being positioned in the driver’s seat.
The Dune: Part II worm rodeo scene also benefited greatly from the speaker additions, with the sand now seeming to spray to the back of the room. Twisters, too, took on a new dimension: the trajectory of wind sounds now seeped from the front to the rear speakers, creating a much more vivid sense of being caught inside the storm.
Music Performance
For music, I decided to leave the full 5.1.2 configuration intact since I was mostly listening to Dolby Atmos music tracks on Apple Music (played via the Apple TV 4K). Also, the Core 200 automatically upconverts stereo tracks in both Music and Movie mode, so everything I listened to ended up being in surround sound format anyway.
With the Core 200 soundbar’s streaming options limited to Bluetooth, you’ll need to rely on an external streamer for lossless music listening.
I’m a fan of Ryan Ulyate’s Atmos mix of Tom Petty’s Wildflowers, which manages to subtly expand the stereo original while maintaining a rock-solid presentation of vocals and instruments. Heard on the Flexus Core 200 system, Petty’s voice on “It’s Good to be King” had the same dry, natural quality I’m used to hearing on higher-end setups, The piano maintained its clean, well rounded tone and the Atmos mix spread subtly towards the rear of the room in a way that added warmth to the sound.
Advertisement
Advertisement. Scroll to continue reading.
Beck’s “She’s Gone,” also in Atmos, further confirmed my impression of the Core 200’s neutral, and mostly transparent, handling of music. Beck’s vocals sounded natural, with just a slight touch of reverb, and the acoustic guitar and harmonica had a crisp, clean tone. The bass guitar had a similar level of depth and punch as on “Whole Lotta Love” when I watched F1, but it gained a deeper, more authoritative foundation with the addition of the SUB 100 subwoofer.
To see how far I could flex that sub, I next played Deadmau5’s “Imaginary Friend” in stereo via the Apple TV 4K’s TIDAL app. For a compact sub with a 10-inch driver, the SUB 100 did an impressive job pressurizing the room and fleshing out the electronic beats. I could literally feel the bass hit in my chest. Upconverted for surround, the track gained a compelling sense of spaciousness, and the addition of a height dimension via the soundbar’s up-firing speakers gave it a nice wall of sound effect.
The Klipsch Flexus SUB 100 delivered impressive bass power for a compact sealed subwoofer with a 10-inch driver.
Shifting back to Atmos, I dug out my Pink Floyd Wish You Were Here (50th Anniversary) Blu-ray, which features a fantastic Dolby Atmos mix by the band’s longtime producer and engineer, James Guthrie. I had been a bit underwhelmed by the Core 200’s Atmos presentation of this disc when I had listened to it without the SUB 100 and SURR 100 speakers, but hearing “Welcome to the Machine” on the full system was a very different experience. The up-front vocals and guitar had a full, monolithic quality, floating well above the physical confines of the soundbar, while the synths stretched out well into the room and around my head. To me, “Welcome to the Machine” is about as good as Atmos music gets, and the Klipsch system did it justice.
Advertisement
Klipsch Flexus Core 200 Soundbar System with SUB 100 Subwoofer and SURR 100 Wireless Rear Speakers for 5.1.2 Dolby Atmos
The Bottom Line
If I haven’t already made this clear enough, the Klipsch Flexus Core 200 soundbar’s performance takes a big leap forward when augmented by the Flexus SUB 100 subwoofer and Flexus SURR 100 rear speakers. That’s not to knock the Core 200, which performs very well for a 3.1.2-channel soundbar, especially one priced at $549. I’m sure many folks would be more than satisfied with its standalone sound, and also with its ease of setup and use.
Are there crucial features missing from the Core 200? Aside from DTS:X support, it would be nice to have built-in Wi-Fi for streaming, so you could use your phone to cue up music without having to rely on lossy Bluetooth for playback. Wi-Fi is a feature found on the Sonos Beam Gen 2 ($499), which lets you stream lossless music from a wide range of apps, and also brings support for AirPlay 2. In my case it was easy enough to use my Apple TV 4K for lossless and Dolby Atmos music streaming, but not everyone will want to deal with an external streamer.
Even without Wi-Fi for music streaming, the Klipsch Flexus Core 200 is a great value. I was very impressed with its performance for the price, and at $1,175 for the full package with subwoofer and rear speakers, it’s a very affordable way to dive into Dolby Atmos surround sound. I’ve regularly found that companies with a long history making speakers also do a great job with soundbars, and the Klipsch Flexus Core 200 system proves that to still be the case.
Pros:
Dynamic sound with clear dialogue
Powerful bass and good immersion with optional subwoofer and rear speakers added
Full-featured remote control
Dialog boost and EQ adjustments
Simple setup
Large, alphanumeric LED display
Great value
Cons:
No built-in Wi-Fi for music streaming
No DTS:X or DTS support
Standalone Core 200 soundbar has limited immersive effect
I’ve been steeling myself for a coming wave of AI-infused wearables that could be worn all over the place, based on reports on gadget plans at Meta, Google and Apple — a halo of connected tech with cameras onboard, streaming to AI services. Qualcomm’s latest chip, announced Monday at Mobile World Congress in Barcelona, is built for it, and the first devices using it are coming this summer. Samsung, Google and Motorola are already building hardware with it.
I sat down with John Kehrli, senior director of product management for Qualcomm, to discuss the newest wearable chip push, and it caught my attention on several levels. The reason you should care is that this is a clear preview of tech products to come: Qualcomm’s chips power almost all of the non-Apple watches, VR headsets and smart glasses out there.
While Qualcomm has had separate chip lines for smartwatches and for smart glasses and VR headsets, the new Snapdragon Wear Elite chip aims to bridge across categories. It’s a higher-powered watch chip filled with different wireless connection capabilities, but it is also made to support video input and streaming for AI, even 1080p video output to displays. That could include AI-infused smart glasses.
Advertisement
“It’s not just the watch: for sure that’s a focus for us, but the portfolio [of devices] has expanded dramatically,” Kehrli says.
Here’s the news about Snapdragon Wear Elite that stood out for me.
Qualcomm’s new chip design is meant to be flexible in form. It could end up many places.
Advertisement
Qualcomm
A lot more onboard processing for offline AI
A big part of Qualcomm’s push on these chips is to do more generative AI and LLM work on device, a trend I expect to grow. The Snapdragon Wear Elite looks a lot more powerful than previous Qualcomm watch chips. Some of the offline, on-device functions could be voice-based AI, for fitness or, according to Qualcomm, for “life logging.”
I’m not sure I need life logging, but I’d be interested in having more AI-based controls for wearables. The extra power looks to also drive video on displays and run onboard cameras, including video streaming. The whole idea behind next-wave multimodal AI is to have AI services be aware of what you’re doing — that’ll mostly happen via camera access.
Kehrli says the processing cores for the neural processing unit on the Snapdragon Wear Elite could support AI models of up to 2 billion parameters on device, at about 10 tokens per second to process. He sees that being good enough for a lot of offline needs, with cloud-connected AI kicking in when needed otherwise.
Kehrli sees a lot of local AI needs for the extra sensors, including cameras, that are going to be on these wearables. “There’s so many exciting inputs coming in [to the devices]. Location, sound, voice, text, all the sensors — we’re really seeing a lot of medical-grade sensors come into the retail space. What do I do with that data?”
Advertisement
Qualcomm’s concept for a wearable pendant is like a smartwatch, but with outward-facing camera.
Qualcomm
Cameras everywhere?
In Qualcomm’s sizzle video for the new chip, we can see a glimpse of a watch with a camera on its top edge. Most smartwatches don’t have cameras right now, but that could be changing soon. While it’s not necessarily a great way to take photos, the onboard cameras are likely more an additional way to tap into AI, like for face recognition biometrics for tap-to-pay, using a watch like a smart key for cars or other connected things, or maybe to use for other AI-based controls.
Another concept shot of a pendant, which looks basically like a neck-worn smartwatch, has its camera facing out. All the AI pins and pendants that have been trickling in these last few months are showing similar ideas. Like smart glasses, the outer-facing cameras could be another way to see things without putting something on your face. But you’d have to wear some pin or pendant.
It also sounds like devices with these new chips will last longer on a charge. Qualcomm’s promising 30% better battery life than with its previous watch chip — potentially “days” of use. I’d still expect more or less a full day, considering these chips might also be supercharging more camera-based and AI features.
The faster charging sounds promising, though. The chips could charge devices up to 50% on 10 minutes of charging. That’s key because a lot of these wearables are being designed to be worn all the time, and some while you’re sleeping. It’s like companies are trying to find ways to do a quick recharge pit stop without spending too much time off your body.
Advertisement
The most interesting part could be the boosted wireless features. Qualcomm’s got six different protocols on-chip: support for Redcap 5G (a protocol to support high-speed and low-power connected tech), Bluetooth 6.0, ultra wideband, GPS, satellite-connected NB-NTN for messaging, and micropower Wi-Fi 802.11ax.
The micropower Wi-Fi support could allow these new wearables to stay Wi-Fi connected continuously, says Kehrli, letting them work in the background longer. On Meta’s Ray-Ban glasses, for instance, right now, they’re mainly Bluetooth-connected and don’t stream video by default; switching to that mode kills battery life fast. Streaming always-on AI modes could last longer on Elite-powered devices.
Qualcomm’s plans for this chip extends to nearly every wearable territory.
Advertisement
Qualcomm
Where they could show up: Watches, glasses, headphones, pendants, more
Qualcomm’s aiming to put its new chip across a wide range of wearables, from camera-enabled headphones and earbuds like Razer’s Motoko concept (which I tried at CES in January) to next-gen smartwatches and AI pendants, to smart glasses, and even sensor-connected bands. Devices like Meta’s neural band, which uses EMG (electromyography, using skin contact sensors) for hand gestures that control its smart glasses, could see upgrades with this chip. Maybe that’s exactly the sort of territory Meta could be exploring with its reported smartwatch debut this year.
It’s also clear that everyone, Qualcomm included, isn’t entirely sure where people prefer to wear these future AI gadgets. Is it glasses? Pendant? Watch? Headphones? All of the above? Kehrli feels people will have different preferences and will choose what works. Will that sort of redundancy make sense or settle itself down into clearer categories in another year or two?
Glasses, Kehrli adds, could be a landing spot for this chip because of the cellular-connecting possibilities, saying he expects adoption of wearables with their own data connections will keep rising, especially with AI services. “We’re seeing, on-wrist, up to 50% of customers taking connected [wearables] with a service plan. We’re seeing that dramatically increase, especially with this AI on device/off device type of experience in the cloud.”
It’s clear that halos of wearables are on deck from several big companies. How it all shakes out and works, though, is still unclear. And while these new wearables should be a lot more powerful, the focus right now isn’t on improving how they could stay connected and communicate with each other, something I got a glimpse of in a demo of a personal mesh network made by startup Ixana at CES. Maybe that’s next on deck.
Advertisement
For now, wearables are trying to be better extensions of your phone, first, and act better as standalone devices too.
We’ll start this week off with a bit of controversy from Linux Land. Anyone who’s ever used the sudo command knows that you don’t see any kind of visual feedback while entering your password. This was intended as a security feature, as it was believed that an on-screen indicator of how many characters had been entered would allow somebody snooping over your shoulder to figure out the length of your password. But in Ubuntu 26.04, that’s no longer the case. The traditional sudo binary has been replaced with a one written in Rust, which Canonical has recently patched to follow the modern convention of showing asterisks on the password prompt.
As you might expect, this prompted an immediate reaction from Linux greybeards. A bug report was filed just a few days ago demanding that the change be reverted, arguing that breaking a decades-old expectation with no warning could be confusing for users. The official response from a Canonical dev was that they see it the other way around, and that the change was made to improve the user experience. It was also pointed out that those who want to revert to the old style of prompt can do so with a config change. The issue was immediately marked as “Won’t Fix”, but the discussion is ongoing.
Speaking of unexpected changes, multiple reports are coming in that the February security update for Samsung Galaxy devices, which is currently rolling out, removes several functions from the Android recovery menu. After the update is applied to phones such as the S25 and Fold 7, long-standing features, such as the ability to wipe the device’s cache partition or install updates via Android Debug Bridge (ADB), disappear.
Just like with the change to sudo, this is the sort of thing that will aggravate veteran users the most. There’s been no official explanation for these changes, and it’s not immediately obvious why Samsung would fiddle with the recovery menu that’s remain largely unchanged since Android’s introduction. As 9to5Google mentions, it could be an attempt to prevent users from installing leaked firmware builds — a practice that’s gotten the attention of the electronic giant’s legal department.
Advertisement
These days, software updates are just one of the things you need to keep track of. Add in emails, RSS feeds, and incoming chat messages, and keeping up with the notifications on your computer or smartphone can be a challenge. But that’s nothing compared to the 800,000 alerts fired off earlier this week by the Vera Rubin Observatory. The observatory uses a 3.2 gigapixel camera to take long exposure images of the night sky, which are then compared with earlier shots to detect visual changes. Astronomers create filters to narrow down what they’re after, and can be notified when the automated system detects a match. A preview image is available in just seconds, while the full-resolution imagery takes around 80 hours to process. It’s still early days, but once the VRO gets up to speed, it’s expected that as many as seven million alerts will be generated each night.
While on the subject of large-scale engineering projects, this week, Google announced that its new data center in Minnesota will be hooked up to the world’s largest battery. The 300 megawatt array built by Form Energy will use iron-air technology, which essentially uses a reversible rusting process to store energy produced by renewable sources such as wind and solar. When those sources aren’t available, the data center can run off of battery power for up to 100 hours.
While heavier and less efficient than lithium-ion, iron-air batteries have the advantage of being substantially cheaper to produce. So while it’s unlikely you’ll see the technology in smartphones anytime soon, it’s perfect for static installations like this.
Finally, some sad news from the world of retro computing/games: a very rare copy of Tsukihime Trial Edition was apparently destroyed while in transit from one collector to another. It might not look like much — the game was distributed by the indie developers on unbranded floppies at a Japanese convention in 1999 — but it represents one of only 50 copies known to exist. While the occasional damaged package is all but unavoidable, this one is particularly egregious as it appears that someone at US Customs intentionally ripped the disk to pieces. The purchaser has filed a complaint with Customs, and we’re interested in hearing what their version of the story sounds like.
See something interesting that you think would be a good fit for our weekly Links column? Drop us a line, we’d love to hear about it.
Honor isn’t reinventing its book-style foldable this year, but with the Magic V6, it doesn’t really need to.
Instead, the company has delivered what feels like a careful refinement of an already accomplished formula. It is thinner in places and tougher where it matters. Moreover, it packs meaningful internal upgrades without upsetting the balance that made its predecessor so compelling.
On paper, the Magic V6 looks familiar. It retains the slim, symmetrical profile that has become a hallmark of Honor’s foldables. It measures 8.75mm when folded on the White model and just 4.0mm when unfolded. Even the heavier colourways tip the scales at a very reasonable 224g. Meanwhile, the White model comes in at 219g.
For a device with a near eight-inch internal display, that’s impressive engineering restraint, especially one with a huge 6600mAh battery that would shame many conventional flagships.
Advertisement
Advertisement
Durability has been pushed further this time around. The handset carries IP68 and IP69 ratings, offering resistance to dust and water under controlled conditions, which is still a rarity in the foldable space.
Honor also doubles down on display protection, pairing its Super Armored Inner Screen with an anti-reflective coating, while the outer display sports an Anti-scratch NanoCrystal Shield. It’s clear the company is targeting everyday usability, not just spec-sheet bragging rights.
Image Credit (Trusted Reviews)
Open the device and you’re greeted by a 7.95-inch LTPO 2.0 internal panel with a variable 1–120Hz refresh rate and a peak brightness rated at up to 5000 nits. Resolution lands at 2352 x 2172, with 1.07 billion colours and full DCI-P3 coverage.
The external 6.52-inch screen mirrors much of that ambition. It stretches to an even higher quoted peak brightness of 6000 nits and a sharp 2420 x 1080 resolution. Both displays support stylus input, but you’ll need to buy the Magic Pen separately.
Advertisement
Honor continues to make a strong play around eye comfort too. The Magic V6 supports 4320Hz PWM dimming on both screens. It also has features such as AI Defocus Eyecare 2.0, Circadian Night Display and hardware-level low blue light. These aren’t headline-grabbing upgrades, but they reinforce the brand’s focus on long-term usability. This is important on a device designed for extended reading, streaming and multitasking sessions.
Advertisement
Powering the show is Qualcomm’s Snapdragon 8 Elite Gen 5 Mobile Platform, paired with 16GB of RAM and 512GB of storage. That’s flagship territory by any metric, and it’s backed by a substantial 6660mAh silicon-carbon battery. Fast charging remains a strength, with support for 80W wired SuperCharge and 66W wireless charging using compatible chargers. There is also wireless reverse charging.
Image Credit (Trusted Reviews)
Cameras are led by the Honor AI Falcon system, headlined by a 50MP ultra-light-sensitive main sensor with OIS, a 64MP periscope telephoto (also with OIS) and a 50MP ultra-wide. Honor quotes up to 6.5 stops of CIPA-rated image stabilisation.
Around the front, there are dual 20MP cameras – one on each display – ensuring consistent selfie and video call quality whether folded or unfolded. Video capture tops out at 4K across both rear and front cameras.
Advertisement
Artificial intelligence plays a prominent role, but it feels more integrated than ornamental. The AI Image Engine includes tools such as AI Super Zoom, AI Enhanced Portrait and Harcourt Portrait. Editing features span AI Eraser, AI Upscale and AI Outpainting. Beyond imaging, users get AI Writing, Call Translation, AI Meeting Agent and even AI Deepfake Detection. Google Gemini support is also baked in, signalling Honor’s intent to compete not just on hardware, but on smart software experiences too.
Image Credit (Trusted Reviews)
Advertisement
Connectivity is equally modern. The Magic V6 supports Wi-Fi 7, Bluetooth 6.0 and USB-C with USB 3.2 Gen 1 speeds. It also includes features such as Mac Screen Sharing and file transfer compatibility with iOS devices. This is a nod to cross-platform users who may not live entirely within Android ecosystems.
In truth, the Magic V6 isn’t a radical reinvention. But that’s precisely the point. Honor has taken one of the most well-rounded foldables on the market and tightened the screws. They have trimmed millimetres, boosted brightness, increased battery capacity and layered in more AI functionality. It’s an evolutionary update, not a revolution. Yet in a maturing foldable category, refinement may well be the smarter move.
If the previous generation proved Honor could build one of the best foldables around, the Magic V6 is about proving it can sustain that position.
Qualcomm’s Snapdragon Elite chips are reserved for the best Android phones and laptops, and now the company has introduced the first in the Elite series for wearables. The Snapdragon Wear Elite processor is designed for smartwatches and AI devices like pendants and promises up to a fivefold increase in single-thread CPU performance, Qualcomm announced.
The new processor is built on a 3nm process to improve speed and efficiency over previous models, while boosting the number of cores to five (one big core at 2.1GHz and 4 little cores at 1.9GHz). With those changes, the company is promising up to five times faster single-threaded performance, with GPU speeds boosted up to seven times.
Qualcomm
The Snapdragon Wear Elite is also equipped with a new NPU that allows low-power AI use cases like keyword recognition along with noise cancellation. It’s also the first Snapdragon wearable processor with a dedicated Hexagon NPU supporting AI models with two billion parameters. That will allow new “personal AI experiences,” the company said, like context-aware recommendations, natural voice interactions, life logging and AI agents that can orchestrate tasks on your behalf.
Wear OS devices with the chip should see up to 30 percent improved battery life and charging speeds of up to 50 percent in ten minutes. It also allows for more types of connectivity, including 5G reduced capability, micro-power Wi-Fi, NB-NTN for satellites, Bluetooth 6.0, GNSS and UWB. However, manufacturers will be able to source versions of the chip without some of those wireless features.
Advertisement
Whether the Snapdragon Wear Elite will give Wear OS watch manufacturers a better chance to chip into the 50-plus percent market share of Apple’s Watch remains to be seen. The first devices using the chip will start to ship in the “next few months,” Qualcomm said. “Leading global partners are supporting the platform including Google, Motorola and Samsung.”
I’m a big fan of my Mac Mini, but it’s not the only option when it comes to buying one of the best mini PCs. Sure, it looks fantastic and boasts powerful hardware, but you have to pay for the privilege, and not everyone has that kind of budget.
To take advantage of the 42% discount, you’ll just need to enter the code R6VSNOOY at checkout. The deal expires March 31, though so don’t hang around.
With this spec you’ll be covered for office tasks, remote working, light content creation, gaming, and more. It’s not the only deal at Kamrui, though. Check out their Kamrui Amazon Shop, where you’re sure to find a mini PC that suits you.
Advertisement
Today’s best KAMRUI deal
The Hyper H2 Mini PC is powered by an Intel Core 14450HX with 10 cores and 16 threads (up to 4.8GHz). This delivers fantastic stability and sustained multi-core performance under heavy workloads and is the superior choice compared to Ryzen 7 alternatives, perfect for high-intensity scenarios, such as 3A gaming, 3D rendering, and video editing.
The CPU is backed up by 32GB of RAM, which Kamrui claims is 50-70% faster than 16GB RAM. It’s also equipped with a 1TB NVMe PCIe4.0×4 SSD, which significantly improves loading times. This can also be expanded to 4TB if you run out of space.
Advertisement
One of the best things about the Hyper H2 is that the integrated Intel UHD Graphics supports up to three 4K displays simultaneously. This is ideal for multitasking, data analysis, or multi-window design workflows.
This is the mini PC that just keeps on giving, and at just $478.39 at Amazon, we recommend it highly.
If you’re going to give your vehicle a good clean inside and out, it’s generally agreed that microfiber towels are the way to go. They’re gentle yet effective and, as long as you’re washing your microfiber towels properly, they can be used repeatedly. However, not all microfiber towels are fit for all surfaces, as different car materials call for different towel types. When cleaning glass, for instance, you want to use low-pile towels with shorter, more tightly-woven fibers. Unfortunately, what makes these good for glass makes them risky to use on paint.
Low-pile towels have more fibers, which means more points of contact with a surface. This can help remove fingerprints effectively and reduce the likelihood of streaks. However, this strong point can become a weakness when used on paint. More points of contact mean a stronger possibility of grabbing and dragging dirt particles across paint, thus creating scratches. This is an even bigger concern if the car hasn’t been effectively cleaned beforehand and there’s excess dirt on the paint. Thus, plush, high-pile towels are better suited for paint cleaning.
Of course, the matter of pile size is just one element of microfiber cloth selection. Material blend and weave types are key too; much like pile size, glass and paint cleaning don’t call for the same varieties.
Advertisement
Material blend and weave type are also important
Vasyl Mykhailenko/Getty Images
When looking for microfiber towels, you’ll likely see material ratios indicating the towel’s blend of polyester and polyamide, also known as nylon. Polyester is for scrubbing, while polyamide covers absorption. An 80/20 polyester-polyamide split is typically recommended for cleaning glass, prioritizing the removal of smudges, grime, and dirt while still being able to suck up residual water and cleaning products. However, a 70/30 blend is typically recommended for potentially scratch-prone vehicle paint jobs, as this composition reduces abrasion and increases absorption and softness.
Aside from the cloth’s blend, you should also pay attention to the type of microfiber towel you’re using. The material’s weave will affect the towel’s cleaning attributes, making some towels better for specific surfaces than others. For glass, common popular weave types include pearl, diamond, and waffle, for their blend of cleaning ability and softness. Meanwhile, plush and twist cloths, for example, are more paint-friendly. These offer increased softness, reduced friction, and high absorption.
If U.S. automakers turn their backs on electric vehicles, “their sales outside the U.S. will shrivel,” warns Bloomberg. [Alternate URL.]
They’re already falling behind on the technology, relying on a 100% U.S. tariff on Chinese EVs to keep surging rivals like BYD Co. at bay…. While the American automakers “mostly understand the challenge in front of them, they don’t have full plans” to confront it [said Mark Wakefield, head of the global automotive practice at consultant AlixPartners]…
“Now is a great time for the V-8 engine,” said Ryan Shaughnessy, the Mustang’s brand manager. “We’ve done extensive customer research in multiple cities, looking at a variety of powertrains, and the V-8 is always the number-one choice.” It isn’t just customers. U.S. automakers have long been run by “car guys:” enthusiasts who live for the bone-shaking rumble of a big engine. For them, quiet and smooth EVs — even the absurdly fast ones — can’t satisfy that craving. They’re convinced many American car buyers share the same enthusiasm for what Shaughnessy described as “the sound and roar of the V-8.”
Wall Street couldn’t be happier with the new direction… Ford’s fortunes are also on the rise, as it’s predicting operating profits could grow by as much as 47% this year to $10 billion. Ford’s stock has risen nearly 50% over the last 12 months. Under the previous environmental rules, automakers effectively had to sell zero-emission vehicles in growing numbers to offset their gas-guzzlers. When they fell short, they had to buy regulatory credits from EV companies such as Tesla Inc. or face penalties. GM spent $3.5 billion on credits from 2022 to the middle of 2025. Now, according to JPMorgan Chase & Co. analyst Ryan Brinkman, GM and Ford each have “billion dollar tailwinds”…
Advertisement
[T]he hangover from all that new horsepower could leave US automakers lagging their Chinese rivals who already build the world’s most advanced — and lowest priced — electric cars. Indeed, there is much talk in Detroit about the competitive tsunami that will be unleashed on American automakers once Chinese car companies find a way to break through trade barriers now protecting the US market. [Ford Chief Executive Officer Jim] Farley even calls it an “existential threat”… “They’re going to build as many V-8 engines and big trucks as they can get out the factory doors,” said Sam Fiorani, vice president of vehicle forecasting for consultant Auto Forecast Solutions. “And as the rest of the world develops modern drivetrains, newer batteries and better electric vehicles, GM and Ford in particular are going to find themselves falling even further behind.” The article notes GM “continues to develop battery-powered vehicles, and CEO Mary Barra said the automaker would begin offering a ‘handful’ of hybrids soon,” while Ford and Stellantis “have plans to launch extended-range electric vehicles, or EREVs, a new kind of plug-in hybrid with an internal combustion engine that recharges the battery as the vehicle drives down the road.” But while automakers may be investing in future EV vehicles, they’re also “leaning into the lucre that comes from selling millions of fossil-fuel vehicles in a rare moment of loosened regulation.”
Most people have by now accumulated a collection of USB drives over the years. However, no digital storage medium lasts forever. Once a USB drive is three to five years old, it is a mature drive, and it should be put to use for less intensive tasks after it ages out of that band. The total lifespan of a USB flash drive is about a decade. Of course, there are gradients of quality to consider.
An expensive, ruggedized SSD is likely to use higher-quality flash memory than a thumb drive from the bargain bin at Walgreens and should last far longer before experiencing any issues. But many people still hold onto flash memory well beyond its prime, regardless of quality. Even tech aficionados aren’t immune. There may be many reasons why you can’t bring yourself to part with your old jump drives, portable hard drives, and SSDs, but you should be aware that they are no longer safe to use for certain purposes.
There are a number of things you should never trust an old USB drive to handle; instead, use a new, fast, and reliable USB-C flash drive or SSD. That ancient USB drive you stored your old tax records on and left in a drawer? It might already have corrupted those crucial documents beyond recovery. The SSD with that video of your firstborn child’s first steps? The passage of time will render both the video and its subject unrecognizable. So, here are five ways we would never reuse old USB drives and why you shouldn’t, either.
Advertisement
Never use old USB flash sticks to store important data
Max Miller/SlashGear
You should, in general, abstain from keeping only a single copy of your important data regardless of where it’s stored, but that advice is even more important in the context of an old USB storage drive. Even a brand new, top-of-the-line SSD can potentially fail or become corrupted, but the odds of failure increase dramatically as a storage drive ages. The older a drive is, the more write cycles (files stored and deleted) it is likely to have endured. You can think of this like wear and tear. The more write cycles, the greater the potential for instability. Files like your wedding album or important tax documents should never live on a flash drive alone. You should always make sure they’re stored in at least one other place.
Redundant copies may require a bit of extra work, but if one of your storage solutions suffers an outage or gets lost, you’ll be grateful you spent a few minutes copying your irreplaceable data. However, you should also be aware of the trade-off between preservation and security. For data that’s valuable to you but not sensitive (for instance, that wedding album), it can be a good idea to keep three copies: one on your computer or phone, one on an external drive, and one in the cloud using a service like Dropbox or Google Drive. However, if the data is sensitive (for instance, a passport, ID, or financial data), you may want to eschew the cloud in favor of physical drives you can keep an eye on personally.
Advertisement
Never use old USB drives for long-term storage
Bin Kontan/Getty Images
Along the same lines, you should never use an old flash drive or hard drive for long-term data archival. On average, the lifespan of an HDD is about three to five years, while that of a flash storage device is about the same. That’s because flash storage of the kind used in thumb drives, SSDs, SD cards, and so on stores data by holding a small electrical charge in the transistors. Since all computers rely on binary code — ones and zeros — the relative positions of the trapped electrons are what tell a computer whether it should read a one or a zero. The electrons remain trapped regardless of whether the drive has external power, but no charge can hold forever. Over time, some of those cells lose their electron memory, which can lead to data corruption. This is often referred to as bit rot or data rot.
A drive that is already experiencing degradation or data rot will lose data more quickly when it is not connected to a computer. Immediately after writing a file, that file will be accessible. But stick the USB drive in a drawer for a few weeks, and you may find that same file unrecoverable when you plug it back into your computer. So, if you were planning to store your critical documents or media on an ancient USB drive and throw that drive in the safe, you’re much better off copying them onto a brand new storage device instead. Even so, you should still make sure the drive works by checking it every so often and replacing it every three years at a minimum.
Advertisement
Don’t live boot an operating system from an old USB
Max Miller/SlashGear
One of the niftiest uses for a USB drive is live booting an operating system. Every computer stores its operating system on a storage drive, which means you can BYOOS — bring your own operating system — by writing it to a flash drive or SSD and plugging that storage device into an existing computer. You can then turn nearly any computer you come across into “your” computer by booting the OS off of your USB drive. This is also a handy method for testing whether a PC is functional, repairing an OS installation, or quickly installing your OS of choice on a new PC. A drive with a bootable operating system installed on it is referred to as a “live USB.”
However, it’s a bad idea to use an old USB drive for live booting, especially if you rely on it as your main computing environment. This is because of the same issues we discussed above. Whether you’re storing files or booting an OS, older USB drives are prone to data rot. Your entire OS could become corrupted without warning. But you’ll also run into another issue: storage speed.
Advertisement
USB 3.0 SuperSpeed at its base data transfer rate of 5 Gbps can create a bit of a bottleneck when live booting on newer hardware, but many old USB drives use even older USB 2.0 specifications that will significantly slow performance. Lastly, old USBs tend to have smaller storage sizes — often less than 1 GB if they were released much more than a decade ago. Depending on the operating system, you might be able to scoot by with 8 GB, but 16 is really preferred in order to ensure you have some storage headroom for optimal performance.
Advertisement
Live file editing is a job for newer SSDs
Max Miller/SlashGear
If you’re often working with files, such as editing documents across multiple computers or editing video shot with a professional-grade camera, you may be accustomed to live file editing from a USB drive. That means you’re editing the files while they’re still stored on the drive without first moving them onto your computer. However, live file editing is an easy way to stress out your USB drive, and you should therefore abstain from using an old drive for that purpose.
A few edits on a Word document may not be too problematic, but over time, the risk compounds. It’s a bit like going outside without sunscreen, in that the more you do it, the bigger the risk you run. One reason you shouldn’t buy used USB drives is that you don’t know how heavily they were used.
Video editors in particular may choose to live edit from a USB drive for two reasons. First, video files are often extremely large, so there isn’t enough space on the computer’s internal drive for multiple projects. Second, video editing wears out a drive more quickly than editing lightweight files, so using external SSDs prevents the internal drive from wearing out. These are valid justifications, but only when editing from newer drives. When you’re using an old USB drive that may already be nearing the end of its lifespan, heavy write cycles from video editing could be the nail in its coffin.
Advertisement
Don’t use old USB drives for console or PC storage
Max Miller/SlashGear
It’s tempting to give an old USB drive a second life as extra storage for your computer or gaming console, especially if it’s a high-quality drive that you can’t bring yourself to part with. However, it is a bad idea to do so for many of the same reasons we’ve discussed in other contexts. Old USB storage devices are more likely to fail, and they are more likely to use older, slower standards that will become major bottlenecks in your workflow. But there are other issues with using an old USB thumb drive or SSD as a working drive.
For flash drives in particular, which are designed for quick file transfers and semi-infrequent use, you are upping the chances of failure when you leave them plugged into a computer for days, let alone months or years. Voltage issues and overheating can happen when a flash drive is used as long-term computer or console storage, and that’s all the more true the older your drive is. Constant writes cause flash memory to heat up, which is why so many professional-grade portable SSDs include a heat sink. High-intensity tasks like video game storage put even more stress on a drive.
SSDs can often be used as computer or console storage, provided they are using a newer generation of USB — preferably at least USB 3.2 Gen 2 for 10 gigabits per second (Gbps) transfer speeds on modern hardware while maintaining broad compatibility. Older SSDs should be used with caution, and you should make sure there is an intact heat sink attached. Test it using a benchmarking suite like CrystalDiskMark on Windows before use, and don’t rely on it for intensive tasks if possible.