The iPhone 17 Pro is absolutely worthy of its ranking among the best camera phones you can buy in 2026. Thanks to its trio of lenses and features like ProRaw, It’s capable of taking stunning images — in broad daylight or in the dead of night — that would rival professional mirrorless cameras. But while Apple may have held its crown as mobile photography champ for a long time, there are an increasing number of flagship Android phones that offer incredible camera skills as well — and the Oppo Find X9 Pro is just such a device. Its camera setup is excellent and I’ve taken some beautiful images with it using both its wide and 200-megapixel zoom cameras.
The Find X9 Pro is a powerhouse phone in all respects, which is why it scored so highly in my full review — and why it was given a coveted CNET Editors’ Choice Award. So to see just how it stacks up against the iPhone 17 Pro, I took it out on a series of photo missions around my beautiful home city of Edinburgh.
Before we dive in, a quick note about the images. They were all shot with each phone’s default camera mode in JPEG with no other settings applied (the Photographic Style on the iPhone was set to Standard). The images have been imported into Lightroom for the purposes of comparison and exporting at file sizes that will play nicely on the internet, but no other edits, sharpening or noise reduction have been applied.
Advertisement
Watch this: One Month Later: The iPhone 17 Pro Strikes Back
Remember that while some decisions about which images look better might be obvious (such as a lack of detail or image processing aberrations), others will simply come down to personal opinion. I’m a professional photographer, so I typically look for an image that captures the scene more naturally. You may like a more vibrant image with high contrast, so take my findings with a pinch of salt.
With that said, let’s dive in.
Wide cameras comparison
Advertisement
Enlarge Image
iPhone 17 Pro, shot on the main camera.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot on the main camera.
Andrew Lanxon/CNET
Starting off with this easy snap overlooking the train tracks. Both phones have exposed their images above well but the Oppo’s shot has more natural warm tones on the brickwork on the wall — the iPhone’s look more magenta. The Oppo’s colors are more vibrant, too, but not overly so.
Enlarge Image
Advertisement
iPhone 17 Pro, shot on the ultrawide camera.
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot on the ultrawide camera.
Advertisement
Andrew Lanxon/CNET
Switching to the ultrawide lens, the blue sky definitely looks oversaturated in the Oppo’s shot. And here’s where we have to dive deeper; Oppo’s image has had more digital sharpening applied to it, which helps some details look crisp, but it’s also got a lot of noise reduction, which smooths details in other areas.
Enlarge Image
Detail crop with the iPhone 17 Pro on the left and Oppo Find X9 Pro on the right.
Advertisement
Andrew Lanxon/CNET
If we look up close at this section of wall, we can see that the strong lines of mortar between the bricks look sharper in the Oppo’s photo on the right. But the bricks themselves look almost polished as they’ve been stripped of detail by the noise reduction. The iPhone’s image has retained that detail.
Enlarge Image
iPhone 17 Pro, shot with the main camera.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera
Andrew Lanxon/CNET
Another weird one to analyze. The wooden box of the library is unquestionably sharper on the Oppo’s shot, with even the minute scratches on the perspex being clearly visible. But as soon as we look further out toward the edges of the frame, that detail plummets.
Advertisement
Enlarge Image
Detail crop with the iPhone 17 Pro on the left and Oppo Find X9 Pro on the right.
Andrew Lanxon/CNET
Zooming in close on a section to the right side of the frame, it’s clear that the Oppo’s image severely lacks detail compared to the iPhone’s image. Whether this is an image processing issue or due to the quality of the lens, I’m not sure, but it’s surprising to see, especially given how sharp the rest of the image is.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the main camera.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with the main camera.
Andrew Lanxon/CNET
This indoor shot on the main camera feels like a slightly easier win for the Oppo. Its image is brighter and colors look richer without being too punchy. As before, it both sharpens some areas and reduces texture in others. There’s a lack of detail toward the edge of the frame, but you’d only notice if you really get up close to the pixels. Overall, I prefer the look of the Oppo’s shot.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the ultrawide camera.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot with the ultrawide camera.
Andrew Lanxon/CNET
And it’s the same when I switched to the ultrawide lens — the Oppo takes the win here.
Enlarge Image
Advertisement
iPhone 17 Pro, shot with the main camera.
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera.
Advertisement
Andrew Lanxon/CNET
I love the balanced exposure from both phones in this vibrant outdoor scene, but I prefer the warmer tone of the Oppo’s shot. The iPhone’s photo looks like it saw all the golden colors and set its auto white balance on the cooler side to compensate. The Oppo produced a more true-to-life image and I think it’s a great shot as a result.
Enlarge Image
iPhone 17 Pro, shot with the main camera.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera.
Andrew Lanxon/CNET
I don’t like the Oppo’s effort here, though. It artificially brightened the shadows way too much, giving this scene a fake HDR look that screams, “I took this on an Android phone.” The iPhone takes an easy win with its more natural handle on shadows.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the main camera.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with the main camera.
Andrew Lanxon/CNET
I’m conflicted on this one. The Oppo’s shot is brighter and more vibrant, but it’s almost too much. The blue sky is a bit on the electric-blue side for my taste, while the buildings in the center of the frame look slightly too bright. Still, I think I prefer its rendition to the iPhone’s, which does look a little drab by comparison.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with 2x zoom.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot with 2x zoom.
Andrew Lanxon/CNET
At 2x zoom, this indoor scene looks solid on both phones. Overall, I think the Oppo’s shot takes the win as it’s brighter and sharper than the iPhone’s.
Enlarge Image
Advertisement
iPhone 17 Pro, shot with 8x zoom.
Andrew Lanxon/CNET
Enlarge Image
Oppo find X9 Pro, shot with 6x zoom.
Advertisement
Andrew Lanxon/CNET
Taking each phone up to its maximum default zoom levels (8x on the iPhone, 6x on the Oppo), the results look quite dramatically different. The color balance is wildly different for one thing, with the iPhone leaning more into teal tones while the Oppo’s photo has a more magenta cast to it. Honestly, neither one looks especially realistic, with both phones going a bit too hard in different directions. What I have noticed is that the Oppo’s image has gone overboard with the digital sharpening, resulting in a crunchiness to the details that I’m not a fan of.
Enlarge Image
Detail crop with the iPhone 17 Pro on the left and Oppo Find X9 Pro on the right.
Advertisement
Andrew Lanxon/CNET
The huge amount of digital sharpening on the Oppo’s shot is clear when you zoom in on the details.
Enlarge Image
iPhone 17 Pro, shot with 8x zoom.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with 6x zoom.
Andrew Lanxon/CNET
This is an odd one; at max zoom, the Oppo has catastrophically failed to render the details on the side of the building.
Advertisement
Enlarge Image
Detail crop with the iPhone 17 Pro on the left and Oppo Find X9 Pro on the right.
Andrew Lanxon/CNET
Check out this detailed crop; I don’t know what the Oppo was doing in its image, but that building has been turned into a bizarre, smeary mess. The iPhone has done a superb job of capturing those distant fine details.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with 8x zoom.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with 6x zoom.
Andrew Lanxon/CNET
Seagulls on a log. There’s very little to choose between either phone in this example. Take your pick!
Advertisement
Enlarge Image
iPhone 17 Pro, shot with 8x zoom.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot with the Hasselblad zoom lens.
Andrew Lanxon/CNET
The Oppo Find X9 Pro does have a secret weapon when it comes to zoom, though, in the form of the Hasselblad telephoto zoom accessory. This optional lens attaches to the phone and gives huge zoom lengths — up to 40x — while retaining excellent quality. You can see the difference here in the maximum zoom range of the iPhone against the zoom of the Find X9 Pro with the lens attached; it’s both closer and sharper.
Enlarge Image
Advertisement
The telephoto lens looks just like a real Hasselblad camera lens. It’s great fun to play with.
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the Hasselblad telephoto zoom lens.
Advertisement
Andrew Lanxon/CNET
I absolutely love using the lens add-on for street photography, as you can get some great candid moments without anyone noticing. It’s worth keeping in mind, though, that the Hasselblad lens for the phone is an eye-watering £435 or $580 (based on a rough conversion of the 499 euro price), and third-party telephoto lenses from the likes of Sandmarc are also available for the iPhone.
Night photography
Enlarge Image
iPhone 17 Pro, shot with the main camera, night mode.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera, night mode.
Andrew Lanxon/CNET
The iPhone’s night mode shot here does look brighter, but I prefer the richer contrast on the Oppo’s shot. Otherwise, it’s a pretty even match here.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the main camera, night mode.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with the main camera, night mode.
Andrew Lanxon/CNET
But it’s a much easier win for the Oppo here. The deeper contrast has helped keep some of the flare from the lights at bay, while the details on the front of the building are much sharper.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the main camera.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot with the main camera.
Andrew Lanxon/CNET
This indoor scene is brighter, warmer and more vibrant on the Oppo and I much prefer it as a result.
Enlarge Image
Advertisement
iPhone 17 Pro, shot with the main camera, night mode.
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera, night mode.
Advertisement
Andrew Lanxon/CNET
The iPhone’s image is brighter here, especially in the sky, but if you zoom in on the details, the Oppo’s image is sharper.
Enlarge Image
iPhone 17 Pro, shot with the ultrawide camera, night mode.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the ultrawide camera, night mode.
Andrew Lanxon/CNET
And it’s basically the same story when you switch to the ultrawide lens.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with 8x zoom, night mode.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with 6x zoom, night mode.
Andrew Lanxon/CNET
When we jump to the zooms, though, the Oppo has ramped up the sharpening again, resulting in an image that looks rather over-processed.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with 2x zoom.
Andrew Lanxon/CNET
Enlarge Image
Advertisement
Oppo Find X9 Pro, shot with 2x zoom.
Andrew Lanxon/CNET
I caught a glorious sunset on one evening but only the iPhone managed to do it justice. I love the iPhone’s natural tones and deep shadows, whereas the Oppo has delivered an oversaturated shot that looks like I’ve applied a tacky filter before posting it to Instagram.
Enlarge Image
Advertisement
iPhone 17 Pro, shot with the main camera.
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with the main camera.
Advertisement
Andrew Lanxon/CNET
And it’s the same here with the Oppo’s shot looking saturated against the iPhone’s more realistic version.
Enlarge Image
iPhone 17 Pro, shot with 8x zoom.
Advertisement
Andrew Lanxon/CNET
Enlarge Image
Oppo Find X9 Pro, shot with 6x zoom.
Andrew Lanxon/CNET
But the difference was most obvious when using the zoom lenses on both phones. The iPhone’s shot not only has more natural colors, but the Oppo’s heavy-handed processing has given the lighthouse an unpleasant halo (a light haziness around its edges) that really spoils the shot.
Advertisement
Enlarge Image
iPhone 17 Pro, shot with the selfie camera.
Andrew Lanxon/CNET
Advertisement
Enlarge Image
Oppo Find X9 Pro, shot with the selfie camera.
Andrew Lanxon/CNET
I ended on a selfie and here both phones went in interesting directions. The Oppo is certainly the winner to my eye — it’s shot is considerably sharper (without overdoing it) with more natural skin tones and an accurate orange hue on my jacket. The background is a bit overly cyan but it’s certainly a better-looking attempt than the iPhone’s.
iPhone 17 Pro vs. Oppo Find X9 Pro: Which takes better photos?
I was surprised at the results. Oppo’s phones — and its sister company OnePlus’s phones — have had a history of leaning hard into image processing with often wildly brightened shadows, too much sharpening and inaccurate colors that resulted in shots that were only really okay for casual snaps. The Find X9 Pro does have some of that (the image of the red restaurant front is a particularly egregious example of shadow brightening) but it’s way more toned down than I expected.
Advertisement
In fact, it delivered shots in many instances that I preferred over the iPhone’s. The golden hues of the tree-lined pathway shot looked sublime on the Oppo, while the warmer, brighter tones inside the pub were a clear victory for the X9 Pro. Most of the images from the Oppo’s main camera I preferred over the iPhone’s, including some at night. It wasn’t a win in every instance and it just goes to show that each phone’s image processing will still trip up in different scenarios.
But overall, I think I have to give the win to the Oppo Find X9 Pro. Its ability to capture scenes accurately with just enough processing to help give images that little pop but without going overboard is admirable. It’s safe to say then, if you’re looking for a high performance Android camera phone, the Find X9 Pro is certainly one to consider.
This week Amazon opened up its parcel shipping, fulfillment, and distribution “to businesses of all types and sizes.” Any business can now ship, store, and deliver “using the same supply chain that supports Amazon,” according to Monday’s announcement of “Amazon Supply Chain Services.”
The move sent shares of UPS and FedEx “tumbling” Monday writes GeekWire. And though both stocks bounced back as the week went on, GeekWire sees this as the latest example of Amazon “turning its internal capabilities into products and services for sale…”
“Amazon had already surpassed both carriers to become the nation’s largest parcel shipper by volume, according to parcel-analytics firm ShipMatrix.”
Initial customers include Procter & Gamble, which is using Amazon’s freight network to transport raw materials; 3M, which is using it to move products to distribution centers; Lands’ End, which is fulfilling orders across sales channels from Amazon’s warehouses; and American Eagle Outfitters, which is using Amazon’s parcel service for last-mile delivery. The service can fulfill orders placed through platforms that compete with Amazon’s own marketplace, including Walmart, Shopify, TikTok, and others… Peter Larsen, vice president of Amazon Supply Chain Services, compared the launch to the origins of Amazon’s cloud business…
Advertisement
In addition to putting Amazon in competition with existing players in the logistics industry, the move also raises questions about data privacy. Amazon has faced accusations of using nonpublic seller data to compete against merchants on its marketplace, which it has denied. Larsen told the Wall Street Journal that the company prohibits using supply chain customer data for its own marketplace decisions, noting that hundreds of thousands of Amazon sellers already trust the company to fulfill orders placed on rival platforms.
The article notes taht in his annual shareholder letter Amazon’s CEO “said the company is also exploring selling its custom AI chips and robotics to outside customers.”
Trump Mobile has repeatedly let down over half a million people waiting for its T1 smartphone, with the fabled device still not going on sale almost a year after its launch. It may never do so.
Launched back in June 2025, the Trump Organization’s Trump Mobile said it would release the T1 smartphone as a “made-in-USA” device. It was a popular device for supporters of President Trump, but no-one’s managed to actually get their hands on it.
Approximately 590,000 people put a deposit of $100 down for the smartphone, which would ultimately sell at $499, reportsIBTimes. Despite receiving an advance in the region of $59 million, Trump Mobile has yet to ship a single unit to consumers.
There doesn’t seem to be any sign of it arriving by the quickly approaching one-year mark, either.
Advertisement
A stretching timeline
The T1 was pitched as a “Made in USA” smartphone, almost in response to Apple and others constructing the iPhone and other smartphones in other countries. Shortly after its introduction and after many assurances of its USA-centric nature, the Internet quickly determined that it was probably a rebadged budget Android device made in China.
The research didn’t stop the surge of preorders for the device, even when Trump Mobile silently dropped the “Made in America” claim.
After failing to ship in the late summer of 2025, the release date kept being moved back to later in the year, then into early 2026. A redesign of the Trump Mobile website in April 2026 removed the release date entirely, instead showing a link to “join the waitlist.”
Previous images of the fabled Trump Mobile T1
Advertisement
Reports into the delay involved one call center representative telling journalists that the T1 was in the “final stages of certification and field testing” in January 2026. There was apparently a ship date of Q1 2026, but that has passed.
At one point, a representative blamed a 43-day federal government shutdown, though critics pointed out that such a claim didn’t really impact a privately held hardware producer.
Considerable doubt
While those who have put down a deposit for the T1 are patiently waiting, there is a possibility that they may not ever get the smartphone at all.
The April website update also included a revised terms of service, which discussed the deposit scheme. The document states that the deposit is not a guarantee that they will receive a working device.
Advertisement
Instead, it explains that the deposit provides “only a conditional opportunity” if Trump Mobile actually puts the T1 on sale. The deposit isn’t a biding sales contract, doesn’t lock the price, can change specifications before release, and isn’t even guaranteed to function on a phone network.
At best, depositors have paid $100 that could turn into a $100 credit towards the T1. That is, if it goes on sale.
If Trump Mobile decides to cancel the T1 entirely, it will issue refunds of the original deposit amount. However, it won’t be liable for any delays caused for issues such as “parts shortages or hold-ups with regulators.
Consumers can also submit a request for a cancellation before a sale is completed.
Advertisement
FTC radio silence
The severe lack of progress has led to some lawmaker complaints about the ordeal. This includes a request from Senator Elizabeth Warren and other Democratic lawmakers to the Federal Trade Commission in January, over the use of alleged “bait-and-switch tactics” and false advertising over the “Made in the USA” claim.
However, as of May 2026, the FTC has not confirmed the existence of such an investigation, nor if one will ever be opened.
As it stands, 590,000 consumers have handed over their money for a smartphone that isn’t “Made in the USA” and may not even get made at all.
At best, they’ll get a smartphone with underwhelming specifications that is bolstered by branding. At worst, the deposits are gone, and all the consumers get is an expensive lesson.
Earbud-based translators are the next game changer. These are over-ear devices that come in a pair—one for you, one for your conversation partner. Each of you wears one earbud, and the software on your phone handles the translation, both ways, behind the scenes. The best earbud translators make for the most natural way to communicate with someone in a foreign language that I’ve found to date, though handhelds tend to have more capabilities. (Earbud-based designs seem to be the direction the industry is heading.)
When shopping for a handheld translator, watch out for expensive subscription plans. Many devices come with free service, but only for a time, and re-upping after the trial period ends can be pricey. Check the fine print before you buy. Also, make sure the translator you’re considering covers all the languages you need. Note that while some translators support hundreds of languages, they may be limited in the language pairs they can translate between.
Who Really Needs a Handheld Language Translator?
Again, if you only need casual translation for occasional or emergency use, you can definitely get by with a free translation app on your phone. Translation devices are best for frequent users who expect to carry on multiple sustained conversations with speakers of other languages over time.
Advertisement
Those scenarios could include attending a reunion with your Swedish wife’s extended family or a lengthy workshop series with colleagues from other parts of the world. These tools are also often marketed to first responders who need to quickly assess a situation when human translation services aren’t available.
In situations where you may need to communicate with several speakers, each speaking a different language, a portable translator can make even more sense. If you expect your journeys to take you to far-flung areas or off the grid entirely, where internet service may be poor or nonexistent, a translator can be a helpful tool in your travel bag, even if you only expect to use it for emergencies.
Which Handheld Language Translators Are Best?
After testing numerous handheld translators, I recommend this trio. Which one you pick will depend on how you expect to use it—and your budget.
Advertisement
Best Stand-Alone Translator
Timekettle
T1 Handheld Translator Device
The Timekettle T1 is a reasonably affordable and very pocketable device that makes for an easy addition to your travel kit. Built for two users to communicate, each with access to half the screen, the T1 translates each side of a conversation—written or spoken—into that user’s own language. Using it can be a little tricky: a color-coded button on the side of the device or a virtual one on the 4-inch touchscreen must be held down to tell the T1 which language to listen for. But once you get the hang of it, the system works pretty well.
Advertisement
Accuracy is solid, and translations are fast, popping up in well under a second. One challenge I had with the device relates to its small screen. Like most translators, the T1 supports photo-based translations via its 8-megapixel camera, but the 540 x 1080-pixel screen is too small to display much text at once. Also, while the unit includes a global eSIM with two years of free service included ($50/year after that), I encountered plenty of signal gaps, even in my own home. The good news is that if Wi-Fi’s available, that works too. The unit also supports 31 offline language pairs (10 in combination with English), so if you plan ahead, service woes may not be an issue at all.
Best Translator Earbuds
iFLYTEK
iFLYTEK AI Translation Earbuds
If you want to upgrade your translation experience and make it more immersive, you’ll want to invest in a pair of earbuds, which give you a more personal and natural way to communicate. As described above, the iconic way to use these is to pop one on yourself and give the other to your friend. An app on your phone handles two-way translation, back and forth.
Advertisement
These 12-gram on-ear earbuds are the best I’ve tested, primarily because once they’re configured, they work completely hands-free. No clicking buttons or tapping the side of your head every time you’re ready to speak: The earbuds understand who’s talking and when, and they work with remarkable speed, almost like a professional interpreter whispering in your ear.
In this week’s “Sunday Reboot,” a good chip issue for Apple to have, regulatory comparisons with oranges, and “Schmigadoon!” gets 12 Tony Award nominations.
Sunday Reboot is a weekly column covering some of the lighter stories within the Apple reality distortion field from the past seven days. All to get the next week underway with a good first step.
This week, Apple had to contend with Maryland lawmakers siding with Apple Towson employees after the store closure announcement, Canada wants Apple to weaken encryption, and Apple failed to reduce the scope of a $4.1 billion iCloud suit in the UK.
A tale of two chip struggles
Apple had to deal with two chip shortage situations this week, but with wildly different results.
Advertisement
On Tuesday, it was discovered that Apple had pulled some of the configuration options for the Mac mini and Mac Studio. Consumers planning to get models with mountains of memory were stopped, as Apple removed the 256GB option from the M3 Ultra Mac Studio.
At the same time, if you wanted an M4 Pro Mac mini with 64GB of memory, you are out of luck. You have only the 24GB or 48GB options available.
These haven’t been the only changes to the lineup, as the Mac Studio lost the 512GB RAM option in March, and the 256GB SSD version of the M4 Mac mini has similarly disappeared.
The upshot here is that, while this is obviously an issue stemming from the global memory crisis affecting the entire tech industry, it is one of Apple’s ways to avoid significantly raising prices. Sure, eliminating high-RAM options isn’t the greatest way, but the alternative would be to raise the prices considerably.
Advertisement
By doing this, it can allow the existing memory inventory to last a bit longer for the models, by still allowing the lower-capacity options to remain on sale. The lower memory variants are also not going to be as badly affected by the cost versus a configuration that is RAM-heavy in nature.
Some may think that this is not really a RAM problem but is Apple preparing to bring out new models by cutting down on existing stock. It’s a thing we have seen before, but current CEO Tim Cook’s remarks during the recent financials indicate we won’t be seeing any real Mac upgrade options until September.
The other chip issue was one of Apple’s own making.
The MacBook Neo is too popular for Apple’s own good.
Advertisement
The MacBook Neo is an extremely popular model, beyond Apple’s own expectations. As a budget MacBook, it has managed to get so much demand that Apple had to double its production plans for the model.
That brought about a new problem for Apple, in the form of a lack of A18 Pro chips.
The MacBook Neo is cheap for Apple to produce, partly because it relies on the use of existing component inventory. It was a recycling effort, using up surplus chips that Apple had already paid for, allowing it to slim down the price to consumers since it was cheaper to produce.
With the massive success of the model, it is believed that Apple now has to do another production run for the A18 Pro chips.
Advertisement
Evidently, while Apple has a good idea of what can make a product a hit, sometimes it can even surprise itself.
Apple and oranges in logos
Big companies are extremely protective of their brands, and Apple is one of the most defensive. Add the Apple logo to something that’s vaguely Apple-like will quickly result in legal issues from Apple’s lawyers.
This makes sense, as Apple has a need to pursue anyone misusing its trademarks to prevent diluting its worth. There’s also the whole thing of preventing consumers from buying fake products that use the brand without authorization.
However, sometimes Apple’s battle over its precious Apple logo goes in some strange directions.
Advertisement
The latest instance is a filing with the EU Intellectual Property Office, trying to convince the regulator to not grant a trademark to another company. This turned out to be a partly-successful act for Apple, as the trademark cannot be used for keyboards and computer equipment.
The logo being objected to, used by keyboard maker Yichun Quinningment Electronics Co., wasn’t an apple, but citrus. It was a circular fruit with a top leaf, a section taken out the right-hand side, and visible segments and “keys” in the middle.
Apple logo [left], Yichun Quinningmeng’s logo [right]
You could argue that the cut-out bit is reminiscent of Apple’s bite section, and the leaf is pretty close, but they aren’t the same. The EUIPO admitted that they were “visually similar, but qualified that it was “to a very low degree,” but that was still enough to create a “mental link” between the two companies for consumers.
Advertisement
Apple has done this a few times in the past, taking on people submitting fruit-based trademarks and complaining of how they are trading off Apple’s logo.
Cases have included the Norwegian Progress Party that stuck an F motif in the middle of an Apple, the pear logo used by Prepear, and the battle against Fruit Union Suisse. In the last case, that was against a century-old organization that used a red apple image with a white cross for many years, and Apple complained about an anniversary redesign.
In trying to work out how far Apple will go over fruit-based trademarks, I discovered there are limitations to its reach. In November 2018, it failed to block logos for Banana Mobile and Banana Computer in Europe.
At the time, the EUIPO concluded that an apple is not a banana.
Advertisement
Cancelled, but nominated
Apple TV shows are frequently listed as nominees for awards. Sometimes, those awards come after the show has ended, typically the following year, but there are exceptions.
On Wednesday, “Schmigadoon!” was the recipient of a massive 12 nominations for the 2026 Tony Awards.
This is a big achievement, but there are massive asterisks at play here. The awards weren’t for the show itself, but for the broadway production.
Apple TV’s ‘Schmigadoon!’
Advertisement
Apple ended the TV version in early 2024, killing it after two seasons despite the third already having been written. The popularity of the show wasn’t enough to save it from continuing, but it did live on in theater.
A stage adaptation of the comedy musical arrived in 2025, as a precursor to the Broadway version, which Apple has co-produced.
This isn’t the first time the show has gone up for awards. This includes a Creative Arts Emmy in 2022, a Critics Choice TV award nomination in 2021, and a spot on the American Film Institute’s “Television Programs of the Year” list for 2021.
With 12 nominations at the Tony Awards, the Broadway musical could end up picking up more trophies in its stage life following its TV death.
Advertisement
Who knows, maybe a big win will convince someone at Apple that they made the wrong move and commission that third season.
Last week’s Sunday Reboot covered Apple’s F1 ambitions, its massive Q2 financial results, and the return of “Ted Lasso” in August.
Attackers are abusing Google Ads and legitimate Claude.ai shared chats in an active malvertising campaign.
Users searching for “Claude mac download” may come across sponsored search results that list claude.ai as the target website, but lead to instructions that install malware on their Mac.
Google’s sponsored search result for ‘claude download mac’
(BleepingComputer)
Shared Claude Chats weaponized to target macOS users
The campaign was spotted by Berk Albayrak, a security engineer at Trendyol Group, who shared his findings on LinkedIn.
Researcher alerts of ongoing malvertising campaign
Albayrak identified a Claude.ai shared chat that presents itself as an official “Claude Code on Mac” installation guide, attributed to “Apple Support.”
The chat walks users through opening Terminal and pasting a command, which silently downloads and runs malware on their Mac.
While attempting to verify Albayrak’s findings, BleepingComputer landed on a secondshared Claude chat carrying out the same attack through entirely separate infrastructure.
Advertisement
The two chats follow an identical structure and social engineering approach but use different domains and payloads. Both chats were publicly accessible at the time of writing:
Shared Claude Chat with malicious instructions
(BleepingComputer)
What does the macOS malware do?
The base64 instructions shown in the shared Claude chat download an encoded shell script from domains such as:
In variant seen by Albayrak [VirusTotal]: hxxp://customroofingcontractors[.]com/curl/b42a0ed9d1ecb72e42d6034502c304845d98805481d99cea4e259359f9ab206e
In variant seen by BleepingComputer [VirusTotal]: hxxps://bernasibutuwqu2[.]com/debug/loader.sh?build=a39427f9d5bfda11277f1a58c89b7c2d
The ‘loader.sh’ (served by the second link above) is another set of Gunzip-compressed shell instructions:
Base64 code retrieves first stage ‘loader.sh’ payload
(BleepingComputer)
This compressed shell script runs entirely in memory, leaving little obvious trace on disk.
BleepingComputer observed the server serving a uniquely obfuscated version of the payload on each request (a technique known as polymorphic delivery), making it harder for security tools to flag the download based on a known hash or signature.
The variant BleepingComputer identified starts by checking whether the machine has Russian or CIS-region keyboard input sources configured. If it does, the script exits without doing anything, sending a quiet cis_blocked status ping to the attacker’s server on its way out. Only machines that pass this check get the next stage:
Before proceeding further, the script also collects the victim’s external IP address, hostname, OS version, and keyboard locale, sending all of it back to the attacker. This kind of victim profiling before payload delivery suggests the operators are being selective about who they target.
The script then pulls down a second-stage payload and runs it through osascript, macOS’s built-in scripting engine. This gives the attacker remote code execution without ever dropping a traditional application or binary.
Advertisement
The variant identified by Albayrak, however, appears to skip the profiling steps. It goes straight to execution.
It harvests browser credentials, cookies, and macOS Keychain contents, packages them up, and exfiltrates them to the attacker’s server. Albayrak identified this as a variant of the MacSync macOS infostealer:
Albayrak’s variant skips user fingerprinting step
(BleepingComputer)
The briskinternet[.]com domain shown above in the variant identified by Albayrak appeared to be down at the time of writing.
When the legitimate URL is the threat
Malvertising has become a recurring delivery mechanism for malware.
BleepingComputer has previously reported on similar campaigns targeting users searching for software like GIMP, where a convincing Google ad would list a legitimate-looking domain but take visitors to a lookalike phishing site instead.
Advertisement
This campaign flips that, as there is no fake domain to spot.
Both Google ads seen here point to Anthropic’s real domain, claude.ai, since the attackers are hosting their malicious instructions inside Claude’s own shared chat feature. The destination URL in the ad is genuine.
It is not, however, the first time that attackers have abused AI platform shared chats this way. In December, BleepingComputer reported a similar campaign targeting ChatGPT and Grok users.
Users should navigate directly to claude.ai for downloading the native Claude app, rather than clicking sponsored search results. The legitimate Claude Code CLI is available through Anthropic’s official documentation and does not require pasting commands from a chat interface.
Advertisement
It is good practice to generally treat any instructions asking you to paste terminal commands with caution, regardless of where those instructions appear to come from.
BleepingComputer reached out to Anthropic and Google for comment prior to publishing.
AI chained four zero-days into one exploit that bypassed both renderer and OS sandboxes. A wave of new exploits is coming.
At the Autonomous Validation Summit (May 12 & 14), see how autonomous, context-rich validation finds what’s exploitable, proves controls hold, and closes the remediation loop.
For years, Uber talked about becoming a super app. Then Waymo started picking up passengers in San Francisco, and the conversation grew more urgent. The company has been trying to embed itself inside the AV industry — as a data provider, an investor, and a distribution platform — but the consumer-facing bet may be just as important.
Two weeks ago, Uber held its annual GO-GET product event in New York and announced something its executives had been circling for a long time: users in the U.S. can now book hotels inside the Uber app, through a partnership with Expedia Group, with access to more than 700,000 properties worldwide. Uber One members — the company’s subscription tier at $9.99 a month — get 20% off a rotating list of 10,000 hotels and 10% back in credits. Vacation rentals through Vrbo will follow later this year, along with restaurant reservations via OpenTable. In the meantime, a “Shop for Me” feature lets users order from stores that aren’t even on the platform.
The announcements, taken together, were the most concrete picture yet of something Uber has been trying to conjure since at least 2019: that an app with 199 million monthly active users could become the app they use for nearly everything.
Praveen Neppalli Naga, Uber’s CTO, offered the clearest explanation of the company’s thinking at TechCrunch’s StrictlyVC event late last month in San Francisco. The super app concept has existed for years in India and Southeast Asia, he noted, but U.S. versions have mostly flopped by bolting services onto traffic rather than building toward a reason to stay.
Advertisement
His answer to what fits? Membership. Every new category — food, groceries, now hotels — gives someone another reason to pay for Uber One. “I take Uber, go to the airport, take a flight, take another Uber, go to a hotel, go to a restaurant,” he said. “There is a flow you can actually build into it.”
Flights are not available yet, though Naga didn’t rule them out. Uber tried flight booking in Europe years ago without success. “First let’s get the hotel things done,” he said. Financial services sound like a possibility too — Uber already offers a debit card to drivers in Mexico — though how far that goes, or when, remains unclear. Said Naga: “Never say never.”
Uber isn’t alone in this race. Airbnb, arguably the company most directly threatened by Uber’s hotel push, announced its own transportation ambitions in late March — a partnership with Welcome Pickups to offer airport transfers in 125 cities across Asia, Europe, and Latin America, structured to keep users inside the Airbnb app rather than sending them to Uber. Meanwhile, Elon Musk has spent three years promising to turn X into an “everything app” in the WeChat mold, and is now nearing what he describes as a long-stated goal: X Money, a banking and payments platform built inside the social network, is expected to launch publicly soon. X claims 500 million monthly active users.
Techcrunch event
Advertisement
San Francisco, CA | October 13-15, 2026
The big question is how many super apps the American market will actually support. WeChat works in China partly because the alternative was a patchwork of inferior options. In the U.S., people already have apps they like for most of what Uber wants to do. Getting them to consolidate inside a single platform requires either a compelling reason — Uber One’s discounts, say — or a seamless enough experience that switching feels worth it.
Advertisement
Uber’s bet is that its installed base is the moat. Its users have already handed over a credit card. Convincing them to book a hotel, or order from a store they’d never find on Uber Eats, is an easy lift compared with convincing them to download something new. Its most recent earnings, reported a few days ago, suggest Uber Eats may be the strongest argument for that thesis: delivery revenue grew 34% year over year in the first quarter, to $5.07 billion, making it easily the fastest-growing part of the business and pulling almost even with mobility in gross bookings.
Uber’s stock is still down about 8% from a year ago — suggesting that Wall Street isn’t fully convinced. But the company says that 50 million people are now paying for Uber One, and together they account for roughly half the company’s total bookings.
When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
From affordable 34-inch LCDs to flagship 45-inch OLEDs, these are the best ultrawide gaming monitors you can buy right now, tested and recommended by us.
It’s a bird, it’s a plane, it’s a six-propeller flying vehicle with a nearly eight-foot wingspan.
For the next year, delivery drones operated by the British company Skyports are taking daily weekday trips across New York City’s East River, between the tip of Manhattan and a pier in Brooklyn. Since early May—a bit behind schedule—the drones have carried light cargo for a New York City health care system. Right now, those loads are basically a few pounds of paper; once the healthcare system is confident the setup works, it should include nonhazardous, non-biological packages, such as light pharmaceuticals.
The drones are part of an experiment run by two New York-New Jersey agencies to discover how a relatively new and sometimes controversial sky-bound delivery tech might fit into a hectic urban environment—and the airspace above it. The pilot program will also try to answer a question that hangs over the entire drone delivery industry: Where does it make sense?
“Will there be enough regular flights (1 to 2 per hour) that the client health care system finds true value?” Stephan Pezdek, the regional freight planning manager at the Port Authority of New York and New Jersey, which is operating the pilot, wrote in an email to WIRED. (The Port Authority declined to name the health care system for contractual reasons.) “Will deliveries make it to their destination faster and within the financial constraints of the current carriers they are using? Will the community appreciate the work and not feel like it is a disruption? All of this will inform our understanding of how the first corridor shapes up.”
Advertisement
The Port Authority, which is also working with the New York City Economic Development Corporation (NYCDEC) on this drone project, will also measure how the deliveries affect patient care, Pezdek says.
Globally, drone delivery is still in an experimental phase. What projects do exist mostly focus on carrying cargo to rural or suburban areas, where gaps in road networks and services, plus emptier skies, could make the tech a better fit. Skyports has been delivering mail in remote areas of Scotland since 2023, and carrying cargo to offshore wind turbines in Germany. The US company Zipline says it makes deliveries to and from some 5,000 health facilities across four continents; its oldest program delivers vaccines and blood products in Rwanda. In the US, companies including Alphabet’s Wing and Amazon’s Prime Air are working to expand delivery services across the South, with a focus on the suburban areas surrounding Houston, Austin, and Dallas, Texas.
For drones, dense cities present different challenges. First, there’s the safety question. New York City’s airspace is packed, hosting three international airports. In Manhattan alone, there are three publicly owned heliports. In May 2023, nearly 9,000 helicopter flights took place over city land or water, according to data compiled by the New York City Council. This drone pilot program’s start date was pushed back in part because another experimental aviation tech, an electric vertical takeoff and landing (eVTOL) vehicle, was demo-ing its own first-of-its-kind flights out of the same heliport.
That citified hustle and bustle leads to extra precautions. The pilot project was, as standard, approved by the US Federal Aviation Administration, which requires a certified drone pilot to supervise every flight. Each flight will take place over a fixed route away from residential buildings. The project must obtain a weekly NYPD permit to operate, and delays in acquiring the first one also led the city to push back its start date, says Amanda Kwan, a spokesperson for the Port Authority. The agency also spoke with three local community boards before it allowed the drones to take off.
Anthropic’s Claude Mythos Preview found thousands of zero-day vulnerabilities across major operating systems and browsers, prompting the Fed chair and Treasury secretary to convene bank CEOs. The company warns of a six-to-twelve month window before adversaries replicate the capability.
Advertisement
Anthropic built an AI model that found thousands of zero-day vulnerabilities in every major operating system and web browser. The Federal Reserve chair and the Treasury secretary called bank CEOs to discuss it. The company says there is a six-to-twelve month window to patch the flaws before adversaries build models that can do the same thing. The cybersecurity industry says the threat was already here. Both are right.
Claude Mythos Preview is the model. It is not yet publicly released. In controlled testing, it surpassed all but the most skilled humans at finding and exploiting software vulnerabilities, identifying flaws that had existed undetected for decades, including a 27-year-old bug in OpenBSD and a 17-year-old remote code execution flaw in FreeBSD. Anthropic CEO Dario Amodei described the current period as a “moment of danger” and warned of “some enormous increase in the amount of vulnerabilities, in the amount of breaches, in the financial damage that’s done from ransomware on schools, hospitals, not to mention banks.”
The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now!
The model’s capability raises a question that the cybersecurity industry has been theorising about for years and now must answer practically: what happens when the cost of finding vulnerabilities drops to near zero? The traditional economics of cybersecurity depend on the asymmetry between attackers, who must find one flaw, and defenders, who must secure all of them. Mythos collapses the cost on both sides. Defenders can now scan their entire codebase for flaws they never knew existed. Attackers, once they build or obtain equivalent models, can do the same.
The response
Anthropic chose a controlled rollout, which it calls Project Glasswing. Approximately 40 technology companies and institutions have initial access to Mythos to bolster their systems. The list does not include most central banks and governments. The asymmetry is intentional: give defenders a head start before the capability becomes widely available.
The response from financial regulators was immediate. Federal Reserve Chairman Jerome Powell and Treasury Secretary Scott Bessent convened a meeting with major US bank CEOs to discuss the cyber risks raised by Mythos. The IMF flagged AI-powered cyber threats to the global banking system. The concern is not that Mythos itself will be used to attack banks. It is that the capability Mythos demonstrates, automated discovery of vulnerabilities at superhuman speed, will be replicated by adversaries who are not bound by Anthropic’s responsible disclosure practices.
Amodei’s six-to-twelve month window is a prediction about how long it will take Chinese AI companies to build models with equivalent vulnerability-discovery capabilities. The window is not about whether adversaries will develop the capability. It is about when. The controlled rollout of Mythos is designed to give the companies that receive early access enough time to patch their most critical flaws before the window closes.
OpenAI released GPT-5.4-Cyber for vetted security teams, scaling its Trusted Access programme in direct response to the Mythos disclosure. The competitive dynamic between Anthropic and OpenAI has extended from commercial AI products into cybersecurity, with both companies positioning themselves as defenders of the software infrastructure their own models could be used to compromise.
The cybersecurity community’s response to the Mythos disclosure has been a mixture of alarm and scepticism. Security researchers note that AI-assisted vulnerability discovery has been developing for years and that the capabilities Mythos demonstrates, while impressive in scale, are an acceleration of existing trends rather than a discontinuous leap. The threat of AI-powered cyberattacks was identified by the UK’s National Cyber Security Centre more than a year ago. What Mythos changes is not the existence of the threat but the specificity of the evidence.
Anthropic occupies an unusual position. It is a company whose business model depends on selling AI capabilities to enterprises, including banks, while simultaneously arguing that AI capabilities of the kind it is developing pose an existential threat to the cybersecurity of those same enterprises. The resolution of the contradiction is commercial: Anthropic’s pitch is that you need its AI to defend against AI of the kind it builds. The logic is circular but the threat is real.
The 271 Firefox vulnerabilities were real. The 27-year-old OpenBSD bug was real. The meeting between the Fed chair and bank CEOs was real. The question is not whether AI will transform cybersecurity. The question is whether the six-to-twelve months Amodei describes is enough time to patch decades of accumulated vulnerabilities across every operating system, browser, and financial platform in production, or whether the window is an estimate designed to create urgency for a problem that cannot be solved on any timeline. Mythos found the flaws. Fixing them is a human problem.
In modern datacenters, storage can live anywhere — local to the machine, remotely accessed over the network, and/or shared between systems.
The next generation of servers will treat system memory in much the same way. Systems will still have some local DDR5, but the bulk of it will be remotely accessed from what some have taken to calling the memory godbox.
The ongoing DRAM shortage has created a perfect storm for the proliferation of the appliances, which not only allow for memory to be pooled, but also data stored in that memory to be shared by multiple machines simultaneously. In effect, memory becomes a fungible resource.
More importantly, your next round of servers will probably support the tech, if they don’t already.
Advertisement
CXL finally has its moment to shine
The technology at the heart of these memory godboxes isn’t new. Compute Express Link (CXL) has been slowly gaining traction since its introduction seven years ago.
As a quick refresher, CXL defines a common, cache-coherent interface for connecting CPUs, memory, accelerators, and other peripherals.
The technology comes in a couple of different flavors: CXL.mem, CXL.cache, and CXL.io, which, as a whole, have implications for disaggregated compute. Imagine a rack with a CPU node, GPU node, memory node, and storage node, which can talk to one another completely independently. That’s the core idea behind CXL.
CXL piggybacks off the PCIe standard, which means in theory it should be broadly compatible, but, up to this point, it’s primarily been used with memory devices.
Advertisement
The 1.0 spec opened the door to memory expansion modules, which allow you to add more memory by slotting them into a CXL-compatible PCIe slot. To the operating system — assuming you’re running Linux that is — the extra memory is largely transparent, showing up as if it were attached to another CPU socket, just one without any additional compute.
The 2.0 spec, which showed up in 2020, added basic support for switching, which meant memory could be pooled and then allocated to any number of connected systems.
AMD and Intel’s current crop of Epycs and Xeons already support these appliances. But while the memory can be partitioned and reallocated to different machines as needed, two machines can’t work on the same data simultaneously.
Unless you were memory-constrained, the added complexity of CXL 2.0 didn’t offer much benefit over simply using higher capacity DIMMs in the first place.
Advertisement
At least, not until memory prices went through the roof.
Where things really get interesting is when the 3.0 spec arrives in AMD and Intel’s next-generation of Epycs and Xeons. In fact, from what we understand, Amazon’s Graviton5 CPUs we looked at in December already support the spec.
CXL 3.0 introduces two key capabilities that make it particularly interesting for memory appliances. The first is support for larger topologies: Multiple CXL switches can be stitched together into a fabric. The second is support for memory sharing: Rather than partitioning memory into slices only accessible to one machine at a time, memory can be shared between machines.
In theory this could allow two machines running the same set of workloads to use the memory closer to that of one. It’s a bit like deduplication for memory. In fact, we already do this in virtualized environments like KVM, but it now works across machines.
Advertisement
There are security and performance implications to all of this. Thankfully in CXL 3.1 and later, the consortium introduced confidential computing capabilities into the spec, allowing for isolation where necessary.
On the performance end of things, CXL 3.0 moves to PCIe 6.0 as a baseline, which provides 16 GB/s of bidirectional bandwidth per lane. Assuming 64 lanes of CXL per CPU, that works out to an additional 512 GB/s of bandwidth. So memory bandwidth shouldn’t be too much of an issue for most applications. Latency, on the other hand, is a different story.
CXL-attached memory is going to add some latency. However, as we’ve previously discussed, the latency isn’t as bad as you’re probably thinking — on the order of a NUMA hop, or about 170 to 250 nanoseconds of round trip latency. Obviously, the farther the memory appliance is from the host CPU, the worse the latency is going to be.
Late last year, the CXL consortium ratified the 4.0 spec, which among other things doubles the bandwidth from 16 GB/s per lane to 32 GB/s by re-basing on PCIe 7.0. However, it’ll be a while before we see appliances based on the spec.
Advertisement
Where’s my memory godbox?
There are several companies developing hardware for these kinds of networked memory appliances.
Panmnesia’s CXL 3.2-compatible PanSwitch is one of the most sophisticated examples. The switch features 256 lanes of connectivity for CXL memory modules, devices, or CPUs to connect, pool, or share resources.
If you’re okay with memory pooling and don’t need the niceties of CXL 3.0, then there are already several memory appliances available that are compatible with the latest generation of Xeon 6 and Epyc Turin processors.
Liqid’s composable memory platform, for example, can provide a pool of up to 100 TB of DDR5 to as many as 32 hosts. Meanwhile, UnifabriX Max systems provide CXL 1.1 or 2.0 connectivity to 16 or more systems with support for CXL 3.2 already in the works.
Advertisement
We suspect that as more CXL 3.0 compatible CPUs and GPUs hit the market, more of these memory godboxes will appear.
AI eats everything
Don’t get too excited. While network attached memory has the potential to reduce an enterprise’s infrastructure spend, those same qualities make it attractive for the very thing driving the memory shortage in the first place.
AI adoption has driven demand for DRAM off the charts. In addition to the HBM used by GPUs, DDR5 is being used for key value cache offload during inference.
These KV caches store model state and can chew significant amounts of memory — often more than the model itself — in multi-tenant serving scenarios.
Advertisement
Rather than discard these caches and recompile them when the model state is restored, it’s more efficient to offload them to system memory and eventually flash storage.
The problem with using flash storage is that it has a finite write endurance. After a while it wears out. Instead, CXL memory vendors are positioning the tech as a more resilient alternative.
That’s bad news for enterprises looking to these memory godboxes for salvation from the RAMpocalypse. ®
You must be logged in to post a comment Login