Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
Equinix has raised $15 billion in funding to expand its xScale data centers for AI, particularly for investments in the U.S.
Redwood City, California-based Equinix has built one of the backbones of the internet with data centers around the world. I visited a secret site once and was amazed at how big the places were that house tons of servers and cabling and cooling — and they’re about to get bigger and more plentiful.
Krupal Raval, managing director of xScale data centers at Equinix, said in an interview that the digital infrastructure company has completed the signing of a joint venture agreement, raising over $15 billion in capital with its partners. The exact mix of equity and debt is to be determined.
Advertisement
The limited liability partnership — subject to close in the fourth quarter — include GIC and the Canada Pension Plan Investment Board (CPP Investments).
“The $15 billion announcement is associated with, frankly, just the scale of the opportunity and real projects that we’re targeting,” Raval said. “That’s one of the points behind the $15 billion and then the second element is that the partnerships are very key to this equation.”
Raval noted that xScale data centers and the plan behind them were hatched five years ago, and there have already been $8 billion in financial commitments prior to today’s announcement. GIC has supported the expansion in the past, and now CPP is joining to further invest in North America.
“We feel like it’s a great testament to the health of our partnership and great working relationship. So we’re beyond thrilled over the fact that GIC is continuing to double down. I guess it’s tripling down into this project. But in addition to GIC, we also have CPP as a new investor. And the reason for that is because a the scale of the opportunity is so large, we thought it prudent to bring in multiple investment investor parties.”
Advertisement
Driven by increasing artificial intelligence (AI) and cloud growth, the joint venture is intended to accelerate the Equinix xScale data center portfolio, which enables hyperscale companies to add core deployments to their existing access point footprints at Equinix International Business Exchange (IBX) data centers. At full buildout, this new JV will nearly triple the investment capital of the Equinix xScale program.
Raval noted that the xScale program already represents an $8 billion commitment of capital, and this additional $15 billion will be invested in the U.S. to build out data centers to handle AI demand primarily in the U.S.
“It will change everything,” said Raval. “We are just in the early innings of AI. Everyone is talking about this as the single most important technological shift in generations.”
With the capital raised through the JV, Equinix expects the joint venture to purchase land to build new state-of-the-art xScale facilities on multiple greater-than-100-megawatt (MW) campuses in the U.S., eventually adding more than 1.5 gigawatts of new capacity for hyperscale customers.
Equinix has a longstanding relationship with GIC, having previously partnered on xScale projects in Asia, the Americas and Europe (see links below for details on other joint ventures). This agreement represents the first joint venture between Equinix and CPP Investments, which manages the assets of the Canada Pension Plan for more than 22 million contributors and beneficiaries.
Advertisement
Under the terms of the agreement, CPP Investments and GIC will each control a 37.5% equity interest in the joint venture, and Equinix will own a 25% equity interest. Each party has made equity commitments, and the joint venture also expects to take on debt to raise the total pool of investable capital to more than $15 billion over time.
Equinix’s existing hyperscale joint venture portfolio in Europe, Asia-Pacific and the Americas has a committed investment of over $8 billion, which is expected to result in greater than 725 megawatts of power capacity across more than 35 facilities at full buildout.
Platform Equinix features nearly 40% of the private on-ramps to the top global cloud service providers, which is more than any other provider. As hyperscale companies scale their operations at Equinix, the ecosystem of over 10,000 enterprises and other companies currently operating at Equinix can benefit from increased opportunities to directly connect and operate in proximity to the largest global cloud operators.
xScale data centers serve the unique core workload deployment needs of the world’s largest cloud service providers, including hyperscalers, which are key players in the AI ecosystem. These companies can add core deployments to their existing access point footprints at Equinix IBX data centers, enabling their growth on a single platform that can immediately span 72 global metros and offer direct interconnection to an ecosystem of more than 10,000 customers.
Advertisement
Equinix said is committed to delivering sustainable digital infrastructure and engaging our suppliers and partners in supply chain responsibility. Equinix has continued to make advancements in the way it designs, builds and operates its data centers with high energy-efficiency standards, and all xScale data centers will be LEED certified (or certified in the regional equivalent).
Raval said that the company maintains the highest standards in its sustainable approach to building its data centers.
“It’s an industry gold standard in terms of where we stand and in terms of our commitment to sustainability,” he said. “For many years, we’ve had a commitment towards being 100% based on clean energy. By 2030 we have science based targets, and we’re the trailblazer in many of these things.”
The closing of the joint venture is subject to the receipt of required regulatory approvals, which are expected to be received in the fourth quarter of 2024. Morgan Stanley served as exclusive financial advisor to Equinix in connection with this transaction.
Advertisement
“As the world’s leading companies build out their infrastructure to support key workloads such as artificial intelligence, they require the combination of large-scale data center footprints optimized for AI training and interconnection nodes for the most efficient inferencing,” said Adaire Fox-Martin, CEO of Equinix, in a statement. “Our xScale and IBX offerings are uniquely positioned to address this business need, enabling companies to realize the powerful potential of AI.”
Goh Chin Kiong, chief investment officer for real estate at GIC, said in a statement, “We are proud to expand our years-long partnership with Equinix, addressing a massive and growing demand for digital infrastructure, driven by the rapid advancement of technology, including AI. GIC’s capital and scale, paired with Equinix’s operational expertise, has driven meaningful value across our investments together. Through this joint venture, we look forward to providing the funding needed to develop state-of-the-art digital infrastructure across the U.S. alongside our likeminded partner, CPP Investments.”
Max Biagosch, senior managing director at CPP Investments, said in a statement, “CPP Investments has invested in data centers for several years and we have developed strong expertise in this space. This investment will help meet the increasing demand for data centers driven by rapid technological advancements and marks a significant step forward in our broader data center strategy. We are pleased to partner with Equinix and GIC to deliver strong long-term risk-adjusted returns for the CPP Fund.”
Raval noted Equinix invests multiple billions of dollars in capital expansion in normal years.
Advertisement
“I don’t think that we should necessarily limit ourselves to whatever history is. I think we have the capability to do more,” Raval said.
Full told, the xScale commitment is now about $23 billion. Raval thinks of a data center as a “product.” This amounts to about 20-plus gigawatts of power needed to run these data centers.
“We’ve designed a product that’s flexible, that can accommodate liquid cooling, that can run the gamut,” he said.
I asked about whether Nvidia’s belief in the onset of sovereign AI, where countries re-create their data infrastructure to make sure they own their own data, is a reason for this. And he said, in brief, yes.
Advertisement
“I think AI is going to grow everywhere,” he said. “This particular announcement is focused on the United States because we believe that the biggest growth market in AI is going to be the United States.”
Equinix expects an unspecified amount of hiring related to this project, which will include construction jobs. The company has already acquired its first U.S. xScale data center site, which will support 240 megawatts of power in the Atlanta area.
VB Daily
Stay in the know! Get the latest news in your inbox daily
Sleep Number’s newest smart bed is designed to keep you cool at night. The ClimateCool Smart Bed, starting at $5,499, is the latest product from the company famous for its adjustable mattress firmness. In a press release, the company says the new mattress can keep your body at the optimal temperature with its “scientifically backed” cooling programs that could be of particular interest to women dealing with symptoms of menopause.
This is Sleep Number’s second smart bed that offers individual temperature control on either side of the bed. The Climate360, which launched in 2020, similarly actively draws heat away from your body to help you stay cool, but unlike the ClimateCool, it can also warm you up if you’re too chilly at night.
But while the 360 starts at a whopping $10,000 for a Queen size, the new ClimateCool starts at $5,499. This pricing includes a base; you can get the adjustable one for $1,500 more. Competitors such as EightSleep, a mattress cover that can heat and cool and also has an adjustable base, start at $2,649, but you need to bring your own mattress.
Sleep Number says the ClimateCool uses the same cooling technology as the Climate360, and both mattresses in the Climate series can use the active cooling feature enabled by its new SmartTemp cooling programs. These were developed with research from Northwestern University’s Feinberg School of Medicine and work in conjunction with ceramic gel layers and a breathable sleep surface in the mattress to keep you cool while adjusting to your body’s temperature throughout the night.
Advertisement
The ClimateCool smart bed has layers of ceramic gel, an airflow system, and a breathable surface to draw heat away from your body to help maintain a comfortable body temperature.Image: Sleep Number
Sleep Number said it conducted studies that found body temperature changes during menopause negatively impact women’s sleep quality. Its survey of more than 10,550 Sleep Number bed users found that “90 percent of female respondents experiencing menopause or perimenopause suffer from night sweats.”
The company claims its active cooling technology could help these women sleep better by sensing their body’s temperature change and drawing the excess heat away from them with its dynamic airflow system. As with its adjustable firmness, each side of the bed can be set to different cooling programs so you can stay cooler while your partner stays cozy.
Users can create their own cooling program or choose from two programs designed to address different needs, including recovery, deep sleep, menopause, illness recovery, and relaxation:
Advertisement
‘All Night Cooling,’ which keeps sleepers cool and can help ease temperature changes and hot flashes.
‘Deep Sleep Cooling,’ designed to help reduce sleep disruptions in the middle of the night.
As with all Sleep Number mattresses, the ClimateCool features adjustable firmness and built-in sleep tracking that measures your biosignals to provide you with a sleep report. These features are accessed through the Sleep Number app.
The Sleep Number ClimateCool smart bed is available now at sleepnumber.com and at Sleep Number stores, starting at $5,499 (Queen size, with integrated base) and $6,999 (Queen size, with FlexFit 2 adjustable base).
Gain industry-leading manageability and security with the PowerEdge T560 – a perfect choice for medium-sized businesses exploring the advantages of software virtualisation. For more about finding your perfect Dell solution visit Dell Warehouse.
The WashG1 is a dedicated wet floor cleaner and Dyson‘s first attempt to prove that it doesn’t just do carpets. It launched in the UK and Australia last month but has just gone on sale in the US. It’s currently only available to buy direct from Dyson, and has a list price of $699.99.
Unclutch those pearls; we all knew it was going to be expensive. I do think that some Dyson products justify their eye-watering price tags, but in this case, there are things worth factoring in before you decide to gamble your child’s college fund on a wet floor cleaner.
I tested one out and you can get the full low-down in my Dyson WashG1 review, but the gist is that it works fantastically well on perfectly smooth, flat floors like linoleum or polished concrete but is nowhere near as impressive on textured or uneven floors (including tiled floors with grouting gaps).
This is Dyson’s first dedicated wet floor cleaner (I say ‘dedicated’ because we do have the Dyson V15s Submarine, which is a vacuum cleaner with a wet floorhead that can be swapped in). Significantly for this brand, which has built its reputation on being really good at moving air about, it doesn’t use suction. Instead, it employs a combination of agitation, hydration and separation to get your floors gleaming.
Advertisement
Water is expelled through the cleaner head, rollers help loosen the dirt and pick up things like hair and solid particles, and then the inner mechanisms separate liquid and solid spillages. That last part is designed to make maintenance easier.
Should you buy one?
It’s very good at certain things. Like today’s best Dyson vacuums, it’s extremely maneuverable; the floorhead can pivot any which way, and it’ll get right up close to baseboards, too. The fact that it can handle liquid and solid waste is really helpful for things like dinnertime messes. I have a small niece and nephew who cannot complete a meal without coating everything in the vicinity with whatever they’ve been eating, and a once-over with the WashG1 is by far the least disgusting way to deal with it that I’ve found so far. The base will take care of some of the maintenance by running a self-clean cycle, when you dock it, too.
However, it’s not worth the investment if you have uneven floors. The WashG1 will struggle to clean them evenly, as I discovered when I tested mine on a flagstone floor. Because the rollers don’t really ‘scrub’, it’s only really capable of tackling surface dirt.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Advertisement
That includes missing the grouting cracks between tiles. (Apparently, the engineers found that adding more water is a more effective way to tackle stubborn dirt than rubbing at it, and while they might have a bit of a point, I still think there are limitations to this approach.)
Those niggles aside, it still might be a good investment for some shoppers. Because it’s brand new, don’t expect discounts any time soon – I have my fingers crossed for a price-drop in the Black Friday sales, though.
One of our Configured to Order Solutions. Configured to our customers specifications ! For all your bespoke IT Hardware solutions at great prices , please contact : sales@ctoservers.co.uk
Ben and I have an annual ritual. For the last half decade, around this time of year, we run to the store, hastily unbox the latest iPhone and get shooting. We do this because we’re passionate about finding out everything there is to know about the new camera — not just to make sure things work well with Halide, but also because no other camera has as many changes year over year.
A byproduct of this ritual? A pretty thorough iPhone review.
If you’ve read our reviews before, you know we do things different. They’re not a quick take or a broad look at the iPhone. As a photographer, I like to focus on reviewing the iPhone 16 Pro as if it were purely a camera. So I set off once more to go on a trip, taking tons of photos and videos, to see how it held up.
For the first “Desert Titanium” iPhone, I headed to the desert. Let’s dive in and see what’s new.
What’s New
Design
As a designer from an era when windows sported brush metal surfaces, it comes as no surprise I love the finish of this year’s model. Where the titanium on the iPhone 15 Pro was brushed on the side rails, this year features more radiant, brushless finish that comes from a different process.
Advertisement
It is particularly nice on the Desert Titanium, which could also be described more like “Sequoia Forest Bronze”:
The front features the now-standard Dynamic Island and slimmer bezels. The rear packs the familiar Pro camera array introduced way back in iPhone 11 Pro.
Its sibling, iPhone 16 features a unique colored glass process unique to Apple. This year’s vibrant colors feel like a reaction to last year’s muted tones. I haven’t seen this process copied anywhere else, and it’s beginning to earn its rank as the signature style of the iPhone. The ultramarine (read: “blue”) iPhone 16 is gorgeous, and needs to be seen in real life. I went with the color Apple calls “teal,” but I would describe it more as “vivid Agave.”
The sensor array on the 16 has returned to the stacked design of the iPhone X. The motivation behind the change may be technical— better support for Spatial video— but from an aesthetic perspective, I alsos simply prefer the vertical arrangement.
While beautiful to look at, that’s also about all I will say about iPhone 16. While admittedly a bit less colorful, the iPhone Pro line has always been Apple’s camera flagship, so that’s the one we’ll dive into.
Inside iPhone 16 Pro
A New 48 Megapixel Ultra Wide
The most upgraded camera is the ultra-wide camera, now 48 megapixels, a 4x resolution improvement from last year. The ultra-wide shows impressive sharpness, even at this higher resolution.
At 13mm, the ultra-wide remains an apt name. It’s so wide that you have to be careful to stay out of frame. However, it does allow for some incredible perspectives:
At the same time, temper your expectations. While the iPhone 14 Pro introduced a 48 MP sensor for its main camera, they almost doubled the physical size of the sensor compared to the iPhone 13 Pro. This year, the ultra-wide is the same physical size, but they crammed in more photo-sites. In ideal lighting, you can tell the difference. In low-light, the expected noise reduction will result in the some smudgier images you’d also get from the 15 Pro.
One very compelling bonus of the 48 MP upgrade is that you get more than for the high-resolution shots. It does wonders for macro photography.
Since the iPhone 13 Pro, the ultra-wide camera on iPhone has had the smallest focus distance of any iPhone. This let you get ridiculously close to subjects.
Advertisement
The problem was that… it was an ultra-wide lens. The shot above is a tight crop of a very wide frame. If you wanted a close up shot like that, you ended up with a lot of extra stuff in your shot which you’d ultimately crop-out.
In the past, that meant a center crop of your 12 MP ultra wide image would get cropped down to a 3 MP image. In Halide, we worked around this with the help of machine learning, to intelligently upscale the image.
With 48MP of image however, a center crop delivers a true 12 MP image. It makes for Macro shots that are on another level.
Fusion Energy
Here’s the main meat – the camera most people shoot almost all their shots on. iPhone 16 Pro’s 48 megapixel main camera sensor.
iPhone 16 Pro packs a new 24mm main camera, they now dub the Fusion camera. It is a new sensor, the ’second generation’ of their 48MP shooter introduced in iPhone 14 Pro. iPhone 16 is also listed as having a ‘Fusion’ camera — but they are, in fact, very different cameras, with the iPhone 16 Pro getting a much larger and higher quality sensor.
‘Fusion’ refers to the myriad of ways Apple is implementing computational magic that produces high quality shots. If you were to zoom in on the microscopic structure of the sensor, you would see that every pixel is made up of four ‘photosites’ — tiny sensor areas that collect green, red, or blue light.
When iPhone 14 Pro quadrupled its resolution, Apple opted for a ‘Quad Bayer’ arrangement, dividing each photo site into four, rather than a denser ‘regular’ arrangement. There’s a huge benefit of this arrangement: the sensor can combine all those adjacent sites to act like single, larger pixels — so you can shoot higher-quality 12MP shots. This was already employed in video and Night mode.
Advertisement
The ‘Fusion’ workflow is essentially using the 48 megapixels worth of data and the 12 megapixel mode to combine into great 24 megapixel resolution shots. I think this is perfect. I firmly believe most people do not benefit from giant 48 megapixel photos for everyday snaps, and it seems Apple agrees. A very Apple decision to use more megapixels but intelligently combine them to get a better outcome for the average user.
Is processing very different from last year? No, not really. It was great, and it’s still great. While there’s slightly more processing happening, I found it difficult to spot a difference between iPhone 15 Pro and iPhone 16 Pro captures. The sensor is the same physical size as last year’s iPhone 15 Pro / Pro Max, and still has delightful amounts of depth of field as a result.
The larger the sensor, the nicer this is, and it really renders beautifully — especially in its secondary telephoto lens mode.
Telephoto: 5× and Fusion at Work
The telephoto camera is a defining characteristic of the Pro line of iPhones. Last year only the 15 Pro Max featured the 5× ‘tetraprism’ lens. This year it’s standard across the Pro line, and I’m happy I have the option of going smaller this year.
That said, I’m a huge fan of the outgoing 3× lens. It was dang near perfect for me. Now, every focal length between 1× and 5× is bridge with the 48 MP main camera, and it’s a bit controversial. Because of its quad-bayer configuration, there’s been a question as to whether the 48 megapixel on the main sensor is really 48 MP, since it needs to do a bit more guesswork to recover details.
Well, comparing a 12 MP crop on the sensor to a “real” 12 MP image shot on iPhone 12 Pro, I preferred my ‘virtual’ output on the 16 Pro.
I’ll admit that years ago I was a skeptic. I like my lenses optical and tangible, and it feels wrong to crop in. Well, this past year, I’ve been sporting the iPhone 15 Pro Max with its 5× zoom, so I found myself using the imaginary 2× lens much more to bridge the gap between focal lengths.
Thanks to wider aperture on the Fusion camera, the virtual 2× produces better results than the physical 2× of the past. I really like it. I no longer want Apple to bring back the physical 2×. Give me an even larger, better Fusion camera.
Advertisement
As for the 5×, after a year of real-world use on the 15 Pro, I don’t want to lose that reach. It’s like having a set of binoculars, and amazing for wildlife, landscapes, or just inspecting things far away.
On a creative level, the 5× can be a tricky focal length to master. While the ultra-wide camera captures everything, giving you latitude to reframe shots in editing, the 5× forces you to frame your shot right there. Photographers sometimes say, “zoom with your feet,” which means taking a few steps back from your subject to use these longer lens. This requires a bit more work than just cropping in post, but the results are worth it.
At night, the telephoto camera suffers as the only remaining 12 MP sensor and narrower field of view that lets in less light. I’d be appreciative of a larger or 48 MP sensor in the future, not for the added resolution, but to reduce noise through binning. What this camera needs more than anything is more light — it would be transformative, and I hope Apple takes things in this direction in the future.
For portraits, which usually happens in a more controlled lighting environment, the 5× telephoto also truly shines. It’s a great lens, and we’re all better for having it on all the iPhones Pro.
Night Photography
With the sun setting, I noticed the latest display made a big difference. With a screen that goes down to one nit, I found it really nice when shooting out in the dark.
Advertisement
Within night mode, the HDR now allows a larger range dynamic range to be captured. However, it was still a frustrating dance at times to get exactly what I wanted out of the exposure, with some exposures over-done, and inconsistent exposure times. In fact, I enjoyed shooting on the iPhone 16 Pro outside of night mode, as it gave me darker, contrasty shots.
While Night Mode remains incredibly impressive, and the intelligence produces solid results without thinking — but at times it can still be frustrating to get exactly what I want. I wish there were an API for apps like Halide to dial in manual settings.
(If anyone at Apple reads this, we filed request FB11689438.)
Under the Hood
If you were to treat this as a review of the iPhone as a camera, there’s actually more to talk about than the cameras. This is a unique year, because the iPhone 16 Pro packs improvements that go beyond the cameras — touching on every part of the photography and videography workflow. In my testing, USB transfer speeds were faster than my iPhone 15 Pro. On the wireless front, Wifi 7 offer up to 46 Gbps, in theory.
The new modem in here has given me easily my best cellular download speeds — in more places. I pulled down a 450 MB offline map of the Mojave Desert in Joshua Tree in less than a minute.
On the wireless power front, I noticed much faster wireless charge speeds with a new MagSafe cable, and also when plugged in. All those things add up to minutes to hours to days saved on the job.
Thermals are a make or break aspect of an iPhone, especially now that it shoots computational intensive video like Apple Log with ProRes. I tested by shooting 4k at 120 FPS for a bit, and found it considerably less hot than the 15 Pro under similar demand. In fact, I never got it to overheat!
Advertisement
Average users will appreciate these quality of life improvements, and Pros will appreciate how it lets them push these devices further than ever before.
Digging deeper into the camera subsystems, the new “Apple Camera Interface” internals allow for faster sensor readout times. This improves features like QuickTake (not that Quicktake), that feature that let you quickly take videos by holding the camera button.
Previously, it wasn’t possible to quickly reconfigure the camera system for high-quality video. It seemed on par with your viewfinder’s video feed, which isn’t as high quality as when you recorded from the camera’s video mode. On iPhone 16 Pro, QuickTake has far better processing — Dolby Vision HDR, 4k resolution, the works. It’s noticeable.
Burst ProRAW 48MP capture performance is also much faster. When absolutely mashing the shutter, 48MP ProRAW frame rate clocked in at 2× the iPhone 15 Pro’s speed. This is good news, but doesn’t solve the tradeoff that comes with ProRAW files — the lag. Apple talked about ‘Zero Shutter Lag’ in the keynote, and that’s exactly what that is about.
When an iPhone captures a ProRAW photo, there’s a two step process. First, the iPhone captures a burst of photos, and then it merges those photos together with the help of sophisticated computational photography algorithms. The iPhone 16 is faster at the first step, grabbing source photos. It still takes several seconds to process the resulting shot, but if you tap the shutter button the camera will now take a photo practically instantaneously — where there was a very real delay before.
The improvement is huge in practice. In total, the iPhone 16 Pro beat the iPhone 15 Pro by anywhere from 400 to 900 milliseconds. Hundreds of milliseconds matter in the moment, and could mean the difference between getting the shot or missing it completely. It’s a massive improvement and a huge achievement, technologically.
Advertisement
Software
While hardware was upgraded, iPhones 16 also come with iOS 18 — a huge update that touches on every single part of the photography experience. We won’t touch on Apple Intelligence or Clean Up, which won’t be ready until next month, but there’s still plenty to talk about with iOS 18.0.
Capture overhaul
You can finally open camera apps from the Lock Screen, which is the single biggest feature request we’ve had from Halide users. In the past, we had to make do offering widgets you could load on your Lock Screen, but real Lock Screen support goes way beyond that, letting you capture photos without unlocking your device.
Aside from several changes in the camera app like being able to pause recordings and keep music playing while you record, there’s an elephant in the room… Photos.
This year, Photos received biggest overhaul since the first iPhone, and the results are subjective. For me, it’s been challenging to adapt — but I do believe in their mission. Photos’ fundamental interface has not changed in 16 years, and I do think has to evolve. For most, it might really work better. Its added customizability is a step forward, and fits into the theme of giving you greater control.
Shooting in Style(s)
Which brings us to Photographic Styles, which have also been overhauled. When they were introduced in the iPhone 13 Pro they were simple filters. You could pick from a warm look, a more contrasty look, maybe a more cool look, but the results were all pre-canned.
Advertisement
Now consider this salt flat. You might want to bring out the coolness of the sunrise and really make that a vivid blue in contrast to the sky above:
But if I apply a simple filter, it would apply the look to both skin color and the sky equally. Blue skin doesn’t work outside James Cameron movies. These new Photographic Styles can target undertones, the way the filter affects skin tones in your shots, making things feel more natural.
These filters are named after moods, such as “Dramatic” or “Quiet.” You can fine tune them with a two dimensional pad. There’s also a slider to adjust the color palette.
Maybe it’s just me, but I found the UI a bit bewildering at first, so I drew this legend to illustrate.
Your adjustments get wiped after a while, though, if not configured to persist.
In the past, I avoided photographic styles, because they were destructive; if I went with a black and white style, I’d lose all the color information. The coolest change to photographic styles this year is that they’re “perceptually non-destructive.” You should be able to reverse the effects of the style, later.
It passed my test — it worked great for me. This even survives an AirDrop to your friend — they can undo or edit the Style when the metadata remains intact.
The added control in Photos also allows you to tune down the “HDR look,” one of the more polarizing aesthetics of iPhone photos. However, Photos doesn’t reduce sharpening, noise reduction, and subject-based adjustments. They still give your photos a “Shot on iPhone” look, whether or not that’s your cup of tea. For deliberate photography, I’m sticking to RAW. For quick snapshots I’ll be shooting in a custom Rose Gold Style.
Advertisement
Video and Audio
iPhone 16 Pro brings 4K 120fps video, including ProRes Log (!). It’s a huge upgrade, and the improved controls in Photos to adjust playback speed are a welcome change too. 4K 120fps video can be recorded in HDR / SDR and with full processing (Slo-mo video uses a visibly lower quality process), whereas ProRes can only be captured using an external SSD. I love the ’new Apple’ that is shipping features like this clearly aimed at the professionals; I don’t see many people shooting iPhones with an SSD attached, but for those that do, this is a fantastic improvement on what has already proven to be the best phone to capture video on.
With Log and ACES, shots from iPhone simply fit into a nice workflow and can slip in undetected as deep depth of field B-roll no problem:
I am not a tremendously huge user of iPhone mics, but both iPhones (iPhone 16 and iPhone 16 Pro) get an improved 4-mic array that support a new Audio Mix feature. It lets you filter out background noise or re-render your audio recording as it if was in a studio or mastered more cinematically.
iPhone 16 Pro can capture Spatial Audio along with your Spatial Video, and does this computational audio processing a bit better than its amateur sibling. It’s very impressive, and can be a huge benefit if you find yourself without a mic — which for most people is probably most situations!
A minor improvement I would suggest to make this useful to us is to allow us to run 4-mic audio capture sessions during recordings that use an external microphone. The peace of mind of having a usable backup recording with Audio Mix would be tremendous.
Advertisement
Camera Control
Okay, here’s the really big deal. Something entirely new on your iPhone.
Over the life of the iPhone, buttons its buttons have either remained the same, evolved, or vanished. Here’s the original iPhone: home, power, volume down, volume up, and a ringer switch.
The first thing to change was the home button. It became a fingerprint sensor and was no longer actually clicking down. With iPhone X, it was finally put out to pasture: a full screen phone didn’t need a home button. A system of gestures worked much better, and Face ID removed the need to scan your finger to unlock.
After that, things stayed the same up until last year, when the ringer switch became an action button. That’s evolution on par with the home button. The only thing so far has been evolution or reduction.
The addition of a new control, then, is a huge deal. I feel like everyone is being fairly casual about this, when Apple is extraordinarily focused on reducing things down to the bare essentials. This showing up on the outside of your iPhone means Apple views it as essential.
Advertisement
How is in actual use? To me, the most important part about controls on a camera is that they become an extensions of you. You can get it in your fingers, and use it blindly. You know what a press or swipe sets. Camera Control delivers on this on some fronts, and not on others.
At the core of the Camera Control experience, there’s several fundamental interactions: one is to open your camera — truly, yours; it can be any camera app — and the other is to interact with it.
The Button
The first was something I had truly underrated when I saw the announcement. What caught eyes and headlines about the Control is the way you can half press and swipe on it; after all, we’ve had camera buttons on phones before. When I got my first iPhone, my then-girlfriend was deep into fancy Nokias — her Nokia N95 had a camera button (and a lot of other ones, too). Nothing new here. Or is there?
I found myself grabbing my ‘old’ iPhone 15 Pro after just days of using the 16 Pro and pointlessly mashing the side of the phone instinctively when I went to take a shot. The Camera Control (don’t call it a button!) is flush with your iPhone; it does not detract from the regular, familiar iPhone hand feel. But it will change the way you interact with it all the same.
Advertisement
Take a beat right now if you are reading this on your phone. Imagine a sudden flash of light causes a gorgeous rainbow to appear in an instant outside your window. Close your eyes. What routine do you have to quickly open your camera?
I had one. We all have some kind of routine, and after years of iPhone use, it’s pretty hard wired. It might take you some time to override this on iPhone 16 Pro, but once you do, it’s much, much faster. You just press that button. Locked? Press the button. Reading in the News app? Press the button.
When I reflexively went to do it on my older iPhone, the phone felt broken — as if you’d press the side button and the screen didn’t light up. I think we’ll see this camera opening button on many if not all Android phones very soon. It just becomes routine so fast, and once this gets in your muscle memory it’s extremely frustrating when it’s not there. You miss a shot. Because of that stupid button-less phone.
When Apple adds something like this, it tends to be just a bit more thought out than a new button that takes a photo — not a thing tacked on as a quick shiny thing to entice buyers. Thoughtful details are abound with the camera triggering press: in your pocket, iPhone won’t open the camera if you press it by accident, as was so wonderfully tested in Faruk’s review:
Advertisement
John Gruber wrote an excellent part of his review going into more into detail on what makes it behave the way it does. Myself, found all this ‘smart’ behavior solid — I haven’t ended up with any errant snaps.
Let’s talk about the rest of this control, though — what is beneath the Sapphire surface.
The Adjustments
This button can be half-pressed, which is to say, not pressed fully. A light press on the button while Camera is open opens an Adjustment menu. Swiping on the control itself lets you dial the selected setting. The settings are (sequentially): Exposure, Depth, Zoom, Cameras, Style, and Tone.
By default, it behaves as a zoom dial. The dial ’snaps’ to the native focal lengths of each lens fairly aggressively, which is a good thing because a swipe on the Control has momentum if you swipe and let go. For precise adjustment, keeping your finger on the Control will allow pretty fine-grained dialing-in with minimal finger movements. I am impressed with its precision.
Regardless, if you are like me and consider zooming no more than cropping your shot, Apple has a ‘Cameras’ Adjustment that is by far my favorite way to use it. The Adjustment has all 4 ‘lenses’ in a row — from 0.5× at the far end to 1×, 2× and finally the 5× telephoto at the other. The result is a quick, pleasing way to cycle through your framing options with a satisfying level of precision — and delivers an amazing interaction iPhone cameras have never had.
The Cameras adjustment can be used… blindly. This may sounds bizarre at the face of it — why would you want to operate the camera without seeing it on a smartphone? Well — recall that reflexive habit-forming I described of opening your camera without looking at it by pressing the Control? The same applies here. Not only can I open it, but I can swipe and feel the haptic click or being on the ultra-wide or telephoto and raise my camera to take my shot.
Advertisement
You’ll see photographers look at a shot and have a hand on their lens, snapping to a setting and then raising it to their eyes to shoot. It’s essential. With this, I can hold my phone in an awkward position with little visibility and shoot through one of the lenses without seeing the screen. I ended up using this a lot. It’s really hard to put into words, but it becomes something in your fingers; a really tactile camera experience that is more of an extension of you. It’s so nice. It’s just like using a camera lens.
That brings me to the not so good part: the Cameras adjustment experience is so nice, integrated and good that it makes the rest of theadjustments feel less great.
Apple has successfully kept a lot of its Camera app paradigms rooted in traditional concepts of photography. Portrait mode features f-stops for its depth effect; lenses are described in full-frame equivalent focal lengths. This stuff matters: it exposes users, even casual ones, to the fundamentals of photography. Through the most popular camera, everyone continues to be educated about these things and can learn what they mean in a very hands-on manner.
Camera Control offers a lot of options, and in doing so, I feel like the Camera Control somewhat breaks from your traditional expectation of what a ‘dial’ on a camera does. Dials do one thing. This does many. In doing so, departs from a camera convention whose simplicity is appreciated by amateurs and professionals alike.
Advertisement
In my ideal setup, Camera Control simply has one, potentially mode-dependent, adjustment. Ideally, it has a logical and predictable start and end (‘opening up’ an aperture can be done without looking at the lens — a similar thing goes for the zoom range). Simplicity can be its flexibility: ideally, it is so predictable and applicable to the camera task at hand that it works even if you cannot see an on-screen interface. Having a “double light press” and navigating a sort of mini-meta-menu system just ends up feeling kind of clunky and odd.
It ends up packing a lot of on-screen interface, and that can also get in the way: If I launch into the Camera, swipe quickly to get to the ultra wide, then hold my finger on it to be ready to shoot, the Camera Control keeps hovering over my frame.
In all, I think the relative plethora of Adjustments makes it feel clumsier and less sleek and snappy than it could be. Given its soft haptic feedback and many options, it can seem a bit overwhelming even to more photographically savvy users. Those more conspiratorially minded might assume Apple added more features here to compensate for the iPhone having fewer at launch; I myself think it’s just a commendable first attempt to do something new.
Focus
For us as developers, it is an interesting new thing. It seems, for now, uniquely catered to us: not only can you set the Camera Control to open any (camera) app like Halide, you can also create your own Adjustments. The API allows us to pick a system icon and use a basic picker — no custom interface — to tie into features of our own app.
Advertisement
It was tempting to just rush into this and have something on day one, but we really wanted to experience the Camera Control and devices for a while to see how it would work in our way. We like to do things a certain, opinionated, focused way. And that’s exactly what we did: Camera Control in Halide offers 2 Adjustments: EV, to adjust exposure, and Focus, at the end of the scale.
Much like Cameras, a manual focus adjustment allows you to quickly focus something as close as possible without looking at the phone. Adjustment for exposure lives in the middle, with a bit more latitude than the system (we go up to -/+ 6 EV, vs. 2) — and the top one is like the middle of your gearbox: “Lock”. Leaving a simple locked adjustment at the top level means Halide does not suffer from any accidental triggers in case you have sensitive adjustments.
On The Nature Of Shutters
There’s an invisible aspect to Camera Control I want to touch on before we move on. I noticed it is also deeply integrated into a low-level improvement I mentioned before — and to understand that, you have to look at how cameras take photos.
Try pressing a shutter button on a regular camera. Film or digital — it will make a quick click. The moment the button reaches the bottom of its throw is when a camera takes a photo.
iPhone 16s do not do that. In fact, they cannot do that. What do I mean by ’that’? Taking a photo as soon as you press down. They take a photo when you release the button. This is something we worked hard to avoid in Halide: when you press the shutter, you want the smallest possible delay; a shutter should fire when the shutter is triggered, not upon release.
Advertisement
But the Camera Control can be long-pressed to take a video. How, then, do you still capture what you see on your screen? Therein lies the smart part of this camera — using the aforementioned Zero Shutter Lag, it can offset the ’slowness’ of the button by grabbing a photo in its buffer. It’s remarkable, and works great for getting a steady shot despite your press, and despite any delay from raising your finger.
The Long Game
I am obviously excited about what the Camera Control brings to the iPhone. It’s a huge change, but it’s easy to miss the long view here.
There’s a reason this isn’t on just Pro phones like a telephoto. Apple knows something about cameras, and that is that they will mean something very different in the years and probably decades and beyond to come.
As our devices become our intelligent companions, cameras are their most important sensors. Their eyes to the world — and accessing the toggle to let them see and interacting with the world is exactly what this control is about. While I feel tremendously catered to, I do think the long view here isn’t to use this as an aperture ring or a focus dial — it’s a button and aperture for the intelligence inside your device.
Advertisement
Processing
And that brings us to the intelligence that does live in this device and controls how every image comes out: Apple’s intelligent image processing.
Image processing has been a hot topic of this review for a while now, and this generation is no different. It’s something a lot of reviews of the iPhone 16 actually already have talked about in varying ways.
Here’s the thing that won’t change, review after review: an iPhone is just better at being a computer than a camera. That’s the reality of it. If you have a large camera with a big lens and a big sensor, it can gather a lot more light. That’s just physics.If you have a small camera and a small sensor, you’re going to have to make up for it somehow. The way the iPhone makes up for it is by being a better computer than a camera. All this computational magic that it does, merging a dozen frames into one, gives it great dynamic range. It lets it take photos at night. It does magic — stuff a small camera shouldn’t be able to pull off.
It’s honestly invisible and fantastic when it works. But when it doesn’t, and it does something unexpected, it’s not great. Is that different this year?
Advertisement
In brief: if you were a fan of the iPhone 15 Pro’s processing, you will enjoy what iPhone 16 Pro is offering up this year. And if you didn’t, there is now a genuinely useful and mostly-lossless way to get shots looking very different than the past years’ iPhones without editing them.
I think there’s people at Apple that probably want the iPhone camera to have a more opinionated ‘look’ — but at this point, a billion people use it. It’s an eternal balance of being a tool for creatives and the most popular tool in human’s hands to capture the world around them as it exists. Not an easy task.
That being said: I think Apple should put almost all of its effort into achieving the seemingly-impossible: a noise reduction method that looks more natural than AI ‘making up’ details, or watercolor smudging. If anyone can make grain a symbol of photographic craft and authenticity, even out of a digital camera, it’s Apple. I still get shots from iPhone where it will go through tremendous lengths to prevent me from having a noisy shot. I can see that extracting detail from the noise is difficult; but the resulting image just looks odd.
If there is one theme to iPhone’s approach to photography this year, it’s more control — and that might apply to the Camera Control, and Photographic Styles, but it remains rather processed whether you like it or not. My advice?
Advertisement
Start to accept that highly processed images are here to stay.
As technology marches on, we are using cameras that help us achieve greater results than the physics would even support — but in doing so, some level of creative control is lost. And while we have tools, like our Process Zero, to achieve what I would call ‘old fashioned photography’ — We are not sure if that will even survive through the long future.
As we strive for ever thinner devices, folding phones and the tech we see in science fiction, processing is the only thing that enable cameras to work in the increasing constraints on power and size they have to fit into.
Even on your new iPhone, camera quality isn’t only quantified by sharpness of a lens or rendering of a single image anymore. The definition of color and sharpness have given way to photography reborn as data science. Your camera gathers signal — and in that signal is noise. The more signal it can acquire, the better. It can handle the aberrations, it can handle the noise with extra processing — as long as it can maximize its light input. In native raw captures, we see more color fringing than years ago; it’s just very well processed out of your shot. Lenses get ‘worse’ — but the photos get better.
Advertisement
That’s why I am here to tell you not to be optimistic about our cellphone cameras going towards less processing. Cameras are being optimized for the future, where photography relies increasingly on magic — and it today’s processing will seem quaint. Things in a decade will be very different than what it is today.
iPhone SE (Spatial Edition)
I’ve talked a lot about photography and video changing, but if you’ll humor me for just one more moment, I’ll talk about one change that excites me. Apple’s push into Spatial photo and video might not be for everyone, but its existence helps a chicken-and-egg problem in an emerging medium that has moved more people close to me to tears than I can recall.
Spatial media — that is, photos and videos shot in 3D for you to relive on a device like Apple Vision Pro — is still nascent.
There’s various tools for capturing immerse and spatial video and audio, but if this is the first iPhone built for the ground up for AI, it’s equally fair to say it’s the first one built from the ground up for Spatial Capture.
That excites me, not because I am an avid lover or consumer of it, but because it’s a genuine new form of media arts that does not involve boiling a lake to generate an image of an astronaut riding a cat. I love that Apple’s working hard to make tools, regardless of demand. The only way we can experience amazing art is if we invent the tools to make it, first.
Advertisement
Verdict: A Camera That Adds Something
iPhone 16 Pro, along with iPhone 15 Pro and 14 Pro, are all what I would call a ’seismic’ camera release for Pros: the kind that has such significant changes that you would not consider it an incremental move but one that makes it practically impossible to go back.
iPhone 14 Pro brought us a large, gorgeous 48MP main cameras. iPhone 15 Pro ProRes Log. And now, iPhone 16 Pro brings Zero Shutter Lag and Camera Control.
If you want a quick verdict: the iPhone 16 Pro is a tremendous camera because between Camera Control, Zero Shutter Lag and its advanced Photographic Styles, it will capture more moments than any iPhone ever did by a huge margin — and that in itself makes me recommend it over any previous one.
That being said, there’s a larger feeling I am left with after reviewing this device in my hands.
Advertisement
As I feel myself getting older, I hold on to the idea of what I think a ‘camera’ or ‘photography’ is more and more. The same happened with cellphones. People used to ridicule that your telephone had a camera on it. No doubt there were purists that said, “well, in my day, this was a thing you took phone calls on. Not a computer in your pocket.”
Here I am: in my day, a camera was a thing you took photos on. Not a computer brain’s eyes to the world. Perhaps I am feeling this is a big change because possibly, this is the close to the last of its kind or the link in evolution: an iPhone that has long since redefined what a phone is, but is about to redefine what a camera is, and what photography means.
Recall the introduction of the iPhone as a phone, an internet communicator, and an iPod. Notably lacking? The camera.
This iPhone is a camera. Maybe the first, if you were to define a camera as a device that has dedicated control for it.
Advertisement
It was in a place like this where one of Steve Jobs’ greatest inspirations once stood and imagined a revolution in photography that shocked the world. He imagined something simple: Instead of having to hire a photographer with a camera, who would bring film to a lab or go into a darkroom to present the shot days later, he imagined a small, elegant metal rectangle that fit into your pocket.
You could simply take it out, slide your finger on its surface to adjust your shot, and take the photo. The real magic? You’d take it and see it; no need to develop any film. Instant gratification.
That man was Edwin Land. He envisioned something most considered impossible: the Polaroid SX-70. It changed photography forever. It seems futuristic today. And guess what? The only controls on that camera were a button… and one slider, right here at the top.
Land didn’t create this because he was obsessed with technology. He wanted to strip away the complications of photography and make it accessible. To focus on the craft, and art and less about worrying about know-how or technique. To truly bring it to its essence: empowering anyone to capture a moment. Surely, some lamented the loss of craft. The loss of essential parts of photography.
Perhaps it was the camera phone that was the next step that truly made photography even more accessible and instant. But many feel like something was lost. It’s telling, then, that where Land removed so many parts of the camera, Apple is adding one.
Apple adding the a new control – a button, a dial – to iPhone isn’t a move it does casually. It’s an admission of a fundamental change of iPhone’s nature that happened over time. An admission that iPhones are far less phones today, and far more cameras.
Advertisement
But as a photographer, remember that ‘camera’ might really mean something entirely different than what we are used to — phones once made phone calls. Today, cameras take photos. In the future? Perhaps this is much more a lens to see and process the world with. A camera, as it is defined in the 21st century.
If I’m reviewing this the way it is, then, I’m really enjoying what I have in my hands. A device on the edge of the sands of time — rooted in the cameras I love, with just enough of the future of photography packed in here for me to manage.
The UK, US and Australia have announced sanctions against 16 people authorities accuse of being part of the most wanted cyber crime gang in the world.
Russia-based Evil Corp is accused of stealing around $300m in nearly ten years of hacking.
The UK’s National Crime Agency (NCA) says it can now reveal the gang’s notorious leader, Maksim Yakubets, has been supported by his father Viktor Yakubets – something he had denied when interviewed by the BBC in 2021.
The information has been released as part of a large, multinational operation to disrupt Evil Corp and another notorious hacking group called LockBit.
Advertisement
Known for their mafia-style of operation, Evil Corp has waged a campaign of destructive cyber-attacks worldwide for over a decade.
In 2019, Maksim Yakubets was sanctioned and a $5m bounty was put up for his arrest, along with another man called Igor Turashev.
Other Russian individuals, including Yakubets’ brother Artem, were also named as part of the US sanctions and designations.
In 2021 the BBC travelled to Russia to search for and interview members of the gang to get their side of the story.
Advertisement
At a former home of Maksim Yakubets we found his father, who gave an impassioned defence of his son while claiming he was personally innocent.
But now the NCA says that Yakubets senior was a major part of the cyber-crime group, accusing him of aiding the gang in laundering some of its stolen funds.
As well as the Yakubets family members, Maksim’s father-in-law was also sanctioned for helping to protect and coordinate the group with his connections to the Russian security services.
Western authorities have now officially linked Eduard Benderskiy, a former high-ranking FSB official, to Evil Corp.
Advertisement
“Maksim Yakubets and his Evil Corp gang has for years lived the archetypal Russian hacker playboy lifestyle seemingly untouchable to law enforcement but today’s announcement shows that we are still watching, digging and determined to disrupt them and bring them to justice,” said Will Lyne, Head of Cyber Intelligence at the NCA.
LockBit connections
Another of those sanctioned is Aleksandr Ryzhenkov, described by the NCA as the younger Yakubets’ right-hand man, and an affiliate of the notorious ransomware gang LockBit.
It’s the first time that a member of Evil Corp has been linked to another major gang and indicates that hackers are working across groups to carry out attacks.
Advertisement
As well as the sanctions, four arrests were made, including two in the UK.
In August, the NCA executed a number of search warrants in the south of England and arrested a 46-year-old male who is suspected of being linked to a LockBit affiliate.
A 50-year-old female was also arrested on suspicion of money laundering offences.
They too were interviewed and later released under investigation whilst the criminal investigation continues.
Advertisement
Both individuals were identified through the analysis and enrichment of data acquired during the course of Operation Cronos – the international police operation that brought down LockBit’s internal infrastructure.
“The action announced today has taken place in conjunction with extensive and complex investigations by the NCA into two of the most harmful cybercrime groups of all time,” said James Babbage, Director General for Threats at the NCA.
The NCA said Evil Corp’s links to the Russian links to the Russian state had been exposed.
“Today’s sanctions send a clear message to the Kremlin that we will not tolerate Russian cyber-attacks – whether from the state itself or from its cyber-criminal ecosystem,” said foreign secretary David Lammy.
You must be logged in to post a comment Login