Ben and I have an annual ritual. For the last half decade, around this time of year, we run to the store, hastily unbox the latest iPhone and get shooting. We do this because we’re passionate about finding out everything there is to know about the new camera — not just to make sure things work well with Halide, but also because no other camera has as many changes year over year.
A byproduct of this ritual? A pretty thorough iPhone review.
If you’ve read our reviews before, you know we do things different. They’re not a quick take or a broad look at the iPhone. As a photographer, I like to focus on reviewing the iPhone 16 Pro as if it were purely a camera. So I set off once more to go on a trip, taking tons of photos and videos, to see how it held up.
For the first “Desert Titanium” iPhone, I headed to the desert. Let’s dive in and see what’s new.
What’s New
Design
As a designer from an era when windows sported brush metal surfaces, it comes as no surprise I love the finish of this year’s model. Where the titanium on the iPhone 15 Pro was brushed on the side rails, this year features more radiant, brushless finish that comes from a different process.
Advertisement
It is particularly nice on the Desert Titanium, which could also be described more like “Sequoia Forest Bronze”:
The front features the now-standard Dynamic Island and slimmer bezels. The rear packs the familiar Pro camera array introduced way back in iPhone 11 Pro.
Its sibling, iPhone 16 features a unique colored glass process unique to Apple. This year’s vibrant colors feel like a reaction to last year’s muted tones. I haven’t seen this process copied anywhere else, and it’s beginning to earn its rank as the signature style of the iPhone. The ultramarine (read: “blue”) iPhone 16 is gorgeous, and needs to be seen in real life. I went with the color Apple calls “teal,” but I would describe it more as “vivid Agave.”
The sensor array on the 16 has returned to the stacked design of the iPhone X. The motivation behind the change may be technical— better support for Spatial video— but from an aesthetic perspective, I alsos simply prefer the vertical arrangement.
While beautiful to look at, that’s also about all I will say about iPhone 16. While admittedly a bit less colorful, the iPhone Pro line has always been Apple’s camera flagship, so that’s the one we’ll dive into.
Inside iPhone 16 Pro
A New 48 Megapixel Ultra Wide
The most upgraded camera is the ultra-wide camera, now 48 megapixels, a 4x resolution improvement from last year. The ultra-wide shows impressive sharpness, even at this higher resolution.
At 13mm, the ultra-wide remains an apt name. It’s so wide that you have to be careful to stay out of frame. However, it does allow for some incredible perspectives:
At the same time, temper your expectations. While the iPhone 14 Pro introduced a 48 MP sensor for its main camera, they almost doubled the physical size of the sensor compared to the iPhone 13 Pro. This year, the ultra-wide is the same physical size, but they crammed in more photo-sites. In ideal lighting, you can tell the difference. In low-light, the expected noise reduction will result in the some smudgier images you’d also get from the 15 Pro.
One very compelling bonus of the 48 MP upgrade is that you get more than for the high-resolution shots. It does wonders for macro photography.
Since the iPhone 13 Pro, the ultra-wide camera on iPhone has had the smallest focus distance of any iPhone. This let you get ridiculously close to subjects.
Advertisement
The problem was that… it was an ultra-wide lens. The shot above is a tight crop of a very wide frame. If you wanted a close up shot like that, you ended up with a lot of extra stuff in your shot which you’d ultimately crop-out.
In the past, that meant a center crop of your 12 MP ultra wide image would get cropped down to a 3 MP image. In Halide, we worked around this with the help of machine learning, to intelligently upscale the image.
With 48MP of image however, a center crop delivers a true 12 MP image. It makes for Macro shots that are on another level.
Fusion Energy
Here’s the main meat – the camera most people shoot almost all their shots on. iPhone 16 Pro’s 48 megapixel main camera sensor.
iPhone 16 Pro packs a new 24mm main camera, they now dub the Fusion camera. It is a new sensor, the ’second generation’ of their 48MP shooter introduced in iPhone 14 Pro. iPhone 16 is also listed as having a ‘Fusion’ camera — but they are, in fact, very different cameras, with the iPhone 16 Pro getting a much larger and higher quality sensor.
‘Fusion’ refers to the myriad of ways Apple is implementing computational magic that produces high quality shots. If you were to zoom in on the microscopic structure of the sensor, you would see that every pixel is made up of four ‘photosites’ — tiny sensor areas that collect green, red, or blue light.
When iPhone 14 Pro quadrupled its resolution, Apple opted for a ‘Quad Bayer’ arrangement, dividing each photo site into four, rather than a denser ‘regular’ arrangement. There’s a huge benefit of this arrangement: the sensor can combine all those adjacent sites to act like single, larger pixels — so you can shoot higher-quality 12MP shots. This was already employed in video and Night mode.
Advertisement
The ‘Fusion’ workflow is essentially using the 48 megapixels worth of data and the 12 megapixel mode to combine into great 24 megapixel resolution shots. I think this is perfect. I firmly believe most people do not benefit from giant 48 megapixel photos for everyday snaps, and it seems Apple agrees. A very Apple decision to use more megapixels but intelligently combine them to get a better outcome for the average user.
Is processing very different from last year? No, not really. It was great, and it’s still great. While there’s slightly more processing happening, I found it difficult to spot a difference between iPhone 15 Pro and iPhone 16 Pro captures. The sensor is the same physical size as last year’s iPhone 15 Pro / Pro Max, and still has delightful amounts of depth of field as a result.
The larger the sensor, the nicer this is, and it really renders beautifully — especially in its secondary telephoto lens mode.
Telephoto: 5× and Fusion at Work
The telephoto camera is a defining characteristic of the Pro line of iPhones. Last year only the 15 Pro Max featured the 5× ‘tetraprism’ lens. This year it’s standard across the Pro line, and I’m happy I have the option of going smaller this year.
That said, I’m a huge fan of the outgoing 3× lens. It was dang near perfect for me. Now, every focal length between 1× and 5× is bridge with the 48 MP main camera, and it’s a bit controversial. Because of its quad-bayer configuration, there’s been a question as to whether the 48 megapixel on the main sensor is really 48 MP, since it needs to do a bit more guesswork to recover details.
Well, comparing a 12 MP crop on the sensor to a “real” 12 MP image shot on iPhone 12 Pro, I preferred my ‘virtual’ output on the 16 Pro.
I’ll admit that years ago I was a skeptic. I like my lenses optical and tangible, and it feels wrong to crop in. Well, this past year, I’ve been sporting the iPhone 15 Pro Max with its 5× zoom, so I found myself using the imaginary 2× lens much more to bridge the gap between focal lengths.
Thanks to wider aperture on the Fusion camera, the virtual 2× produces better results than the physical 2× of the past. I really like it. I no longer want Apple to bring back the physical 2×. Give me an even larger, better Fusion camera.
Advertisement
As for the 5×, after a year of real-world use on the 15 Pro, I don’t want to lose that reach. It’s like having a set of binoculars, and amazing for wildlife, landscapes, or just inspecting things far away.
On a creative level, the 5× can be a tricky focal length to master. While the ultra-wide camera captures everything, giving you latitude to reframe shots in editing, the 5× forces you to frame your shot right there. Photographers sometimes say, “zoom with your feet,” which means taking a few steps back from your subject to use these longer lens. This requires a bit more work than just cropping in post, but the results are worth it.
At night, the telephoto camera suffers as the only remaining 12 MP sensor and narrower field of view that lets in less light. I’d be appreciative of a larger or 48 MP sensor in the future, not for the added resolution, but to reduce noise through binning. What this camera needs more than anything is more light — it would be transformative, and I hope Apple takes things in this direction in the future.
For portraits, which usually happens in a more controlled lighting environment, the 5× telephoto also truly shines. It’s a great lens, and we’re all better for having it on all the iPhones Pro.
Night Photography
With the sun setting, I noticed the latest display made a big difference. With a screen that goes down to one nit, I found it really nice when shooting out in the dark.
Advertisement
Within night mode, the HDR now allows a larger range dynamic range to be captured. However, it was still a frustrating dance at times to get exactly what I wanted out of the exposure, with some exposures over-done, and inconsistent exposure times. In fact, I enjoyed shooting on the iPhone 16 Pro outside of night mode, as it gave me darker, contrasty shots.
While Night Mode remains incredibly impressive, and the intelligence produces solid results without thinking — but at times it can still be frustrating to get exactly what I want. I wish there were an API for apps like Halide to dial in manual settings.
(If anyone at Apple reads this, we filed request FB11689438.)
Under the Hood
If you were to treat this as a review of the iPhone as a camera, there’s actually more to talk about than the cameras. This is a unique year, because the iPhone 16 Pro packs improvements that go beyond the cameras — touching on every part of the photography and videography workflow. In my testing, USB transfer speeds were faster than my iPhone 15 Pro. On the wireless front, Wifi 7 offer up to 46 Gbps, in theory.
The new modem in here has given me easily my best cellular download speeds — in more places. I pulled down a 450 MB offline map of the Mojave Desert in Joshua Tree in less than a minute.
On the wireless power front, I noticed much faster wireless charge speeds with a new MagSafe cable, and also when plugged in. All those things add up to minutes to hours to days saved on the job.
Thermals are a make or break aspect of an iPhone, especially now that it shoots computational intensive video like Apple Log with ProRes. I tested by shooting 4k at 120 FPS for a bit, and found it considerably less hot than the 15 Pro under similar demand. In fact, I never got it to overheat!
Advertisement
Average users will appreciate these quality of life improvements, and Pros will appreciate how it lets them push these devices further than ever before.
Digging deeper into the camera subsystems, the new “Apple Camera Interface” internals allow for faster sensor readout times. This improves features like QuickTake (not that Quicktake), that feature that let you quickly take videos by holding the camera button.
Previously, it wasn’t possible to quickly reconfigure the camera system for high-quality video. It seemed on par with your viewfinder’s video feed, which isn’t as high quality as when you recorded from the camera’s video mode. On iPhone 16 Pro, QuickTake has far better processing — Dolby Vision HDR, 4k resolution, the works. It’s noticeable.
Burst ProRAW 48MP capture performance is also much faster. When absolutely mashing the shutter, 48MP ProRAW frame rate clocked in at 2× the iPhone 15 Pro’s speed. This is good news, but doesn’t solve the tradeoff that comes with ProRAW files — the lag. Apple talked about ‘Zero Shutter Lag’ in the keynote, and that’s exactly what that is about.
When an iPhone captures a ProRAW photo, there’s a two step process. First, the iPhone captures a burst of photos, and then it merges those photos together with the help of sophisticated computational photography algorithms. The iPhone 16 is faster at the first step, grabbing source photos. It still takes several seconds to process the resulting shot, but if you tap the shutter button the camera will now take a photo practically instantaneously — where there was a very real delay before.
The improvement is huge in practice. In total, the iPhone 16 Pro beat the iPhone 15 Pro by anywhere from 400 to 900 milliseconds. Hundreds of milliseconds matter in the moment, and could mean the difference between getting the shot or missing it completely. It’s a massive improvement and a huge achievement, technologically.
Advertisement
Software
While hardware was upgraded, iPhones 16 also come with iOS 18 — a huge update that touches on every single part of the photography experience. We won’t touch on Apple Intelligence or Clean Up, which won’t be ready until next month, but there’s still plenty to talk about with iOS 18.0.
Capture overhaul
You can finally open camera apps from the Lock Screen, which is the single biggest feature request we’ve had from Halide users. In the past, we had to make do offering widgets you could load on your Lock Screen, but real Lock Screen support goes way beyond that, letting you capture photos without unlocking your device.
Aside from several changes in the camera app like being able to pause recordings and keep music playing while you record, there’s an elephant in the room… Photos.
This year, Photos received biggest overhaul since the first iPhone, and the results are subjective. For me, it’s been challenging to adapt — but I do believe in their mission. Photos’ fundamental interface has not changed in 16 years, and I do think has to evolve. For most, it might really work better. Its added customizability is a step forward, and fits into the theme of giving you greater control.
Shooting in Style(s)
Which brings us to Photographic Styles, which have also been overhauled. When they were introduced in the iPhone 13 Pro they were simple filters. You could pick from a warm look, a more contrasty look, maybe a more cool look, but the results were all pre-canned.
Advertisement
Now consider this salt flat. You might want to bring out the coolness of the sunrise and really make that a vivid blue in contrast to the sky above:
But if I apply a simple filter, it would apply the look to both skin color and the sky equally. Blue skin doesn’t work outside James Cameron movies. These new Photographic Styles can target undertones, the way the filter affects skin tones in your shots, making things feel more natural.
These filters are named after moods, such as “Dramatic” or “Quiet.” You can fine tune them with a two dimensional pad. There’s also a slider to adjust the color palette.
Maybe it’s just me, but I found the UI a bit bewildering at first, so I drew this legend to illustrate.
Your adjustments get wiped after a while, though, if not configured to persist.
In the past, I avoided photographic styles, because they were destructive; if I went with a black and white style, I’d lose all the color information. The coolest change to photographic styles this year is that they’re “perceptually non-destructive.” You should be able to reverse the effects of the style, later.
It passed my test — it worked great for me. This even survives an AirDrop to your friend — they can undo or edit the Style when the metadata remains intact.
The added control in Photos also allows you to tune down the “HDR look,” one of the more polarizing aesthetics of iPhone photos. However, Photos doesn’t reduce sharpening, noise reduction, and subject-based adjustments. They still give your photos a “Shot on iPhone” look, whether or not that’s your cup of tea. For deliberate photography, I’m sticking to RAW. For quick snapshots I’ll be shooting in a custom Rose Gold Style.
Advertisement
Video and Audio
iPhone 16 Pro brings 4K 120fps video, including ProRes Log (!). It’s a huge upgrade, and the improved controls in Photos to adjust playback speed are a welcome change too. 4K 120fps video can be recorded in HDR / SDR and with full processing (Slo-mo video uses a visibly lower quality process), whereas ProRes can only be captured using an external SSD. I love the ’new Apple’ that is shipping features like this clearly aimed at the professionals; I don’t see many people shooting iPhones with an SSD attached, but for those that do, this is a fantastic improvement on what has already proven to be the best phone to capture video on.
With Log and ACES, shots from iPhone simply fit into a nice workflow and can slip in undetected as deep depth of field B-roll no problem:
I am not a tremendously huge user of iPhone mics, but both iPhones (iPhone 16 and iPhone 16 Pro) get an improved 4-mic array that support a new Audio Mix feature. It lets you filter out background noise or re-render your audio recording as it if was in a studio or mastered more cinematically.
iPhone 16 Pro can capture Spatial Audio along with your Spatial Video, and does this computational audio processing a bit better than its amateur sibling. It’s very impressive, and can be a huge benefit if you find yourself without a mic — which for most people is probably most situations!
A minor improvement I would suggest to make this useful to us is to allow us to run 4-mic audio capture sessions during recordings that use an external microphone. The peace of mind of having a usable backup recording with Audio Mix would be tremendous.
Advertisement
Camera Control
Okay, here’s the really big deal. Something entirely new on your iPhone.
Over the life of the iPhone, buttons its buttons have either remained the same, evolved, or vanished. Here’s the original iPhone: home, power, volume down, volume up, and a ringer switch.
The first thing to change was the home button. It became a fingerprint sensor and was no longer actually clicking down. With iPhone X, it was finally put out to pasture: a full screen phone didn’t need a home button. A system of gestures worked much better, and Face ID removed the need to scan your finger to unlock.
After that, things stayed the same up until last year, when the ringer switch became an action button. That’s evolution on par with the home button. The only thing so far has been evolution or reduction.
The addition of a new control, then, is a huge deal. I feel like everyone is being fairly casual about this, when Apple is extraordinarily focused on reducing things down to the bare essentials. This showing up on the outside of your iPhone means Apple views it as essential.
Advertisement
How is in actual use? To me, the most important part about controls on a camera is that they become an extensions of you. You can get it in your fingers, and use it blindly. You know what a press or swipe sets. Camera Control delivers on this on some fronts, and not on others.
At the core of the Camera Control experience, there’s several fundamental interactions: one is to open your camera — truly, yours; it can be any camera app — and the other is to interact with it.
The Button
The first was something I had truly underrated when I saw the announcement. What caught eyes and headlines about the Control is the way you can half press and swipe on it; after all, we’ve had camera buttons on phones before. When I got my first iPhone, my then-girlfriend was deep into fancy Nokias — her Nokia N95 had a camera button (and a lot of other ones, too). Nothing new here. Or is there?
I found myself grabbing my ‘old’ iPhone 15 Pro after just days of using the 16 Pro and pointlessly mashing the side of the phone instinctively when I went to take a shot. The Camera Control (don’t call it a button!) is flush with your iPhone; it does not detract from the regular, familiar iPhone hand feel. But it will change the way you interact with it all the same.
Advertisement
Take a beat right now if you are reading this on your phone. Imagine a sudden flash of light causes a gorgeous rainbow to appear in an instant outside your window. Close your eyes. What routine do you have to quickly open your camera?
I had one. We all have some kind of routine, and after years of iPhone use, it’s pretty hard wired. It might take you some time to override this on iPhone 16 Pro, but once you do, it’s much, much faster. You just press that button. Locked? Press the button. Reading in the News app? Press the button.
When I reflexively went to do it on my older iPhone, the phone felt broken — as if you’d press the side button and the screen didn’t light up. I think we’ll see this camera opening button on many if not all Android phones very soon. It just becomes routine so fast, and once this gets in your muscle memory it’s extremely frustrating when it’s not there. You miss a shot. Because of that stupid button-less phone.
When Apple adds something like this, it tends to be just a bit more thought out than a new button that takes a photo — not a thing tacked on as a quick shiny thing to entice buyers. Thoughtful details are abound with the camera triggering press: in your pocket, iPhone won’t open the camera if you press it by accident, as was so wonderfully tested in Faruk’s review:
Advertisement
John Gruber wrote an excellent part of his review going into more into detail on what makes it behave the way it does. Myself, found all this ‘smart’ behavior solid — I haven’t ended up with any errant snaps.
Let’s talk about the rest of this control, though — what is beneath the Sapphire surface.
The Adjustments
This button can be half-pressed, which is to say, not pressed fully. A light press on the button while Camera is open opens an Adjustment menu. Swiping on the control itself lets you dial the selected setting. The settings are (sequentially): Exposure, Depth, Zoom, Cameras, Style, and Tone.
By default, it behaves as a zoom dial. The dial ’snaps’ to the native focal lengths of each lens fairly aggressively, which is a good thing because a swipe on the Control has momentum if you swipe and let go. For precise adjustment, keeping your finger on the Control will allow pretty fine-grained dialing-in with minimal finger movements. I am impressed with its precision.
Regardless, if you are like me and consider zooming no more than cropping your shot, Apple has a ‘Cameras’ Adjustment that is by far my favorite way to use it. The Adjustment has all 4 ‘lenses’ in a row — from 0.5× at the far end to 1×, 2× and finally the 5× telephoto at the other. The result is a quick, pleasing way to cycle through your framing options with a satisfying level of precision — and delivers an amazing interaction iPhone cameras have never had.
The Cameras adjustment can be used… blindly. This may sounds bizarre at the face of it — why would you want to operate the camera without seeing it on a smartphone? Well — recall that reflexive habit-forming I described of opening your camera without looking at it by pressing the Control? The same applies here. Not only can I open it, but I can swipe and feel the haptic click or being on the ultra-wide or telephoto and raise my camera to take my shot.
Advertisement
You’ll see photographers look at a shot and have a hand on their lens, snapping to a setting and then raising it to their eyes to shoot. It’s essential. With this, I can hold my phone in an awkward position with little visibility and shoot through one of the lenses without seeing the screen. I ended up using this a lot. It’s really hard to put into words, but it becomes something in your fingers; a really tactile camera experience that is more of an extension of you. It’s so nice. It’s just like using a camera lens.
That brings me to the not so good part: the Cameras adjustment experience is so nice, integrated and good that it makes the rest of theadjustments feel less great.
Apple has successfully kept a lot of its Camera app paradigms rooted in traditional concepts of photography. Portrait mode features f-stops for its depth effect; lenses are described in full-frame equivalent focal lengths. This stuff matters: it exposes users, even casual ones, to the fundamentals of photography. Through the most popular camera, everyone continues to be educated about these things and can learn what they mean in a very hands-on manner.
Camera Control offers a lot of options, and in doing so, I feel like the Camera Control somewhat breaks from your traditional expectation of what a ‘dial’ on a camera does. Dials do one thing. This does many. In doing so, departs from a camera convention whose simplicity is appreciated by amateurs and professionals alike.
Advertisement
In my ideal setup, Camera Control simply has one, potentially mode-dependent, adjustment. Ideally, it has a logical and predictable start and end (‘opening up’ an aperture can be done without looking at the lens — a similar thing goes for the zoom range). Simplicity can be its flexibility: ideally, it is so predictable and applicable to the camera task at hand that it works even if you cannot see an on-screen interface. Having a “double light press” and navigating a sort of mini-meta-menu system just ends up feeling kind of clunky and odd.
It ends up packing a lot of on-screen interface, and that can also get in the way: If I launch into the Camera, swipe quickly to get to the ultra wide, then hold my finger on it to be ready to shoot, the Camera Control keeps hovering over my frame.
In all, I think the relative plethora of Adjustments makes it feel clumsier and less sleek and snappy than it could be. Given its soft haptic feedback and many options, it can seem a bit overwhelming even to more photographically savvy users. Those more conspiratorially minded might assume Apple added more features here to compensate for the iPhone having fewer at launch; I myself think it’s just a commendable first attempt to do something new.
Focus
For us as developers, it is an interesting new thing. It seems, for now, uniquely catered to us: not only can you set the Camera Control to open any (camera) app like Halide, you can also create your own Adjustments. The API allows us to pick a system icon and use a basic picker — no custom interface — to tie into features of our own app.
Advertisement
It was tempting to just rush into this and have something on day one, but we really wanted to experience the Camera Control and devices for a while to see how it would work in our way. We like to do things a certain, opinionated, focused way. And that’s exactly what we did: Camera Control in Halide offers 2 Adjustments: EV, to adjust exposure, and Focus, at the end of the scale.
Much like Cameras, a manual focus adjustment allows you to quickly focus something as close as possible without looking at the phone. Adjustment for exposure lives in the middle, with a bit more latitude than the system (we go up to -/+ 6 EV, vs. 2) — and the top one is like the middle of your gearbox: “Lock”. Leaving a simple locked adjustment at the top level means Halide does not suffer from any accidental triggers in case you have sensitive adjustments.
On The Nature Of Shutters
There’s an invisible aspect to Camera Control I want to touch on before we move on. I noticed it is also deeply integrated into a low-level improvement I mentioned before — and to understand that, you have to look at how cameras take photos.
Try pressing a shutter button on a regular camera. Film or digital — it will make a quick click. The moment the button reaches the bottom of its throw is when a camera takes a photo.
iPhone 16s do not do that. In fact, they cannot do that. What do I mean by ’that’? Taking a photo as soon as you press down. They take a photo when you release the button. This is something we worked hard to avoid in Halide: when you press the shutter, you want the smallest possible delay; a shutter should fire when the shutter is triggered, not upon release.
Advertisement
But the Camera Control can be long-pressed to take a video. How, then, do you still capture what you see on your screen? Therein lies the smart part of this camera — using the aforementioned Zero Shutter Lag, it can offset the ’slowness’ of the button by grabbing a photo in its buffer. It’s remarkable, and works great for getting a steady shot despite your press, and despite any delay from raising your finger.
The Long Game
I am obviously excited about what the Camera Control brings to the iPhone. It’s a huge change, but it’s easy to miss the long view here.
There’s a reason this isn’t on just Pro phones like a telephoto. Apple knows something about cameras, and that is that they will mean something very different in the years and probably decades and beyond to come.
As our devices become our intelligent companions, cameras are their most important sensors. Their eyes to the world — and accessing the toggle to let them see and interacting with the world is exactly what this control is about. While I feel tremendously catered to, I do think the long view here isn’t to use this as an aperture ring or a focus dial — it’s a button and aperture for the intelligence inside your device.
Advertisement
Processing
And that brings us to the intelligence that does live in this device and controls how every image comes out: Apple’s intelligent image processing.
Image processing has been a hot topic of this review for a while now, and this generation is no different. It’s something a lot of reviews of the iPhone 16 actually already have talked about in varying ways.
Here’s the thing that won’t change, review after review: an iPhone is just better at being a computer than a camera. That’s the reality of it. If you have a large camera with a big lens and a big sensor, it can gather a lot more light. That’s just physics.If you have a small camera and a small sensor, you’re going to have to make up for it somehow. The way the iPhone makes up for it is by being a better computer than a camera. All this computational magic that it does, merging a dozen frames into one, gives it great dynamic range. It lets it take photos at night. It does magic — stuff a small camera shouldn’t be able to pull off.
It’s honestly invisible and fantastic when it works. But when it doesn’t, and it does something unexpected, it’s not great. Is that different this year?
Advertisement
In brief: if you were a fan of the iPhone 15 Pro’s processing, you will enjoy what iPhone 16 Pro is offering up this year. And if you didn’t, there is now a genuinely useful and mostly-lossless way to get shots looking very different than the past years’ iPhones without editing them.
I think there’s people at Apple that probably want the iPhone camera to have a more opinionated ‘look’ — but at this point, a billion people use it. It’s an eternal balance of being a tool for creatives and the most popular tool in human’s hands to capture the world around them as it exists. Not an easy task.
That being said: I think Apple should put almost all of its effort into achieving the seemingly-impossible: a noise reduction method that looks more natural than AI ‘making up’ details, or watercolor smudging. If anyone can make grain a symbol of photographic craft and authenticity, even out of a digital camera, it’s Apple. I still get shots from iPhone where it will go through tremendous lengths to prevent me from having a noisy shot. I can see that extracting detail from the noise is difficult; but the resulting image just looks odd.
If there is one theme to iPhone’s approach to photography this year, it’s more control — and that might apply to the Camera Control, and Photographic Styles, but it remains rather processed whether you like it or not. My advice?
Advertisement
Start to accept that highly processed images are here to stay.
As technology marches on, we are using cameras that help us achieve greater results than the physics would even support — but in doing so, some level of creative control is lost. And while we have tools, like our Process Zero, to achieve what I would call ‘old fashioned photography’ — We are not sure if that will even survive through the long future.
As we strive for ever thinner devices, folding phones and the tech we see in science fiction, processing is the only thing that enable cameras to work in the increasing constraints on power and size they have to fit into.
Even on your new iPhone, camera quality isn’t only quantified by sharpness of a lens or rendering of a single image anymore. The definition of color and sharpness have given way to photography reborn as data science. Your camera gathers signal — and in that signal is noise. The more signal it can acquire, the better. It can handle the aberrations, it can handle the noise with extra processing — as long as it can maximize its light input. In native raw captures, we see more color fringing than years ago; it’s just very well processed out of your shot. Lenses get ‘worse’ — but the photos get better.
Advertisement
That’s why I am here to tell you not to be optimistic about our cellphone cameras going towards less processing. Cameras are being optimized for the future, where photography relies increasingly on magic — and it today’s processing will seem quaint. Things in a decade will be very different than what it is today.
iPhone SE (Spatial Edition)
I’ve talked a lot about photography and video changing, but if you’ll humor me for just one more moment, I’ll talk about one change that excites me. Apple’s push into Spatial photo and video might not be for everyone, but its existence helps a chicken-and-egg problem in an emerging medium that has moved more people close to me to tears than I can recall.
Spatial media — that is, photos and videos shot in 3D for you to relive on a device like Apple Vision Pro — is still nascent.
There’s various tools for capturing immerse and spatial video and audio, but if this is the first iPhone built for the ground up for AI, it’s equally fair to say it’s the first one built from the ground up for Spatial Capture.
That excites me, not because I am an avid lover or consumer of it, but because it’s a genuine new form of media arts that does not involve boiling a lake to generate an image of an astronaut riding a cat. I love that Apple’s working hard to make tools, regardless of demand. The only way we can experience amazing art is if we invent the tools to make it, first.
Advertisement
Verdict: A Camera That Adds Something
iPhone 16 Pro, along with iPhone 15 Pro and 14 Pro, are all what I would call a ’seismic’ camera release for Pros: the kind that has such significant changes that you would not consider it an incremental move but one that makes it practically impossible to go back.
iPhone 14 Pro brought us a large, gorgeous 48MP main cameras. iPhone 15 Pro ProRes Log. And now, iPhone 16 Pro brings Zero Shutter Lag and Camera Control.
If you want a quick verdict: the iPhone 16 Pro is a tremendous camera because between Camera Control, Zero Shutter Lag and its advanced Photographic Styles, it will capture more moments than any iPhone ever did by a huge margin — and that in itself makes me recommend it over any previous one.
That being said, there’s a larger feeling I am left with after reviewing this device in my hands.
Advertisement
As I feel myself getting older, I hold on to the idea of what I think a ‘camera’ or ‘photography’ is more and more. The same happened with cellphones. People used to ridicule that your telephone had a camera on it. No doubt there were purists that said, “well, in my day, this was a thing you took phone calls on. Not a computer in your pocket.”
Here I am: in my day, a camera was a thing you took photos on. Not a computer brain’s eyes to the world. Perhaps I am feeling this is a big change because possibly, this is the close to the last of its kind or the link in evolution: an iPhone that has long since redefined what a phone is, but is about to redefine what a camera is, and what photography means.
Recall the introduction of the iPhone as a phone, an internet communicator, and an iPod. Notably lacking? The camera.
This iPhone is a camera. Maybe the first, if you were to define a camera as a device that has dedicated control for it.
Advertisement
It was in a place like this where one of Steve Jobs’ greatest inspirations once stood and imagined a revolution in photography that shocked the world. He imagined something simple: Instead of having to hire a photographer with a camera, who would bring film to a lab or go into a darkroom to present the shot days later, he imagined a small, elegant metal rectangle that fit into your pocket.
You could simply take it out, slide your finger on its surface to adjust your shot, and take the photo. The real magic? You’d take it and see it; no need to develop any film. Instant gratification.
That man was Edwin Land. He envisioned something most considered impossible: the Polaroid SX-70. It changed photography forever. It seems futuristic today. And guess what? The only controls on that camera were a button… and one slider, right here at the top.
Land didn’t create this because he was obsessed with technology. He wanted to strip away the complications of photography and make it accessible. To focus on the craft, and art and less about worrying about know-how or technique. To truly bring it to its essence: empowering anyone to capture a moment. Surely, some lamented the loss of craft. The loss of essential parts of photography.
Perhaps it was the camera phone that was the next step that truly made photography even more accessible and instant. But many feel like something was lost. It’s telling, then, that where Land removed so many parts of the camera, Apple is adding one.
Apple adding the a new control – a button, a dial – to iPhone isn’t a move it does casually. It’s an admission of a fundamental change of iPhone’s nature that happened over time. An admission that iPhones are far less phones today, and far more cameras.
Advertisement
But as a photographer, remember that ‘camera’ might really mean something entirely different than what we are used to — phones once made phone calls. Today, cameras take photos. In the future? Perhaps this is much more a lens to see and process the world with. A camera, as it is defined in the 21st century.
If I’m reviewing this the way it is, then, I’m really enjoying what I have in my hands. A device on the edge of the sands of time — rooted in the cameras I love, with just enough of the future of photography packed in here for me to manage.
GamesBeat Next is almost here! GB Next is the premier event for product leaders and leadership in the gaming industry. Coming up October 28th and 29th, join fellow leaders and amazing speakers like Matthew Bromberg (CEO Unity), Amy Hennig (Co-President of New Media Skydance Games), Laura Naviaux Sturr (GM Operations Amazon Games), Amir Satvat (Business Development Director Tencent), and so many others. See the full speaker list and register here.
Sweeney is a big proponent of open platforms and the open metaverse. In fact, he will talk about that subject in a virtual talk at our GamesBeat Next 2024 event on October 28-29 in San Francisco. (You can use this code for a 25% discount: gbn24dean). And so Sweeney continues to pressure the major platforms to give more favorable terms to game developers.
He started out on that front by giving a price cut for users of Unreal Engine 5, Epic’s tools for making games. For those who release games first or simultaneously on the Epic Games Store, Epic is cutting its royalty rate from 5% to 3.5% for Unreal developers.
He noted that Epic is in better financial shape than it was a year ago, when Epic had to lay off a lot of staff. Sweeney said the company spent the last year rebuilding.
Join us for GamesBeat Next!
GamesBeat Next is almost here! GB Next is the premier event for product leaders and leadership in the gaming industry. Coming up October 28th and 29th, join fellow leaders and amazing speakers like Matthew Bromberg (CEO Unity), Amy Hennig (Co-President of New Media Skydance Games), Laura Naviaux Sturr (GM Operations Amazon Games), Amir Satvat (Business Development Director Tencent), and so many others. See the full speaker list and register here.
“I’m happy to tell you now that the company is financially sound and that Fortnite and Epic Games Store hit new record records in concurrency and success,” he said.
Advertisement
Fortnite reached a peak last holiday season of 110 million monthly active users, and Sweeney said the Epic Games Store is seeing record success. He said the company has emphasized the shift toward large social games and the concept of the metaverse. The strategy includes unifying Unreal Engine 5’s high-end features with Unreal Editor for Fortnite (UEFN) to create Unreal Engine 6, aiming for easier, scalable game development.
He also noted the financial cushion that came from a $1.5 billion investment from Disney, which has teamed with Epic Games to create a Disney-based virtual world with all of Disney’s characters — interconnected with the world of Fortnite. He noted Epic’s small but important victory against Apple in court in the U.S. and in the regulatory arena in the European Union, enabling developers to promote alternative app stores without (too severe) penalties from Apple.
And he noted Epic’s legal victory against Google’s app store Google Play in Epic’s antitrust lawsuit (alongside the federal victory over Google in a search-related antitrust case). But he still had harsh words for Samsung and Google, noting a fresh antitrust lawsuit over their alleged collusion to block Fortnite’s return to the Samsung app store on Android smartphones.
Sweeney noted there is a generational change in the game industry, with established titles with familiar gameplay not doing as well with consumers, while players are gravitating to big games with more friends.
Advertisement
“This is all happening in the context of a game business. It’s rapidly changing in a way that we’ve only seen a few times in our lifetimes as game developers. It’s a generational change, and while the one of the manifestations we’re seeing right now is a lot of games are released with high budgets, and they’re not selling nearly as well as expected, whereas other games are going incredibly strong,” Sweeney said. “What we’re seeing the real trend here is the players are gravitating towards the really big games where they can play with more of their friends. And so this is a manifestation of Metcalfe’s Law,” about how the value of a network or social experience grows in proportion to the number of friends you can connect to.
“And in the world of gaming, that means that you and your friends getting together and playing games, chatting by voice, attending concerts and doing all other kinds of cool virtual things online,” he said. “And this trend — some people will call it the metaverse, and we’re not all in agreement on what this means. Some people, when they hear the word metaverse, they think of what Facebook is doing with VR and now AR. Some people use the metaverse to describe everything they don’t like about the current Fortnite season. But when you look at what’s happening in the world of Fortnite, it’s new and it’s exciting, and it’s something that never happened at this scale in the history of entertainment, with all an original story that’s evolving with original content and also all the world’s brands participating, dropping in, musicians, reaching users, Disney and Star Wars and others, all coming together to create a world class entertainment experience.”
This is the future of gaming, he said.
Back to growth
“The primary goal for this decade is to help all developers achieve” their growth goals, he said. “And our strategy for doing this over time is to share everything we’ve built with you so that you can do these same things. And this is not just a message to game developers. It’s also a message the entire real time industry.”
That means film and TV makers can use Unreal Engine for virtual production on a massive scale. So can architecture firms, automotive companies, fashion, music, enterprises and gaming, he said.
“The common thing all of this shares is that we all aim to reach the world with our stuff, and we’re all using the same tools to make this come together in an unprecedented skill. I think Fortnite is just one demonstration. Other games are doing similar things, but as this is adopted more widely by the entire world, we think it’s a growth opportunity for everybody, and we’re way out of the game industry’s current malaise,” Sweeney said.
Advertisement
Epic’s next journey is to create Unreal Engine 6. There’s Unreal Engine 5 for high-end game development in consoles, mobile games and PCs. And there is a new thread of development for user-generated content makers and smaller companies using Unreal Editor for Fortnite.
“Over the next few years, we’re going to be bringing these two development [threads] together,” he said.
That will lead to Unreal Engine 6 and foundations for gameplay programming that are easier to learn and more scalable.
He said Epic will help enable everybody to build a game once and then choose one platform or to ship it to all platforms and all the app stores at once.
Advertisement
On the metaverse, he said Epic is participating in standards bodies like the Metaverse Standards Forum and other groups to define standards applicable to all engines and all digital content creation tools.
“The ultimate aim of this effort is to achieve technical interoperability between games and game engines of all sorts, and to achieve economic interoperability in an open system,” he said. “The game developers can easily build experiences standalone or in Fortnite or anywhere. Purchases in one place are honored in other places, and the entire economy is an open economy where everybody can participate.”
He said that Epic and Disney are working together to build a “new Disney ecosystem that is Disney’s but is also fully connected to Fortnite, such that anything you get in one place can work in the other place, and your experiences aren’t disconnected, and you have the same friends, same items and the same the same social experience as you go.”
He said this partnership is just the first step towards an open system in which all companies and creators can participate together in the future as peers.
Advertisement
More litigation
He said one really important aspect to this effort is “Epic’s fight to open mobile platforms to competition because for a vibrant digital ecosystem to exist in the future, we need fair competition into these monopoly rent collectors now.”
He said the app stores focus on limiting what developers can do, imposing more restrictions to prevent things like the metaverse from happening, or to tax developers to the point where they’re extracting all of the profits from a game’s sales.
“We’re at a point now where game development is expensive. It’s low margin, and game companies are suffering. Apple and Google make way more profit from most games than the developers make themselves, while contributing nothing,” Sweeney said.
So Tim, tell us how you really feel.
He noted how he grew up programming an Apple computer to follow the Steve Wozniak vision for Apple, not the modern corporate vision of Apple. He misses the days when you could do anything with a computer, with not need to ask a corporation’s permission for anything. He noted that is why there is more innovation on Windows, Mac and Linux than on the mobile platforms. He referred to Apple and Google as gatekeepers.
Advertisement
“Among the fights we’ve taken on here, he noted the case with Apple is still an ongoing fight to open up payments so developers can process payments without Apple mediation and without Apple fees,” he said.
He noted the “massive victory” against Google in a jury trial late last year, when Google lost on all counts in antitrust litigation. He noted the European Union’s implementation of the Digital Markets Act, which enabled Fortnite to return to iOS in Europe.
And he said the United Kingdom and Japan have passed new laws, and there’s major legislation in many major developing countries all around the world.
“The world is changing for the better. There’s much more to do. We’re going to keep fighting on until there’s a victory,” he said.
Advertisement
I’ve asked Apple, Google and Samsung for comment.
VB Daily
Stay in the know! Get the latest news in your inbox daily
– Compute Blade: https://computeblade.com
– My open source Pi Cluster project: https://github.com/geerlingguy/pi-cluster
– Radxa CM3 and Pine64 SOQuartz review: https://www.youtube.com/watch?v=aXlcNVKK-7Q
– BigTreeTech CB1 Review in Livestream: https://www.youtube.com/watch?v=Krpac-MaD5s
– Compute Blade alpha review: https://www.youtube.com/watch?v=zH9GwYZu_aE
Support me on Patreon: https://www.patreon.com/geerlingguy
Sponsor me on GitHub: https://github.com/sponsors/geerlingguy
Merch: https://redshirtjeff.com
2nd Channel: https://www.youtube.com/c/GeerlingEngineering
Contents:
00:00 – This is the Compute Blade
00:34 – A Slice of Pi
03:35 – Why blade?
06:15 – Pine64’s Blade
06:58 – Clone Wars
10:17 – Kickstarter and price .
The fusion sector recently has been showered with interest from investors, who no doubt have been encouraged by the breakthrough experiment at the National Ignition Facility two years ago, which proved that a controlled fusion reaction could generate more power than was required to kick it off.
The first company to build a power plant that can produce electricity that can be sold to the grid en masse could start chipping away at the multi-trillion-dollar global energy market. Tech firms, in particular, have been eyeing fusion and nuclear startups as possible pollution-free solutions to their AI-induced power demands.
Acceleron did not immediately reply to questions.
Advertisement
Where most startups aim to re-create the superheated, super-pressurized conditions inside of a star, Acceleron takes a different approach, using subatomic particles known as muons to lower the heat and pressure required for fusion reactions to take place.
In nature, atoms tend to resist fusing, mostly because an atom’s orbiting electrons repel other atoms. To get around that, most approaches to fusion follow nature’s approach: they get atoms hot enough and close enough that their electrons are freed from their orbits, lowering the usual atomic inhibitions. As atomic nuclei zip around without their electrons, some ram into each other, fusing into a new nucleus and releasing enormous amounts of energy. That’s what happens inside a star.
Muon-catalyzed fusion takes a different tack. Instead of heating and compressing hydrogen isotopes, it injects muons into the mix. Muons are subatomic particles that resemble electrons — both have a negative charge — but their mass is 207 times greater. As muons bombard hydrogen isotopes, they replace electrons in some atoms. A muon orbits the nucleus of an atom much more closely than an electron, lowering the barrier atoms need to fuse.
In muon-catalyzed fusion, the barrier is low enough that fusion can occur at room temperature and pressure. That’s why it’s sometimes called cold fusion. While muon-catalyzed has been demonstrated in laboratory conditions, the energy required to generate muons has so far outstripped the amount of energy produced by any fusion reactions.
Advertisement
There are a few reasons why muon-catalyzed fusion hasn’t worked yet. For one, each muon only lasts for about 2.2 microseconds before it decays into less useful subatomic particles. That’s long enough to facilitate about 100 fusion reactions, but still too short for commercial power purposes. The other problem is that about 0.8% of the time, a muon gets stuck to another subatomic particle (an alpha particle) and doesn’t participate in any more fusion reactions. That may not seem like much, but again, it has been high enough to doom commercial plans.
Cambridge, Massachusetts-based Acceleron, which spun out of NK Labs, is hoping that by raising the pressure of the hydrogen isotope mix, and maybe the temperature, it’ll be able to reduce the rate at which muons stick to alpha particles. The hope is to keep enough muons in the mix to catalyze more fusion reactions, ideally enough more that they’ll offset the amount of power required to generate the muons.
NK Labs was awarded a three-year, $2 million ARPA-E grant in 2020 to explore whether higher pressure would improve the prospects of muon-catalyzed fusion. The results, not all of which are public at this time, appear to have piqued investors’ interests.
That the new iPhone SE models may support Apple Intelligence says a lot about their performance — it takes a lot of RAM to run local AI features, for instance. They’re expected to look like the iPhone 14, doing away with the chunky top and bottom bezels and the home button; both are firsts for the entry-level smartphone. It’s also rumored that the iPhone SE 4 will get an OLED screen, rather than the usual LCD.
Apple will produce 11-inch and 13-inch iPad Airs with “internal improvements” at the same time as the new SE models, Gurman writes. And right along with those will be new Magic Keyboards for both sizes that will come with some iPad Pro keyboard features.
Finally, according to Gurman, before the year is through, Apple will release new M4-equipped computers: a smaller Mac Mini, new MacBook Pros, and iMacs. He reckons an update to the iPad Mini is also possible.
A US jury has ruled Amazon Web Services (AWS) willingly infringed on two patents, and must now pay $30.5 million for violating the patent owner’s rights in computer networking and broadcasting technology.
The offending technologies were AWS’s Cloudfront content delivery network and Virtual Private Cloud virtual network – which infringed on the patents originally owned by Boeing, but obtained by Acceleration Bay.
The two patents in this case are said to involve methods of streamlining data delivery across a network. Without getting too technical, the technologies allow data to be sent from peer to peer and flow around slow or broken connections by forming a network.
Assertion entity
Acceleration Bay describes itself as an ‘Incubator & Investor’, and recently won a separate patent trial against Activision, in which the video game developer was ordered to pay $23.4 million.
The final court’s judgement in the AWS case will come soon, but the payout could yet triple, due to the fact that Amazon ‘willfully’ breached the patents. AWS cloud services reportedly brings in around $9 billion operating profit per quarter – which is around 62% of Amazon’s total, so it probably won’t be hit too hard by the charges.
Advertisement
This isn’t the first time AWS has faced opposition with patented technology, having had to pay $525 million in damages earlier in 2024 after losing a cloud storage patent case.
The tech giant has also had a long-running spat with Nokia, with both firms bringing forward patent lawsuits against each other in recent years – most recently, in August 2024, AWS accused Nokia of over a dozen infringements for cloud computing technologies.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Since AWS is a dominant player in the cloud storage game, naturally it controls a lot of the technologies involved, which it claimed Nokia was using without permission.
You must be logged in to post a comment Login