Connect with us

Technology

All The Latest iOS 17 Photography Features

Published

on

All The Latest iOS 17 Photography Features

New iPhone season is upon us! Okay, more like the ‘new-iPhones-arrive-week’, but every year the event is inaugurated with a fresh update to iOS. This year, iOS 17 contains a goodies that can make photography feel super fast and responsive. Naturally, we’re supporting them on day 1.

Let’s dig into how we’re supporting them in today’s Halide update, and other ways we spent our summer.

Zero Shutter Lag

When looking at the live preview in the camera on your phone, what you see often won’t be what you get when you snap a photo.

Let’s take a photo of the iPad stopwatch. In the image of the left, we have what we saw in the viewfinder, and the image on the right is what was actually captured.

A screenshot of a viewfinder showing a stopwatch, side by side with a photo of the stopwatch, the times slightly different.
The numbers blend together because it’s updating the screen really fast.

Why is there a difference of about 1/5 of a second? By the time an image arrives on your screen, it’s already been captured. When you tap the capture button, you’re asking the iPhone to capture the next image that hits your image sensor.

While the delay seems small, it can be enough to throw off a perfect shot. This is why, last year, we asked Apple to offer Zero Shutter Lag, and we’re super excited to see it launch!

Advertisement
A sequences of seven photos in quick succession, with the fourth photo highlighted.

With ZSL, your iPhone constantly captures photos from the moment you start a session. It doesn’t actually save these photos, it just holds them in memory for a few seconds. When an app asks the iPhone to take a photo, iOS actually looks back at the last few seconds of photos and grabs the one that was visible when you tapped that capture button.

There are a few situations where this won’t work: when capturing with manual exposure settings (shutter speed and ISO), or if you’re shooting with flash (shame on you). We imagine the former is a technical limitation that could be fixed someday, but we don’t see flash support coming to ZSL. It would have to leave the light on the whole time you’re shooting, which would be weird.

Prioritizing Responsiveness

After your iPhone picks a photo, its job isn’t done. Next it performs magic such as Deep Fusion, Smart HDR, and some other bespoke adjustments not mentioned in iPhone marketing. Let’s call all of this stuff “smart processing” for brevity.

Smart processing uses a lot of memory and GPUs, pushing your iPhone to its limit. If you keep smashing that capture button while it’s in the middle of processing photos, it will quickly get overwhelmed. Rather than run out of memory and crash, iOS stops allowing you to take photos until it’s had a chance to catch up.

This can make it hard to snap photos of fast moving subjects like the scrub jay that visits you every morning that you named Blorbo.

We’ve had a solution in Halide since 2019: a switch in Capture Settings that lets you turn off the smartest processing. This cuts down the time and resources it takes to process a photo, making it easier for your iPhone to keep up with demand.

Advertisement
A screenshot of the "Enable Smartest Processing" option in Halide.
A lot of users dig this switch because they prefer a less-processed look.

This isn’t a perfect solution, because you need to know this switch exists in the first place, and then manually toggle it on and off. No more! With iOS 17, Halide detects when it’s getting overwhelmed and temporarily cuts back on processing.

In other words, if you just snap one photo every few seconds, your photos will get Deep Fusion and similar magic. If you quickly tap the capture button, the first couple of photos will probably get all that processing, but then the system will turn it off so you don’t miss shots. Say hello to Blorbo!

Photos of a blue California scrub jay.

If you don’t like this new automatic behavior, and want to ensure every photo gets all the smartest processing, you can opt-out in Capture Settings by disabling our new ‘Prioritize Responsiveness’ toggle.

A screenshot of the "Prioritize responsiveness" setting in Halide.

HDR Photo Capture and Display

In the last few years, TVs, phones, and computer screens have gotten better at displaying really bright images. You’d notice this of a very bright object, such as a lightbulb, and displayed it on one of these “High Dynamic Range” (HDR) screens. The bright parts really pop.

Of course writing about photography is like dancing about architecture, and we’d love to show you some example HDR photos. Unfortunately, browser support for HDR photos is not great. It’s not nearly as bad as shopping for an HDR television, where you deal with a train wreck of competing formats like HDR10, HDR10+, and Dolby Vision. (Yes, the same industry that brought you “Beta vs VHS” and “Blu-ray vs HD-DVD” continue to innovate.)

Apple wants to avoid a similar fate in still photography, so this year they put their weight behind an open standard called “ISO HDR.” Ok, technically it’s ISO/TS 22028-5, which doesn’t quite roll off the tongue. The important thing is that with iOS 17, Halide captures HDR photos. In fact, it has for years!

While the ISO HDR format is new, iPhones have captured this extra brightness information since 2020, and saving them inside your photos. You might not have noticed, because there was no way for outside developers to actually show the HDR parts of the photos in their apps until iOS 17.

Advertisement

That said… HDR can also be incredibly annoying. If you’ve scrolled through Instagram since they introduced Reels, you’ve come across uncomfortably bright videos. This is because those content creators have cranked up their brightness into the HDR range to grab your attention and stop scrolling, kind of like how TV commercials crank their sound really loud to get your attention. It’s just tacky.

Anyway, in Halide, we don’t want our last-photo thumbnail to throw off your perception while composing a shot, so HDR is turned down in that thumbnail view. HDR is only visible in the full sized reviewer.

The technology behind HDR is thorny, and the artistic side is thornier. Used sparingly, HDR can look incredible. Used incorrectly, it’s an eyesore. We’ve been thinking of elegant ways to deal with it in Halide, and we’ll talk about that in a long post in the future.

Bug Fixes All Around

While OS updates give us new toys, they invariably break apps. For example, this is the expected animation when you tap the thumbnail to open our photo reviewer.

But in the early iOS 17 betas, the image jumped in from off screen, upside-down.

Advertisement

We won’t dig too deep into the problem, but to make a long story short, iOS 17 seemed to change how it loaded orientation information during some animations. We reached out to Apple, including code that reproduced the bug for them, and they were able to fix things before iOS 17 broke a bunch of photography apps.

That was definitely a bug, but OS updates can also break things by deliberately changing the way existing code behaves.

iOS 17 changes how widget backgrounds get laid out. Apple told developers how to update apps to handle the changes, but if you don’t follow their instructions and update your app, you can end up with widgets that get cropped weird.

A screenshot of the quick-launch Halide widget, with the edges cut off.

That looks embarassing, but it’s not the end of the world. On the more serious side of things, we found some rules quietly changed around we need to package up photos before sending them to the iOS photo library. Otherwise, they wouldn’t save. Yeah.

Whether it’s bugs or rules changing, each of these fixes usually only take a couple of days. Rarely, we’ll deal stuff that takes weeks to resolve, and we’ve even had an iOS issue that plagued us for over a year. This is all time we’d rather spend moving Halide forward, but it’s a normal reality of building software for a major platform, whether it’s an iPhone or Playstation.

Advertisement

And we have sins of our own. For example, the screen in Halide that lets you pick a custom icon would sometimes cut off text on smaller iPhones. A lot of companies have trouble making a case to go back and fix these small things, because that time could go toward delivering new features that drive sales. Well our team feels money comes second to shipping things we’re proud of, so we spent time this  summer chipping away at small bugs and annoyances in Halide, like that icon picker.

We think this puts us in good shape to ship major features this fall — and we have some very cool things in the works.

Coming Soon (and Sometimes Never)

Whenever Apple releases a new technology, we don’t just slap it onto Halide and call it a day. One thing we consider is how mature it is. Maybe rather than launch something in iOS 17, we’ll wait until iOS 17.1.

One such feature was the new “Deferred Processing” system. It allows iOS to write a half-finished photo to your photo library right away, and then finish its smart processing later. This makes your iPhone even more responsive— at least, it sounds like it does. We were never able to get it to work on our test devices. We won’t release stuff we can’t test, because a slightly slower camera is better than a crashing one, but once we get it working we’ll ship it in an update.

Another reason we’ll reject a feature is if doesn’t make sense for Halide. There’s an old adage: “Every program attempts to expand until it can read mail.” It’s tempting to add every feature that comes out because it’s fun, and gets your app attention, but every feature comes at a cost.

Advertisement

Every new button makes the UI a bit more complicated, and every new line of code could break with a future OS update. We love it when Apple gives us new toys in OS updates, but we’re excited by updates that let us delete stuff.

But sometimes we’ll reject something for Halide that makes perfect sense as a little spinoff app…

Ahem. So: Halide 2.12 is available right now. We have our iPhone 15’s arriving this Friday, so keep your eyes out for our big iPhone update and our deep dive into the iPhone cameras around the corner!

Source link

Advertisement
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

The Huawei Watch D2 is a surprise sequel to one of 2022’s weirdest watches

Published

on

Huawei Watch D2

The Huawei Watch D was one of the weirdest wearables of 2022, and now the airbag-packing, blood-pressure-tracking marvel is back. 

Released this week as one of six new Huawei Watch models, the D2 builds on the ground-breaking blood pressure tracking of the first D model. In a world-first for any smartwatch, its ambulatory blood pressure monitoring (ABPM) system is now certified by China’s National Medical Products Administration and the EU’s Medical Device Regulation body. 

Source link

Continue Reading

Technology

Field Tested in the Galapagos

Published

on

Field Tested in the Galapagos

Normally we share updates through long articles, but writing about photography is like dancing about math. What if we share it in a video? (Spoiler: we are not launching a video app.) If you dig the new format, let us know in the video comments or through Halide@mastodon.social

Source link

Advertisement
Continue Reading

Technology

Cold war spy satellites and AI detect ancient underground aqueducts

Published

on

Cold war spy satellites and AI detect ancient underground aqueducts

Holes at the top of this image are vertical shafts to underground aqueducts called qanats

Nazarij Buławka et al

Most of the ancient underground aqueducts that enabled humans to settle in the world’s hottest and driest regions have been lost over time. Now, archaeologists are rediscovering them by using artificial intelligence to analyse spy satellite images taken during the cold war.

The oldest known underground aqueducts that are found across much of North Africa and the Middle East are called qanats and are up to 3000 years old. They were designed to carry water from highland or mountain…

Advertisement

Source link

Continue Reading

Technology

How generative AI changes consumer lifestyles

Published

on

How generative AI changes consumer lifestyles

Host Andrew McDougall sits down with technology experts Jason Thomson and Thomas Slide to discuss the future of generative AI. What can brands do to connect with consumers using the latest generative AI technology? How will this change the way consumers look for information, and what is the next “big” breakthrough in this space? Listen to find out more.

Source link

Continue Reading

Technology

This $75 million blockbuster was reportedly shot on an iPhone

Published

on

This $75 million blockbuster was reportedly shot on an iPhone

The highly anticipated horror flick 28 Years Later was shot entirely on the iPhone 15, Wired claimed in a report on Thursday, noting that with a budget of $75 million, it’s is the biggest movie yet to use a smartphone for filming.

The main filming for the Danny Boyle movie finished up last month and the final product is expected to land in theaters in June 2025. Those working on set had reportedly been instructed to sign a non-disclosure agreement to ensure news didn’t leak about the use of the iPhone. It’s possible that Apple and the moviemakers had been planning a big reveal to highlight the powerful capabilities of the iPhone when it comes to capturing moving pictures, but Wired’s report may impact that plan.

It was first suggested that Boyle was using a smartphone for at least some of the shots for 28 Years Later after a photo of the movie set taken by a paparazzi in July revealed, on close inspection, a protective cage holding something that was most definitely not a regular movie camera but instead, quite possibly, a high-end smartphone.

Wired investigated further and received confirmation from several people linked to the movie that Boyle had indeed been using a number of iPhone 15 Pro Max handsets — connected to elaborate rigs — to film scenes for 28 Years Later.

Advertisement

Having a prominent moviemaker like Danny Boyle use an iPhone for a big-budget movie is a real boon for Apple, which sells a lot of its smartphones off the back of the handset’s strong reputation for producing excellent imagery.

Apple itself also showed off the phone’s ability to record footage for elaborate productions during its Scary Fast event last October in which introduced the new MacBook Pro and iMac with M3 chips. All of the presenters, locations, and drone footage in the online presentation were filmed using the iPhone 15 Pro Max, though as many reports noted at the time — and which also applies to 28 Years Later — the phone was supported by a plethora of advanced moviemaking equipment such as lighting, dollies, and cranes, and also had a highly experienced post-production team to craft the footage into something compelling. The results are astonishing for such a tiny device, so we’re eager to see what Boyle has managed to do with it.



Advertisement




Source link

Continue Reading

Technology

Google funds FireSat launch to detect and track wildfires

Published

on

Featured image for Google funds FireSat launch to detect and track wildfires

Google has backed FireSat, a constellation of satellites intended to detect, track, and perhaps even prevent wildfires from spreading. The first satellite in the FireSat program is expected to launch early next year.

Google is backing the FireSat satellite launch

Google Maps and Search services have been alerting users about nearby wildfire boundaries since 2020. The search giant has been mapping the wildfires in detail ensuring users are aware of the potential danger. Google also sends notifications and instructions on how to stay safe.

Google has infused $13m into an initiative led by the Earth Fire Alliance that aims to “detect and track wildfires the size of a classroom within 20 minutes”. A blog post published this week details FireSat. Essentially, it is a new constellation of satellites to monitor, detect, and track early-stage wildfires.

In addition to financially backing FireSat, Google Research will also contribute to this project. The entire platform will have Artificial Intelligence (AI) to provide a better way to monitor and manage wildfires.

Advertisement

Google has indicated that the Google Research team will plug relevant data into Machine Learning (ML) technology. This would help develop AI-driven enhancements aimed at detecting wildfires when they are small.

How will FireSat help detect wildfires and save lives?

Wildfires are notoriously difficult to detect. Oftentimes, there are false alarms. Moreover, current-generation satellite imagery used for wildfire detection has low-resolution imagery and infrequent updates.

All these restrictions usually mean wildfires remain undetected until they become as large as football fields. Needless to say, such delays allow wildfires to rapidly expand, destroy habitats, and threaten nearby towns. Google and the FireSat constellation aim to bring down, or perhaps eliminate, the aforementioned limitations, and speed up detection.

The first FireSat satellite, which Google is helping launch, is expected to happen early next year. Fully deployed, this constellation should have 50 satellites in low-earth orbit.

Advertisement

The FireSat satellites are equipped with infrared sensors that detect small fires. Some reports suggest the constellation could eventually detect a fire as small as 5 by 5 meters or about the size of a classroom.

FireSat should be able to provide accurate and actionable information about the location, size, and intensity of early-stage wildfires. This early detection, coupled with real-time updates, could mean agencies can douse wildfires before they pose any serious threat.

Source link

Advertisement
Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.