Today, we are launching something unlike any tech product in 2024: a product that uses zero AI and zero computational photography to produce natural, film-like photos. We call it Process Zero. It lives in Halide, and it turns your iPhone into a classic camera.
Process Zero is a new mode in Halide that skips over the standard iPhone image processing system. It produces photos with more detail and allows the photographer greater control over lighting and exposure. This is not a photo filter— it really develops photos at the raw, sensor-data level.
Just like film, Process Zero photos come with (digital) negatives, affording incredible control to change exposure after the fact. Much like film, it has grain. It works best in daytime or mixed lighting, rather than nighttime shots. Thankfully, unlike film, you don’t need any chemicals to develop these negatives. We give you one dial.
Best of all, Process Zero is available on every iPhone that runs Halide and iOS 17, not just the latest iPhones Pro.
Because Process Zero eschews magical algorithms, it has tradeoffs. This is why it’s a new choice in addition to the standard iPhone photo processing system in Halide. Read on to learn why we built this, what the tradeoffs are, and where are we going.
By comparison, an iPhone is downright conservative, mostly a magic helping hand in difficult lighting situations. Consider the classic problem of capturing a window on a sunny day. If you’ve only captured photos on an iPhone, you might not even know this is a classic photography problem:
Classic cameras can either over-expose the outside or under-expose the room. But algorithms can combine these multiple photos and voilà!
Advertisement
This is great for aspiring photographers, who can now focus on learning high-level concepts instead of getting bogged down fiddling with knobs. Even experts can appreciate the convenience of just pressing a button and getting useful results.
In the years following Halide’s launch in 2017, iPhone cameras have gotten much smarter, adding sophisticated algorithms like Smart HDR and Deep Fusion. We worried about what this could mean for our app. What’s the point in manual control when a phone can do better if you stay out of the way?
It turns out we were wrong to worry. As cameras have gotten smarter, Halide has only thrived. Users want more than standard manual controls— they want control over algorithms. Tons of people tell us they love Halide for this one toggle:
To be clear, these algorithms are amazing, but leaving all decisions to a machine means sacrificing some choices as an artist. A machine can only make objective decisions, but many technical decisions are inherently subjective.
For example, a photographer manually editing their photos might ask themselves, “Do I want noise in my photo, or to eliminate it at the cost of detail?” The iPhone’s image processing pipeline doesn’t like noise at all, and that’s fine. We wouldn’t be surprised if most iPhone users prefer their photos that way.
But Sebastiaan and I like a bit of noise in our shots, and that toggle in Halide that reduces processing didn’t help. Noise reduction is just one of those things that gives iPhone photos their look. Because Halide was built on top of the system processing, we had to come along for the ride.
Advertisement
So our love of noise sent us down the path of building our own process.
What’s going on with the noise removal? We can’t say for sure, because we didn’t build the iPhone’s algorithm. We do know that when you combine multiple photos (as in the window example earlier) you are no longer capturing a single moment in time, and when you average together multiple photos, noise goes away.
Unfortunately, photo merging algorithms have to guess how each photo lines up. This is especially tricky with moving objects, and if the algorithm guesses wrong you see ghosts. In the end, all of this intricate guesswork costs more than just noise, but also fine details.
In contrast, Process Zero is a single-shot process. We take one, and only one, photo. If parts of your image are not properly exposed, we don’t have any algorithms to fix that. Sometimes this is a good thing! Consider this photo of Ethan from the day he came home from the hospital.
Advertisement
If an algorithm sees this image, it may think it needs to bring out details in the shadows and smooth away the noise. This can frustrate an experienced photographer who knows what they’re doing.
Process Zero also lets you retain full control over your camera settings. For example, algorithm-based exposure logic can’t let you pick a specific shutter speed, because the whole job of the algorithm is to take nine photos at different shutter speeds and pick the best of the bunch.
As we’ve followed the iPhone’s algorithms getting more sophisticated over the years, we’ve found that our single-shot approach is the only reliable way to give users total control over shutter speed and other exposure settings.
Consider these shots I took last year from a boat in the Galapagos. By shooting a single photo with a fast shutter speed, I outperformed the algorithm.
Just be careful what you wish for. Turning off the algorithms has tradeoffs.
Advertisement
I mentioned grain earlier, and just like film, Process Zero will have an ideal ‘ISO’ range. In the dark, it will get noisy. Fortunately, newer iPhones with quad-bayer sensors have incredible low-light performance, compared to the past. Don’t go in expecting night mode, but I’ve been surprised by how useful the results can be.
Because Process Zero does not fuse multiple shots, you are limited by the dynamic range of the sensor. That means that if you’re shooting something like that window from earlier, you need to choose which bits you want exposed.
Finally, some flagship features of the iPhone are deeply integrated with algorithms. If you want a full, 48-megapixel resolution rather than binning, that isn’t possible. That also means that if you want a virtual 2× camera at a 12-megapixel resolution, that isn’t possible. However, the situation might improve if enough people file feedback with Apple!
Then there’s the issue of HDR output. If you’ve ever scrolled through photos and video in your camera roll, and your phone suddenly got bright, that’s what we’re talking about. Some people hate the look, but I think the results can be stunning when done with thought and care.
We’d love to show you examples, but browsers don’t really support it, and that cuts to the heart of the problem. If you shoot in HDR today, your photos can look pretty different from screen to screen. Later this year, new standards will land that help with HDR compatibility, and we’ll revisit it then.
Advertisement
To summarize: Process Zero gives you a single 12-megapixel shot. It will be less saturated, softer, grainier, and quite different than what you see from most phones. Each shot includes a true Bayer RAW file, if you want to use it in a full-fledged RAW editor, but we designed Halide so you don’t need one.
If any of Process Zero’s tradeoffs are dealbreakers, that’s fine! If Process Zero just shows you how valuable smart-processing can be in difficult situations, that’s cool. We find ourselves toggling between Process Zero and the system processing depending on the content of a scene and what we’re hoping to accomplish. That’s why we made it easy to switch modes with a tap.
As cameras make more and more creative choices on your behalf, we think the photographer should retain the agency to cast aside algorithms and do their own thing. Just as a photographer expresses themselves in their choice of lens, exposure settings, and film stock, Halide now lets you choose the process that works for you.
Image Lab
Back in the days of film photography, half of the art was in taking the photo, and the other half was developing your negative. Sometimes it was to correct mistakes, and sometimes it was for creative effect. When going analog, we love pushing and pulling film.
However, adjusting the exposure on a processed JPEG/HEIC is never as good as ‘re-developing’ a digital negative.
This is why Process Zero includes a digital negative. Rather than leave you to find an editor that supports them, we decided to go the extra mile and include Image Lab, a one-dial solution to developing your negative.
We think Image Lab gives Halide a great “batteries included” experience. Take a shot, tweak it if you have to, and share it — no other steps are required. We also built Image Lab because even if you’re comfortable working with RAW editors, they can yield different results than Process Zero.
Advertisement
Image Lab is not a full-fledged editor. It does not contain any color or contrast adjustment knobs. You can’t even crop! Think of Process Zero + Image Lab as “step zero” of your workflow.
But if you do take those DNGs we include with Process Zero into another app, they edit pretty well:
Process Zero is Step One
Today is also step zero in our journey to the next generation of Halide. We plan to make it bigger than our award-winning Mark II. We’re going to call it… wait for it… Mark III.
Our Plan for Mark III
We released Halide Mark II after a long silent period, so we could release one big, splashy update. While that’s fun, it’s both risky and ultimately less useful to our users. We’ve decided to try roll out some Mark III features a bit early, rather than saving them up for one big launch at the end. As much as we love surprises, we think getting these features into your hands and gathering feedback will make Mark III much better than developing it in a vacuum.
Process Zero is step one: the first part of giving you even more control, and beautiful output out of the box. We’re going to go beyond that.
Advertisement
Halide members will get early access to even more Mark III features, along with exclusive icons and other goodies. You can get the app here — and try it for free for a week by starting a membership.
If subscriptions aren’t your thing, that’s totally fine. We’re continuing to offer a one-time-purchase option too. While it’s technically a one-time purchase for Mark II, we’ve decided to include Mark III, too.
It has been a busy year. A few months ago, we shipped our new video app Kino. Today, we launched Process Zero, the first step on the road to Mark III. In a few weeks we’ll enter the busiest time of year. We feel incredibly privileged to be able to work on what we love, and it wouldn’t be possible without your support. Thank you.
We’d love to hear your feedback on Process Zero, and we can’t wait to see what you shoot with it. Make sure to tag your photos #ProcessZero so we can see and share your shots!
Last month, two young paddleboarders found themselves stranded in the ocean, pushed 2,000 feet from the shore by strong winds and currents. Thanks to the deployment of a drone, rescuers kept an eye on them the whole time and safely brought them aboard a rescue boat within minutes.
In North Carolina, the Oak Island Fire Department is one of a few in the country using drone technology for ocean rescues. Firefighter-turned-drone pilot Sean Barry explained the drone’s capabilities as it was demonstrated on a windy day.
“This drone is capable of flying in all types of weather and environments,” Barry said.
Advertisement
Equipped with a camera that can switch between modes — including infrared to spot people in distress — responders can communicate instructions through a speaker. It also can carry life-preserving equipment.
The device is activated by a CO2 cartridge when it comes in contact with water. Once triggered, it inflates into a long tube, approximately 26 inches long, providing distressed swimmers something to hold on to.
In a real-life rescue, after a 911 call from shore, the drone spotted a swimmer in distress. It released two floating tubes, providing the swimmer with buoyancy until help arrived.
Like many coastal communities, Oak Island’s population can swell from about 10,000 to 50,000 during the summer tourist season. Riptides, which are hard to detect on the surface, can happen at any time.
Advertisement
Every year, about 100 people die due to rip currents on U.S. beaches. More than 80% of beach rescues involve rip currents, if you’re caught in one, rescuers advise to not panic or try to fight it, but try to float or swim parallel to the coastline to get out of the current.
Oak Island Fire Chief Lee Price noted that many people underestimate the force of rip currents.
“People are, ‘Oh, I’m a good swimmer. I’m gonna go out there,’ and then they get in trouble,” Price said.
For Price, the benefit of drones isn’t just faster response times but also keeping rescuers safe. Through the camera and speaker, they can determine if someone isn’t in distress.
Advertisement
Price said many people might not be aware of it.
“It’s like anything as technology advances, it takes a little bit for everybody to catch up and get used to it,” said Price.
In a demonstration, Barry showed how the drone can bring a safety rope to a swimmer while rescuers prepare to pull the swimmer to shore.
“The speed and accuracy that this gives you … rapid deployment, speed, accuracy, and safety overall,” Price said. “Not just safety for the victim, but safety for our responders.”
Manuel Bojorquez is a CBS News national correspondent based in Miami. He joined CBS News in 2012 as a Dallas-based correspondent and was promoted to national correspondent for the network’s Miami bureau in January 2017.
It’s been quite some time since we heard anything about Netflix’s animated adaptation of Splinter Cell — but the streamer has finally provided some details on the show. The reveal comes in the form of a very brief teaser trailer, which shows a little bit of the show, but mostly showcases Liev Schreiber’s gravelly take on lead character Sam Fisher. We also have a proper name now: it’s called Splinter Cell: Deathwatch.
Horseshoe crabs: Ancient creatures who are a medical marvel – CBS News
Correspondent Conor Knighton visits New Jersey beaches along the Delaware Bay to learn about horseshoe crabs – mysterious creatures that predate dinosaurs – whose very blood has proved vital to keeping humans healthy by helping detect bacterial endotoxins. He talks with environmentalists about the decline in the horseshoe crab population, and with researchers who are pushing the pharmaceutical industry to switch its use of horseshoe crab blood with a synthetic alternative used in medical testing.
Advertisement
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.
Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.
SPOILER WARNING: Information about NYT Strands today is below, so don’t read on if you don’t want to know the answers.
Your Strands expert
Advertisement
Your Strands expert
Marc McLaren
NYT Strands today (game #201) – hint #1 – today’s theme
What is the theme of today’s NYT Strands?
• Today’s NYT Strands theme is… A way with words
NYT Strands today (game #201) – hint #2 – clue words
Play any of these words to unlock the in-game hints system.
Advertisement
DIRTY
STRICT
POSE
POSED
DEAN
DOSE
NYT Strands today (game #201) – hint #3 – spangram
What is a hint for today’s spangram?
• A bard’s domain
NYT Strands today (game #201) – hint #4 – spangram position
What are two sides of the board that today’s spangram touches?
First: left, 4th row
Last: right, 5th row
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON’T WANT TO SEE THEM.
NYT Strands today (game #201) – the answers
The answers to today’s Strands, game #201, are…
Advertisement
RHYME
VERSE
METER
STANZA
SYNTAX
DICTION
SCANSION
SPANGRAM: POETRY
My rating: Moderate
My score: 2 hints
I’ve never been a fan of poetry, though I love words and language. Set it to music and it’s a different matter – and I guess the best lyricists are also poets. But ask me to talk about SCANSION and STANZAs and I’m a little lost. All of which is a way of justifying why I needed two hints to complete what for some people will probably be a fairly simple Strands puzzle.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
I worked out what the theme was early on, with the clue of ‘A way with words’ and the fact that I found RHYME by accident combining to set me on the right track. But though I spotted a couple more, I couldn’t get them all without needing a helping hand for METER and STANZA. After that I spotted the spangram, and the others were solved pretty much by a combination of guesswork and my modicum of knowledge.
Yesterday’s NYT Strands answers (Thursday 19 September, game #200)
SPIDER
MILLIPEDE
BEETLE
TERMITE
EARWIG
SPANGRAM: CREEPYCRAWLIES
What is NYT Strands?
Strands is the NYT’s new word game, following Wordle and Connections. It’s now out of beta so is a fully fledged member of the NYT’s games stable and can be played on the NYT Games site on desktop or mobile.
I’ve got a full guide to how to play NYT Strands, complete with tips for solving it, so check that out if you’re struggling to beat it each day.
A blockchain entrepreneur, a cinematographer, a polar adventurer and a robotics researcher plan to fly around Earth’s poles aboard a SpaceX Crew Dragon capsule by the end of the year, becoming the first humans to observe the ice caps and extreme polar environments from orbit, SpaceX announced Monday.
The historic flight, launched from the Kennedy Space Center in Florida, will be commanded by Chun Wang, a wealthy bitcoin pioneer who founded f2pool and stakefish, “which are among the largest Bitcoin mining pools and Ethereum staking providers,” the crew’s website says.
“Wang aims to use the mission to highlight the crew’s explorational spirit, bring a sense of wonder and curiosity to the larger public and highlight how technology can help push the boundaries of exploration of Earth and through the mission’s research,” SpaceX said on its website.
Wang’s crewmates are Norwegian cinematographer Jannicke Mikkelsen, Australian adventurer Eric Philips and Rabea Rogge, a German robotics researcher. All four have an interest in extreme polar environments and plan to carry out related research and photography from orbit.
The mission, known as “Fram2” in honor of a Norwegian ship used to explore both the Arctic and Antarctic regions, will last three to five days and fly at altitudes between about 265 and 280 miles.
Advertisement
“This looks like a cool & well thought out mission. I wish the @framonauts the best on this epic exploration adventure!” tweeted Jared Isaacman, the billionaire philanthropist who charted the first private SpaceX mission — Inspiration4 — and who plans to blast off on a second flight — Polaris Dawn — later this month.
The flights “showcase what commercial missions can achieve thanks to @SpaceX’s reusability and NASA’s vision with the commercial crew program,” Isaacman said. “All just small steps towards unlocking the last great frontier.”
Like the Inspiration4 mission before them, Wang and his crewmates will fly in a Crew Dragon equipped with a transparent cupola giving them a picture-window view of Earth below and deep space beyond.
No astronauts or cosmonauts have ever viewed Earth from the vantage point of a polar orbit, one tilted, or inclined, 90 degrees to the equator. Such orbits are favored by spy satellites, weather stations and commercial photo-reconnaissance satellites because they fly over the entire planet as it rotates beneath them.
The high-inclination record for piloted flight was set in the early 1960s by Soviet Vostok spacecraft launched into orbits inclined 65 degrees. The U.S. record was set by a space shuttle mission launched in 1990 that carried out a classified military mission in an orbit tilted 62 degrees with respect to the equator.
The International Space Station never flies beyond 51.6 degrees north and south latitude. NASA planned to launch a space shuttle on a classified military mission around the poles in 1986, but the flight was canceled in the wake of the Challenger disaster.
Advertisement
“The North and South Poles are invisible to astronauts on the International Space Station, as well as to all previous human spaceflight missions except for the Apollo lunar missions but only from far away,” the Fram2 website says. “This new flight trajectory will unlock new possibilities for human spaceflight.”
SpaceX has launched 13 piloted missions carrying 50 astronauts, cosmonauts and private citizens to orbit in nine NASA flights to the space station, three commercial visits to the lab and the Inspiration4 mission chartered by Isaacman.
Isaacman and three crewmates plan to blast off Aug. 26 on another fully commercial flight, this one featuring the first civilian spacewalks. NASA plans to launch its next Crew Dragon flight to the space station around Sept. 24.
Bill Harwood has been covering the U.S. space program full-time since 1984, first as Cape Canaveral bureau chief for United Press International and now as a consultant for CBS News.
Today we’re launching a totally new, totally different app. Meet Orion.
Orion is a small, fun app that helps you use your iPad as an external HDMI display for any camera, video game console, or even VHS. Just plug in one of the bajillion inexpensive adapters, and Orion handles the rest.
But wait — we’re a camera company. Why an HDMI monitor?
We built this to scratch a few itches. First, in professional cinematography, it’s common to connect an external screen to your camera to get a better view of the action. Orion not only gives you a bigger screen, but you can even share screenshots with your crew with a couple of taps.
We also built this for… pure fun. When traveling with a Nintendo Switch, it’s a delight to play games on a bigger screen, especially alongside friends.
Orion goes a step beyond display. By default, inputs could look fuzzy on an iPad’s retina display. (Why? The Switch runs a modest 1080p resolution, and even if it ran at a higher resolution, most adapters on the market can only run 60 frames per second at 1080P.)
Advertisement
Orion sharpens those low resolution inputs with an AI powered upscaler!
Another perk is control over the brightness of the image beyond the iPad’s screen brightness. If you’re trying to view video in daylight, crank up brightness to HDR range for extra help. If you’re on a late-night flight and don’t want to bother anyone around you, make things darker than the iPad’s darkest.
OK, I hear you ask, but how much does all of this cost? A camera monitor is hundreds of dollars. Well, Orion is free. Yep, free.
If you want to support the app, get Orion Pro: It packs AI upscaling, CRT emulation for retro games, and image adjustments (and whatever else we cook up). It’s a one-time upgrade for $5. It unlocks everything. No subscriptions.
As for those adapters, we found plenty available for under $20. Now it’s easy to get confused and accidentally buy, say, a USB-C hub with video output, which can’t capture anything. (Ask me how I know.) That’s why we personally tested the top ten adapters on Amazon and made a helpful buying guide with our recommendations and some other accessories, too.
The Story of Orion
This summer, Apple announced a set of awesome new features coming to iOS 17, and one of them was external-webcam support on iPad. After digging into the feature for our flagship app, Halide, we weren’t satisfied with the results in a camera app.
Advertisement
However, we did discover that a ton of companies sell tiny, inexpensive adapters that convert HDMI signals into webcams. “What if you could use an iPad as a portable screen?” Hmm! Intriguing. We had an idea, and we got to work.
We wrote the first line of code on August 6th, we’re shipping September 20th — 45 days later.
We’re launching at the start of new iPhone season, so we’re already super busy and shifting our focus to our flagship iPhone photography app, Halide. Orion won’t distract us from that, because we’re calling it a b-side.
B-sides are small fun, small, and focused projects. Apps like Halide needs major work every year to keep up with new hardware, but we expect Orion will be “done” after a release or two. We’ll keep maintaining it so it doesn’t break, but we won’t revolve our lives around it. It’s a fun utility, and that’s why we’re only asking for a few bucks.
Beyond being fun to build and design, apps like Orion let us experiment with new developer tools earlier than in our flagship apps. In a mature app used by lots of people, it’s a good idea to wait a year or two before adopting a cutting edge technology; while Apple launched SwiftUI in 2019, but we waited until 2021 to add it to Halide. SwiftUI has been a huge win for certain types of problems— and we couldn’t have built Orion so quickly without it— but by waiting two years before adding SwiftUI it to Halide, we had to play a lot of catchup in 2021.
So apps like Orion allow us to scratch our own itch, which is how we got into building apps in the first place, and also help us keep up with where iOS is heading.
Advertisement
The Orion Video System Design
You might notice something about the styling of Orion — it’s very stylized in a… retro sort of way.
When we set off to design the app, we really wanted it to be fun. Starting with the basic idea—a portable screen—we thought of the era where televisions and video were still exciting, fresh technology. The techno-utopia of the early 1980s came to mind. We find this a delightful aesthetic.
Pastels, purples and pinks. Detailed technical illustrations and bright colors. Futuristic logos. Type that tracked far too tightly thanks to the invention of the photo-typesetter. And of course, the invention of bitmap typefaces and on-screen user interfaces and icons.
We didn’t want to just lean into the clichés—there are enough vaporwave sunsets with Deloreans out there that try to seem ’80s’— so went and developed a visual language that is based on the electronics brochures and VCR interfaces of bygone days that conveys ‘modern’ in a way only the 1980s visual vernacular can.
In Halide, we did everything we could to make the app feel as tactile as a real camera. Great cameras are wonderfully tactile — every knob and switch has a weighted, deliberate feel and click to it.
In Orion, we wanted to give you the joy of your own ‘video system’. That meant starting from the beginning: you open the box to unpack it. Because, well, why not.
Instructions follow, so you can get started quickly.
And when not actively in use, you return to a glowing, slightly distorted nostalgic place of on-screen menus, where our custom-made pixel font called Radiant steals the show.
If it wasn’t obvious, we had a lot of fun doing this. And that’s what really mattered to us: if anything, Orion was a project to collaborate with friends on something fun and different.
Thank you
We want to build things with craft, fun and delight. To showcase that apps are an art form, and have no business being boring. We hope you enjoy the result — we know that we loved building it for you. Thanks to you, we get to do what we love.
Orion was a collaboration with friends. Some of the incredible design and typography on display (and our two custom typefaces) are the work of Jelmar Geertsma. Orion was co-engineered with Anton Heestand. The opening music (yes, opening music) is by Cabel Sasser. Extra thanks go to Louie Mantia for bézier wrangling and our families — especially Margo — for supporting us in doing what we love. If you are still reading here, please consider leaving us a review on our apps — it goes a very long way.
You must be logged in to post a comment Login