Apple Camera Engineers Want iPhone Photos to Reflect Real Life

Closeup of the iPhone 12 Pro camera.

Apple’s Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, sat down for an interview to talk about some of the photography advancements Apple has made with the iPhone 12 (via PetaPixel).

iPhone 12 Camera

Repeatedly throughout the interview Mr. McCormack mentioned the goal was to faithfully reproved the photo so it reflects “What is was like to actually be there.” While professional photographers typically shoot photos with the intention of editing them later, Apple wants the photo you get straight out of the camera to be good enough for most people.

We replicate as much as we can to what the photographer will do in post. There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there.

Their approach to photography isn’t just about the lenses and sensor, it includes the software, image signal processing, the machine learning, and all of the work put into the A14 Bionic chip.

Apple’s ProRAW was also mentioned in the interview. We received some information during the iPhone keynote, but we again heard Apple’s goal with Mr. McCormack. Traditionally, photographers have to make a choice between shooting with the RAW format, or a compressed format like JPG which gives you the computational machine learning at the expense of quality.

ProRAW was designed to give you the best of both worlds. It combines computational photography but saves that processing as a separate digital negative file. This lets iPhone photographers enjoy the iPhone’s image processing while still working with the raw image data.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.