Apple's iPhone camera design philosophy


PetaPixel has an interview with Apple Product Line Manager Francesca Sweet, and Vice President Camera Software Engineering Jon McCormack, providing insights into changes to the iPhone 12 Pro Max camera. The new flagship iPhone, for the first time, introduces a larger camera sensor. When asked why now and why not sooner, Apple explained their holistic approach to iPhone designs.

â€"You could of course go for a bigger sensor, which has form factor issues, or you can look at it from an entire system to ask if there are other ways to accomplish that,” McCormack said of Apple’s perspective. â€"We think about what the goal is, and the goal is not to have a bigger sensor that we can brag about. The goal is to ask how we can take more beautiful photos in more conditions that people are in. It was this thinking that brought about deep fusion, night mode, and temporal image signal processing.

McCormack emphasized that because Apple is developing the entire system, it sees things differently than if it were only responsible for one or two parts of the process.

â€"We don’t tend to think of a single axis like ‘if we go and do this kind of thing to hardware’ then a magical thing will happen. Since we design everything from the lens to the GPU and CPU, we actually get to have many more places that we can do innovation.”

Apple often avoids talking about specs. In part, because specs are a commodity that can easily be matched or beat, but also it doesn't tell the whole story. It seems like Apple is doing some spin here, but the approach has been beneficial. Each component in its devices affects other components in multiple ways. Space is a premium, so making something bigger is generally a significant decision.