Apple Executive Discusses iPhone 12 Camera Design Philosophy Sensor and Lens, A14 Bionic and Embedded Software, Engadget Japanese Version



[ad_1]

Photo

The iPhone 12 mini and iPhone 12 Pro Max will be available this week, with an interview with Apple executives talking about the iPhone’s camera design philosophy.

John McCormack, Apple’s vice president of camera software engineering, and Franceska Sweet, product line manager, share the vision and design philosophy behind the development of the iPhone camera in PetPixel camera and photography media.

By providing an overview, Apple makes it clear that camera development is centralized and that software and hardware are considered one. That is, it integrates everything from sensors and lenses to the A14 Bionic chip, image signal processing, and the software behind computational photography (general photography processed and generated by computers).

The main objective is to be able to take photographs that capture the moments of life without being tied by technology, while people spend their lives animated. Rather than thinking about ISO, subject movement and other factors, McCormack said, “Apple wants to eliminate that need so that people can focus in the moment, take great pictures, and get back to normal life. That’s it.”

McCormack also stressed that he is also considering “more serious photographers.” “I’m duplicating as much as I can what the photographer does when shooting” and I divide the shot into two aspects: “exposure” and subsequent “development method”. On top of that, “We used a lot of image processing for exposure, but we also processed more automatically in development. The purpose is to make an image that looks true to life and experiences that it was really there. To reproduce what it was like.”

More interestingly, it would explain that machine learning divides photos into elements such as “background, foreground, eyes and lips, hair and skin, clothes and sky” and processes them individually. “These are processed independently to be adjusted locally in (Adobe) Lightroom” and “adjust all exposure, contrast, saturation, etc., and combine them all into one photo.”

Kataya Sweet also commented on the improved night mode on the iPhone 12 series. “The new wide-angle camera and the improved image blending algorithm reduce noise and achieve perfection of detail,” she said. Especially with the iPhone 12 Pro Max, “The larger sensor allows us to capture more light in less time, which improves blur, especially at night.”

And for the Apple Pro RAW format (only supported by 12 Pro and 12 Pro Max), which will be added later this year, adding computational photography results to the traditional RAW camera will give photographers full control over the image. It is also said that he considered it.

In addition, there are several talks about the processing power of the A14 Bionic and the idea of ​​using machine learning for photography, so if you are interested, I recommend that you read the original text.

Source: PetaPixel

Via: 9to5Mac



[ad_2]