[ad_1]
From Apple iPhone 12 Y iPhone 12 Mini add significant new photography features, but camera hardware and computational photography software on the higher end iPhone 12 Pro The models really show how hard Apple is working to appeal to photo and video enthusiasts.
Among the changes in the iPhone 12 Pro models are new capabilities to merge multiple frames into a superior shot and a lidar sensor to improve autofocus. The iPhone 12 Pro Max also has a larger sensor for better low-light performance in the main camera, a telephoto camera that brings distant subjects closer together, and better stabilization to counter your shaky hands.
the iPhone 12, iPhone 12 Mini, iPhone 12 Pro and iPhone 12 Pro Max debuted in Apple iPhone 12 launch event Tuesday. The iPhone 12 (from $ 799, £ 799, AU $ 1,349) and 12 Mini (from $ 699, £ 699, AU $ 1,199) stick with last year’s design, with regular, ultra-wide and selfie cameras.
The biggest photographic enhancements come with the 12 Pro (from $ 999, £ 999, AU $ 1,699) and 12 Pro Max (from $ 1,099, £ 1,099, AU $ 1,849), which get a larger image sensor and a fourth telephoto camera for more distant subjects. The iPhone 12 Pro has the same 2x telephoto zoom range as previous iPhones, a 52mm equivalent focal length, but the 12 Pro Max extends to a 2.5x zoom, or a 65mm equivalent lens.
The iPhone 12 and iPhone 12 Mini also get significant improvements. They will benefit from night mode photos that now also work on the ultrawide and selfie cameras, and an improved HDR mode for challenging scenes with bright and dark elements.
Apple’s efforts in this area reflect the fact that consumers consider the camera to be one of the most important features of a smartphone, along with the processor and the speed of the network. We take photos and videos to document our lives, to share with friends and family, and to enjoy artistic expression.
Computational photography tricks
HDR stands for High Dynamic Range – the ability to capture shadow detail without turning highlights into washed out clutter. All new iPhones feature third-generation HDR technology designed to better capture details like cropped faces, Apple said. It also uses machine learning technology to judge processing options, such as increasing brightness in dark areas.
The iPhone 12 Pro models get another computational photography technique that Apple calls ProRaw. IPhones and Android phones have been able to take raw photos for years, a raw alternative to JPEG that allows photographers to decide the best way to edit an image. Apple ProRaw combines Apple’s computational photography with a raw format so that photographers benefit from noise reduction and dynamic range with the flexibility of raw images, Apple said. It’s similar to Google’s raw computational technology that came with the Pixel 3 in 2018.
Google pioneered work on the range of processing tricks called computational photography, which helped erase the comfortable lead in image quality that Apple’s early iPhones had for years.
But with the iPhone 11, Apple employed its own versions of some Google techniques, such as combining multiple low-exposure frames into a single shot to capture shadow detail without turning the skies into an overexposed blackout. Google calls it HDR + and Apple calls it Smart HDR; a related technology called Deep Fusion blends the frames for better details and textures, particularly in low light.
In iPhone 12, Apple’s deep fusion technology exercises all the major parts of the A14 Bionic chip, including the main CPU, image signal processor, graphics processor, neural engine, and other elements. That means Apple can apply deep fusion technology to all cameras on all iPhone models, Apple said. And it means iPhone portrait shots now work in night mode, with a capacity that Google added with its Pixel 5.
Best iPhone camera hardware
The larger sensor in the iPhone 12 Pro models, 47% larger than the sensor in the main camera in the iPhone 11, increases the pixel size. That engineering choice increases the cost of the sensor but allows it to collect more light for better color, less noise, and better low-light performance.
Pro phones also stabilize images by changing the sensor, not the lens elements, which, according to Apple, allows you to take photos in hand with a surprisingly long exposure time of 2 seconds.
All iPhone 12 models also benefit from a wider f1.6 aperture in the main camera for better light-gathering ability. And the ultra-wide camera now gets optical image stabilization.
The phones also feature better video capabilities, with 10-bit encoding designed to better capture color and brightness, as well as support for Dolby Vision HDR video technology. The iPhone 12 Pro models can shoot HDR at 60 frames per second, but the iPhone 12 and 12 Mini max out at 30fps.
They can be recorded in ordinary 4K and 1080p up to 60fps, but 1080p in slow motion can reach 240fps. Time-lapse videos are now stabilized.
What the iPhone doesn’t do
But Apple hasn’t gone as far as some in trying to grab photo headlines.
The iPhone 12 does not employ pixel binning, for example, a technique that uses much higher resolution sensors for photographic flexibility. Pixel grouping groups data from groups of four or nine neighboring pixels to produce the color information of a single pixel in the photo being taken. Or, if there is enough light when the photo is taken, the phone can skip pixel binning and take a photo at a much higher resolution. That offers more detail or more flexibility to cut out the important part of the scene.
Another newer trick that the iPhone skipped is the inclusion of a telephoto camera with a much higher magnification. The Huawei P40 Pro Plus has an impressive 10X optical zoom, for example. That’s difficult since the laws of physics make telephoto cameras physically large, but smartphone makers like Huawei and Samsung They are trying to solve the problem with a mirror that bends the path of light into the phone.
However, Apple might have other tricks up its sleeve. In 2017, Apple acquired image sensor startup InVisage, whose QuantumFilm technology showed promise for making image sensors smaller or improving image quality.
And he has done a lot with computational photography alone, in particular a portrait mode that simulates the “bokeh” of blurred backgrounds from high-end cameras and the lighting effects that can be applied to those portraits.