Apple's new image processing called Deep Fusion will be part of the next iOS 13 beta. Deep Fusion, especially in mid-light conditions indoors or taking pictures shortly after sunset provide for better photos by pixel by pixel creates a composite of the processor image.
Apple had announced Deep Fusion as part of the keynote address for iPhone 11 and iPhone 11 Pro (Max). The feature is not yet integrated in iOS 13 and the later updates and should be submitted later with a planned later for the year update. Experimental users and developers can already test Deep Fusion with the upcoming Developer Beta, which should have been released originally during the course of yesterday's Apple. However, the company decided to release it later, like the US media TechCrunch according to a specific date and in “will come“Was changed.
Deep Fusion only for the iPhone 11 (Pro)
Deep Fusion is a new image processing process that will not replace Apple's Smart HDR, but is designed for other lighting conditions to deliver better photos. Deep Fusion works exclusively on the new iPhone 11 and iPhone 11 Pro (Max) with Apple's A13 Bionic processor. Older smartphones are out.
Pixel by pixel to the final photo
The new image processing works as follows: The iPhone camera shoots with Deep Fusion first a photos with negative light value (EV, Exposure Value), in order to extract from this fast frame in particular much sharpness. This frame is followed by three shots with normal light value and a long-term shot. The underexposed frame and the combination of the four other frames are each combined into a 12-megapixel photo, giving a total of 24 megapixels of image information.
The final result, in turn, is a 12-megapixel photo generated from a total of 24 megapixels, whose image information has previously passed through pixel-by-pixel four different neural networks from Apple to make optimizations in areas such as sky, skin tones, clothing or foliage. Apple expects this measure better transitions with different skin tones, more details in clothing and a higher level of detail at the edges of moving objects.
Deep Fusion works automatically
Deep Fusion is not a new photo mode and can not be activated or deactivated by the user. The entire process happens automatically in the background. Apple wants users to take the process of shooting the best possible photo as much as possible and let the smartphone make the decisions.
Smart HDR, Deep Fusion and Night Mode
Deep Fusion is supposed to activate automatically on the main camera with a wide-angle lens in the range of 10 lux, ie above the range in which the night mode automatically becomes active. For bright to medium bright scenes, however, the well-known Smart HDR remains the automatic image processing of choice. The telephoto lens will mainly use Deep Fusion, Smart HDR will only be used in very bright scenery. And the Ultra Wide Angle only uses Smart HDR because it does not support Deep Fusion or the new Night Mode.
Update 28.10.2019 19:14 clock