Apple should continue to offer one or more photo sensors with a 7P lens system. TF international believes that Apple will not deliver any major innovation in this area before 2023.
Is Apple resting on its laurels? This year, the photo improvements of the iPhone 12 seem quite marginal compared to the iPhone 11. Which may seem quite surprising: after all, with the iPhone 12, Apple has still implemented a handful of innovations in its photo sensors.
First, there is the arrival of a LiDAR sensor, and larger sensors on the iPhone 12 Pro and 12 Pro Max. But also a set of 7 lenses above the wide-angle sensor. This new system must improve light capture while erasing chromatic aberrations.
Apple benefits from strong demand for 7P optics and competition between its subcontractors
While the 6-lens set on previous generations delivered an aperture of f / 1.8, the new optics allow an opening f / 1.6. However, in a note to investors TF international estimates that Apple won’t innovate much in the next two generations.
According to analyst Ming-Chi Kuo We will have to wait until 2022 or even 2023 before discovering a new major innovation in photos at Apple. It must be said that the suppliers, in particular Largan and Genius Electronics, intend to capitalize on their current production capacities.
Especially since Apple is not the only firm to buy them 7P optical assemblies, which should quickly surge among the Android competition. Suppliers therefore conduct a real price war, which should continue in 2021. The situation thus encourages Apple to continue its use of 7P lenses while waiting for another technology to become viable.
So photo sensors should not change too much until the iPhone of 2023. With the possible exception of a periscopic telephoto lens, as many Android competitors have been doing in recent years – but also new computational photography techniques via iOS and the future chip of these smartphones.
Apple should also take more advantage of the LiDAR sensor: for the time being and according to several independent tests, this sensor, exclusive to Pro models, is not yet used much. It just kicks in to help autofocus in the dark and make augmented reality experiences more precise.
Also read: iPhone 12 – 21% more expensive to produce than iPhone 11
We are still waiting for this LiDAR sensor to deliver its promises, in particular the precise measurement of the depth of field to avoid clipping errors, for example when applying the Bokeh effect.
Source: Apple Insider