Finally a real change to the photo sensor from the iPhone 6S!
New rumors confirm that Apple could increase the definition of the main sensor of the iPhone 14 Pro and Pro Max. This will go to 48 megapixels. The definition of iPhone sensors has stopped at 12 megapixels since the iPhone 6s, while the competition chooses 50, 64 or 108 megapixels sensors. However, will this new sensor really improve the quality of photos on these iPhones? Not sure.
In recent years, the identification of image sensors in smartphones has progressed significantly. Continues to increase, but exceeds 100 mega pixel. We had the opportunity to test a little, like Samsung Galaxy S21 Ultra 5G, the Xiaomi Mi 11i or the honor 50. As of this year, the first smartphones with sensors 200 mega pixel might happen. the Xiaomi Note 11 could be the first smartphone To take advantage of this component.
Read also – iPhone 14 Pro: First look at pricing, Apple will increase prices
If the definition of sensors is too high in most phone brands, it remains very low in the … irreducible … American. this is an Apple. Since the iPhone 6s, the Cupertino company has stayed true to the definition of 12MP. Admittedly, the nature of sensors has changed since then, as has the quality of lenses, stabilizers, and autofocus. But Apple refuses to increase the definition of the main sensor so as not to imbalance the secondary sensors.
Apple will choose a 48-megapixel sensor to equip the iPhone 14 Pro and Pro Max
But this could change. Some rumors say that some models expected in September 2022 may have sensors 48 mega pixels. The information was confirmed today by TrendForce specifying that this change will only be relevant Professional Domain, that’s by saying l’iPhone 14 Pro et l’iPhone 14 Pro Max. Once again, Apple will give the more expensive Pro line a graphic advantage. It was the first to make use of a secondary sensor, optical zoom, optical stabilization, or even LiDAR autofocus.
If this leak is confirmed, will it improve the images? In fact, the leak confirms that the sensor will be ‘Quad Pixel’ compatible. This means that four adjacent pixels will be combined to form a larger pixel. So the default image definition will remain at 12 megapixels. Either the current definition, to maintain the above-mentioned balance.
So why the change? two hypotheses. Either Apple chooses a larger component. This will increase the pixel size: the “quad pixel” of the new sensor will be larger than the original pixel of the current component. And as always, more light always means better quality. Either Apple does not change the size of the sensor. This decision will be motivated solely by the marketing argument for the format. Which would be rather sad.
source : TrendForce
Subtly charming zombie buff. Amateur analyst. Proud tvaholic. Beer fanatic. Web expert. Evil troublemaker. Passionate internet maven. Gamer. Food evangelist.