Understanding iPhone Cameras

 
Understanding iPhone and iPad camera systems: Single camera, Dual Camera, and Triple Camera on iPhone 11 Pro.


Introduction to iPhone Cameras

In 2007, Apple announced the first iPhone with a single 2.0-megapixel camera. Over the next few generations, the megapixel count would climb up while sensor quality improving.

The next benchmark moment comes with the reveal of the iPhone 4 - a 5-megapixel back-side illuminated sensor-equipped camera with an additional 0.3-megapixel, 480p front selfie camera. By now, the iPhone has become the most popular digital camera, surpassing any other dedicated digital cameras in the market. iPhones are also capable of recording high definition videos by now. iPads, by now past the first generation, also starting to be equipped with cameras.

The megapixel count keeps climbing as usual over the next few generations, as well as the quality of the sensors along with the quality of lens elements. In 2016, with iPhone 7 Plus, iOS gains multiple rear camera systems. By now, the megapixel count has risen to 12. This second camera gives a 2x optical zoom power; however, that is the least interesting fact about this system.

Front selfie camera has no evolved in to TrueDepth camera system that is capable of depth selfies using additional sensors.

Multiple Cameras

iPhone 11 Pro Max Triple-camera system

iPhone 11 Pro Max Triple-camera system

Though many may refer to them as multiple lenses, these are actual, separate, physical cameras with individual sensors. Current high-end iPhones contain a triple-camera system with 0.5x, 1x, and 2x optical zoom powers. In 35mm equivalents, they have focal lengths of 13mm ultra-wide, 26mm wide-angle, and 52mm telephoto, respectively. What do all these choices mean? Other than giving an exciting look to the device? The capability of high-quality, native, 12-megapixel images at each point of view (POV).

Many of the other iPhones and iPads still contain a dual-camera system, but the camera combination does vary. For example, iPhone X has wide-angle and telephoto cameras, yet iPhone 11 and 2020 iPad Pro include ultra-wide and wide-angle cameras.



Fusion Cameras

Apple wanted multiple cameras to do more - be more intelligent and act as one to transition smoothly between each other in different situations. Once the dual-camera system was introduced, Dual camera was born by having a combination of wide-angle and telephoto cameras.

iOS Camera app with Triple camera

The native iOS camera app used this as the default, and users were able to zoom seamlessly while Dual camera would switch cameras accordingly behind the scenes. In low light situations, even if the user is in 2x optical zoom power, Dual camera would use the wide-angle camera to capture the image since it has a superior sensor and stabilization for a cleaner and a sharper image. With the introduction of the triple-camera system, there were three total fusion cameras included: Dual, Dual Wide, and Triple.

iPhone 11 Pro Physical and Fusion Cameras

So, these fusion cameras are great, but what is the catch? They are not capable of manual controls, such as manual focus. Also, the RAW image shooting is not possible. Due to these complications, manual camera apps try to avoid fusion cameras as much as possible.


Not All Cameras Are Equal

Though these individual cameras have impressive capabilities, they differ in comparison. For example, Ultra Wide camera is not capable of changing focus distance and RAW image output, very similar to the front selfie camera. Wide Angle camera, which tends to be the primary and default camera among the rest, teends to have the best sensor quality and optics with a larger aperture opening.

Introduction to Depth

Introduction of multi-cameras gave birth to depth photo capturing, which is capable of including depth information using disparity seen from each of the two cameras. This dispairty information used to create a depth map which is used for creating depth blur for a look that is equal to a DSL camera with a long lens with a wide open aperture.

TrueDepth Camera

With iPhone X, TrueDepth camera was born, mainly for Face ID feature. These additional sensors were capable of prodiving time-of-flight depth information - depth selfies were born!

Incredible Processing Power

Even before the introduction of Apple Neural Engine, iOS cameras had incredible processing power. With the Image Signal Processor (ISP) built directly into the main processor, iOS devices these days are capable of mind-blowing image processing. A sample process handled by the ISP as follows:

  • Auto exposure

  • Auto white balance

  • Auto focus

  • Noise reduction

  • Local tone mapping

  • Highlight details

  • Image fusion

  • Face detection

  • Facial landmarking

  • Segmentation mask

  • Semantic rendering

With iPhone 11 Pro, Night Mode and Deep Fusion were introduced, showcasing the ever-improving signal processing pipeline.

 
Previous
Previous

Camera M 4.0 Update

Next
Next

Camera M User Manual