iPhone 11 is a wonderful surprise. It brings more advanced technology (namely in the camera capabilities and the processing power under the hood) and yet offers it for less than the iPhone XR cost in 2018. It combines a large 6.1-inch display with a premium-feeling body, and comes in an array of colors too. The most eye-catching feature of the new iPhone is to the imaging capabilities: with two sensors on the rear, you can now take wider-angle snaps alongside the ‘normal’ main images. These sensors are 12MP each, and are raised from the rear of the phone in a square glass enclosure – which we’re not enamored with visually.The night mode is the most impressive part of the iPhone 11 imaging quality, bringing brightness and clarity to impossibly dark scenes, and the Portrait mode, defocusing the background, is improved on the new iPhone too.
This isn’t something we normally do, but we’re going to get right to the simple fact that the iPhone 11 camera is easily the standout feature on this handset. Apple has doubled the number of lenses on offer here: where the iPhone XR had one, porthole-like sensor on the rear, things are much more grandiose for 2019, with a whole window on the rear containing two 12MP sensors. Apple’s clearly going for an iconic and uniform look with the iPhone 11 range, with the Pro and Pro Max packing the same square lens bump on the rear. It takes some getting used to, almost to the point of it being too obtrusive visually, with your fingers playing across it far more when you’re holding the iPhone in landscape, but it actually isn’t as obtrusive as the bump on 2018’s iPhone, thanks to being ‘layered’ up from the back – the glass housing around the lenses is raised a small amount from the rear glass, and the sensors themselves a little more. It’s a wide-angle array – that’s to say you get the ‘regular’ camera you’ll find on every phone, plus an ultra-wide-angle lens that brings more of the scene you’re shooting into the frame. It’s a setup that’s pretty easy to use: a toggle at the bottom of the camera interface enables you to move between focal length, and you can hold this down to activate a scroll wheel with which you can more smoothly zoom in and out. There’s a slight judder when transitioning between the two cameras, and if you look closely you can see there’s a difference in the light sensitivity of the two sensors as well in the preview.
One thing that’s supposed to be simple is fixing your too-narrow photos when you could be using the ultra-wide lens. We saw in a demo how the iPhone 11 would be able to take a shot using the standard lens, but during our testing could not work out how to get access to the wider shot that’s supposed to be taken at the same time, so you can change the composition post snap. We activated all the right settings but making the picture wide after taking it is not something that’s going to be easy to do for most. Side note: iOS 13 brings a feature we’ve been after for a long time: the capability to change the aspect ratio when you’re snapping. You can choose square, 16:9 alongside the 4:3 standard image. However no matter the ratio chosen, it’s still a 4:£ image on the phone – just let’s talk about something that does work well – the low-light performance. Historically Apple’s iPhone cameras have never been great here, but with its improved AI smarts the iPhone 11 is capable of rendering some amazing night snaps. Whether you’re in a sort-of-dark situation, or focusing a tripod-mounted phone at the night sky, there’s a setting that enables you to make what would normally be a badly-lit photo look as clear as… well not quite as clear as day, but wonderfully bright. This works by the iPhone 11 automatically telling you to hold the handset steady for 2-5 seconds so that the shutter can stay open for longer; the phone then captures a number of photos at different exposures and sharpness levels, before merging the data to produce the very best photo possible. If you’ve braced or mounted the phone securely, the capture time can be extended to up to 30 seconds – this is only really necessary if you’re going to be taking photos of the night sky, and for general night shots we saw very little difference between the brightness of photos taken over 5 seconds and 30 seconds. The results are startling, elevating Apple to the level of Huawei, Samsung and Google when it comes to taking low-light and night photos – and in some ways enabling it to surpass its rivals. Night mode can make photos shot at 1am look as if they were taken in late afternoon, and if you can get your subjects to remain still, you’ll take great snaps. However, try to photograph a scene that includes motion – people dancing at a concert, for instance – and it’s a world of blur. You’ll need to manually turn off night mode, and that’s a little bit of a nuisance when you’re trying to get a quick snap. Talking of speed, there’s a nice new feature added to iOS 13 whereby pressing and holding on the shutter button will allow you to take a quick video, Instagram-style, instead of burst mode photos (you can still do this by sliding your finger left; if you slide right instead recording will be locked, allowing you to take your finger off the shutter button to adjust exposure and zoom). This is a nice feature that’s going to appeal to those who want to share video clips to social media with ease. You don’t get the same low-light capabilities for video (more on that in a moment), but it’s smooth and defaults to the settings you’ve already set, so you can be shooting high-end 4K footage in a matter of seconds. We did notice on occasion that the iPhone 11 would show a black screen when we fired up the camera, meaning we would need to flick into another mode (like video or slow-mo) to jolt the viewfinder into showing something. We’ll keep an eye on this, as it’s likely something that will be fixed soon via an update, but it seems like a bug when starting the camera app.
There was one feature Apple made a huge deal of at the iPhone launch event, and it could be the thing that propels the iPhone to the head of our list of best camera phones, or at least gets it very close: Deep Fusion. This feature will take nine photos before you press the shutter button to take a snap, go through the information in each, and then on a pixel-by-pixel basis will decide how best to light and optimize the snap when you do take it. It was called “mad science” on stage – and if it works, we’ll be happy to go along with Apple’s description. We say ‘could’ because Deep Fusion isn’t actually available yet – curiously Apple is adding the feature later this year, and it won’t even show in your camera app… the pics will just get better, according to Apple. Why wasn’t it available at launch? We’re in the dark on that one, given that it seems the power is all in the iPhone already. Either way, we’re looking forward to re-reviewing the iPhone 11 camera when it lands.
With the addition of the second camera, Apple has made Portrait mode on the iPhone 11 far better than it was on last year’s iPhone XR – where last year software was used to help the iPhone know which was foreground and which was background, the extra sensor gives more physical information to help. It’s not perfect – where a scene is divided into foreground subject and background, it sometimes leaves some blur around the object that’s supposed to be in focus (especially with hair) but it can take some decent snaps.