iPhone 7 depth effect. The whole truth about the iPhone XR camera and Apple portrait modes. How to use the Depth feature

The new iPhone XR once again made Apple fans and others marvel at the camera's capabilities. We've translated an article by Ben Sandofsky, blog author and developer of the Halide app, in which he talks about how Apple's dual cameras work, how they create blur, and how it works with a single camera on the iPhone XR. Read the previous article about the iPhone XS camera.

With the introduction of the iPhone XR, every phone in Apple's lineup now supports depth capture. But the XR is unique: it's the first iPhone that can do this with a single lens. We began testing and optimizing the Halide application for XR and found both advantages and disadvantages.

In this post, we'll talk about three different ways to capture iPhone depth data, what makes the iPhone XR so special, and show you the new Halide 1.11 update that will allow you to do things on the iPhone XR that the regular camera app can't.

Depth Capture Method #1: Disproportionality between the two cameras

Humans perceive depth using two eyes. Our eyes may be just inches apart, but our brain detects subtle differences between the images. Accordingly, the greater the difference or discrepancy, the closer the object.

The iPhone 7 Plus introduced a dual-camera system that allows for depth in a similar way. By taking two photographs at the same time, each from a slightly different position, we can build a depth map.

When comparing images, the software part does a lot of “thinking through” work, which, for example, leads to the appearance of noise, which sometimes makes the result even “rougher.” You have to spend a lot of resourcesfor data filtering and additional image processing that involves how to properly smooth edges and fill in “holes.”

This requires a lot of calculations and was not possible until the iPhone X and iOS, which made it possible to process depth maps at 30 frames per second. But all this takes up a lot of memory. For a while, most application crashes happened because the system was using too much memory and depth processing resources.

Disadvantages of a dual camera


The first limitation and oddity of this method is that you can only generate depth for two parts of the images that overlap. In other words, if you have a wide-angle lens and a telephoto lens, you will only be able to create depth data for the telephoto lens.

Another limitation is that you cannot use manual controls. Since the system must perfectly synchronize the output of frames and exposure of each camera. Trying to manage these settings manually would be like trying to hit two cars at once.

And finally, while the color data of a 2D image may be 12 megapixels, the disparity map is only half a megapixel. If you try to use portrait mode, you'll end up with blurred edges that ruin the whole effect. You can add sharpness to edges, which will increase contrast in a 2D image, but this is not enough for fine details such as hair.

Depth Capture Method #2: TrueDepth Sensor


With the iPhone X, Apple introduced the TrueDepth camera - instead of measuring disparity, it uses infrared light to project over 30,000 dots.However, depth data is not entirely based on infrared points. Sit in a black and white room and see how TrueDepth behaves:


Obviously, the system uses color data as part of its calculations.

Disadvantages of TrueDepth

One of the disadvantages of TrueDepth is its sensitivity to infrared interference. This means that bright sunlight affects the quality.

Why not add a TrueDepth sensor on the back of the XR? I think there are three simple reasons for this: cost, range and complexity.

People are willing to pay extra for Face ID, which requires an IR sensor, or for a telephoto lens, but they aren't willing to pay extra to improve the depth effect of photos.

Add to this the fact that the infrared sensor works much worse at sensing depth at greater distances - the further people move, the more complex the depth map becomes. Now you can understand why Apple is rather hesitant to use TrueDepth for the rear camera.

Depth Capture Method #3: Focus Pixels and PEM

At the iPhone XR presentation, Apple said:

“Our team was able to combine hardware and software to create a depth segmentation map using focus pixels and neural network software so you can create portrait photos on the all-new iPhone XR.”

Apple's marketing department coined the term "Focus Pixels." The real term is Dual Pixel Auto Focus (DPAF), a common feature in full-fledged cameras and smartphones today, which first appeared on the iPhone with the iPhone 6.

DPAF technology was invented for very fast focusing, which is important when shooting video with moving objects. However, it is designed in such a way that it allows you to use its capabilities for calculating inconsistencies, during which a depth map is built.

The use of depth capture is a fairly new phenomenon. Google Pixel 2 was the first phone to feature depth capture on a single camera using DPAF technology. More details have been written about this.

In a DPAF system, each pixel on the sensor is made up of two sub-pixels, each with their own tiny lens. The hardware determines focus, similar to a rangefinder camera; if two subpixels are identical, then the pixel is in focus. Imagine the imbalance diagram we showed earlier, but on an absolutely miniature scale.


If you captured two separate images, one for each set of subpixels, you would get two images one millimeter apart.It turns out that a simple mismatch is enough to produce very jagged and rough depth information.

As I mentioned, this technology is also used by Google, and the Pixel team had to do a ton of work to make it usable:

“Another detail: Because the left and right sides captured by the Pixel 2 camera are so close to each other, the depth information we receive is inaccurate, especially in low light, due to high noise in the images. To reduce this noise and improve depth accuracy, we capture burst images on the left and right sides, then align and average them before applying our stereo algorithm."

The depth map resolution on the iPhone XR is approximately 0.12 megapixels: that's about 1/4 the resolution of a dual-camera system. This is really small, and that is why the best portraits produced by the iPhone XR are largely due to the use of a neural network.

Portrait Effects Matte


This year, Apple introduced an important feature that greatly improves the quality of portrait photos, and they call it "Portrait Effects Matte" or PEM.It uses machine learning to create a highly detailed matte finish that is ideal for adding background effects. At the moment, this training model only allows people to be found.

With PEM, Apple feeds a 2D color image and a 3D depth map to a machine learning system, and the software predicts what the resulting high-resolution image should look like. It identifies which parts of the image are people's outlines, and even pays extra attention to individual hair, glasses, or other parts that often get lost when a portrait effect is applied.

Photos in portrait mode always looked good. However, PEM makes them look great. This is a pretty powerful effect that makes the iPhone XR's very low resolution source data look really good.

This is why the camera app on the iPhone XR won't activate portrait mode until it "sees" a person. In iOS 12.1, PEM can only recognize people, but this may change in the future with a software update.

Without PEM the depth data is a bit rough.However, when combined with PEM, XR produces great photos.

So does the iPhone XR take better portrait photos?


AND Yes and no. The iPhone XR seems to have two advantages over the iPhone XS: it can take wider-angle photos with depth, and because the wide-angle lens collects more light, photos turn out better in low light and have less noise.

Remember how we said that Portrait XR mode is only available on human portraits? When it comes to faces, you never want to photograph someone up close with a wide-angle lens because it distorts the face in a way that isn't good for them. The photos above show this perfectly (iPhone XR has a focal length equivalent to 26mm)

This means that portraits on the iPhone XR are best taken from the waist up. If you want a headshot like on the iPhone XS, you'll have to crop the photo, which will result in a loss of resolution. A wide-angle lens is not always a plus.


However, the XR lens allows you to capture significantly more light than the Xs lens. This means you'll see less noise reduction (yes, that defunct "beauty filter" people think they saw) and generally get more detail. Additionally, the camera sensor behind the wide-angle XR and XS lenses is approximately 30% larger than the one behind the telephoto lens, allowing it to capture even more light and detail.

So, yes: sometimes the iPhone XR will take better-looking portraits than any other iPhone, including the XS and XS Max.

But otherwise, the XS will probably give you a better result. A more accurate and clear depth map, combined with a focal length that's better suited for portraits, means people will look better even if the image is a little darker. It can also blur the background of almost anything, not just people.

As for why Apple won't let you use portrait mode on the iPhone XS with its exact same wide-angle camera, we've got some ideas.

Most likely, Apple is faced with a serious interface conundrum in trying to explain to people why suddenly one camera can take pictures of more than just people, while the other cannot. However, adding more objects to the PEM learning machine may indicate that we will eventually get portraits with the iPhone's dual-camera wide-angle camera.

Halide 1.11 brings portrait effects to iPhone XR


We're excited to "unlock" powerful phone features that people didn't have access to before. Now we're doing it again: Halide 1.11 will let you take portrait photos of almost any subject, not just people.

We do this by capturing a focus pixel disparity map and running the image through software blur. When you open Halide on your iPhone XR, simply tap Depth to turn on depth capture. Any photo you take will have a depth map, and if there is enough data to determine the foreground and background, the image will display beautiful bokeh, just like footage on the iPhone XS.

You'll notice that turning on Depth Capture mode doesn't allow you to preview the portrait blur effect in real time or even automatically detect people. Unfortunately, the iPhone XR doesn't allow this. You'll have to look at the photo a little later after processing, just like with the Google Pixel.

It's perfect? No. As we mentioned, the XR's depth data is lower than the dual-camera iPhone. But in many situations this is enough to get great photos.




Want to try? Halide 1.11 has been sent to Apple moderators and will be released once theApp Store approval (editor's note: it's already passed!).

The iPhone XR has finally lost its little flaw: its inability to take a great photo of your beautiful cat, dog, or whatever. Hope you enjoy!

Updating to iOS 10.1 will allow iPhone 7 Plus smartphones to take full advantage of their dual-lens camera to create a sense of greater depth when shooting portraits.

The single highlight and standout among other features on the camera at launch was the dual optical zoom, which allows you to take closer shots than ever on iPhones. And while the official software update isn't scheduled until late October, a public beta is already online. To get iOS 10.1 now, you must participate in Apple's beta program, which you can sign up for at this link.

In the new Portrait mode, the camera uses a 12-megapixel wide-angle lens, the same as the regular iPhone 7, and a 12-megapixel telephoto lens that distinguishes between the background and the foreground. This is what can keep a subject sharp while blurring the background to achieve that same bokeh effect usually achieved on much more expensive DSLR cameras.

The iPhone 7 Plus isn't the first phone to add this feature, but it's something new for longtime iPhone owners that's been promised for a long time.

What can you photograph?

Portrait mode isn't just for taking pictures of people. During our testing, we used it on pets, plants, and inanimate objects using the same basic principles.

Getting started with portrait mode.

Once you've updated to the iOS 10.1 beta, tap on the camera app and swipe down on the menu wheel until you see "Portrait." Fix on the subject and pay attention to the guides on the screen. Because you use a telephoto lens for the foreground, your subject will be closer than with a regular camera, so you may have to stand back further.

Once you've found the best angle, you'll see a yellow "depth effect" sign pop up. Now lock onto your chosen focus, adjust the brightness and press the shutter button when you're ready. Keep your phone level until it finishes shooting; processing in this mode takes a little longer.

Bokeh Basics.

The effect works in almost any scenario, but if you want a more dramatic blur, you'll need to increase the depth of field, meaning you'll have to leave more space between the subject and the background. The closer you can get to the subject, the greater the blur. And finally, pay attention to the lighting. Make sure the subject is well lit but avoid backlighting.

Final result.

Now go to your photos folder to check your masterpiece. As with HDR mode, the camera saves a copy with and without the depth effect, so you can see the difference as you scroll. Just click on the image and look for the "depth effect" label in the top left corner if you're having trouble figuring out which option is which.

The image will retain the effect even if you share it with other devices, and this is not just the iPhone 7 Plus.

Additional future possibilities for dual lens?

Future software updates could add even more features to the iPhone 7 Plus, such as 3D displays and augmented reality capabilities similar to those found in the Google Tango Project. But we may have to wait for this closer to the 10th anniversary of the iPhone next year.

There is a new feature called "Depth". After you take a portrait photo, you can adjust the background blur level. Apple is promoting this feature as one of the best features of the iPhone XS. However, you can also use it on iPhone X, iPhone 8 Plus, and even iPhone 7 Plus using a third-party app.

The iPhone XS's new sensor blurs different parts of a photo at different levels. The effect is similar to that of DSLR cameras. Smartphones like the Pixel 2 work differently.

How to adjust background blur oniPhone X, iPhone 8 PlusAndiPhone 7 Plus

The Focos application has already made it into ours, and there is a reason for this. It fixes one of the biggest shortcomings of portrait mode. Older iPhone models blurred out extraneous elements in photos in almost all cases.

Portrait mode does not always recognize ears, glasses, and hairstyle edges. Focos allows you to not only adjust the blur level, but also adjust the focal point. You can open a photo and change its focus, as well as adjust the depth effect. The application does its job very well. Moreover, basic functions are available for free.

Here's how it works. Once you install the app, simply open it. You will see a camera, and below there is a grid with ready-made photos. Select a photo to open it.

There will be a slider under the photo. Move it to the left to reduce the blur, and move it to the right to increase it. That's all, adjusting the blur is so easy! The application uses data about the depth of the photo itself, so the edges of the object will not be blurred.

The Focos app allows you to change the aperture from f/20 to f/1.4. You can even change the shape of the blur effect. There's even a 3D editing tool available.

All this can be done in the application for free, and these functions will be enough for most users. To gain access to all features, you can purchase a paid subscription.

Fans of cool photos will certainly be able to appreciate the capabilities of the new camera on the iPhone 7 PLUS. After all, a new mode has appeared, which is called Portrait.

Typically, portrait photography is done with very good cameras. You can also achieve this effect if you play around a little in Photoshop.

But now you don’t need to do this, because you can just buy an iPhone 7 PLUS and enjoy this mode to the fullest. Let's figure out what this mode is, because some are unfamiliar with it, and also how exactly to use it.

Quite often you can find photographs in the news feed of any social network that have an object in the foreground, and everything behind is blurry.

So, this is exactly the effect that is called the “bokeh effect” or on the iPhone it can be found under the name “depth” effect. Apple has seriously worked on the plus camera and now you can create such miracles.

How popular this will be remains to be seen. After all, everything is still implemented in a beta version and users cannot fully experience all the benefits of such portrait photography.

How to create depth/bokeh effect on iPhone 7 PLUS

To find portrait mode on the iPhone 7 PLUS, just go into the camera and scroll through the modes until
you won't find the one you need.


Next, we find an interesting object and point the camera at it. Let's try different angles and you will see that the background will blur.

It is worth considering the fact that if an object or person is too far away, the mode simply will not work.


I think Apple will work in this direction and in the future they will be able to bring this mode to fruition. So we're just waiting for some great photos.

Results

The new dual camera of the iPhone 7 PLUS opens up quite a lot of possibilities, because now there is a bokeh effect, and there is also a 2x zoom.

Perhaps the next iPhone will have an even better camera and this one will give even more space for the development of portrait photography. In the meantime, we are testing what we have.


iPhone 7 Plus owners have one major advantage when it comes to photography: the dual camera lens system. With this upgraded camera, users can use 2x optical zoom, opening up even more possibilities. Another equally interesting feature is portrait mode, which allows you to get a bokeh effect (focusing attention on a specific object by blurring the background).

Portrait mode is still in beta at the moment, but you can always get the new feature ahead of schedule. Of course, this is possible through third-party software. Moreover, for this you will not need to have an iPhone 7 Plus - any other iPhone model on which you can install a specialized application is enough.

In fact, the App Store has a huge number of applications that use various photo effects and simulate “bokeh”. But we will pay attention to only one, which is the best in terms of getting a blurred background - Tadaa SLR.

Tadaa SLR App Review: Bokeh Effect

Let us immediately note that you can download Tadaa SLR from the Apple market for 299 rubles, but the developer often gives the opportunity to download his application for free. Using Tadaa SLR is not very difficult. The whole point comes down to the fact that you need to upload a ready-made photo to the EPP for which you want to apply the bokeh effect. After this, make sure the Mask and Edges options are enabled. The latter can be turned off if you want to detect edges in your photos in more detail. But note that the function works very well, so I use it.

Next, using your finger, you need to start drawing a mask on the photo - what you want to put in the center of attention in the photo. If necessary, zoom in on the photo. This allows for much better control over automatic edge detection.

When you are finished, click the Next button in the upper right corner of the screen. You'll be taken to a screen where you can play with the blur effect settings.

You can choose a linear or circular blur style, or completely blur. Once you achieve the desired result, click on the Apply button.

After that, you can add filters to the photo, adjust brightness, contrast, saturation and a number of other characteristics. Save the photo to your feed - you're done!