Main Content

blendexposure

Create well-exposed image from images with different exposures

Description

J = blendexposure(I1,I2,...,In) blends grayscale or RGB images that have different exposures. blendexposure blends the images based on their contrast, saturation, and well-exposedness, and returns the well-exposed image, J.

example

J = blendexposure(I1,I2,...,In,Name=Value) blends images that have different exposures, using name-value arguments to adjust how each input image contributes to the blended image.

example

Examples

collapse all

Read a series of images with different exposures that were captured from a fixed camera with no moving objects in the scene.

I1 = imread('car_1.jpg');
I2 = imread('car_2.jpg');
I3 = imread('car_3.jpg');
I4 = imread('car_4.jpg');

Display the images. In the underexposed images, only bright regions like headlights have informative details. Conversely, the headlights are saturated in the overexposed images, and the best contrast comes from darker regions such as the brick floor and the roof.

montage({I1,I2,I3,I4})

Figure contains an axes object. The hidden axes object contains an object of type image.

Blend the images using exposure fusion. By default, the blendexposure function attempts to suppress highlights from strong light sources. For comparison, also blend the images without suppressing the highlights. Display the two results.

E = blendexposure(I1,I2,I3,I4);
F = blendexposure(I1,I2,I3,I4,'ReduceStrongLight',false);
montage({E,F})
title('Exposure Fusion With (Left) and Without (Right) Strong Light Suppression')

Figure contains an axes object. The hidden axes object with title Exposure Fusion With (Left) and Without (Right) Strong Light Suppression contains an object of type image.

In the fused images, bright regions and dark regions retain informative details. With strong light suppression, the shape of the headlights is identifiable, and saturated pixels do not extend past the boundary of the headlights. Without strong light perception, the shape of the headlights is not identifiable, and there are saturated pixels in the reflection of the headlights on the ground and on some parts of the other cars.

Read a series of images with different exposures. The images were captured from a fixed camera, and there are no moving objects in the scene.

I1 = imread('office_1.jpg');
I2 = imread('office_2.jpg');
I3 = imread('office_3.jpg');
I4 = imread('office_4.jpg');
I5 = imread('office_5.jpg');
I6 = imread('office_6.jpg');
montage({I1,I2,I3,I4,I5,I6})
title('Images with Different Exposures')

Figure contains an axes object. The hidden axes object with title Images with Different Exposures contains an object of type image.

Blend the registered images using exposure fusion, optionally varying the weight of contrast, saturation and well-exposedness in the fusion, and without reducing strong light sources. Display the result.

E = blendexposure(I1,I2,I3,I4,I5,I6,'contrast',0.8,...
    'saturation',0.8,'wellexposedness',0.8,'reduceStrongLight',false);
imshow(E)
title('Blended Image Using Exposure Fusion')

Figure contains an axes object. The hidden axes object with title Blended Image Using Exposure Fusion contains an object of type image.

Input Arguments

collapse all

Grayscale or RGB images, specified as a series of m-by-n numeric matrices or m-by-n-by-3 numeric arrays. All images must have the same size and data type.

Data Types: single | double | uint8 | uint16

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: blendexposure(I1,I2,I3,Contrast=0.5,Saturation=0.9)

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: blendexposure(I1,I2,I3,"Contrast",0.5,"Saturation",0.9)

Relative weight given to contrast during blending, specified as a number in the range [0, 1].

Relative weight given to saturation during blending, specified as a number in the range [0, 1].

Relative weight given to exposure quality during blending, specified as a number in the range [0, 1]. The exposure quality of each image is based on the divergence of the pixel intensities from a model of pixels with good exposure.

Reduce strong light, specified as a numeric or logical 1 (true) or 0 (false). If ReduceStrongLight is true, then blendexposure attempts to suppress highlights from strong light sources in the images.

Note

If the input images do not have strong light sources and you specify ReduceStrongLight as true, then the output image J has less contrast.

Output Arguments

collapse all

Fused image, returned as a numeric matrix or array of the same size and data type as the input images I1,I2,...,In.

Tips

  • To blend images of moving scenes or with camera jitter, first register the images by using the imregmtb function. imregmtb considers only translations, not rotations or other types of geometric transformations, when registering the images.

Algorithms

The blendexposure function computes the weight of each quality measure as follows:

  • Contrast weights are computed using Laplacian filtering.

  • Saturation weights are computed from the standard deviation of each image.

  • Well-exposedness is determined by comparing parts of the image to a Gaussian distribution with a mean of 0.5 and a standard deviation of 0.2.

  • Strong light reduction weights are computed as a mixture of the other three weights, multiplied by a Gaussian distribution with a fixed mean and variance.

The weights are decomposed using Gaussian pyramids for seamless blending with a Laplacian pyramid of the corresponding image, which helps preserve scene details.

References

[1] Mertens, T., J. Kautz, and F. V. Reeth. "Exposure Fusion." Pacific Graphics 2007: Proceedings of the Pacific Conference on Computer Graphics and Applications. Maui, HI, 2007, pp. 382–390.

Version History

Introduced in R2018a