专利摘要:
ENCODE, DECODE, AND REPRESENT HIGH DYNAMIC RANGE IMAGES.The present invention relates to techniques for encoding and decoding image data that comprise a mapped tone (TM) image with HDR reconstruction data in the form of luminance ratios and residual color values. In an exemplary embodiment, luminance ratio values and residual values in color channels in a color space are generated on an individual pixel basis based on a high dynamic range (HDR) image and a mapped tone image (TM ) derivative comprising one or more color changes that would not be recoverable from the TM image with a luminance ratio image. The TM image with HDR reconstruction data derived from the luminance ratio values and the residual color channel values can be output in an image file to a downstream device, for example, for decoding, processing, and / or storage. The image file can be decoded to generate a restored HDR image free from color changes.
公开号:BR112013026191A2
申请号:R112013026191-9
申请日:2012-04-16
公开日:2020-11-03
发明作者:Wenhui Jia;Ajit Ninan;Arkady Ten;Gregory John WARD;Gaven Wang
申请人:Dolby Laboratories Licensing Corporation;
IPC主号:
专利说明:

Invention Patent Descriptive Report for "CODE, DECODE, AND REPRESENT HIGH DYNAMIC RANGE IMAGES".
CROSS REFERENCE TO RELATED APPLICATIONS 5 This application claims priority to Provisional Patent Application no. US 61 / 476,174 filed on April 15, 2011 and Provisional Patent Application of no. US 61 / 552,868 filed on October 28, 2011, which are incorporated in this document as a reference in their entirety.
TECHNICAL FIELD The present invention generally relates to image processing and, in particular, to encoding, decoding and representing high dynamic range images.
BACKGROUND OF THE INVENTION The display technologies that are developed by the transferee and others have the ability to reproduce images that have a high dynamic range (HDR). Such displays can reproduce images that more accurately represent real-world scenes than conventional displays. To support backward compatibility as well as new HDR display technologies, an HDR image can be represented by a mapped tone image with additional metadata that comprises grayscale luminance ratios. On the one hand, the mapped tone image can be used to present an image of normal dynamic range (for example, on a legacy display). On the other hand, additional metadata can be used with the mapped tone image to generate, retrieve or display an HDR image (for example, via an HDR viewer). However, a mapped tone image may comprise one or more color changes caused by various reasons in connection with a user who manipulates the image or with a tone mapping operator, in particular, used to generate the mapped tone image. For example, the user can change the hue information related to part or all of the pixels in the image in order to create an image with a more artistic expression. In addition, a tone mapping operator can perform different black or white clippings on various color channels, and can introduce color changes, for example, in relatively little or highly exposed regions of the image. In existing techniques, these color changes in the mapped tone image are difficult or impossible to remove when a downstream decoder attempts to reconstruct an HDR image from a mapped tone image and brightness ratios of scale. accompanying gray. The approaches described in this section are approaches that could be pursued, but not necessarily approaches that were previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualifies as a prior art merely because of its inclusion in that section. Similarly, it should not be assumed that problems identified in relation to one or more approaches have been recognized in any prior art based on this section, unless otherwise indicated.
BRIEF DESCRIPTION OF DRAWINGS The present invention is illustrated by way of example, and not by way of limitation, in the Figures of the accompanying drawings and in which similar reference numerals refer to similar elements and in which: FIG. 1 illustrates an exemplary HDR image encoder, according to some possible embodiments of the present invention; FIG. 2 illustrates an exemplary HDR image decoder, according to some possible embodiments of the invention; FIG. 3A and FIG. 3B illustrate exemplary process flows, according to a possible embodiment of the present invention; and FIG. 4 illustrates an exemplary hardware platform on which a computer or computing device can be deployed as described in this document, in accordance with a possible embodiment of the present invention.
FIG. 5 illustrates the variable saturation characteristics of different color sensors in a typical camera over a black to white gradient. FIGURES 6A and 6B illustrate modalities of a present system that transforms image data from camera sensor color space to monitor color space that performs post-hoc white balance. FIG. 7 is a modality of a white balance correction technique.
DESCRIPTION OF POSSIBLE EXEMPLIFICATIVE MODALITIES Possible exemplary modalities, which refer to image processing techniques, are described in this document. In the following description, for the purpose of explanation, numerous specific details are set out in order to provide a complete understanding of the present invention. It will be apparent, however, that the present invention can be practiced without these specific details. In other instances, well-known devices and structures are not described in exclusive detail, in order to avoid unnecessarily obstructing, hiding or obfuscating the present invention. Exemplary modalities are described in this document according to the following outline:
1. OVERVIEW
2. HDR IMAGE ENCODER
3. WHITE BALANCE CORRECTION
4. HDR IMAGE DECODER
5. EXEMPLIFICATIVE PROCESS FLOW
6. IMPLEMENTATION MECHANISMS - OVERVIEW OF HARDWARE
7. EQUIVALENTS, EXTENSIONS, ALTERNATIVES AND FUN SOS
1. OVERVIEW This overview presents a basic description of some aspects
aspects of a possible embodiment of the present invention.
It should be noted that this overview is not an extensive or exclusive summary of aspects of the possible modality.
Furthermore, it should be noted that this overview is not intended to be understood as identifying any 5 particularly significant aspects or elements of the possible modality, just as it does not delimit any scope of the particular possible modality, in the same way not like invention in general.
This overview merely presents some concepts that refer to the possible exemplary modality in a condensed and simplified format, and should be understood merely as a conceptual prelude to a more detailed description of possible exemplary modalities that follow below.
In order to display images on a wide variety of image rendering devices, tone mapping operators (TMOs) process input HDR images into base mapped tone (TM) images. Base TM images can comprise color changes (for example, hue changes, color cuts, artistic expressions, etc.) in relation to the input image.
In some techniques, base TM images are provided for downstream image decoders along with luminance ratios to reconstruct HDR images equivalent to the input HDR images.
However, a downstream image decoder would not have the ability to remove color changes in a reconstructed HDR image, which depends on a TM base image and grayscale luminance ratios.
As a result, color changes will remain noticeable in the reconstructed HDR image.
In contrast, an HDR image encoder in terms of the techniques described in this document creates not only brightness ratios, but also residual color values based on an HDR input image and a base TM image.
Luminance ratios and residual color values can be collectively denoted as HDR reconstruction data.
Optionally and / or in addition, the luminance ratios are transformed into a logarithmic domain to support a relatively wide range of luminance values.
Optionally and / or in addition, the resulting logarithmic luminance ratios and residual color values are quantified.
Optionally and / or additionally, the logarithmic ratios and the quantified residual color values are stored in a residual image.
The logarithmic ratios and the quantified residual color values or the residual image in some modalities, are equipped with the base TM image for a downstream image decoder.
Optionally and / or additionally, parameters related to the logarithmic ratios and the quantified residual color values (for example, range limits, etc.) are also provided with the base TM image.
A TMO, in this document, can freely perform color cuts in color channels for individual pixels with low (black) or high (white) luminance levels. Likewise, a TMO is not required to maintain hue at each pixel, as described in this document.
In terms of the techniques described in this document, a user is free to select a BMT based on the image content (for example, human figures, an indoor image, an outdoor scene, a night view, a sunset sun, etc.) or applications (for example, used in a film, a poster, a wedding photo, a magazine, etc.). Color clippings or modifications can be deliberately and freely used to create artistic expressions from images.
The HDR image decoders and encoders in this document support TMOs deployed by different types of editing software and camera manufacturers that can introduce a wide range of possible color changes.
In terms of techniques described in this document, HDR encoders provide residual color values for HDR decoders.
HDR decoders, in turn, make use of residual color values to prevent (or minimize) color changes from being present in reconstructed HDR images.
In terms of the techniques described in this document, bit streams and / or image files can be used to store and deliver TM base images and their corresponding HDR reconstruction data.
options for downstream image viewers or decoders to decode and / or render.
An image format in terms of the techniques described in this document supports TMOs deployed by different types of editing software and camera manufacturers.
Examples of image formats described in this document may include, but are not limited to, standard JPEG image formats (which include, for example, JPEG-HDR), etc.
In an exemplary embodiment, a JPEG-HDR image format is used to support storing a base TM image with luminance ratios and residual color values.
Additionally and / or optionally, one or both the base TM image and the residual image stored in an image file are compressed.
As described in this document, compression can be performed using the JPEG standard or a different method.
An image viewer or decoder that does not support HDR image processing in terms of the techniques described in this document simply opens the base TM image in an image file.
On the other hand, HDR image decoders in terms of the techniques described in this document are configured to read / analyze the image file in the base TM image and its luminance ratios and corresponding residual color values and to store / rebuild an HDR image.
The reconstructed HDR image described in this document is free from color changes that were absent from an original HDR input image, but were introduced into the TM base image by a TMO.
In some possible embodiments, mechanisms, as described in this document, form a part of an image encoder, which includes, but is not limited to, a handheld device, game machine, cinema system, home entertainment system, television, computer type laptop, netbook type computer, cell phone radio, electronic book reader, point of sale terminal, tablet computer, computer workstation, computer kiosk, and various other types of terminals and units processing.
Various modifications to the preferred modalities and to the generic principles and attributes described in this document will be readily apparent to those skilled in the art. Thus, the description is not intended to be limited to the modalities shown, but is to be in accordance with the broadest scope consistent with the principles and resources described in this document.
2. HDR IMAGE ENCODER FIG. 1 illustrates an exemplary HDR image encoder, according to some possible embodiments of the present invention. In some possible embodiments, the HDR image encoder is deployed by one or more computing devices, and configured with software and / or hardware components that implement image processing techniques to encode an HDR input image. in a TM image with HDR reconstruction data in a standard or proprietary image format. The HDR image encoder comprises software and / or hardware components configured to receive an HDR input image. As used herein, an "HDR input image" refers to any HDR image that may comprise floating point or fixed point high dynamic range image data. The HDR input image can be in any color space that supports a high dynamic range color range. In an exemplary embodiment, the HDR input image is an RGB image (for example, input HDR RGB 102 as illustrated in FIG. 1) in an RGB color space. In one example, each pixel in the HDR input image comprises floating point pixel values for all channels (for example, red, green, and blue color channels in the RGB color space) defined in the color space. In another example, each pixel in the HDR input image comprises fixed point pixel values for all channels (for example, 16 bytes or larger / smaller numbers of fixed point pixel values in bytes for red, green channels , and blue in the RGB color space) defined in the color space. Each pixel can
optionally and / or alternatively hold low resolution pixel values for one or more of the channels in the color space.
In an exemplary embodiment, the HDR image encoder comprises software and / or hardware components configured to perform a number of pre-processing steps.
Optionally and / or alternatively, the pre-processing steps include, but are not limited to, zero or more integrity checks on the HDR input image, etc.
For example, an HDR input image may or may not comprise pixel values that imply negative luminance values, for example, introduced by an upstream step or by local data corruption introduced in encoding or transmission.
To prevent an underlying luminance value from being negative, thereby causing problems in subsequent tone mapping operations, underlying luminance values in the HDR input image are checked with an integrity check (Negative Lum Check 104 ). If a pixel's underlying luminance value is not positive, pixel values for all pixel color channels can be set to zero.
In possible modalities where a luminance value is not provided directly for a pixel in a color space, a luminance value for the pixel can be (indirectly) derived from pixel values of the pixel in the color space.
In an exemplary modality, pixel values, R, G, and B, in an RGB color space for a pixel can be used to compute the luminance value, Y, for the pixel, as follows: Y = 0.30078125 * R + 0.59765625 * G + 0.1015625 * B expression (1) Before the HDR input image, which may or may not be preprocessed, is provided to a tone mapping operator ( TMO 106), the HDR input image data is passed through a white balance operator (105). As will be discussed in more detail below, correcting and / or adjusting the white balance in HDR images may be desirable as a part of HDR encoding as well as prior to the tone mapping operation.
In an exemplary mode, the TMO 106 comprises software and / or hardware components configured to generate, based on the HDR input image (which can be processed), a mapped tone (TM) image that can be rendered over a wide range variety of 5 display devices.
In terms of techniques described in this document, TMO 106 is treated as a black box in the HDR image encoder.
The TMO 106 or a user who employs the TMO 106 to manipulate the HDR input image, can freely introduce one or more color changes that affect hues or chroma properties in some or all portions of the output TM image of the TMO 106. In terms of the techniques described in this document, the TM image with color changes freely made by the TMO 106 or user can be provided as a base image for downstream devices, along with HDR reconstruction data created under the techniques described in this document that can be used to reproduce / render the HDR image.
The HDR reconstruction data provides enough information for a downstream container device to reproduce the HDR image free from color changes made by the TMO 106. Optionally and / or alternatively, the HDR image encoder comprises software components and / or hardware (Black Mod 110) configured to perform black modifications on the TMO 106 output, the TM image (R'G'B '108). The HDR or Mod Black 110 image encoder in it finds zero pixel values in the TM image (R'G'B '108). In one example, if a pixel value for a color channel is zero, the pixel value is provided with a small value such as 1, 2, 10 or other greater or lesser value.
In another example, if the luminance value for a pixel is zero, small values are provided for the pixel values for one or more color channels such as 1, 2, 10 or other smaller or larger values.
A small pixel value (for example, below 10) may not make a noticeable visual difference.
The TM image (R'G'B '108) may or may not be a gamma-corrected 8-bit image.
For the purpose of illustration only, the mapped tone image (R'G'B '108) emitted by the TMO 106 has been gamma corrected within the TMO 106. Optionally and / or alternatively, the HDR image encoder comprises software components and / or hardware (Inverse Gamma 112) configured to convert the mapped tone image 5 (R'G'B '108) to an intermediate mapped tone image (RGBt) in a linear domain if an output parameter or a value of The return of the TMO 106 indicates that gamma correction was performed within the TMO 106. The gamma curve used for gamma correction and / or for gamma conversion can be related to a standard color space such as sRGB or AdobeRGB, which can be indicated by TMO 106 using one or more output parameters or return values.
In an exemplary embodiment, RGBt can be used to further derive luminance ratios and residual values, as described in this document.
In an exemplary mode, luminance values (Yh) in the HDR input image (RGBh) and luminance values (Yt) in RGBt can be calculated.
In some possible modalities, Yh, Yt, and luminance ratios (r) between Yh and Yt, can be calculated on an individual pixel basis as follows: Yh = L (RGBh) = 0.30078125 * Rh + 0.59765625 * Gh + 0.1015625 * Bh Yt = L (RGBt) = 0.30078125 * Rt + 0.59765625 * Gt + 0.1015625 * Bt expressions (2) Where Yh comprises a plurality of luminance values, being whereas each corresponds to a different pixel in the HDR input image, Yt comprises a plurality of luminance values, each corresponding to a different pixel in the mapped tone image, and er comprises a plurality of luminance ratios, each of which is defined as a ratio between a luminance value in Yh and a corresponding luminance value in Yt.
In some possible modalities, Yh, Yt, and r can be expressed with matrices of the same dimension.
A position in a matrix described in this document
as indicated by a row index and a column index, can indicate a pixel in an image (for example, the HDR input image, the mapped tone image, or a luminance ratio image formed by r ). Luminance values of Yh and Yt and luminance ratios 5 of r correspond to each other if their positions share the same row index and column index in the matrices.
In an alternative modality, the division operations (Div 116), as illustrated in Figure 1, to compute r based on Yh and Yt are performed as subtractions in a logarithmic domain.
In techniques as described in this document, the luminance ratios r are computed using the mapped tone image (RGBt) that comprises the result of color change operations performed by the TMO 106 or by the user.
The luminance ratios, as computed in expressions (2), when multiplying the mapped tone image, produce an image whose luminance values correspond to the luminance values of the HDR input image.
If color balance is maintained by TMO 106 and if there is no color clipping performed with the tone image mapped by TMO 106 or the user, a combined image created by a multiplication of the mapped tone image with the luminance ratios r corresponds to the color-by-color HDR input image.
On the other hand, if the mapped tone image comprises distortions / color changes, for example, when the color balance in the HDR input image is changed by the TMO 106 in the mapped tone image or if it is cropped color occurs in white balance 105 or TMO 106, the combined image created by a multiplication of the tone image mapped with the luminance ratios r does not match the color-by-color HDR input image.
In the techniques described in this document, differences in color channels other than the luminance channel are computed between the combined image and the HDR input image to produce residual values included in HDR reconstruction data.
The HDR reconstruction data generated under the techniques described in this document provide extra color information that has been lost in white balance 105 or TMO 106 or in operations performed by the user.
When a downstream device such as an HDR image decoder or an HDR rendering device receives the TM image with the 5 color distortions / changes and the HDR reconstruction data, the color distortions / changes in the TM images are compensated with HDR reconstruction data.
As used in this document, clipping refers to a type of color shift that changes / modifies pixel values outside of association in color channels so that the resulting pixel values are within represented ranges.
Cropping can happen in any color channels (for example, pixel values R, G, and B in an RGB color space in a certain portion of the HDR image can be cropped in the TM image). Crop amounts may or may not vary with color channels (for example, more crop for green, less crop for blue, etc.). Using the luminance ratios r, the HDR input image can be remapped to generate an intermediate remapped image (RGBht) whose color balance is not changed.
RGBht can be calculated with a division operation (Div 118) as follows:
expression (3) As explained above, if color balance was maintained by TMO 106 and if there is no color clipping, the remapped image (RGBht) will be the same as the mapped tone image (RGBt). Otherwise, there will be differences in these two images.
The differences between the images are residual values in the tone mapping image space (for example, a space that comprises all possible mapped tone images). In an exemplary modality, residual values (RGBe) are calculated with subtractions (Sub 132) in the linear domain as follows: RGBe = RGBht - RGBt expression (4) Residual values (RGBe) can be converted (by a CSC block) 134 shown in Figure 1) to a YCbCr color space.
In technical
In this document, the mapped tone image (RGBt) and the remapped image (RGBht) have the same luminance values.
The residual luminance values between these two images are all zeros.
Residual values (RGBe) in the YCbCr color space comprise only 5 chroma information (Diff CbCr 154) that needs to be saved.
Conversion from RGBe to Diff CbCr can be provided as follows:
expression (5) Where MCSC and its inverse MCSC-1 can be a 3x3 matrix defined as follows:
expression (6) In techniques as described in this document, the conversion coefficients used to compute luminance values in the HDR input image and in the mapped tone image are exactly the same as those in MCSC in the expressions (5 ) and (6). In these techniques, residual luminance values (Ye) in RGBe are all zeros, as shown below:
expressions (7)
The luminance ratios r in the linear domain, as computed in expressions (2), have a wide range, because the ratios carry HDR information from the input image. In an exemplary modality, for purposes of efficient quantization, as illustrated in the expression (8) below, the luminance ratios r are first converted (for example, by a log block 130 of FIG. 1) into a logarithm domain. The maximum and minimum luminance ratio values in the logarithmic domain can be used to determine a logarithmic range (for example, for a Min Max block in FIG. 1) with the upper and lower limits such as lrmax and lrmin, respectively. Then, the logarithmic luminance ratios can be quantified (for example, uniformly or according to a particular curve) (for example, by a block Quant 8b 136 of FIG. 1) in Log Yt 8-bit values ( or H in expression (8)) based on the log range. In one example, natural logarithm is used with the logarithmic domain. In other examples, logarithm with bases other than that of natural logarithm is used with the logarithmic domain.
expression (8) In an example YCbCr color space, residual values of Cb and Cr (denoted as U and V in expressions (9) and (10)) in Diff CbCr can be quantified to 8-bit values ( CbCr 158), respectively, in a similar manner, as follows: expressions (9)
expressions (10) In an exemplary modality, after quantification, the HDR reconstruction data comprises three sets of two-dimensional data, H, U, and V (Log Yt and CbCr 158 in FIG. 1). The data sets in the HDR reconstruction data can be saved / stored in a single container of YCbCr 144 (for example, a YUV image) that comprises the luminance ratio values (in the luminance channel of the space example YUV color) and residual values (in the chroma difference channels of the example YUV color space), as if they formed an image (for example, YUV). At the end, two images can be obtained.
One is a tone image mapped in the RGB color space, and the other is the HUV image in the YUV color space.
The mapped tone image can be the output (R'G'B '108) of the TMO 106, after black modification (Black Mod 110) and / or with optional desaturation (for example, 150). Both mapped tone and HUV images can comprise 8-bit data, and can be compressed, for example, using a standard JPEG compression method.
The HDR reconstruction data can be output in an application segment (SEG of APP 146) with the tone image mapped into a single image file.
The single image file can be in a standard or proprietary image file format (for example, JPEG-HDR). The application segment can be a marker field (for example, marker APP11) in the image file format (for example, as defined by the JPEG standard). In one example, the TM image forms a base image (RGB TM Base 148) after JPEG compression, and the HUV image is attached to the TM base image in an application segment (SEG of APP 146) such as APP11 marker in an output HDR image file (for example, 156).
The techniques as described in this document can be used to process both floating-point and fixed-point HDR images (for example, a 16-bit linear image, a 14-bit gamma corrected image, etc.). ). 5 In an exemplary embodiment, the base TM image and the HUV image are stored in a standard JPEG format under JPEG-HDR techniques, commercially available from Dolby Laboratories, San Francisco, California. The base TM image is stored in an entropy-encoded data segment. The HUV image with parameters and auxiliary data is stored in an application segment such as APP11 application segment under JPEG-HDR, with an appropriate ID string (for example, "DD"). Minimum and maximum values of quantification value ranges in the HUV can be stored in a type I segment. These minimum and maximum values include the maximum and minimum luminance ratio values in the logarithmic domain, the maximum and minimum values for residual values of Cb, and the maximum and minimum values for residual values of Cr. Optionally and / or alternatively, other information specifying the base image color space (for example, sRGB, AdobeRGB) and residual mode (for example, luminance ratio only) is included in the type I segment. residual mode is only luminance ratio, data and parameters related to Cb and Cr can be ignored in later decoding. In an exemplary modality, the HUV image is stored in a type II segment, and can be divided into multiple type II segments with index information in a segment header, if the data size of the HUV image exceeds a certain size, for example, 64k bytes.
3. WHITE BALANCE CORRECTION Often, a user captures an image with a digital camera and it is desired to render an HDR image derived from the captured image. If the RAW Camera format of the camera that captures the image was known, then it may be possible to create an HDR image with high chroma fidelity.
However, post-hoc white balance can be difficult to perform correctly with the use of standard JPEG for at least two reasons: (1) the exact mapping of color space rendered back to sensor colors is not known, and (2) stapling was applied unevenly for different color channels.
Without knowing how to get to a color space that is linearly related to the original sensor values, the white balance correction cannot be done correctly.
However, if the RAW Camera format is known, it is possible to use RAW Camera formatting to adjust exposure and white balance after the fact (that is, after images are captured by the camera). Although standard JPEG images can have their brightness enlarged to some degree, it is not possible to recover lost highlights, and white balance either increases overall image brightness or causes highlights to become colored to some degree.
One mode of the present system, however, has the ability to adjust exposure and white balance of JPEG-HDR images and may have advantages over standard JPEG image encodings and (in many cases) RAW camera.
A standard image encoding, such as JPEG or 24-bit TIFF, has, in practice, a maximum representable value on each channel (255,255,255) which corresponds to "white". As long as the camera's exposure and white balance are correctly set, the reproduced image will be acceptable for normal viewing and printing purposes.
However, if the image is slightly overexposed or the white balance is improperly stipulated by the user or the camera firmware, it will be difficult to correct the image during post-processing.
Once a primary value has been cut to 255, the information used to recover the original color would tend to be lost.
Although there is no widely adopted standard for RAW Camera, most of these formats contain enough information at each pixel to make moderate adjustments to exposure and white balance.
Gra-
Using the original sensor UM / D output values, it is possible to know almost as soon as the camera when the image was taken.
In particular, it is possible to inform when each sensor is saturated, which allows the possibility to adjust the exposure and white balance within the captured range. 5 In fact, if RAW camera files are available, then the situation is helped for two reasons: (1) RAW camera has an extra small space in each channel besides "white", and (2) RAW camera reports exactly when each channel cuts, so that it is possible to take note of not exceeding this maximum in the output mapping
Typically then, post-hoc white point correction works on RAW Camera, since (1) the linear sensor range and color space are known, and (2) there is no information lost when stapling.
Any rendered white point available at the time of capture is available from the RAW Camera data.
In fact, if white balance is performed on a RAW camera, the resulting image is the original image as the steps taken by the camera in firmware are repeated - but possibly performed in standalone software.
However, there are limits to how much highlight information can be retrieved.
For example, in some cases, if one of the channels reached its maximum value in that patch, the RAW converter can trim the other two to ensure a neutral result.
Stapling primary colors to the wrong white point makes post-hoc correction problematic even if it is already known to arrive at linear values, as the full range of sensor data has been compromised.
Such post-hoc processing can either discolor new highlights or increase image brightness in a way that desaturates color and loses detail elsewhere.
The situation is illustrated by FIG. 5. FIG. 5 represents a graph of a hypothetical digital camera's performance in capturing image data in a black to white gradient.
According to the graph of FIG. 5, the digital image captured in the green channel is saturated by a first value of white luminance (in digital value 4095), and then the red channel is saturated by a higher value of white luminance.
The blue channel, however, needs a much higher white luminance value for it to saturate.
Without knowing those points where the various color channels become saturated, correcting for an appropriate white balance is particularly tricky.
JPEG-HDR extension JPEG-HDR was created as an extension of the JPEG standard in order to allow the storage of high dynamic range images.
With the existing format for JPEG-HDR, JPEG-HDR effectively has two image channels that affect white balance correction - namely, the base layer (mapped tone) JPEG image and a residual layer that can be used to recover a calibrated linear color space - thus giving many of the benefits of a RAW camera.
The additional dynamic range actually allows the system to correct color values beyond what is rendered in the base layer.
However, it may be desirable to add some additional information in the event that the captured range does not cover the full range of scene values, as is often the case even for HDR.
In one mode, the system can store the upper sensor range for the HDR captured in each of the RGB color channels.
This can take the form of 2 or 3 white balance scaling factors and a matrix (for example 3x3) that produces effective linear sensor values over a nominal range (for example 0 to 1). The "effective linear sensor values" can be interpreted as the full range of a hypothetical sensor that could capture in a single exposure that can be fused from multiple exposures in an HDR capture.
These effective linear sensor values can be used to produce the sensor color space values of such a hypothetical sensor.
"White balance scaling factors" (or "white balance multipliers") can be used to tell the system what the original white balance setting was during conversion - that is, the one used to obtain the current output.
In another mode, a change can be made to JPEG-HDR to allow the highlights of the recovered HDR image to extend beyond the common maximum for non-white values - where one or more effective sensor values have reached their upper limit .
Because these pixels can be rendered as white in the mapped tone base layer, the new residual CbCr color channels, as further described in this document, can be used.
When an application requires the recombined HDR image from a JPEG-HDR file, then white clipping can be performed.
This can be achieved by transforming the HDR image back to the "effective linear sensor values" and applying the white balance (corrected) multipliers. In one embodiment, these multipliers can be normalized so that the smallest of the three channel multipliers is exactly 1. This can be followed by a stapling step in which any values of sensor * multiplier> 1 are cut to 1. Finally , the effective sensor values modified at each pixel can be transformed back into the target color space using the inverse of the supplied sensor matrix.
In yet another modality, an optimization can check to see if HDR pixel values are large enough to approach stapling.
If a pixel is within the capped color gamut limit, then no stapling is required and the two color transformations and white adjustment can be combined into one transformation.
This will be the identity matrix in cases where the original white balance is unchanged.
Since most pixels can be within the captured color range, this can reduce the computing requirements for this method.
If rerendering the mapped tone base layer is desired, the corrected and stapled HDR created in this way can then be used as the source for recomputing the mapped tone image.
To illustrate part of the discussion above, FIG. 6A shows a possible set of mapping operations from a camera sensor color space to a monitor color space in this way.
Color space of camera sensor 602 sends image data to color channel multipliers 604. In order to guarantee a white point, stapling for minimum value 606 can be employed before these values are inserted in a color transformation matrix 608 to render on a target monitor. Figure 6B shows a modality different from a possible set of mappings. As can be seen, if the captured image is in HDR format, then stapling at 606 can be ignored (610) and the values can be fed directly into the color transformation matrix 608. With any of the FI modalities - FIGURES 6A and / or 6B, the transformation matrix output 608 can be emitted from the White Balance operator 105 in FIG. 1 along the HDR path to Div 118 and block Yh. In one embodiment, the camera may be able to guess (possibly with or without user input) as to the ambient lighting conditions under which the image was captured to affect, say, for example, a Tungsten white balance or a white balance of daylight. This can be taken into account for the appropriate configuration of the color channel multipliers. Alternatively, if there is no such information in the camera settings, then making the user need to make an appropriate guess on a post-hoc basis or choosing a neutral surface (gray reference) in the image may be sufficient.
4. HDR IMAGE DECODER FIG. 2 illustrates an exemplary HDR image decoder, according to some possible modalities of the present invention. In an exemplary embodiment, the HDR image decoder is deployed by one or more computing devices, and configured with software and / or hardware components that implement image processing techniques to decode HDR image data (de- noted as HDR 202 in FIG. 2) comprising an RGB mapped tone base image and HDR reconstruction data. In an exemplary mode, HDR reconstruction data refer to luminance ratio values, residual values of Cb and Cr, and auxiliary parameters and data related to the aforementioned data.
In an exemplary embodiment, the image data to be decoded by the HDR image decoder is in an image file in an image format (for example, JPEG-HDR). The HDR image decoder may comprise an analyzer (for example, 204) configured to receive HDR image data 202 (for example, a JPEG-HDR image file in an enhanced format to store residual Cb and Cr in addition to luminance ratios), and to analyze HDR 202 image data in the RGB mapped tone base image (denoted as Base image 206 in Figure 2) and one or more application segments (SEG of APP 208) that store the HDR reconstruction data.
In an exemplary embodiment, analyzer 204 is a standard JPEG decode.
In an exemplary embodiment, the HDR image decoder comprises software and / or hardware components configured to analyze the one or more application segments (SEG of APP 208) in a luminance ratio image (ratio image 210) and values residuals of quantified Cb and Cr (Residual of CbCr 212). The luminance ratio image (ratio image 210) comprises quantified logarithmic luminance ratios.
In an exemplary embodiment, the HDR image decoder comprises a decanting processing block (De-quant 214) configured to decantify the logarithmic luminance ratios quantified in logarithmic luminance ratios.
The HDR image decoder comprises an inverse logarithm processing block (exp 216) configured to convert logarithmic luminance ratios to luminance ratios in a non-logarithmic domain.
In an exemplary embodiment, the HDR image decoder comprises a decanting processing block (De-quant 218) configured to decantify the residual values of Cb and Cr quantified in residual values of Cb and Cr.
The HDR image decoder comprises a color space conversion processing block (CSC 220) configured to convert residual values of Cb and Cr into residual values of RGB in the linear domain.
In an exemplary embodiment, the HDR image decoder 5 comprises a resaturation block (232) configured to perform the reverse process of desaturation, optionally and / or additionally, if the base image of the mapped tone is desaturated by the encoder.
In an exemplary embodiment, the HDR image decoder comprises a gamma decoding processing block (Gamma Decoding 224) configured to perform gamma decoding in the RGB-mapped tone base image (Image- base 206), optionally and / or additionally, if the mapped tone base image (Base image 206) is gamma encoded.
For example, a parameter in a type I segment of an application segment can indicate that the mapped tone-based image is a gamma-encoded RGB image (for example, sRGB image). The output of the gamma decoding processing block (Gamma Decoding 224) is multiplied with the luminance ratios of the ratio image on an individual pixel basis to derive an intermediate HDR image in a Mul 226 processing block, while the Residual RGB values are multiplied with the same luminance ratios as the ratio image on an individual pixel basis to derive a residual RGB image in a Mul 222 processing block (which can be the same as 226). The intermediate HDR image and the residual RGB image can be added on an individual pixel basis by a sum processing block (Add 228) to derive an HDR RGB image (RGB 230), which can be a restored version of the RGB image of input HDR in FIG. 1. In an alternative mode, the pixel values in the base TM image and the residual RGB values are added first.
The sum results are multiplied by the luminance ratios to derive the RGB image from HDR.
In yet another modality, White Balance 232 can be performed to affect a post-hoc white balance correction. FIG. 7 depicts a possible modality for processing HDR image data and adjusts the data to an appropriate white balance. At 5 702, the system inserts JPEG-HDR image data with any available sensor color space data and white balance multipliers. In 704, the system checks (for example, from the user or perhaps embedded in the image metadata) whether a new white point is assumed for the present image. If so, then new white balance multipliers are computed. Otherwise, the old white balance multipliers are used and the image data is transformed back to the color space sensor at 708. For each color channel, the current white balance multipliers (or the old ones) or recently calculated) are applied to the image data. If necessary, image values are stapled to the minimum of the maximum (ie min-max) of the channel values at 712. At 714, the image data is transformed into the monitor color space. After that, the image data is output at 716 as a restored HDR RGB output. In one embodiment, the white balance operation of FIG. 1 and FIGURES 6A and / or 6B in the encoder and the white balance operation of FIG. 2 and 7 in the decoder can be implemented as operations in pairs, in which the encoder white balance operation works to affect the decoder's appropriate white balance correction. It should also be appreciated that the processing of FIG. 7 merely incorporates part of the features described in this section and that many other features and / or refinements can be added.
5. EXEMPLIFICATIVE PROCESS FLOW FIG. 3A illustrates an exemplary process flow according to a possible embodiment of the present invention. In some possible modalities, one or more computing devices or components such as an HDR image encoder (for example, as shown in FIG. 1) can perform this process flow. The HDR image encoder can be deployed by adding one or more new processing blocks to, and / or modifying one or more existing processing blocks in, a standard based image encoder such as an encoder JPEG image file.
In block 302, the HDR image encoder 5 receives a high dynamic range (HDR) image. In an exemplary embodiment, the HDR image is one of a fixed point image or a floating point image.
In an exemplary mode, the HDR image is encoded in a JPEG, JPEG-2000, MPEG, AVI, TIFF, BMP, GIFF or other image format.
At block 304, the HDR image encoder also receives a mapped tone (TM) image that was generated based on the HDR image.
The TM image comprises one or more color changes that are not recoverable from the TM image with a luminance ratio image.
In an exemplary embodiment, at least one of the one or more color changes in the TM image is caused by a clipping (for example, in pixel values of R, G or B) or hue changes in one or more pixels.
In block 306, the HDR image encoder computes luminance ratio values, on an individual pixel basis, by dividing the luminance values of the HDR image by luminance values of the TM image on the individual pixel base.
In block 308, the HDR image encoder applies the luminance ratio values to the HDR image to create a remapped image.
In an exemplary embodiment, the HDR image encoder converts at least one of the remapped image and the TM image from one color space to a different color space.
In block 310, the HDR image encoder determines residual values in color channels of a color space based on the remapped image and the TM image.
If the original color is changed, at least one of the residual values is different from zero.
In an exemplary modality, the color space is a YCbCr color space; the color channels of the
color palette comprises a color channel Cb and a color channel Cr.
Residual values in the color channels of the color space are calculated as differences between first pixel values, as derived from the re-mapped image, in the color channels and second pixel values, as derived from the TM image, in the image channels. color.
In block 312, the HDR image encoder outputs a version of the TM image with HDR reconstruction data.
HDR reconstruction data are derived from luminance ratio values and residual color channel values.
In an exemplary modality, the HDR reconstruction data comprise a residual image with quantified values derived from the luminance ratio and residual values in the color channels of the color space.
The HDR reconstruction data can additionally comprise parameters that specify ranges of the quantified values.
In an exemplary mode, the HDR reconstruction data is stored in an application segment of an image file with the TM image as a base image in the image file.
In an exemplary embodiment, the image file is in a JPEG-HDR format.
In an exemplary embodiment, the HDR image encoder can perform one or more integrity checks on the HDR image, for example, before the HDR image is manipulated by a tone mapping operator (TMO) or a user.
In an exemplary mode, the HDR image encoder replaces zero, one or more zero color channel values in the TM image with values less than a threshold value.
This threshold value can be 1, 2, 3, 10, 11, etc. in several possible modalities.
In an exemplary embodiment, any tone mapping operations with any TMO and / or any color changes to any number of pixels in the TM image can be performed in the process of generating the TM image.
In an exemplary embodiment, the HDR image encoder applies a color space conversion to at least one of the HDR image, the TM image or the remapped image.
In an exemplary modality, residual luminance values between the TM image and the remapped image are all zeros.
For example, in a color space (for example, YUV) with a luminance channel (for example, Y) and two color channels (for example, Cb and Cr), differences in luminance values between the TM image and the remapped image (for example, already or alternatively after a color space conversion, in the color space) can all be zeros.
FIG. 3B illustrates an exemplary process flow according to a possible embodiment of the present invention.
In some possible modalities, one or more computing devices or components such as an HDR image decoder (for example, as shown in FIG. 2) can perform this process flow.
The HDR image decoder can be deployed by adding one or more new processing blocks to and / or modifying one or more existing processing blocks in a standard image decoder such as a video decoder. JPEG image.
In block 322, the HDR image decoder analyzes an image file that comprises a mapped tone (TM) image and HDR reconstruction data.
In an exemplary mode, the base image of TM comprises results of any tone mapping operations with any tone mapping operator and / or any color changes in any number of pixels.
In an exemplary embodiment, the HDR reconstruction data comprises quantified luminance ratio values (for example, Y channel) and residual values quantified in color channels (for example, Cb and Cr channels) of a color space ( YUV). The TM base image comprises one or more color changes that are not recoverable from the TM base image with a luminance ratio image.
In an exemplary modality, the image file is codified in one of JPEG, JPEG-2000, MPEG, AVI, TIFF, BMP, GIFF or other
other image format. In an exemplary modality, the image file is analyzed with a standard-based image decoder, for example, a JPEG decoder. In block 324, the HDR image decoder extracts quantification parameters that refer to the quantized luminance ratio values and the quantified residual values in the color channels of the color space. In block 326, the HDR image decoder converts, based at least in part on the quantization parameters, the quantized luminance ratio values and the quantized residual values into luminance ratio values and residual values in the color channels of the es - color palette. In an exemplary mode, the quantified luminance ratios and the quantified residual values are stored in a residual image. In an exemplary modality, the residual image and the base image of TM are unquantified and decompressed using a common procedure. In block 328, the HDR image decoder reconstructs an HDR image using the base TM image and the luminance ratio and residual values in the color channels of the color space. The HDR image can be either a fixed point image or a floating point image. In an exemplary embodiment, the HDR image decoder performs an action of color space conversion, gamma encoding, gamma decoding, resolution reduction or resolution increase, for example, by at least one from the base TM image, the residual image, the HDR image or an intermediate image.
6. IMPLEMENTATION MECHANISMS - OVERVIEW OF
HARDWARE According to one modality, the techniques described in this document are implemented by one or more special purpose computing devices. Purpose computing devices are
can be directly wired to perform the techniques or may include digital electronic devices such as one or more application specific integrated circuits (ASICs) or field programmable port arrangements (FPGAs) that are persistently programmed for 5 perform the techniques or may include one or more general purpose hardware processors programmed to perform the techniques as per program instructions in firmware, memory, other storage or a combination.
Such special-purpose computing devices can also combine custom wired direct logic, ASICs or FPGAs with custom programming to perform the techniques.
Special purpose computing devices can be desktop computer systems, portable computer systems, handheld devices, network devices, or any other device that incorporates direct wire and / or program logic for deploy the techniques.
For example, FIG. 4 is a block diagram illustrating a computer system 400 by which a modality of the invention can be implemented.
Computer system 400 includes a bus 402 or other communication mechanism for communicating information, and a hardware processor 404 coupled to bus 402 for processing information.
Hardware processor 404 can be, for example, a general purpose microprocessor.
The computer system 400 also includes a main memory 406, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 402 to store information and instructions to be executed by a 404 processor. Main memory 406 can also be used to store temporary variables or other intermediate information while executing instructions to be executed by a 404 processor. Such instructions, when stored in non-transitory storage medium accessible to the 404 processor, render the system. computer 400 on a special purpose machine that is customized to perform the operations specified in the instructions.
Computer system 400 additionally includes a read-only memory (ROM) 408 or other static storage device coupled to bus 402 to store static information and instructions for processor 404. A storage device 410, 5 such as a magnetic disk or optical disc, is provided and attached to bus 402 to store information and instructions. Computer system 400 can be coupled via bus 402 to a display 412, such as a liquid crystal display, to display information to a computer user. An input device 414, which includes alphanumeric and other keys, is coupled to bus 402 to communicate information and command selections to the processor
404. Another type of user input device is cursor control 416, such as a mouse, a trackball-type pointing device or cursor direction keys to communicate direction information and command selections to the 404 processor and to control cursor movement on display
412. This input device typically has two degrees of freedom on two geometric axes, a first geometric axis (for example, x) and a second geometric axis (for example, y), which allow the device to specify positions on a plane. The computer system 400 can implement the techniques described in this document using the logic directly connected by custom wire, one or more ASICs or FPGAs, firmware and / or program logic which, in combination with the system of computer, causes or computer system program 400 to be a special purpose machine. According to one embodiment, the techniques in this document are performed by a computer system 400 in response to a 404 processor that executes one or more sequences of one or more instructions contained in main memory 406. Such instructions can be read in main memory 406 from another storage medium, such as storage device 410. Execution of the instruction sequences contained in main memory 406 causes processor 404 to perform the process steps described in this document. In modalities
ternative, directly wired circuitry can be used in rather than or in combination with software instructions.
As used in this document, the term "storage medium" refers to any non-transitory medium that stores data and / or instructions that cause a machine to operate in a specific manner.
Such a storage medium may comprise non-volatile and / or volatile media.
Non-volatile medium includes, for example, optical or magnetic disks, such as storage device 410. Volatile medium includes dynamic memory, such as main memory 406. Common forms of storage medium include, for example, a floppy disk, a floppy disk, hard disk, solid state drive, magnetic tape or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with hole patterns, a RAM, a PROM, and EPROM , a FLASH-EPROM, NVRAM, any other cartridge or memory chip.
The storage medium is distinct from, but can be used in conjunction with, the transmission medium.
The transmission medium participates in transferring information between storage media.
For example, the transmission medium includes coaxial cables, copper wire and optical fibers, which includes the wires that comprise 402 bus. Transmission medium can also take the form of acoustic or light waves, such as those generated during communications data by radio wave or infrared.
Various forms of medium can be involved in carrying one or more sequences of one or more instructions to the 404 processor for execution.
For example, instructions can be carried initially on a magnetic disk or solid state drive on a remote computer.
The remote computer can load the instructions in its dynamic memory and send the instructions over a telephone line using a modem.
A modem location for computer system 400 can receive the data on the phone line and use an infrared transmitter to convert the data into an infrared signal.
An infrared detector can receive the data carried in the infrared signal and the appropriate circuitry can place the data on bus 402. Bus 402 carries the data to main memory 406, from which processor 404 retrieves and executes instructions. - 5 sessions.
The instructions received by main memory 406 can be optionally stored on storage device 410, or before or after execution by processor 404. Computer system 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides two-way data communication that connects to a network link 420 that is connected to a local network 422. For example, communication interface 418 can be an integrated services digital network card (ISDN) , cable modem, satellite modem or a modem to provide a data communication connection to a corresponding type of telephone line.
As another example, communication interface 418 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
Wireless connections can also be deployed.
In any such deployment, communication interface 418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams that represent various types of information.
The network link 420 typically provides data communication across one or more networks to other data devices.
For example, network connection 420 may provide a connection via LAN 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP 426, in turn, provides data communication services via the worldwide packet data communication network now commonly referred to as "Internet" 428. Both local network 422 and Internet 428 use electrical, electromagnetic or optical signals that carry digital data streams.
Signals through the various networks and signals in network connection 420 and through communication interface 418, which carry digital data to and from computer system 400, are exemplary forms of transmission medium. Computer system 400 can send messages and receive data, which includes program code, through the network (s), network connection 5 420 and communication interface 418. In the example of the Internet, a server 430 can transmit a code required for an application program via the Internet 428, ISP 426, local network 422 and communication interface 418. The received code can be executed by a 404 processor as it is received, and / or stored in a communication device. storage 410 or other non-volatile storage for later execution.
7. EQUIVALENTS, EXTENSIONS, ALTERNATIVES AND FUN
SOS In the aforementioned specification, possible modalities of the invention were described with reference to numerous specific details that may vary from implantation to implantation. Thus, the only and exclusive indicator of what the invention is, and is intended by the claimants to be the invention, is the set of claims that is issued from this request, in the specific form in which such claims forward, including any subsequent correction. Any definitions expressly provided in this document for terms contained in such claims must govern the meaning of such terms used in the claims. Therefore, any limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of that claim in any way. The specification and drawings should therefore be considered in an illustrative rather than a restrictive sense. For the purpose of illustration, it has been described that in some possible modalities, HDR reconstruction data comprise quantified luminance ratios and residual quantified Cb and Cr values. In some possible embodiments, HDR reconstruction data may comprise luminance ratios and / or residual values of unquantified Cb and Cr, which may be, for example, floating point or fixed point values.
For example, one or more application segments in an image file can store these unquantified values.
An HDR image decoder in terms of techniques in this document can analyze the image file and recover the unquantified values. 5 These unquantified values can be used in combination with a mapped tone base image extracted from the image file to reconstruct an HDR image.
For the purpose of illustration, it has been described that in some possible modalities, possible pre-processing may include reduced resolution.
In some possible modalities, pre-processing in this document may not reduce the resolution for the purpose of maintaining image details and / or color accuracy of HDR images that are processed by techniques in this document.
For example, image encoding of a residual HUV image can be performed by a JPEG image encoder with a mode that avoids downsampling.
For the purpose of illustration, it has been described that in some possible modalities, a JPEG image file format and / or JPEG codec can be used in an HDR image encoder and / or decoder.
For the purpose of the present invention, an image codec other than a JPEG codec can be used in an HDR image encoder and / or decoder.
For the purpose of illustration, it has been described that in some possible modalities, an HDR input image and a mapped tone base image are RGB images.
For the purpose of the present invention, other types of images can be used to store an HDR image and a tone-based image mapped in the present document.
For example, an HDR input image in a YUV color space can be used instead of in an RGB color space.
Zero, one or more color space conversions in the HDR image encoding or decoding process can be implemented in terms of techniques, as described in this document.
For the purpose of illustration, it has been described that in some possible modalities, an HUV (or YUV) file in a YCbCr color space can be used to store luminance ratios and different residual values of luminance ratios.
For the purpose of the present invention, other types of color spaces and other types of image files can be used to store information equivalent to luminance ratios and residual values.
For example, luminance ratios and residual values can be converted to a different color space than the YCbCr color space.
Similarly, an image file other than the YUV file can be used to store values converted from luminance ratios and residual values.
In some possible embodiments, reversible transformations can be used to perform color space conversion or pixel value conversions in terms of techniques, as described in this document.
In some possible modalities, an image file that comprises a base image of tone mapped with luminance ratio and residual values of Cb and Cr has a file size similar to that of another image file that comprises a base image of tone map - with luminance ratio, but without residual Cb and Cr values.
In a particular modality, images with residual values of Cb and Cr are on average only 10% larger than counterparty images without residual values of Cb and Cr.
For the purpose of illustration only, a mapped tone image is generated by a TMO.
For the purpose of the present inventions, more than one TMOs can be used together to generate a mapped tone image, as described in this document.
A detailed description of one or more modalities of the invention, read together with attached figures, which illustrates the principles of the invention has been provided.
It is to be appreciated that the invention is described in connection with such modalities, but the invention is not limited to any mode.
The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents.
Naked-
mere specific details have been set out in this description in order to provide a complete understanding of the invention.
These details are provided for the purpose of example and the invention can be practiced according to the claims without some or all of these specific details.
For the sake of clarity, the technical material that is known in the fields of art related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
权利要求:
Claims (30)
[1]
1. Method that comprises the steps of: receiving a high dynamic range (HDR) image; receiving a mapped tone (TM) image generated based on the HDR image, the TM image comprising one or more color changes that are not recoverable from the TM image with a luminance ratio image; compute luminance ratio values, on an individual pixel basis, by dividing the luminance values of the HDR image by the luminance values of the TM image on the individual pixel basis; apply the luminance ratio values to the HDR image to create a remapped image; determine residual values in color channels of a color space based on the remapped image and the TM image; and outputting the TM image with HDR reconstruction data, and the HDR reconstruction data are derived from the luminance ratio values and residual values.
[2]
2. Method according to claim 1, further comprising converting at least one of the remapped image and the TM image from a different color space to the residual color space.
[3]
A method according to claim 1, wherein the color space is a YCbCr color space, and wherein the color channels of the color space comprise a color channel Cb and a color channel Cr.
[4]
4. Method according to claim 3, in which the residual values in the color channels of the color space are calculated as differences between first pixel values, as derived from the remapped image, in the color channels and second values of pixel, as derived from the TM image, in the color channels.
[5]
5. Method according to claim 1, in which at least one of the one or more color changes in the TM image is caused by one of the cuts or changes in hues, in one or more pixels.
[6]
A method according to claim 1, wherein the HDR reconstruction data comprises a residual image with quantized values derived from the luminance ratio values and the residual values in the color channels of the color space. 5
[7]
7. Method according to claim 1, wherein the HDR reconstruction data is stored in an application segment of an image file with the TM image as a base image in the image file.
[8]
8. Method, according to claim 1, still comprising creating an image file, in which the image file is in a JPEG-HDR format.
[9]
9. Method, according to claim 1, further comprising allowing any tone mapping operations with any tone mapping operator and / or any color changes in any number of pixels in the TM image.
[10]
A method according to claim 1, further comprising replacing zero, one, or more zero color channel values in the TM image with zeros, or values less than a threshold value.
[11]
11. Method according to claim 1, further comprising applying a color space conversion to at least one of the HDR image, the TM image, or the remapped image.
[12]
12. The method of claim 1, wherein the residual luminance values between the TM image and the remapped image are all zeros.
[13]
13. The method of claim 1, wherein the HDR image is one of a fixed point image or a floating point image.
[14]
14. Method according to claim 1, further comprising performing a pre-processing operation required by a tone mapping operator in the HDR image, wherein the pre-processing operation is one of a conversion of color space, a gamma encoding, a gamma decoding, a reduction in resolution, or an increase in resolution.
[15]
15. Method according to claim 1, further comprising receiving information that refers to the gamma coding performed on the TM image, and performing a reverse operation of gamma coding 5 on the TM image to generate a TM image before determining the residual values in the color channels of the color space.
[16]
16. Method according to claim 1, wherein the HDR image is encoded in one of JPEG, JPEG-2000, MPEG, AVI, TIFF, BMP, GIFF, or other image format.
[17]
17. Method, which comprises the steps of: analyzing an image file that comprises a base mapped tone (TM) image and HDR reconstruction data, with the HDR reconstruction data comprising quantified luminance ratio values and residual values quantified in color channels of a color space; the base TM image comprising one or more color changes that are not recoverable from the base TM image with a luminance ratio image; extract quantization parameters that refer to the quantized luminance ratio values and the quantified residual values in the color channels of the color space; convert, based at least in part on the quantization parameters, the quantized luminance ratio values and the quantified residual values into luminance ratio values and residual values in the color channels of the color space; and reconstruct an HDR image using the base TM image and the luminance ratio and residual values in the color channels of the color space.
[18]
18. Method according to claim 17, in which the image file is analyzed with a standard-based image decoder.
[19]
19. Method, according to claim 17, in which the quantified luminance ratios and the quantized residual values are stored in a residual image, and in which the residual image and the base TM image are decantified and decompressed with the use of a common procedure.
[20]
20. The method of claim 17, wherein the TM-based image comprises results of any tone mapping operations with any tone mapping operator and / or any color changes at any number of pixels.
[21]
21. The method of claim 17, wherein the HDR image is one of a fixed point image or a floating point image.
[22]
22. Method for performing white balance correction on initial image data, with initial image data comprising mapped tone (TM) image data and residual image data, with residual image data still comprise high dynamic range (HDR) image data, effective linear sensor values and white balance multipliers, comprising the following steps: retrieving effective linear sensor values and white balance multipliers from the data residual image; produce sensor color space values from the effective linear sensor values and the initial image data to form first intermediate image data; for each color channel, apply white balance multipliers to the first intermediate image data to produce the second intermediate image data; stapling second intermediate image data to a minimum of maximum color channel values if HDR image data is not available from the initial image data to form the third intermediate image data; and if stapling has not been performed, transform the second intermediate image data into a target monitor color space to form the final image data; and if stapling has been performed, transform the third intermediate image data into a target monitor color space to form the final image data.
[23]
23. The method of claim 22, wherein the initial image data is formatted in JPEG-HDR format.
[24]
24. The method of claim 23, wherein the initial image data comprises a higher sensor range for the HDR captured in each of the RGB color channels.
[25]
25. The method of claim 24, wherein the upper sensor range comprises white balance multipliers and a matrix, the matrix further comprising effective linear sensor values.
[26]
26. The method of claim 25, wherein the final image data is inserted into an HDR path of an image encoder.
[27]
27. Method for processing initial image data comprising the steps of: entering the initial image data; enter sensor color space data and white balance multipliers; if a new white point for the image data is assumed, compute new white balance multipliers to produce current white balance multipliers; for each pixel in the image data, transform the initial image data into first intermediate image data back into the sensor color space; for each color channel in the intermediate image data, apply the current white balance multipliers to produce second intermediate image data; staple the second intermediate image data to the minimum value quantity, all the maximum color channel values to produce third intermediate image data; and transforming the third intermediate image data into a desired monitor color space to produce four intermediate image data.
[28]
28. The method of claim 27, wherein the four intermediate image data are inserted into a tone mapping operator.
[29]
29. The method of claim 27, wherein the initial image data is captured by an image capture device and, in addition, in which the camera RAW data from the image capture device is not available.
[30]
30. The method of claim 29, wherein the image capture device is capable of capturing image data in a plurality of color channels and, in addition, in which at least two different color channels saturate at a different luminance value along a black to white gradient.
Input HDR RGB Negative Lum Check White Balance Ratio + Residual 1/7 Black Mod SEG
APP Reverse Range Desaturation Base TM optional
RGB
Image Resaturation Decoding base range optional Analyzer Image 2/7 white balance SEQ of
Residual CbCr RGB HDR APP restored
Receive a high dynamic range (HDR) image
Receive a mapped tone (TM) image generated based on the HDR image
Compute luminance ratio values, on an individual pixel basis, by dividing the luminance values of the HDR image by the luminance values of the TM image on the individual pixel basis
Apply the luminance ratio values to the HDR image to create a remapped image
Determine residual values in color channels of a color space based on the remapped image and the TM image
Issue the TM image with HDR reconstruction data, the HDR reconstruction data being derived from the luminance ratio and residual values
Analyze an image file that comprises a mapped tone (TM) image and HDR reconstruction data that comprise quantized luminance ratio values and quantified residual values in color channels of a color space; the base TM image comprising one or more color changes that are not recoverable from the base TM image with a luminance ratio image
Extract quantification parameters with respect to quantized luminance ratio values and quantified residual values in the color channels of the color space
Convert, based on at least the quantization parameters, the quantized luminance ratio values and the quantized residual values into luminance ratio values and residual values in the color channels of the color space
Rebuild an HDR image using the base TM image and the luminance ratio values and residual values in the color channels of the color space
Memory Server Device Main display storage
Bus Input Device
4/7 Connection Cursor control Processor Network interface communication Local area network
Host
Values
Luminance
Multipliers
Red Camera sensor color space Staple
Color green (sensor value for minimum transformation max = 4095)
Blue
6/7 Multipliers
Red Staple Matrix Transformation for Minimal Color
Green If HDR data Blue
Insert JPEG-HDR data with sensor color space data and white balance multipliers
A new point of Sim Computing new white is presumed white balance multipliers
Not
For each pixel, transform image data back into sensor color space
For each color channel, apply current white balance multipliers
Staple values to the minimum of maximum color channel values
Transform image data into monitor color space
Output image data
类似技术:
公开号 | 公开日 | 专利标题
BR112013026191A2|2020-11-03|encode, decode, and represent high dynamic range images
US10264259B2|2019-04-16|Encoding, decoding, and representing high dynamic range images
US10992936B2|2021-04-27|Encoding, decoding, and representing high dynamic range images
同族专利:
公开号 | 公开日
TW201624999A|2016-07-01|
KR20150098245A|2015-08-27|
KR101632596B1|2016-06-22|
US8248486B1|2012-08-21|
EP2697962A2|2014-02-19|
KR102029670B1|2019-10-08|
JP5762615B2|2015-08-12|
CN103503429A|2014-01-08|
HK1188354A1|2014-04-25|
CA2830678C|2018-05-22|
JP6469631B2|2019-02-13|
EP3376749B1|2021-09-15|
KR101552479B1|2015-09-11|
TWI690211B|2020-04-01|
CA2830678A1|2012-10-18|
EP3166298A1|2017-05-10|
TWI580275B|2017-04-21|
KR20150085100A|2015-07-22|
IL259069A|2019-03-31|
IL228761A|2016-08-31|
KR101684334B1|2016-12-08|
IL238138A|2018-05-31|
JP2017050028A|2017-03-09|
JP2015062291A|2015-04-02|
RU2589857C2|2016-07-10|
KR101825347B1|2018-02-05|
JP2015084546A|2015-04-30|
CN103888743B|2016-05-25|
TWI624182B|2018-05-11|
ES2686980T3|2018-10-23|
TW201526658A|2015-07-01|
KR101695633B1|2017-01-13|
JP5893708B2|2016-03-23|
WO2012142589A2|2012-10-18|
CA3000434A1|2012-10-18|
KR101939135B1|2019-01-17|
KR20160072274A|2016-06-22|
JP5661218B2|2015-01-28|
KR20190009412A|2019-01-28|
KR20150056874A|2015-05-27|
KR20190114026A|2019-10-08|
KR102128233B1|2020-06-30|
RU2013150934A|2015-05-20|
JP2014510988A|2014-05-01|
TWI521973B|2016-02-11|
KR20130141676A|2013-12-26|
IL228761D0|2013-12-31|
RU2640717C1|2018-01-11|
EP2697962A4|2014-11-19|
CN103503429B|2014-11-19|
WO2012142589A3|2013-01-10|
EP2697962B1|2016-12-21|
TWI513327B|2015-12-11|
CN103888743A|2014-06-25|
CA3000434C|2020-12-15|
KR101751226B1|2017-06-27|
TW201304558A|2013-01-16|
KR20180014233A|2018-02-07|
TW201720143A|2017-06-01|
KR20140140655A|2014-12-09|
TW201820861A|2018-06-01|
EP3166298B1|2018-07-18|
EP3376749A1|2018-09-19|
JP2016129396A|2016-07-14|
JP6058183B2|2017-01-11|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US4649568A|1984-10-22|1987-03-10|Polaroid Corporation|Reconstitution of images|
JP2663189B2|1990-01-29|1997-10-15|富士写真フイルム株式会社|Image dynamic range compression processing method|
US5414527A|1991-08-14|1995-05-09|Fuji Xerox Co., Ltd.|Image encoding apparatus sensitive to tone variations|
JP3222577B2|1992-09-16|2001-10-29|古河電気工業株式会社|Aluminum alloy fin material for heat exchanger|
US5621660A|1995-04-18|1997-04-15|Sun Microsystems, Inc.|Software-based encoder for a software-implemented end-to-end scalable video delivery system|
US5742892A|1995-04-18|1998-04-21|Sun Microsystems, Inc.|Decoder for a software-implemented end-to-end scalable video delivery system|
US6282313B1|1998-09-28|2001-08-28|Eastman Kodak Company|Using a set of residual images to represent an extended color gamut digital image|
US6335983B1|1998-09-28|2002-01-01|Eastman Kodak Company|Representing an extended color gamut digital image in a limited color gamut color space|
US6282312B1|1998-09-28|2001-08-28|Eastman Kodak Company|System using one or more residual image to represent an extended color gamut digital image|
US6285784B1|1998-09-28|2001-09-04|Eastman Kodak Company|Method of applying manipulations to an extended color gamut digital image|
US6282311B1|1998-09-28|2001-08-28|Eastman Kodak Company|Using a residual image to represent an extended color gamut digital image|
US6301393B1|2000-01-21|2001-10-09|Eastman Kodak Company|Using a residual image formed from a clipped limited color gamut digital image to represent an extended color gamut digital image|
US6748106B1|2000-03-28|2004-06-08|Eastman Kodak Company|Method for representing an extended color gamut digital image on a hard-copy output medium|
US6738427B2|2000-09-15|2004-05-18|International Business Machines Corporation|System and method of processing MPEG streams for timecode packet insertion|
EP1816603A1|2000-11-30|2007-08-08|Canon Kabushiki Kaisha|Image processing device, image processing method, storage medium, and program|
US6606418B2|2001-01-16|2003-08-12|International Business Machines Corporation|Enhanced compression of documents|
JP3966117B2|2001-09-06|2007-08-29|富士ゼロックス株式会社|Image processing apparatus, image encoding apparatus, image printing apparatus, and methods thereof|
JP2004029639A|2002-06-28|2004-01-29|Canon Inc|Method for reducing the number of bits|
US7260265B2|2002-10-04|2007-08-21|International Business Machines Corporation|Enhancing compression while transcoding JPEG images|
KR20060108709A|2003-11-12|2006-10-18|콸콤 인코포레이티드|High data rate interface with improved link control|
US7492375B2|2003-11-14|2009-02-17|Microsoft Corporation|High dynamic range image viewing on low dynamic range displays|
US8218625B2|2004-04-23|2012-07-10|Dolby Laboratories Licensing Corporation|Encoding, decoding and representing high dynamic range images|
US20050259729A1|2004-05-21|2005-11-24|Shijun Sun|Video coding with quality scalability|
JP2006019847A|2004-06-30|2006-01-19|Fuji Photo Film Co Ltd|Image processor, processing method and program|
KR100657268B1|2004-07-15|2006-12-14|학교법인 대양학원|Scalable encoding and decoding method of color video, and apparatus thereof|
JP2006157453A|2004-11-29|2006-06-15|Sony Corp|Image processing apparatus, image processing method, and imaging apparatus|
ES2551561T3|2006-01-23|2015-11-19|MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V.|High dynamic range codecs|
US8014445B2|2006-02-24|2011-09-06|Sharp Laboratories Of America, Inc.|Methods and systems for high dynamic range video coding|
EP1871113A1|2006-06-20|2007-12-26|THOMSON Licensing|Method and apparatus for encoding video enhancement layer with multiresolution color scalability|
JP4991851B2|2006-07-17|2012-08-01|トムソンライセンシング|Method and apparatus for encoding video color enhancement data and method and apparatus for decoding video color enhancement data|
WO2008043198A1|2006-09-30|2008-04-17|Thomson Licensing|Method and device for encoding and decoding color enhancement layer for video|
CA2570090C|2006-12-06|2014-08-19|Brightside Technologies Inc.|Representing and reconstructing high dynamic range images|
US8237865B2|2006-12-18|2012-08-07|Emanuele Salvucci|Multi-compatible low and high dynamic range and high bit-depth texture and video encoding system|
US7826673B2|2007-01-23|2010-11-02|Sharp Laboratories Of America, Inc.|Methods and systems for inter-layer image prediction with color-conversion|
TW200845723A|2007-04-23|2008-11-16|Thomson Licensing|Method and apparatus for encoding video data, method and apparatus for decoding encoded video data and encoded video signal|
US7961983B2|2007-07-18|2011-06-14|Microsoft Corporation|Generating gigapixel images|
US7940311B2|2007-10-03|2011-05-10|Nokia Corporation|Multi-exposure pattern for enhancing dynamic range of images|
US8175158B2|2008-01-04|2012-05-08|Sharp Laboratories Of America, Inc.|Methods and systems for inter-layer image prediction parameter determination|
JP4544308B2|2008-01-11|2010-09-15|ソニー株式会社|Image processing apparatus, imaging apparatus, method, and program|
US8953673B2|2008-02-29|2015-02-10|Microsoft Corporation|Scalable video coding and decoding with sample bit depth and chroma high-pass residual layers|
JP2009224971A|2008-03-14|2009-10-01|Omron Corp|Image processing device|
JP5395500B2|2008-07-22|2014-01-22|キヤノン株式会社|Measuring apparatus and image forming apparatus|
US8373718B2|2008-12-10|2013-02-12|Nvidia Corporation|Method and system for color enhancement with color volume adjustment and variable shift along luminance axis|
KR101007101B1|2009-01-07|2011-01-10|한양대학교 산학협력단|Adaptive tone mapping apparatus and method, and image processing system using the method|
US8406569B2|2009-01-19|2013-03-26|Sharp Laboratories Of America, Inc.|Methods and systems for enhanced dynamic range images and video from multiple exposures|
US8290295B2|2009-03-03|2012-10-16|Microsoft Corporation|Multi-modal tone-mapping of images|
WO2010105036A1|2009-03-13|2010-09-16|Dolby Laboratories Licensing Corporation|Layered compression of high dynamic range, visual dynamic range, and wide color gamut video|
CN101626454B|2009-04-10|2011-01-05|黄宝华|Method for intensifying video visibility|
US8525900B2|2009-04-23|2013-09-03|Csr Technology Inc.|Multiple exposure high dynamic range image capture|
US8766999B2|2010-05-20|2014-07-01|Aptina Imaging Corporation|Systems and methods for local tone mapping of high dynamic range images|
CN102986214A|2010-07-06|2013-03-20|皇家飞利浦电子股份有限公司|Generation of high dynamic range images from low dynamic range images|
TWI580275B|2011-04-15|2017-04-21|杜比實驗室特許公司|Encoding, decoding, and representing high dynamic range images|
US8891863B2|2011-06-13|2014-11-18|Dolby Laboratories Licensing Corporation|High dynamic range, backwards-compatible, digital cinema|JP5383360B2|2009-07-15|2014-01-08|キヤノン株式会社|Image processing apparatus and control method thereof|
TWI580275B|2011-04-15|2017-04-21|杜比實驗室特許公司|Encoding, decoding, and representing high dynamic range images|
US8891863B2|2011-06-13|2014-11-18|Dolby Laboratories Licensing Corporation|High dynamic range, backwards-compatible, digital cinema|
KR101845231B1|2011-06-14|2018-04-04|삼성전자주식회사|Image processing apparatus and method|
TWI523500B|2012-06-29|2016-02-21|私立淡江大學|Dynamic range compression method for image and image processing device|
US9489706B2|2012-07-02|2016-11-08|Qualcomm Technologies, Inc.|Device and algorithm for capturing high dynamic rangevideo|
KR101970122B1|2012-08-08|2019-04-19|돌비 레버러토리즈 라이쎈싱 코오포레이션|Image processing for hdr images|
CN109064433A|2013-02-21|2018-12-21|皇家飞利浦有限公司|Improved HDR image coding and decoding methods and equipment|
TWI711310B|2013-06-21|2020-11-21|日商新力股份有限公司|Transmission device, high dynamic range image data transmission method, reception device, high dynamic range image data reception method and program|
TWI676389B|2013-07-15|2019-11-01|美商內數位Vc專利控股股份有限公司|Method for encoding and method for decoding a colour transform and corresponding devices|
US10218917B2|2013-07-16|2019-02-26|Koninklijke Philips N.V.|Method and apparatus to create an EOTF function for a universal code mapping for an HDR image, method and process to use these images|
US9264683B2|2013-09-03|2016-02-16|Sony Corporation|Decoding device and decoding method, encoding device, and encoding method|
TWI646828B|2013-09-03|2019-01-01|日商新力股份有限公司|Decoding device and decoding method, encoding device and encoding method|
US9036908B2|2013-09-30|2015-05-19|Apple Inc.|Backwards compatible extended image format|
CN109889843A|2014-01-07|2019-06-14|杜比实验室特许公司|Technology for being encoded, being decoded and being indicated to high dynamic range images|
CN103747225B|2014-01-23|2015-11-18|福州大学|Based on the high dynamic range images double-screen display method of color space conversion|
EP3111644A1|2014-02-25|2017-01-04|Apple Inc.|Adaptive transfer function for video encoding and decoding|
WO2015180854A1|2014-05-28|2015-12-03|Koninklijke Philips N.V.|Methods and apparatuses for encoding an hdr images, and methods and apparatuses for use of such encoded images|
US20150350641A1|2014-05-29|2015-12-03|Apple Inc.|Dynamic range adaptive video coding system|
JP5948619B2|2014-06-10|2016-07-06|パナソニックIpマネジメント株式会社|Display system, display method, and display device|
EP2958330A1|2014-06-20|2015-12-23|Thomson Licensing|Method and device for decoding a HDR picture from a bitstream representing a LDR picture and an illumination picture|
WO2015193117A1|2014-06-20|2015-12-23|Thomson Licensing|Method and device for decoding a hdr picture from a bitstream representing a ldr picture and an illumination picture|
WO2015193116A1|2014-06-20|2015-12-23|Thomson Licensing|Method and device for decoding a hdr picture from a bitstream representing a ldr picture and an illumination picture|
EP2958328A1|2014-06-20|2015-12-23|Thomson Licensing|Method and device for signaling in a bitstream a picture/video format of an LDR picture and a picture/video format of a decoded HDR picture obtained from said LDR picture and an illumination picture|
CN105493490B|2014-06-23|2019-11-29|松下知识产权经营株式会社|Transform method and converting means|
EP2961168A1|2014-06-27|2015-12-30|Thomson Licensing|Method and apparatus for predicting image samples for encoding or decoding|
CN107005720B|2014-08-08|2020-03-06|皇家飞利浦有限公司|Method and apparatus for encoding HDR images|
US10277771B1|2014-08-21|2019-04-30|Oliver Markus Haynold|Floating-point camera|
JP6331882B2|2014-08-28|2018-05-30|ソニー株式会社|Transmitting apparatus, transmitting method, receiving apparatus, and receiving method|
JP6194427B2|2014-10-06|2017-09-06|テレフオンアクチーボラゲット エルエム エリクソン(パブル)|Coding and derivation of quantization parameters|
JP6617142B2|2014-10-07|2019-12-11|トレリス・ユーロプ・ソチエタ・ア・レスポンサビリタ・リミタータTRELLIS EUROPE SrL|Improved video and image encoding process|
US10225485B1|2014-10-12|2019-03-05|Oliver Markus Haynold|Method and apparatus for accelerated tonemapping|
MX2017005983A|2014-11-10|2017-06-29|Koninklijke Philips Nv|Method for encoding, video processor, method for decoding, video decoder.|
US9589313B2|2015-01-09|2017-03-07|Vixs Systems, Inc.|Dynamic range converter with pipelined architecture and methods for use therewith|
US9654755B2|2015-01-09|2017-05-16|Vixs Systems, Inc.|Dynamic range converter with logarithmic conversion and methods for use therewith|
US9860504B2|2015-01-09|2018-01-02|Vixs Systems, Inc.|Color gamut mapper for dynamic range conversion and methods for use therewith|
EP3051821A1|2015-01-30|2016-08-03|Thomson Licensing|Method and apparatus for encoding and decoding high dynamic rangevideos|
CN111654697A|2015-01-30|2020-09-11|交互数字Vc控股公司|Method and apparatus for encoding and decoding color picture|
CN107409213B|2015-03-02|2020-10-30|杜比实验室特许公司|Content adaptive perceptual quantizer for high dynamic range images|
CN104835131B|2015-04-20|2018-05-15|中国科学技术大学先进技术研究院|A kind of method and system that HDR image generation and tone mapping are realized based on IC|
JP6663214B2|2015-05-26|2020-03-11|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Display method and display device|
WO2016189774A1|2015-05-26|2016-12-01|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ|Display method and display device|
US10674182B2|2015-06-05|2020-06-02|Telefonaktiebolaget Lm Ericsson |Pixel pre-processing and encoding|
EP3113495A1|2015-06-30|2017-01-04|Thomson Licensing|Methods and devices for encoding and decoding a hdr color picture|
EP3113496A1|2015-06-30|2017-01-04|Thomson Licensing|Method and device for encoding both a hdr picture and a sdr picture obtained from said hdr picture using color mapping functions|
EP3119088A1|2015-07-16|2017-01-18|Thomson Licensing|Method and device for encoding an image, method and device for decoding an image|
US10244249B2|2015-09-21|2019-03-26|Qualcomm Incorporated|Fixed point implementation of range adjustment of components in video coding|
US10200690B2|2015-09-22|2019-02-05|Qualcomm Incorporated|Video decoder conformance for high dynamic rangevideo coding using a core video standard|
US10080005B2|2015-11-09|2018-09-18|Netflix, Inc.|High dynamic range color conversion correction|
EP3182691B1|2015-12-17|2018-10-31|Thomson Licensing|Method of encoding raw color coordinates provided by a camera representing colors of a scene having two different illuminations|
US10148972B2|2016-01-08|2018-12-04|Futurewei Technologies, Inc.|JPEG image to compressed GPU texture transcoder|
CN108781290A|2016-03-07|2018-11-09|皇家飞利浦有限公司|HDR videos are coded and decoded|
GB2549696A|2016-04-13|2017-11-01|Sony Corp|Image processing method and apparatus, integrated circuitry and recording medium|
CN107767838B|2016-08-16|2020-06-02|北京小米移动软件有限公司|Color gamut mapping method and device|
GB2554669A|2016-09-30|2018-04-11|Apical Ltd|Image processing|
US10244244B2|2016-10-26|2019-03-26|Dolby Laboratories Licensing Corporation|Screen-adaptive decoding of high dynamic range video|
US10218952B2|2016-11-28|2019-02-26|Microsoft Technology Licensing, Llc|Architecture for rendering high dynamic range video on enhanced dynamic range display devices|
US10104334B2|2017-01-27|2018-10-16|Microsoft Technology Licensing, Llc|Content-adaptive adjustment of display device brightness levels when rendering high dynamic range content|
US10176561B2|2017-01-27|2019-01-08|Microsoft Technology Licensing, Llc|Content-adaptive adjustments to tone mapping operations for high dynamic range content|
CN110337667A|2017-02-15|2019-10-15|杜比实验室特许公司|The tint ramp of high dynamic range images maps|
EP3367658A1|2017-02-24|2018-08-29|Thomson Licensing|Method and device for reconstructing an hdr image|
EP3373585A1|2017-03-09|2018-09-12|Thomson Licensing|Method for inverse tone mapping of an image with visual effects|
JP6866224B2|2017-05-09|2021-04-28|キヤノン株式会社|Image coding device, image decoding device, image coding method and program|
KR102344334B1|2017-06-27|2021-12-29|삼성전자주식회사|Display apparatus and method for processing image|
US10755392B2|2017-07-13|2020-08-25|Mediatek Inc.|High-dynamic-range video tone mapping|
TWI644264B|2017-12-22|2018-12-11|晶睿通訊股份有限公司|Image identification method and image identification device|
US10957024B2|2018-10-30|2021-03-23|Microsoft Technology Licensing, Llc|Real time tone mapping of high dynamic range image data at time of playback on a lower dynamic range display|
WO2020185022A1|2019-03-12|2020-09-17|주식회사 엑스리스|Method for encoding/decoding image signal, and device therefor|
CN110232669A|2019-06-19|2019-09-13|湖北工业大学|A kind of tone mapping method and system of high dynamic range images|
CN113271449B|2021-07-21|2021-09-28|北京小鸟科技股份有限公司|Conversion system, method and equipment for multiple HDR videos|
法律状态:
2020-11-10| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-11-10| B15K| Others concerning applications: alteration of classification|Free format text: AS CLASSIFICACOES ANTERIORES ERAM: H04N 1/41 , H04N 1/40 , H04N 1/46 , H04N 9/04 Ipc: H04N 1/407 (2006.01), H04N 1/60 (2006.01), H04N 9/ |
2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201161476174P| true| 2011-04-15|2011-04-15|
US61/476,174|2011-04-15|
US201161552868P| true| 2011-10-28|2011-10-28|
US61/552,868|2011-10-28|
PCT/US2012/033795|WO2012142589A2|2011-04-15|2012-04-16|Encoding, decoding, and representing high dynamic range images|
[返回顶部]