![]() METHODS AND APPARATUS FOR DERIVING A TEMPORAL MOTION VECTOR PREDICTOR
专利摘要:
Temporal motion vector prediction method and apparatus An apparatus and method for motion vector prediction for a current block in an image are disclosed. in video coding systems, spatial and temporal redundancy is exploited using spatial and temporal prediction to reduce the information to be transmitted. motion vector prediction (mvp) has been used to further converse the associated bit rate associated with motion vector. in conventional temporal mvp, the predictor is often based on a single candidate, such as the motion vector co-located in frame/image/previous. if the motion vector co-located in the previous frame/image does not exist, the predictor for the current block is not available. an improved mvp technique is disclosed where mvp utilized multiple candidates based on co-located motion vectors of future and/or past reference images. candidates are arranged in order of priority to provide better mvp availability and also to provide more accurate prediction. furthermore, the disclosed mvp technique can be operated in a closed-loop fashion so that no additional lateral information or minimal additional lateral information is required. 公开号:BR112012027263B1 申请号:R112012027263-2 申请日:2011-04-22 公开日:2021-08-17 发明作者:Yu-Wen Huang;Shaw-Min Lei;Yu-Pao Tsai;Jian-Liang Lin 申请人:Hfi Innovation Inc.; IPC主号:
专利说明:
BACKGROUND OF THE INVENTION CROSS REFERENCE FOR RELATED ORDERS The present invention claims priority to Provisional Patent Application, No. US 61/363,557, filed July 12, 2010, entitled "Video encoding methods for B-frame referencing colocalized motion vector" and Provisional Patent Application, No. ° US 61/431,454, filed January 11, 2011, entitled "Enhanced Advanced Motion Vector Prediction". The Provisional Patent Applications are hereby incorporated by reference in their entirety. FIELD OF THE INVENTION The present invention relates to video encoding. In particular, the present invention relates to coding techniques associated with motion vector prediction. DESCRIPTION OF RELATED TECHNIQUE In video coding systems, spatial and temporal redundancy is exploited using spatial and temporal prediction to reduce the information to be transmitted. Spatial and temporal prediction uses pixels decoded from the same image and reference images, respectively, in order to form prediction for actual pixels to be encoded. In a conventional encoding system, side information associated with spatial and temporal prediction may have to be transmitted which will take up some bandwidth of the compressed video data. Transmission of motion vectors for temporal prediction may require a visible portion of the compressed video data, particularly in low bit rate applications. To further reduce the bit rate associated with motion vectors, a technique called Motion Vector Prediction (MVP) has been used in the field of video encoding in recent years. The MVP technique exploits the statistical redundancy between spatially and temporally neighboring motion vectors. When MVP is used, a predictor for the current motion vector is chosen and the motion vector residue is transmitted instead of the motion vector itself to conserve the bit rate associated with motion vector transmission. The MVP scheme can be applied in a closed-loop arrangement where the predictor can be derived at the decoder based on decoded information and no side information has to be transmitted. Alternatively, lateral information can be explicitly transmitted in the bit stream to inform the decoder regarding the selected motion vector predictor type. While MVP can be used for intercoded blocks to conserve bandwidth, it can also be used for SKIP and DIRECT coded blocks to substantially reduce the bit rate for the underlying blocks. In conventional temporal MVP, the predictor is often based on a single candidate as the motion vector co-located in the previous frame/image. If the motion vector co-located in the previous frame/image does not exist, the predictor for the current block is not available. It is desirable to improve MVP performance in order to reduce the bit rate of the encoding system. Improvement can be achieved by designing a better MVP that can provide more accurate prediction and improve predictor availability. Furthermore, it is desirable that the MVP can be operated in a closed-loop fashion so that no side information or minimal side information is needed. BRIEF SUMMARY OF THE INVENTION An apparatus and method for predicting motion vector for a current block in an image is disclosed. In an embodiment according to the present invention, the apparatus and method for motion vector prediction comprises the steps of receiving two or more motion vectors associated with at least one temporal block of a group consisting of one or more images of future references and one or more past reference images, determine a candidate set based on said two or more motion vectors, wherein the candidate set consisting of at least two elements, arrange the candidate set in a priority order , and determining a motion vector predictor or a motion vector predictor candidate or a motion vector or motion vector candidate for the current block from the candidate set according to the order of priority. The order of priority is predefined in one embodiment according to the present invention, and the order of priority is determined according to an adaptive scheme of another embodiment according to the present invention. The step of determining the set of candidates based on said two or more motion vectors may further comprise a step of deriving scaled motion vectors corresponding to the two or more motion vectors in an alternative embodiment in accordance with the present invention. The candidate set can be selected from two or more motion vectors, staggered motion vectors, or a combination of both, in various embodiments in accordance with the present invention. The adaptive scheme can be based on a criterion selected from a group consisting of reconstructed motion vector statistics from previous blocks, current block partition type, motion vector correlation, motion vector directions, distance of motion vectors. motion vectors, and whether the motion vectors cross the current block in a case of temporal motion vector in different embodiments according to the present invention. When scaled motion vectors are used, the adaptive scheme can be based on whether scaled motion vector directions are interpolation or extrapolation in an alternative embodiment according to the present invention. In another embodiment according to the present invention, the apparatus and method for motion vector prediction comprises the steps of receiving two or more motion vectors associated with at least one temporal block from a first reference picture list, determine a candidate set consisting of at least two elements based on two or more motion vectors; arrange the candidate set in a priority order, and determine a motion vector predictor or a motion vector predictor candidate or a motion vector or motion vector candidate for the current block from the candidate set according to the order of priority. Information associated with the order of priority is incorporated into a sequence header, an image header, or a slice header. The first reference image list can be either a reference image list 0 or a reference image list 1. BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 illustrates DIRECT mode prediction motion vector scaling in B-slice encoding according to the prior art. Figure 2 illustrates motion vector scaling in B-slice encoding based on a co-located motion vector of the previous first B image according to a prior art. Figure 3 illustrates an exemplary motion vector prediction using motion vectors from blocks co-located in list 0 and list 1 of reference images with a predefined order of priority. Figure 4 illustrates an example of a predefined order of priority for the four motion vectors in Figure 3. Figure 5 illustrates an alternative predefined priority order for the four motion vectors in Figure 3. Figure 6 illustrates an example of predefined order of priority for the four motion vectors of Figure 3, where only three of the four motion vectors are used as candidates for the motion vector predictor. Figure 7 illustrates an alternative predefined priority order for the four motion vectors of Figure 3, where only three of the four motion vectors are used as candidates for the motion vector predictor. Figure 8 illustrates an example of predefined order of priority for the four motion vectors of Figure 3, where only two of the four motion vectors are used as candidates for the motion vector predictor. Figure 9 illustrates an alternative predefined priority order for the four motion vectors of Figure 3, where only two of the four motion vectors are used as candidates for the motion vector predictor. Figure 10A illustrates an example of motion vector prediction using scaled motion vectors derived for the current block with current list equal to list 0 and reference image RefIdxL0 = 0. Figure 10B illustrates an example of motion vector prediction using scaled motion vectors derived for the current block with the current list equal to list 1 and the reference image RefIdxL1 = 0. Figure 11 illustrates an example of predefined priority order for the four staggered motion vectors of Figure 10A. Figure 12 illustrates an example of predefined priority order for the four staggered motion vectors of Figure 10B. Figure 13 illustrates an example of the predefined order of priority of the staggered motion vectors mvL0l and mvL1l from Figure 10A. Figure 14 illustrates an example of predefined priority order of scaled motion vectors mvL1j and mvL0j from Figure 10A. Figure 15A illustrates an example of motion vector prediction using the scaled motion vectors derived for the current block with the current list equal to list 0 and reference image RefIdxL0 = 0 and the unscaled motion vectors of co-located blocks of the list 0 and list 1 of reference images. Figure 15B illustrates an example of motion vector prediction using the scaled motion vectors derived for the current block with the current list equal to list 1 and reference image RefIdxL1 = 0 and unscaled motion vectors of the colocalized block of list 0 and list 1 of reference images. Figure 16 illustrates an example of the predefined order of priority for the scaled and unscaled motion vectors of Figure 15A. Figure 17 illustrates an example of the predefined order of priority for the scaled and unscaled motion vectors of Figure 15B. DETAILED DESCRIPTION OF THE INVENTION In video coding systems, spatial and temporal redundancy is exploited using spatial and temporal prediction to reduce the bit rate to be transmitted. Spatial prediction uses pixels decoded from the same image to form prediction for actual pixels to be encoded. Spatial prediction is often operated on a block-by-block basis, such as 16x16 or 4x4 block for the luminance signal in Intra H.264/AVC encoding. In video sequences, neighboring images often have great similarities, and simply using image differences can effectively reduce the transmitted information associated with static background areas. However, moving objects in the video stream can result in substantial residuals and will require a higher bit rate to encode the residuals. Motion Compensated Prediction (MCP) is a popular technique for exploring temporal correlation in video sequences. Motion Compensated Prediction can be used in a forward prediction mode, where a current image block is predicted using a decoded image or images that are before the current image in viewing order. In addition to forward prediction, backward prediction can also be used to improve motion compensated prediction performance. Backward prediction uses a decoded image or images after the current image in viewing order. Since the first version of H.264/AVC was finished in 2003, forward prediction and backward prediction have been extended to list 0 prediction and list 1 prediction, respectively, where both list 0 and list 1 can contain multiple images before or after the current image in preview order. The following describes the standard reference image list rebuild. For list 0, reference images before the current image have lower reference image indices than those after the current image. For List 1, reference image after current image have lower reference image indices than previous ones for current image. For both list 0 and list 1, after applying the previously described rule, the temporal distance is also considered in determining the reference image index. Generally speaking, a reference image closer to the current image has a lower reference image index. For example, suppose the current image is image 5, and images 0, 2, 4, 6, and 8 are reference images, where the numbers indicate the viewing order. List 0 of reference pictures with ascending reference picture indices and starting with index equal to zero are 4, 2, 0, 6, and 8. List 1 of reference pictures with ascending reference picture indices and starting with index equal to zero are 6, 8, 4, 2, and 0. The reference image, with index equal to 0 is called co-located image, and in this example with image 5 as the current image, image 6 is the list of a co-localized image, and image 4 is the co-localized image of list 0. When a block in a co-localized image of list 0 or list 1 has the same block location as the current block in the current image, it is called a co-localized block of list 0 or list 1, or called a block co-located in list 0 or list l. The unit used for motion estimation mode in earlier video standards such as MPEG-1, MPEG-2 and MPEG-4 is mainly based on macroblock. For H.264/AVC, 16x16 macroblock can be segmented into 16x16, 16x8, 8x16 and 8x8 blocks for motion estimation. In addition, the 8x8 block can be segmented into 8x8, 8x4, 4x8 and 4x4 blocks for motion estimation. For the High Efficiency Video Coding (HEVC) standard under development, the unit for motion estimation/compensation mode is called Prediction Unit (PU), PU wave is hierarchically divided from a maximum block size. MCP type is selected for each slice in H.264/AVC standard. A slice that motion compensated prediction is constrained to list 0 prediction is called a P-slice. For a B-slice, motion compensated prediction also includes list 1 prediction and bidirectional prediction in addition to list 0 prediction. In video encoding systems, the motion vector and encoded residues are transmitted to a decoder to reconstruct the video on the decoder side. Furthermore, in a system with flexible reference image structure, information associated with the selected reference images may also have to be transmitted. Motion vector transmission may require a visible portion of the transmitted bit rate, particularly in low bit rate applications or in systems where motion vectors are associated with the smallest blocks or high precision of motion. To further reduce the bit rate associated with motion vector, a technique called motion vector prediction (MVP) has been used in the field of video encoding in recent years. The MVP technique exploits the statistical redundancy between spatially and temporally neighboring motion vectors. When MVP is used, a predictor for the current motion vector is chosen and the motion vector residual, that is, the difference between the motion vector and the predictor, is transmitted. The MVP scheme can be applied in a closed-loop arrangement where the predictor is derived at the decoder based on decoded information and no side information has to be transmitted. Alternatively, lateral information can be explicitly transmitted in the bit stream to inform the decoder regarding the selected motion vector predictor type. In the H.264/AVC standard, there is also a SKIP mode in addition to the conventional intra and inter modes for macroblocks in a Pslice. SKIP mode is a very effective method to achieve large compression since there is no quantized error signal, no motion vector, no reference index parameter to be transmitted. The only information needed for the 16x16 macroblock in SKIP mode is a signal to indicate that SKIP mode is being used and therefore substantial bit rate reduction is achieved. The motion vector used to reconstruct the SKIP macroblock is similar to the motion vector predictor of a macroblock. In the H.264/AVC standard, four different types of interprediction are supported for B slices including list 0, list 1, bipredictive and DIRECT prediction, where list 0 and list 1 refer to the predictor with reference image of group 0 and group 1, respectively. For bipredictive mode, the prediction signal is formed by a weighted average of list 0 and list 1 motion compensated prediction signals. The DIRECT prediction mode is inferred from previously passed syntax elements and can be list 0 or list 1 prediction or bipredictive. Therefore, there is no need to transmit information to the motion vector in DIRECT mode. In the case where no quantized error signal is transmitted, the DIRECT macroblock mode is referred to as the SKIP B mode and the block can be efficiently coded. In HEVC being developed, some motion vector prediction improvement over H.264/AVC is being considered. In this disclosure, a motion vector prediction system and method for Bframe/picture/slice based on temporal blocks in the past and/or future reference pictures are disclosed. The motion vector for a current block is predicted by the motion vectors of temporal blocks in the past and/or future reference images effectively so that the coding efficiency for the motion vector can be improved. Temporal motion vectors are considered as predictor candidates for the current block and candidates are arranged in order of priority. The candidate with the highest order of priority will be considered as a predictor in front of a candidate with a lower order of priority. The advantage of priority-based MVP derivation is that it increases the chance that the temporal MVP candidate for the current block exists. For example, the conventional video coding system only considers the motion vector co-located in the previous frame/image as the candidate. If the candidate does not exist, then the coding system will consider the MVP candidate for the co-located block to be unavailable. Therefore, it is desirable to increase the availability of the MVP candidate in order to improve the coding efficiency of the coding system. In the H.264/AVC standard, temporal DIRECT mode is used for B-slices, where the motion vectors for a current block 110 in slice B is derived from the motion vector of co-located block 120 in the first 1 list of images. reference, as shown in Figure 1. Motion vector derivation for temporal DIRECT mode is described in "Direct Mode Coding for Bipredictive Slices in the H.264 Standard", authored by Tourapis et al., in IEEE Trans, on Circuits and Systems for Video Technology, Vol. 15, No. 1, pp.119-126, Jan. 2005. The motion vector for the colocalized block of the first reference list 1 is denoted as MV . The motion vectors for the current block are denoted as MVLO and MVL1 with respect to list 0 of reference pictures and list 1 of reference pictures. The temporal distance between the current image and the reference image list 0 is denoted as TDB and the temporal distance between the reference image list 0 and reference image list 1 is denoted as TDD. The motion vectors for the current block can be derived according to: The above equations were later replaced by: so that X and ScaleFactor can be pre-calculated at the slice/image level. In temporal DIRECT mode, the motion vector predictor is only based on the motion vector for the colocalized block of the first 1 list of references. In another prior art entitled "Optimized RD Coding for Motion Vector Predictor Selection", by Laroche et al, in IEEE Trans, on Circuits and Systems for Video Technology, vol. 18, No. 12, pp.1681-191, Dec 2008, motion vector prediction selection based on motion vector concurrency is disclosed. The motion vector competition scheme uses RD optimization to determine the best motion vector predictor from motion vector predictor candidates. For example, as shown in Figure 2, temporal motion vector predictor candidates can include list 0 of motion vectors corresponding to the block co-located in the co-located image of list 1 Ref1, and list 0 and list 1 of motion vectors for a co-localized block in the co-localized image of list 0, B-1. The list 0 of motion vectors corresponding to the block co-localized in the co-localized image of list 1 Ref1 can be calculated in the same way as defined in the H.264/AVC standard: List 0 and list 1 of motion vectors for a block co-localized in list 0 co-localized image, B-1, can be used to derive the motion vector predictor of the current block. If only the motion vector »rt'fA)e. colocalized in the B-1 image pointing to a forward P-image exists, the mv3L0and mv3L1 motion predictors can be calculated according to: The motion vector MV., is represented in 2 0 Figure 2 and dLOB - 1 is the temporal distance between the P-frame forward and frame B-1. In the case of backward prediction, the mv4L0and mv4L1 predictors can be calculated according to: The motion vector is the motion vector 30 colocalized in image B-1 pointing to the past P-frame, as illustrated in Figure 2. Depending on the availability of motion vectors and , the corresponding predictors in equations (7) - (12 ) can be used for the current block and RD optimization is applied to select the best motion vector predictor. The motion vector prediction scheme according to Laroche et al. will require lateral information to be transmitted to the decoder side to indicate the particular motion vector predictor selected. Transmitting lateral information associated with the selected motion vector predictor will consume some bandwidth. Regardless of whether the motion vector competition scheme is turned on or off, temporal motion vector prediction can be beneficial in reducing motion vector residuals. It is desirable to develop a temporal motion vector prediction technique to improve any temporal motion vector predictor when motion vector competition is not used, as well as to improve any temporal motion vector predictor candidates when vector competition of motion is used. Furthermore, it is also desirable that the improved temporal motion vector prediction have no additional lateral information or minimal additional lateral information. Thus, a motion vector prediction technique for a current block in an image is developed. The motion vector prediction technique of the present invention can be used to determine a motion vector predictor, motion vector predictor candidate, motion vector, or motion vector candidate for the current block. The technique uses motion vectors associated with temporal blocks of the future and past reference images as predictor candidates and selects a predictor according to a priority order. Figure 3 illustrates exemplary motion vector prediction using motion vectors from blocks co-located in the past and future reference images according to a priority order. The future image example used in Figure 3 is the image belonging to list 1 with reference image index RefIdxL1 equal to 0 and the image is labeled "l". The past image example used in Figure 3 is the image belonging to list 0 with reference image index RefIdxL0 equal to 0 and the image labeled "j". In addition, the current image is labeled "k", the image belonging to list 0 with RefIdeL0> 1 is labeled "i", and the image belonging to list 1 with RefIdeL1> 1 is labeled "w". Motion vectors mvL0 and mvL1 for a current block 310 are determined from motion vectors mvL0j, mvL1j, mvL0i, and mvL1i of colocalized block 320 of image j and colocalized block 330 of image 1 respectively according to the order of priority. Motion vectors mvL0j and mvL1j are referring to image j co-located block 320 motion vectors pointing to a list 0 image and a list 1 image, respectively. Motion vectors mvL0j and mvL1j are referring to co-located block motion vectors 330 of image 1 pointing to a list 0 image and a list 1 image, respectively. These motion vector candidates will be arranged in order of priority so that a predictor can be chosen accordingly. Figure 4 illustrates an example of predefined priority order for the four motion vector candidates of Figure 3. Motion vector predictor determination based on motion vectors co-located in the past and future reference images for the current blocks 310 to use a reference image in list 0 is shown below: • If mvL0i exists and the corresponding reference image (pic.id = il) is in list 0 of the current image, then mvL0 = mvL0l (referencepic.id = i - 1); • Otherwise if mvL1j exists and the corresponding reference image (pic. id = m) is in list 0 of the current image, then mvL0 = mvL1j (referencepic. id = m); • Otherwise if mvL0j exists and the corresponding reference image (pic. id = j) is in list 0 of the current image, then mvL0 = mvL0j (referencepic. id = j); • Otherwise if mvL1l exists and the corresponding reference image (pic. id = l) is in list 0 of the current image, then mvL0 = mvL1l (reference pic. id = l); • Otherwise, mvL0 is not available. The predefined priority order shown above illustrates an example of determining motion vector predictor or motion vector predictor candidate mvL0 for current block 310 to use a reference image in list 0 and candidates for the predictor are based in colocalized motion vectors of list 0 and list 1 of reference images. A person skilled in the art can use other predefined priority orders to achieve the same or similar objective. Figure 5 illustrates another example of predefined priority order for the four motion vector candidates of Figure 3. Motion vector predictor determination based on motion vectors co-located in the future and past reference images for the block current 310 to use a list 1 of reference images is shown as follows: • If mvL1j exists and the corresponding reference image (pic.id = m) is in list 1 of the current image, then mvL1 = mvL1j (referencepic.id = m); • Otherwise if mvL0l exists and the corresponding reference image (pic. id = i-l) is in list 1 of the current image, then mvL1 = mvL0l (referencepic. id = i - 1); • Otherwise if mvL1l exists and the corresponding reference image (pic. id = l) is in list 1 of the current image, then mvL1 = mvL1l (reference pic. id = l); • Otherwise if mvL0j exists and the corresponding reference image (pic. id = j) is in list 1 of the current image, then mvL1 = mvL0j (referencepic. id = j); • Otherwise, mvL1 is not available. The predefined priority order shown above illustrates an example of determining motion vector predictor or motion vector predictor candidate mvL1j for current block 310 to use a reference image in list 1 and the candidates for the predictor are based on it. if in motion vectors co-located in list 0 and list 1 of reference images. A person skilled in the art can select other predefined priority orders to achieve the same or similar goal. Although the four vectors are considered as candidates for the predictor or predictor candidate of the current block, not all motion vector candidates have to be used. For example, three of the four motion vectors of Figure 3 can be used to determine the motion vector predictor or motion vector predictor candidate for the current block 310 to use a reference image in list 0, as shown in Figure 6: • If there is mvL0l and the corresponding reference image is in list 0 of the current image, then mvL0 = mvL0l (referencepic. id = i - 1); • Otherwise if mvL1j exists and the corresponding reference image is in list 0 of the current image, then mvL0 = mvL1j (referencepic. id = m); • Otherwise if mvL0j exists and the corresponding reference image is in list 0 of the current image, then mvL0 = mvL0j (referencepic. id = j); • Otherwise, mvL0 is not available. Likewise, three of the four motion vectors of Figure 3 can be used to determine the motion vector predictor or motion vector predictor candidate for the current block 310 using a reference picture in list 1 as shown in Figure 7: • If mvL1 j exists and the corresponding reference image is in list 1 of the current image, then mvL1 = mvL1j (referencepic. id = m); • Otherwise if mvL0l exists and the corresponding reference image is in list 1 of the current image, then mvL1 = mvL0l (referencepic. id = i - 1); • Otherwise if mvL1l exists and the corresponding reference image is in list 1 of the current image, then mvL1 = mvL1l (reference pic. id = l); • Otherwise, mvL1 is not available. In another example, two of the four motion vectors of Figure 3 can be used to determine the motion vector predictor or motion vector predictor candidate for the current block 310 to use a reference image in list 0, as shown in Figure 8: • If mvL0l exists and the corresponding reference image is in list 0 of the current image, then mvL0 = mvL0l (referencepic. id = i - 1); • Otherwise if mvL1j exists and the corresponding reference image is in list 0 of the current image, then mvL0 = mvL1j (referencepic. id = m); • Otherwise, mvL0 is not available. Likewise, two of the four motion vectors of the Figure 3 can be used to determine the motion vector predictor or motion vector predictor candidate for the current block 310 to use a reference image in list 1 as shown in Figure 9: • If there is mvL1j and the image of corresponding reference is in list 1 of current image, so mvL1 = mvL1j (referencepic. id = m); • Otherwise if mvL0l exists and the corresponding reference image is in list 1 of the current image, then mvL1 = mvL0l (referencepic. id = i - 1); • Otherwise, mvL1 is not available. It is not restricted to use at least one motion vector associated with a temporal block from a list 0 of reference images and at least one motion vector associated with a temporal block from a list 1 of reference images to determine the motion vector predictor or motion vector predictor candidate for the current block. In some other embodiments, a priority order is incorporated into the bit stream, such as a sequence header, an image header, or a slice header, and the motion vector predictor or motion vector predictor candidate. motion for the current block 310 is determined from two motion vectors mvL0l and mvL11 associated with the colocalized block of list 1 330 according to priority order. In another example, the motion vector predictor or motion vector predictor candidate for the current block 310 is determined from two motion vectors and mvL0j and mvL1j associated with the colocalized block of list 0 320 according to order priority embedded in a sequence header, image header, or slice header. While the examples illustrated above use co-located motion vectors of reference image in the future with RefIdxL1 = 0 and in the past with RefIdxL0 = 0 as motion vector predictor candidates or motion vector predictor candidate for the current block, motion vectors staggered colocalized motion of the corresponding reference images can also be used as candidates. A motion vector scaling method has been described in Figures 1 and 2 and associated text. The same scaling method can be used to derive the scaled motion vectors for the present invention. Figure 10A illustrates an example of motion vector prediction based on scaled motion vectors of co-placed blocks of list 0 and list 1 when the current list is list 0 and RefIdxL0 = 0. The scaled motion vectors corresponding to mvL0l, mvL1j, mvL0j and mvL1l are labeled as 1002, 1004, 1006 and 1008, respectively. The scaled motion vector represents the motion vector between the current block and the reference image in list 0 with RefIdxL0 = 0, by scaling the co-placed motion vector of list 0 or list 1. So instead of using mvL0l, mvL1j, mvL0j, and mvL1l as candidates for motion vector predictor or motion vector predictor candidate, scaled motion vectors 1002, 1004, 1006, and 1008 are used as motion vector predictor candidates or predictor candidate motion vector icon. Scaled motion vectors can provide better prediction than unscaled motion vectors. Figure 10B illustrates an example of motion vector prediction based on scaled motion vectors of co-placed blocks of list 0 and list 1 when the current list is list 1 with RefIdxL1 = 0. The scaled motion vectors corresponding to mvL1j, mvL0l , mvL1l and mvL0j, are labeled 1012, 1014, 1016 and 1018, respectively. The scaled motion vector represents the motion vector between the current block and the reference image in list 1 with RefIdxL1 = 0, by scaling the colocated motion of list 0 or list 1. So instead of using mvL1j, mvL0l, mvL1l and mvL0j as motion vector predictor candidates or motion vector predictor candidate, scaled motion vectors 1012, 1014, 1016, and 1018 are used as motion vector predictor candidates or motion vector predictor candidate . Scaled motion vectors can provide better prediction than unscaled motion vectors. As in the case of unscaled motion vectors, a predefined order of priority can be assigned to the scaled motion vectors for motion vector list 0 prediction. The index of list 0 of reference images RefIdxL0 can be implicitly derived or explicitly passed in the bitstream. In the case where the current list is list 0 and RefIdxL0 = 0, the default priority order for the scaled motion vectors of Figure 10A is shown in Figure 11. The motion vector predictor or motion vector predictor candidate is determined according to the following priority order: • If mvL0l exists, then mvL0 = scaled mvL0l (motion vector 1002); • Otherwise if mvL1j exists, then mvL0 = mvL1j scaled (motion vector 1004); • Otherwise if mvL0j exists, then mvL0 = scaled mvL0j (movement vector of 1006); • Otherwise if mvL1l exists, then mvL0 = scaled mvL1l (motion vector 1008); • Otherwise, mvL0 is not available. In another embodiment, the motion vector list 0 prediction (the current list is list 0 and RefIdxL0 = 0) for triggering the motion vector predictor or motion vector predictor candidate for the current block 310 is determined from according to the following priority order: • If mvL0l exists, then mvL0 = scaled mvL0i (motion vector 1002); • Otherwise if mvL1l exists, then mvL0 = mvL1j scaled (motion vector 1008); • Otherwise, mvL0 is not available. In this case, only the scaled motion vectors associated with the colocalized block of list 1330 are considered. Information associated with order of priority is embedded in a sequence header, an image header, or a slice header. Yet another modality of motion vector list 0 prediction (the current list is list 0 and RefIdxL0 = 0) determines the motion vector predictor or motion vector predictor candidate for the current block 310 according to the following order of priority: • If mvL1j exists, then mvL0 = mvL1j scaled (motion vector 1004); • Otherwise if mvL0j exists, then mvL0 = scaled mvL0j (movement vector of 1006); • Otherwise, mvL0j is not available. In this case, only the scaled motion vectors associated with the colocalized block of list 0 320 are considered. Information associated with order of priority is embedded in a sequence header, an image header, or a slice header. The predefined priority order can also be assigned to the scaled motion vectors for list 1 motion vector prediction. List 1 index of reference images RefIdxL1 can be either implicitly derived or explicitly passed in the bitstream. In the case where the current list is list 1 and RefIdxL1 = 0, the default priority order for the scaled motion vectors of Figure 10B is shown in Figure 12. The motion vector predictor or motion vector predictor candidate is determined according to the following priority order: • If mvL1j exists, then mvL1 = scaled mvL1j; • Otherwise if mvL0l exists, then mvL1 = scaled mvL0l • Otherwise if mvL1l exists, then mvL1 = scaled mvL1l, • Otherwise if mvL0j exists, then mvL1 = scaled mvL0j; • Otherwise, mvL1 is not available. Likewise, the list of a motion vector prediction (the current list is list 1 and RefIdxL1 = 0) can only consider scaled motion vectors associated with the colocalized block of list 0 320 (ie, motion vectors 1012 and 1018 ) according to a predefined priority order incorporated in the bitstream, or it can consider only the scaled motion vectors associated with the co-located block of list 1 330 (i.e., the motion vectors 1014 and 1016) according to an order predefined priority embedded in the bitstream. You can also choose two or three motion vectors scaled in Figure 11 or Figure 12 as candidates for the motion vector predictor or motion vector predictor candidate, and the motion vector predictor or motion vector predictor candidate for the current block is determined from the candidates according to a priority order. For example, candidates include two motion vectors 1002 and 1004 in Figure 11, and according to a priority order, motion vector 1002 will be considered before motion vector 1004. In another modality, candidates include three vectors of motion 1012, 1014, and 1016 of Fig. 12, and according to a priority order, motion vector 1012 will initially be considered, then motion vector 1014, and lastly motion vector 1016. As described earlier, while the example in Figure 11 considers all scaled motion vectors corresponding to list 0 and list 1 of co-located motion vectors as motion vector predictor or motion vector predictor candidate, the candidates can be limited to scaled motion vectors corresponding to the block co-located in list 0 or list 1 only. As before, the index of list 0 of reference images RefIdxL0 can be implicitly derived or explicitly passed in the bitstream. In the case that the current list is list 0 and RefIdxL0 = 0, only the scaled motion vectors associated with the block co-located in list 1 can be considered as candidates. An example of the predefined priority order for this process is shown in Figure 13. Information from the current list and list 0 can be embedded or defined in each prediction unit (PU). The motion vector predictor or motion vector predictor candidate is determined according to the following order of priority: • If mvL0l exists, then mvL0 = scaled mvL0l; • Otherwise if mvL1l exists, then mvL0 = mvL1l scaled, • Otherwise, mvL0 is not available. In the example above, if only the motion vectors associated with the block co-located in list 0 are considered as candidates, an example of the predefined order of priority for this case is shown in Figure 14. The motion vector predictor or predictor candidate of motion vector is determined according to the following priority order: • If mvL1j exists, then mvL0 = scaled mvL1j; • Otherwise if mvL0j exists, then mvL0 = mvL0j scaled; • Otherwise, mvL0 is not available. In the examples above, the motion vector predictor is either based on scaled motion vectors or unscaled motion vectors of list 0 and list 1 of colocalized motion vectors. The scaled motion vectors and the unscaled motion vectors of list 0 and list 1 of colocalized motion vectors can be combined to provide more options for better motion vector prediction. Figure 15A illustrates an example of motion vector prediction using scaled motion vectors and unscaled motion vectors. Both the scaled motion vectors 1002, 1004, 1006, and 1008 between the current block 310 and the reference image in list 0 with image id = j, and the unscaled motion vectors mvL0l, mvL1j, mvL0j, and mvL1l of the co-placed blocks in the future and past reference images are used as candidates for motion vector predictor or motion predictor vector candidate. Figure 15B illustrates an example of motion vector prediction using scaled motion vectors and unscaled motion vectors. Both scaled motion vectors 1012, 1014, 1016 and 1018 between the current block 310 and the reference image in list 1 with image id = 1, and unscaled motion vectors mvL1j, mvL0l, mvL1l and mvL0j are used as candidates for motion vector predictor or motion vector predictor candidate. Figure 16 illustrates an example of motion vector prediction using both scaled motion vectors and unscaled motion vectors of the co-located block in the future and past reference images as candidates for the motion vector predictor, as shown in Figure 15 A. Motion vector predictor candidates or motion predictor vector candidate include staggered motion vectors 1002, 1004, 1006, and 1008 between current block 310 and list 0 of reference images with image id = j, and unscaled motion vectors mvL0l , mvL1j, mvL0j, and mvL1l of blocks co-located in the future and past reference images are used as motion vector predictor candidates or motion predictor vector candidate. The determination of the predictor or predictor candidate is according to the following predefined priority order: • If mvL0l exists and the corresponding reference image (pic. id = i - 1) is in list 0 of the current image, then mvL0 = mvL0l (referencepic.id = i - 1); Otherwise, mvL0 = scaled mvL0l (referencepic. id = j); • Otherwise if mvL1j exists and the corresponding reference image (pic. id = w + i) is in list 0 of the current image, then mvL0 = mvL1j (referencepic. id = m + 1); Otherwise, mvL0 = scaled mvL1j (referencepic. id = j); • Otherwise if mvL0j exists and the corresponding reference image (pic. id = j) is in list 0 of the current image, then mvL0 = mvL0j (referencepic. id = j); Otherwise, mvL0j = scaled mvL0j (referencepic. id = j); • Otherwise if mvL1l exists and the reference image (pic. id = l) is in list 0 of the current image, then mvL0 = mvL1l (reference pic. id = l); Otherwise, mvL0 = scaled mvL1l (referencepic. id = j); • Otherwise, mvL0 is not available. Figure 17 illustrates an example of motion vector prediction using both scaled motion vectors and unscaled motion vectors of the co-located block in the future and past reference images as motion vector predictor candidates as depicted in Figure 15B. Motion vector predictor candidates or motion predictor vector candidate include scaled motion vectors 1012, 1014, 1016, and 1018 between current block 310 and the reference image in list 1 with image id = 1, and vectors of Unscaled motion mvL0l, mvL1l, mvL0j, and mvL1l of the co-located block in the future and past reference images are used as motion vector predictor candidates or motion predictor vector candidate. The determination of the predictor or predictor candidate is according to the following predefined priority order: • If mvL1j exists and the corresponding reference image (id = m + 1) is in list 1 of the current image, then mvL1 = mvL1j (referencepic .id = m+1); Otherwise, mvLX = scaled mvL1j (referencepic. id = l); • Otherwise if mvL0l exists and the corresponding reference image (id = i - 1) is in list 1 of the current image, then mvL1 = mvL0l (referencepic. id = i - 1); Otherwise, mvL1 = scaled mvL0l (referencepic. id = l); • Otherwise if mvL1l exists and the corresponding reference image (pic. id = l) is in list 1 of the current image, then mvL1 = mvL1l (reference pic. id = l); Otherwise, mvL1 = scaled mvL1j (referencepic. id = l); • Otherwise if mvL0j exists and the corresponding reference image (pic. id = j) is in list 1 of the current image, then mvL1 = mvL0j (referencepic. id = j); Otherwise, mvL1j = scaled mvL0j (referencepic. id = l); • Otherwise, mvL1 is not available. In the above examples of motion vector prediction according to a predefined priority order, a respective priority order is used in each example to illustrate the process of determining a motion vector predictor or a motion vector predictor candidate from motion vector candidates. The particular priority order used is in no way interpreted as a limitation to the present invention. A person skilled in the art can opt for different priority orders for motion vector candidates to implement the present invention. Furthermore, although the above examples illustrate that the motion vector order between candidates is determined according to a predefined priority order, the candidates priority order can also be performed according to an adaptive scheme. Adaptive priority ordering scheme can be based on the statistic of reconstructed motion vectors from previous blocks, current block partition type, motion vector correlation, motion vector directions, if motion vector direction scaled is interpolation or extrapolation, the distance of the motion vectors, or whether the motion vector traverses the current block, in the case of the temporal motion vector. Furthermore, the adaptive schema may also be based on a combination of two or more of the factors mentioned above. When the statistic of the reconstructed motion vectors from previous blocks is used for the adaptive scheme, the statistic can be related to the counts of the motion vector candidates as an example. The order of priority is adapted to the counts of motion vector candidates, where the motion vector candidate with a higher count will be assigned a higher priority for the motion vector predictor. When the current block partition type is used for the adaptive scheme, for example, if a current encoding unit of size 2Nx2N is split into two rectangular prediction units of size Nx2N and the current block is the left prediction unit, the motion vector with the greatest similarity to the left neighbor of the current coding unit will be assigned a higher priority, and if a current coding unit of size 2Nx2N is divided into two rectangular prediction units of size Nx2N and the current block is the right prediction unit, the motion vector with the greatest similarity to the top right neighbor of the current encoding unit will be assigned a higher priority. When correlation between motion vectors is used for the adaptive scheme, the motion vector with the highest correlation will be assigned a higher priority. For example, if two motion vectors in the priority list are exactly the same, the motion vector is considered to have the highest correlation. When motion vector direction is used for adaptive scheme, the motion vector that points to the direction of the target reference image, as an example, will be assigned a higher priority. When the scaled MV direction being interpolated or extrapolated is used for the adaptive scheme, the scaled motion vectors with the interpolation process, as an example, will be assigned a higher priority. When the distance between motion vectors is used for the adaptive scheme, a shorter temporal distance for the motion vector from a current block to the target reference image, as an example, will be assigned a higher priority. When the coverage of a temporal motion vector is used as the adaptive scheme, the motion vector coverage along a current block will be assigned a higher priority. Note that the invention can be applied to not only Inter mode, but also Skip, Direct, and Merge modes. In Inter mode, given a current list, a motion vector predictor is used to predict the motion vector of a PU, and a motion vector residue is transmitted. The present invention can be applied to derive motion vector predictor when motion vector competition scheme is not used or to derive motion vector predictor candidate when motion vector scheme is used. As for Skip, Direct, and Merge, they can be considered as special cases of Inter mode, where the motion vector residue is not transmitted and always inferred as zero. In such cases, the present invention can be applied to determine the motion vector when the motion vector competition scheme is not used or to derive the motion vector candidate when the motion vector scheme is not used. Motion vector prediction mode according to the present invention as described above can be implemented in different hardware, software codes, or a combination of both. For example, an embodiment of the present invention may have an integrated circuit on a video compression chip or program codes embedded in video compression software to perform the processing described herein. An embodiment of the present invention may also have program codes run in a digital signal processor (DSP) to perform the processing described herein. The invention may also involve a number of functions to be performed by a computer processor, a digital signal processor, a microprocessor, or a field programmable gate assembly (FPGA). These processors can be configured to perform specific tasks in accordance with the invention, by executing computer-readable software code or firmware code that defines the particular methods embodied in the present invention. Software code or firmware codes can be developed in different programming languages and different formats or styles. Software code can also be compiled for different target platform. However, the different code formats, styles and languages of software codes and other means of configuring code to perform the tasks according to the invention will not depart from the spirit and scope of the invention. The invention can be embodied in other specific forms without departing from its spirit or essential characteristics. The examples described are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is therefore indicated by the appended claims and not by the foregoing description. All changes that come within the meaning and equivalence range of the claims must be embraced within their scope.
权利要求:
Claims (34) [0001] 1. Method for deriving a motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for a current block in a current image, the method characterized in that it comprises: receiving two or more motion vectors of a temporal block in a first reference image in a first list selected from a group consisting of list 0 and list 1, wherein a first motion vector of said two or more motion vectors points to another reference image in the first list and a second motion vector of said two or more motion vectors points to a second reference image in a second list different from the first list; determining a candidate set based on said two or more motion vectors, wherein said determining the candidate set comprises deriving a first scaled motion vector from the first motion vector and a second scaled motion vector from the second motion vector for the pool of candidates; and determining the motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for the current block of the candidate set according to a priority order of said at least two motion vectors. [0002] 2. Method according to claim 1, characterized in that the order of priorities is a predefined order of priority. [0003] 3. Method according to claim 2, characterized in that the information associated with the predefined priority order is incorporated in a sequence header, an image header, or a slice header. [0004] 4. Method according to claim 1, characterized in that the order of priority is determined according to an adaptive scheme. [0005] 5. Method according to claim 4, characterized in that the adaptive scheme is based on a criterion selected from a group consisting of: statistic of reconstructed motion vectors from previous blocks, partition type of the current block, correlation of motion vectors, directions of motion vectors, distance of motion vectors, and whether motion vectors cross the current block in case of temporal motion vector. [0006] 6. Method according to claim 1, characterized in that the set of candidates is selected from said two or more motion vectors. [0007] 7. Method according to claim 1, characterized in that said determination of the set of candidates comprises deriving scaled motion vectors corresponding to said two or more motion vectors, wherein the set of candidates is selected from the staggered motion vectors or a combination of said two or more motion vectors and staggered motion vectors. [0008] 8. Method according to claim 7, characterized in that the set of candidates is selected from the scaled motion vectors. [0009] 9. Method according to claim 7, characterized in that the set of candidates is selected from said two or more motion vectors and the scaled motion vectors. [0010] 10. Method according to claim 7, characterized in that the order of priority is according to an adaptive scheme based on whether directions of scaled motion vectors are interpolation or extrapolation. [0011] 11. Method according to claim 1, characterized in that the order of priority is determined based on individual blocks. [0012] 12. Method according to claim 1, characterized in that the order of priority is determined based on an individual slice. [0013] 13. Method according to claim 1, characterized in that the first list corresponds to list 1 and the second scaled motion vector has a higher priority than the first scaled motion vector. [0014] 14. Method according to claim 1, characterized in that the first list corresponds to list 0 and the second scaled motion vector has a higher priority than the first scaled motion vector. [0015] 15. Method according to claim 1, characterized in that the first list corresponds to list 0 and the first scaled motion vector has a higher priority than the second scaled motion vector. [0016] 16. Apparatus for deriving a motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for a current block in a current image, the apparatus characterized in that it comprises: means for receiving two or more motion vectors of a temporal block in a first reference image in a first list selected from a group consisting of list 0 and list 1, wherein a first motion vector of said two or more vectors of motion points to another reference image in the first list and a second motion vector of said two or more motion vectors points to a second reference image in a second list different from the first list; means for determining a set of candidates based on said two or more motion vectors, said means for determining the set of candidates comprises means for deriving a first scaled motion vector from the first motion vector and a second motion vector scaled from the second motion vector to the candidate set; and means for determining the motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for the current block of candidate set according to a priority order of said two motion vectors. [0017] 17. Apparatus according to claim 16, characterized in that the order of priority is a predefined order of priority. [0018] 18. Apparatus according to claim 17, characterized in that the information associated with the predefined priority order is incorporated in a sequence header, an image header, or a slice header. [0019] 19. Apparatus according to claim 16, characterized in that the order of priority is determined according to an adaptive scheme. [0020] 20. Apparatus according to claim 19, characterized in that the adaptive scheme is based on a criterion selected from a group consisting of: reconstructed motion vector statistics from previous blocks, partition type of the current block, correlation of motion vectors, directions of motion vectors, distance of motion vectors, and whether motion vectors cross the current block in a temporal motion vector case. [0021] 21. Apparatus according to claim 16, characterized in that the set of candidates is selected from said two or more motion vectors. [0022] 22. Apparatus according to claim 16, characterized in that said means for determining the set of candidates comprises means for deriving scaled motion vectors corresponding to said two or more motion vectors, wherein the set of candidates is selected from the scaled motion vectors or a combination of said two or more motion vectors and the scaled motion vectors. [0023] 23. Apparatus according to claim 22, characterized in that the set of candidates is selected from the scaled motion vectors. [0024] 24. Apparatus according to claim 22, characterized in that the set of candidates is selected from said two or more motion vectors and the scaled motion vectors. [0025] 25. Apparatus according to claim 22, characterized in that the order of priority is according to an adaptive scheme based on whether directions of scaled motion vectors are interpolation or extrapolation. [0026] 26. Method for deriving a motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for a current block in an image, the method characterized in that it comprises: receiving two or more motion vectors of at least one temporal block in a first reference image of a first reference image list selected from a group consisting of list 0 and list 1, wherein a first motion vector of said two or more vectors a motion vector points to another reference image in the first list and a second motion vector of said two or more motion vectors points to a second reference image in a second list different from the first list; determining a candidate set based on said two or more motion vectors, wherein said determining the candidate set comprises deriving a first scaled motion vector from the first motion vector and a second scaled motion vector from the second vector of movement for the set of candidates; and determining the motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for the current block of the candidate set in accordance with a priority order of said at least two motion vectors; where information associated with the order of priority is incorporated into a sequence header, an image header, or a slice header. [0027] 27. Method according to claim 26, characterized in that the set of candidates is selected from said two or more motion vectors. [0028] 28. Method according to claim 26, characterized in that said determination of the set of candidates comprises deriving the scaled motion vectors corresponding to said two or more motion vectors, wherein the set of candidates is selected from of the scaled motion vectors or a combination of said two or more motion vectors and the scaled motion vectors. [0029] 29. Method according to claim 28, characterized in that the set of candidates is selected from the scaled motion vectors. [0030] 30. Method according to claim 26, characterized in that the first reference image list is a list 0 of reference images or a list 1 of reference images. [0031] 31. Apparatus for deriving a motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for a current block in an image, the apparatus characterized in that it comprises: means for receiving two or more motion vectors of at least one temporal block in a first reference image list selected from a group consisting of list 0 and list 1, wherein a first motion vector of said two or more motion vectors points to another reference picture in the first list and a second motion vector of said two or more motion vectors points to a second reference picture in a second list different from the first list; means for determining a set of candidates based on said two or more motion vectors, said means for determining the set of candidates comprises means for deriving a first scaled motion vector from the first motion vector and a second motion vector scaled from the second motion vector to the candidate set; and means for determining the motion vector predictor or motion vector predictor candidate or motion vector or motion vector candidate for the current block of the candidate set in accordance with the order of priority of said at least two vectors of movement; where information associated with the order of priority is incorporated into a sequence header, an image header, or a slice header. [0032] 32. Apparatus according to claim 31, characterized in that the set of candidates is selected from said two or more motion vectors. [0033] 33. Apparatus according to claim 31, characterized in that said means for determining the set of candidates comprises means for deriving the scaled motion vectors corresponding to said two or more motion vectors, wherein the set of candidates is selected from the scaled motion vectors or a combination of said two or more motion vectors and the scaled motion vectors. [0034] 34. Apparatus according to claim 33, characterized in that the set of candidates is selected from the scaled motion vectors.
类似技术:
公开号 | 公开日 | 专利标题 BR112012027263B1|2021-08-17|METHODS AND APPARATUS FOR DERIVING A TEMPORAL MOTION VECTOR PREDICTOR BR112013012832B1|2021-08-31|Spatial motion vector prediction method and apparatus BR112013007057B1|2022-01-04|METHODS AND APPARATUS TO DERIVE MOTION VECTOR PREDICTOR OR CANDIDATE MOTION VECTOR PREDICTOR FOR CODING OR DECODING A MOTION VECTOR OF A PU | CURRENT IN AN IMAGE JP5770277B2|2015-08-26|Method and apparatus for deriving motion vector / motion vector predictor candidates for inter mode, skip mode and merge mode CN107105286B|2020-01-21|Method and apparatus for deriving motion vector predictor CN107181959B|2020-02-18|Method and apparatus for deriving motion vector predictor JP4020789B2|2007-12-12|B-picture direct mode motion vector calculation method TWI737142B|2021-08-21|Method and apparatus of combined inter and intra prediction for video coding TWI738081B|2021-09-01|Methods and apparatuses of combining multiple predictors for block prediction in video coding systems TWI729497B|2021-06-01|Methods and apparatuses of combining multiple predictors for block prediction in video coding systems
同族专利:
公开号 | 公开日 AU2011278851A1|2012-11-29| WO2012006889A1|2012-01-19| CN102934434A|2013-02-13| US20120008688A1|2012-01-12| CN105704495B|2018-10-26| CA2794379C|2017-10-17| EP2559252A1|2013-02-20| BR112012027263A2|2016-07-26| CA2794379A1|2012-01-19| CN102934434B|2016-03-30| US20150326876A1|2015-11-12| CN105704495A|2016-06-22| AU2011278851B2|2014-05-08| JP5788979B2|2015-10-07| KR20150091271A|2015-08-10| EP2559252A4|2014-08-27| KR101709003B1|2017-02-23| KR20130021388A|2013-03-05| US9124898B2|2015-09-01| JP2013529877A|2013-07-22| US9961364B2|2018-05-01|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP0625853B1|1993-05-21|1999-03-03|Nippon Telegraph And Telephone Corporation|Moving image encoder and decoder| JPH10224800A|1997-02-07|1998-08-21|Matsushita Electric Ind Co Ltd|Motion vector coding method and decoding method| KR100252342B1|1997-08-12|2000-04-15|전주범|Motion vector coding method and apparatus| KR100364789B1|2000-02-28|2002-12-16|엘지전자 주식회사|Method for estimating motion and apparatus for the same| EP1152621A1|2000-05-05|2001-11-07|STMicroelectronics S.r.l.|Motion estimation process and system.| ES2687176T3|2001-11-06|2018-10-24|Panasonic Intellectual Property Corporation Of America|Encoding method of moving images and decoding method of moving images| KR100833228B1|2002-02-21|2008-05-28|삼성전자주식회사|Method for encoding motion image having fixed computational complexity and apparatus thereof| JP2004208259A|2002-04-19|2004-07-22|Matsushita Electric Ind Co Ltd|Motion vector calculating method| HUE044616T2|2002-04-19|2019-11-28|Panasonic Ip Corp America|Motion vector calculating method| EP2271106B1|2002-04-19|2016-05-25|Panasonic Intellectual Property Corporation of America|Motion vector calculating method| US7154952B2|2002-07-19|2006-12-26|Microsoft Corporation|Timestamp-independent motion vector prediction for predictive and bidirectionally predictive pictures| TWI224930B|2002-07-18|2004-12-01|Lg Electronics Inc|Calculation method for prediction motion vector| KR100506864B1|2002-10-04|2005-08-05|엘지전자 주식회사|Method of determining motion vector| US7567617B2|2003-09-07|2009-07-28|Microsoft Corporation|Predicting motion vectors for fields of forward-predicted interlaced video frames| ES2445333T3|2004-01-08|2014-03-03|Entropic Communications, Inc.|Distribution of candidate vectors based on local movement complexity| JP2007067731A|2005-08-30|2007-03-15|Sanyo Electric Co Ltd|Coding method| JP2007074592A|2005-09-09|2007-03-22|Sony Corp|Image processing apparatus and method thereof, program, and recording medium| JP4646777B2|2005-10-12|2011-03-09|冷化工業株式会社|Stir mixing system and concentration adjusting method using the system| FR2897213A1|2006-02-08|2007-08-10|Thomson Licensing Sas|BLOCK ENCODING METHOD OF IMAGES OF A SEQUENCE OF VIDEO IMAGES| US8428136B2|2006-03-09|2013-04-23|Nec Corporation|Dynamic image encoding method and device and program using the same| US20080037748A1|2006-07-13|2008-02-14|Iotum Corporation, A Delaware Corporation|Method of and System for Conference Calling| JP2008153907A|2006-12-18|2008-07-03|Renesas Technology Corp|Image encoding apparatus, information terminal including the same, and image encoding method| JP2008283490A|2007-05-10|2008-11-20|Ntt Docomo Inc|Moving image encoding device, method and program, and moving image decoding device, method and program| KR100883025B1|2008-01-25|2009-02-12|엘지전자 주식회사|Method of determining motion vector| EP2266318B1|2008-03-19|2020-04-22|Nokia Technologies Oy|Combined motion vector and reference index prediction for video coding| JP4977094B2|2008-06-25|2012-07-18|株式会社東芝|Image coding method| CN101686393B|2008-09-28|2012-10-17|华为技术有限公司|Fast-motion searching method and fast-motion searching device applied to template matching| KR101279573B1|2008-10-31|2013-06-27|에스케이텔레콤 주식회사|Motion Vector Encoding/Decoding Method and Apparatus and Video Encoding/Decoding Method and Apparatus| KR101647376B1|2009-03-30|2016-08-10|엘지전자 주식회사|A method and an apparatus for processing a video signal|KR100955396B1|2007-06-15|2010-04-29|성균관대학교산학협력단|Bi-prediction coding method and apparatus, bi-prediction decoding method and apparatus, and recording midium| KR20120016991A|2010-08-17|2012-02-27|오수미|Inter prediction process| US9602813B2|2010-08-18|2017-03-21|Sk Telecom Co., Ltd.|Image encoding/decoding device and method, and reference picture indexing device and method| KR101825768B1|2010-11-24|2018-02-05|벨로스 미디어 인터내셔널 리미티드|Motion vector calculation method, image coding method, image decoding method, motion vector calculation device and image coding/decoding device| CN107105289B|2010-12-13|2020-12-22|韩国电子通信研究院|Method for decoding video signal based on interframe prediction| US9049455B2|2010-12-28|2015-06-02|Panasonic Intellectual Property Corporation Of America|Image coding method of coding a current picture with prediction using one or both of a first reference picture list including a first current reference picture for a current block and a second reference picture list including a second current reference picture for the current block| EP3554079A1|2011-01-07|2019-10-16|LG Electronics Inc.|Method for encoding video information, method of decoding video information and decoding apparatus for decoding video information| CN106851306B|2011-01-12|2020-08-04|太阳专利托管公司|Moving picture decoding method and moving picture decoding device| US10404998B2|2011-02-22|2019-09-03|Sun Patent Trust|Moving picture coding method, moving picture coding apparatus, moving picture decoding method, and moving picture decoding apparatus| MX2013009864A|2011-03-03|2013-10-25|Panasonic Corp|Video image encoding method, video image decoding method, video image encoding device, video image decoding device, and video image encoding/decoding device.| US9288501B2|2011-03-08|2016-03-15|Qualcomm Incorporated|Motion vector predictorsfor bi-predictive inter mode in video coding| EP2687015A4|2011-03-14|2014-12-17|Mediatek Inc|Method and apparatus for deriving temporal motion vector prediction| CN107948657B|2011-03-21|2021-05-04|Lg 电子株式会社|Method of selecting motion vector predictor and apparatus using the same| MX2013010231A|2011-04-12|2013-10-25|Panasonic Corp|Motion-video encoding method, motion-video encoding apparatus, motion-video decoding method, motion-video decoding apparatus, and motion-video encoding/decoding apparatus.| US9247266B2|2011-04-18|2016-01-26|Texas Instruments Incorporated|Temporal motion data candidate derivation in video coding| PL2717573T3|2011-05-24|2018-09-28|Velos Media International Limited|Image encoding method, image encoding apparatus, image decoding method, image decoding apparatus, and image encoding/decoding apparatus| MX2013012132A|2011-05-27|2013-10-30|Panasonic Corp|Image encoding method, image encoding device, image decoding method, image decoding device, and image encoding/decoding device.| US9485518B2|2011-05-27|2016-11-01|Sun Patent Trust|Decoding method and apparatus with candidate motion vectors| SG194746A1|2011-05-31|2013-12-30|Kaba Gmbh|Image encoding method, image encoding device, image decoding method, image decoding device, and image encoding/decoding device| KR101889582B1|2011-05-31|2018-08-20|선 페이턴트 트러스트|Video encoding method, video encoding device, video decoding method, video decoding device, and video encoding/decoding device| US9313494B2|2011-06-20|2016-04-12|Qualcomm Incorporated|Parallelization friendly merge candidates for video coding| US9131239B2|2011-06-20|2015-09-08|Qualcomm Incorporated|Unified merge mode and adaptive motion vector prediction mode candidates selection| KR20120140592A|2011-06-21|2012-12-31|한국전자통신연구원|Method and apparatus for reducing computational complexity of motion compensation and increasing coding efficiency| KR102083012B1|2011-06-28|2020-02-28|엘지전자 주식회사|Method for setting motion vector list and apparatus using same| PL2728878T3|2011-06-30|2020-06-15|Sun Patent Trust|Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device| SG10201505808YA|2011-06-30|2015-09-29|Sony Corp|Image processing device and image processing method| US9819963B2|2011-07-12|2017-11-14|Electronics And Telecommunications Research Institute|Inter prediction method and apparatus for same| MX347793B|2011-08-03|2017-05-12|Panasonic Ip Corp America|Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus.| CN104883576B|2011-08-29|2017-11-14|苗太平洋控股有限公司|The method that prediction block is produced with AMVP patterns| RU2622849C1|2011-09-09|2017-06-20|Кт Корпорейшен|Method and device for decoding video signal| GB2556695B|2011-09-23|2018-11-14|Kt Corp|Method for inducing a merge candidate block and device using the same| MX2014003991A|2011-10-19|2014-05-07|Panasonic Corp|Image encoding method, image encoding device, image decoding method, and image decoding device.| US9762904B2|2011-12-22|2017-09-12|Qualcomm Incorporated|Performing motion vector prediction for video coding| US20130188716A1|2012-01-20|2013-07-25|Qualcomm Incorporated|Temporal motion vector predictor candidate| KR102030205B1|2012-01-20|2019-10-08|선 페이턴트 트러스트|Methods and apparatuses for encoding and decoding video using temporal motion vector prediction| EP2811743B1|2012-02-03|2021-03-03|Sun Patent Trust|Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding/decoding device| WO2013132792A1|2012-03-06|2013-09-12|パナソニック株式会社|Method for coding video, method for decoding video, device for coding video, device for decoding video, and device for coding/decoding video| US20130329007A1|2012-06-06|2013-12-12|Qualcomm Incorporated|Redundancy removal for advanced motion vector predictionin three-dimensionalvideo coding| US20130343459A1|2012-06-22|2013-12-26|Nokia Corporation|Method and apparatus for video coding| US10200710B2|2012-07-02|2019-02-05|Samsung Electronics Co., Ltd.|Motion vector prediction method and apparatus for encoding or decoding video| WO2014005280A1|2012-07-03|2014-01-09|Mediatek Singapore Pte. Ltd.|Method and apparatus to improve and simplify inter-view motion vector prediction and disparity vector prediction| KR101754999B1|2012-08-29|2017-07-06|브이아이디 스케일, 인크.|Method and apparatus of motion vector prediction for scalable video coding| US20140086328A1|2012-09-25|2014-03-27|Qualcomm Incorporated|Scalable video coding in hevc| US9491461B2|2012-09-27|2016-11-08|Qualcomm Incorporated|Scalable extensions to HEVC and temporal motion vector prediction| US9491459B2|2012-09-27|2016-11-08|Qualcomm Incorporated|Base layer merge and AMVP modes for video coding| MX339432B|2012-09-28|2016-05-26|Sony Corp|Image processing device and method.| KR102186605B1|2012-09-28|2020-12-03|삼성전자주식회사|Apparatus and method for encoding and decoding multi-view image| CN102883163B|2012-10-08|2014-05-28|华为技术有限公司|Method and device for building motion vector lists for prediction of motion vectors| CN103024397B|2013-01-07|2015-07-08|华为技术有限公司|Method and device for determining time domain motion vector predictor| EP3490259B1|2013-04-02|2021-10-13|InterDigital Madison Patent Holdings, SAS|Enhanced temporal motion vector prediction for scalable video coding| US9432685B2|2013-12-06|2016-08-30|Qualcomm Incorporated|Scalable implementation for parallel motion estimation regions| US9992512B2|2014-10-06|2018-06-05|Mediatek Inc.|Method and apparatus for motion vector predictor derivation| FR3029055B1|2014-11-24|2017-01-13|Ateme|IMAGE ENCODING METHOD AND EQUIPMENT FOR IMPLEMENTING THE METHOD| WO2016165069A1|2015-04-14|2016-10-20|Mediatek Singapore Pte. Ltd.|Advanced temporal motion vector prediction in video coding| US10999595B2|2015-11-20|2021-05-04|Mediatek Inc.|Method and apparatus of motion vector prediction or merge candidate derivation for video coding| JP2017146333A|2016-02-15|2017-08-24|船井電機株式会社|Display device| WO2017160209A1|2016-03-18|2017-09-21|Telefonaktiebolaget Lm Ericsson |Motion vector reconstruction order swap| EP3439303B1|2016-04-28|2020-12-30|LG Electronics Inc. -1-|Inter prediction mode-based image processing method and apparatus therefor| US20190158860A1|2016-05-13|2019-05-23|Sharp Kabushiki Kaisha|Video decoding device| US10136155B2|2016-07-27|2018-11-20|Cisco Technology, Inc.|Motion compensation using a patchwork motion field| US10721489B2|2016-09-06|2020-07-21|Qualcomm Incorporated|Geometry-based priority for the construction of candidate lists| WO2018105582A1|2016-12-09|2018-06-14|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ|Encoding device, decoding device, encoding method, and decoding method| WO2018105580A1|2016-12-09|2018-06-14|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ|Encoding device, decoding device, encoding method, and decoding method| WO2018105581A1|2016-12-09|2018-06-14|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ|Encoding device, decoding device, encoding method, and decoding method| EP3343925A1|2017-01-03|2018-07-04|Thomson Licensing|Method and apparatus for encoding and decoding motion information| KR20180098161A|2017-02-24|2018-09-03|주식회사 케이티|Method and apparatus for processing a video signal| CN110574377B|2017-05-10|2021-12-28|联发科技股份有限公司|Method and apparatus for reordering motion vector prediction candidate set for video coding| CN110662072A|2018-06-29|2020-01-07|杭州海康威视数字技术股份有限公司|Motion information candidate list construction method and device and readable storage medium| WO2020035029A1|2018-08-17|2020-02-20|Mediatek Inc.|Method and apparatus of simplified sub-mode for video coding| CN110933439A|2018-09-20|2020-03-27|杭州海康威视数字技术股份有限公司|Motion information candidate list construction method and device and readable storage medium| WO2020103943A1|2018-11-22|2020-05-28|Beijing Bytedance Network Technology Co., Ltd.|Using collocated blocks in sub-block temporal motion vector prediction mode| WO2020108168A1|2018-11-30|2020-06-04|华为技术有限公司|Video image prediction method and device| CN111263166A|2018-11-30|2020-06-09|华为技术有限公司|Video image prediction method and device| CN113261292A|2018-12-29|2021-08-13|北京字节跳动网络技术有限公司|Method for constructing default motion candidate in inter-frame prediction based on sub-blocks| WO2020143772A1|2019-01-10|2020-07-16|Beijing Bytedance Network Technology Co., Ltd.|Affine based merge with mvd| US10638130B1|2019-04-09|2020-04-28|Google Llc|Entropy-inspired directional filtering for image coding|
法律状态:
2017-07-11| B25A| Requested transfer of rights approved|Owner name: HFI INNOVATION INC. (CN) | 2018-03-27| B15K| Others concerning applications: alteration of classification|Ipc: H04N 19/577 (2014.01), H04N 19/109 (2014.01), H04N | 2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-04-07| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-06-01| B350| Update of information on the portal [chapter 15.35 patent gazette]| 2021-07-13| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-08-17| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 22/04/2011, OBSERVADAS AS CONDICOES LEGAIS. PATENTE CONCEDIDA CONFORME ADI 5.529/DF, QUE DETERMINA A ALTERACAO DO PRAZO DE CONCESSAO. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US36355710P| true| 2010-07-12|2010-07-12| US61/363,557|2010-07-12| US201161431454P| true| 2011-01-11|2011-01-11| US61/431,454|2011-01-11| US13/039,555|2011-03-03| US13/039,555|US9124898B2|2010-07-12|2011-03-03|Method and apparatus of temporal motion vector prediction| PCT/CN2011/073167|WO2012006889A1|2010-07-12|2011-04-22|Method and apparatus of temporal motion vector prediction| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|