专利摘要:
The aim of the invention is to calibrate an onboard stereo system in a reproducible manner and without being sensitive to the irregularities of the road or to the image noise produced by these cameras. To do this, the invention proposes to correlate the depth difference of a point of a scene observed by the system with respect to a supposedly flat scene and the corresponding yaw gap between the cameras of the system, then of deduce a calibration correction lace of the cameras. According to one embodiment, the comparison between the scene as observed and as expected consists in determining, by spatio-temporal filtering, a depth difference (AZa) between the observed depth of at least one point (P1a) a plane scene image formed in the image plane of a first camera (11) as positioned in the reference of the other camera, and the expected depth of that point (P2a) projected onto the plane stage (120) from the first camera (11), and then to determine the calibration gap between the cameras according to the depth deviation averaged over a sufficient set of points.
公开号:FR3026591A1
申请号:FR1459047
申请日:2014-09-25
公开日:2016-04-01
发明作者:Lucien Garcia
申请人:Continental Automotive GmbH;Continental Automotive France SAS;
IPC主号:
专利说明:

[0001] The present invention relates to a method for the extrinsic calibration of cameras used in a stereoscopic stereoscopic imaging system on board, in particular intended to equip a motor vehicle. In the field of motor vehicle safety, driver assistance systems can include visual assistance through on-vehicle stereoscopic systems. These systems are generally used to detect obstacles located in the upstream field of view of these vehicles. A stereoscopic system makes it possible to determine the distance between the vehicle and these upstream obstacles from two onboard cameras, arranged close to each other to provide pairs of stereo images to a digital processing unit. By analyzing the disparity between the images thus provided, the system makes it possible to identify obstacles and their distance to the vehicle with precision. The recognition of these obstacles is, moreover, brought to the attention of the driver by the driver assistance system. The reliability of the cameras can become decisive, for example when it comes to knowing in real time whether, in the absence of obstacles reported elsewhere, the road is free of obstacles. The accuracy depends on the calibration of the cameras and the knowledge of any variations from an initial calibration. Calibration of the cameras relates to the intrinsic parameters, such as the adjustment of their focal length or zoom, as well as their extrinsic parameters relating to the position of each camera with respect to the vehicle and the relative position of the cameras. one compared to the other. Each camera is initially calibrated intrinsically at the factory and, for the supplier, the intrinsic parameters are considered constant for the duration of use. Conventionally, one of the cameras being considered as the reference camera, the extrinsic calibration consists in adjusting the position and the rotation of this reference camera with respect to the vehicle and with respect to the rotation of the other camera, called slave camera. . The cameras must be set in position with respect to each other very precisely to avoid any perpendicularity of their position relative to their spacing. But it is difficult to assemble sufficiently precise to obtain a null offset in yaw and thus avoid this defect. In addition, the extrinsic parameters vary over time due to variations in the utilization parameters, in particular variations due to temperature or mechanical vibrations experienced by the cameras.
[0002] With reference to an OXYZ orthogonal coordinate system of a stereoscopic system, the calibration of the relative rotation of the cameras around the transverse axis OX (pitch angle), the longitudinal axis OZ (roll angle) and the elevation axis OY (angle yaw or "yaw" in English terminology) can be made advantageously by applying the epipolar constraint used in the search for the stereoscopic pairing of points in a so-called epipolar geometry space. This geometry establishes the relationships between the points of different images of the same scene (image points), made from different points of view, these image points corresponding to the projections in the image space of the cameras of the same point object of the scene.
[0003] The epipolar constraint makes it possible to limit the search, in a given image, of the image point of an object point on a so-called epipolar right projection line, whereas only the position of the image point in the other image is known. The epipolar constraint thus guides the construction of a stereoscopic image by the search for the points of pairing between each point of a first mono acquisition image, produced by a first camera, and the points of the epipolar line of the other image, produced simultaneously by the other camera. The epipolar geometry makes it possible to deduce, by simple relations, the corresponding image points in connection with their depth of field in order to reconstruct stereoscopic images, that is to say in three-dimensional vision.
[0004] However, pixelation of images has an impact on their quality. This impact is measurable for roll or pitch calibration because the detection of the calibration error is then directly measurable on the image. For example, a roll or pitch rotation of one degree will cause a 10 pixel difference in the image and it will be visible. But the yaw deviation can not be corrected on the image because the sensitivity is then too low: the projection gap on the epipolar line remains too low compared to the image noise - less than 0.1 pixel on average for a shift of one degree - and the epipolar constraint can not be exploited. To remedy this problem, and more generally to overcome the fault of precise assembly of the cameras - which more particularly results in the calibration in yaw - it could be envisaged to use additional external information, such as the speed of the vehicle or the depth of the scene from another sensor. For example, the use of a radar makes it possible to locate an object - for example a vehicle - at a given distance. The same vehicle is then observed with a first camera of the stereo system and calibrated at an angle with the other camera so that the vehicle is well at the given distance. However, the radar is not precise enough and therefore needs to take a significant number of landmarks. In addition, this radar generates an additional cost.
[0005] Other developments have been explored without the constraint of using a radar, using only the image processing system. Thus, the patent document FR 2 986 358 describes the calibration of a camera mounted on a vehicle from the capture of particular target points and marked on a test pattern. By solving a system of nonlinear equations with six unknowns, three components in translation and three components in rotation, a point of coordinates given in the image plane of the camera is then positioned in the scene. This solution is difficult to apply for the calibration of the lace between two cameras of a stereoscopic system because the complexity of the system does not make it possible to arrive at unambiguous determinations for two moving cameras from the same test pattern. International application WO 2011/079258 proposes to determine the real-time decibration of a multi-camera system, more particularly its extrinsic decalibration, and to re-calibrate it, from the correspondence between the observed data of an image. - by their measurements - and those planned according to the application of the previously set calibration. Data matching, which refers to typical object model characteristics, is stored as historical statistics of real-time measured alignment scores. However, this solution relates to systems of at least three multiple cameras and uses several model objects, such as circles, or a 3D model, such as a cylinder, to implement the method. The use of templates restricts the use of this method. In addition, the performance of the determination of the de-calibration, in particular of the derailed calibration for an onboard stereoscopic system, are not measurable. The object of the invention is to calibrate an on-board stereoscopic camera system in a reproducible manner and without being sensitive to the irregularities of the road or to the image noise produced by these cameras. To do this, the invention proposes to correlate the depth difference of a point of a scene observed by the system with respect to the supposedly flat scene and the corresponding yaw variation between the cameras of the system, then of deduce a calibration correction lace of the cameras. To this end, the subject of the present invention is an extrinsic calibration method of a first and a second camera of a stereoscopic stereoscopic image formation system embedded in a motor vehicle, comprising the following steps: intrinsic mono and extrinsic stereo calibrations of roll, pitch and yaw of cameras; acquisition of successive stereoscopic images of a scene in a field of view by reconstruction of three-dimensional images calibrated by the preceding calibration step, each stereoscopic image being reconstructed from left and right images of the first and second cameras produced simultaneously and digitally stored as pixels; filtering the stereoscopic images thus reconstructed; - comparison, by spatio-temporal filtering with a predetermined number of points, between a profile of the scene as observed by each reconstructed image and a linear profile of the scene as expected; - establishment of a calibration gap lace from the previous comparison; correction of the calibration calibration of the cameras from the lace calibration deviation determined by the preceding step; and iteration of the correction of the preceding step until a zero calibration difference is obtained by repeating the preceding steps. According to preferred embodiments: the comparison between the profiles of the scene as observed and as expected consists in determining a difference in depth of field between a depth of field of the scene as observed and a depth of field of at least one point of the linear profile scene as expected, and then determining the yaw calibration difference between the first and second cameras as a function of the depth deviation averaged over a predetermined set of points ; the observed depth of field of an image point of the scene formed by the first camera is determined according to a disparity between the images of the cameras; the distance between said first camera and the image point is equal to the product of the focal distance of the first camera and a distance between the cameras divided by the disparity between the images of the cameras; the expected depth is determined by a matrix of passage between a marker of said first camera and a mark of the vehicle; the lace calibration difference is, for each image point, proportional to the depth difference and inversely proportional to the square of the expected depth; the yaw calibration difference between the first and the second camera corresponds to a convergence, respectively a divergence between these cameras, according to whether the profile of the observed scene is generally inclined with a generally positive slope, respectively globally negative, with respect to the expected scene of globally zero slope profile; an elementary variation of lace angle dL between the first and the second camera as a function of an elementary depth difference dZ, between the same point as observed and as expected respectively of the profile of the observed scene and the linear profile of the expected scene, is given by the following formula: dL = (Z2 * (1 + Un2)) << b »a distance between the cameras," Z "the expected depth of the point in the linear profile scene , "A" a normalized transverse coordinate of an image point corresponding to the point observed in a two-dimensional image reference of the first camera (with Un =, "Ui" being a transverse coordinate of the image point in a plane image of the first camera, "UO" a coordinate centered in said image and "f" a focal length of said first camera). Other data, characteristics and advantages of the present invention will become apparent on reading the nonlimiting detailed description below, with reference to the appended figures which represent, respectively: FIG. 1, a logic diagram of the lace calibration according to the method of the present invention; Figures 2a and 2b are side and top views of a vehicle in which the cameras of a stereoscopic system are assembled on a boarding support; FIG. 3a, a side view of different road profiles, the profile assumed to be linear, the actual profile to be traveled by the vehicle, and the profile as observed by a stereoscopic system 30 embedded in the vehicle - through a first camera in the reference of the second camera - in the case of a convergent yaw defect between the cameras of the system; FIG. 3b, a side view of the different profiles of the road according to FIG. 3a, in the case of a yaw defect diverging between the cameras; and FIG. 4, a perspective view of the image plane of the first camera of the stereoscopic system. The flow diagram of FIG. 1 illustrates the implementation of the extrinsic calibration method according to the invention of a stereoscopic camera system embedded in a vehicle, from a step 10 of acquisition of images of the road observed by the system and a step of parallel acquisition of images of the road as expected (step 20), that is to say images of the road considered as flat. The onboard stereoscopic system includes, in addition to the cameras, a unit for processing data from these cameras in order to construct three-dimensional images.
[0006] Each image of the observed route (step 10) is derived from a three-dimensional calibrated 3D reconstruction image made from a large number of points - for example 100,000 points - seen by the cameras of the stereoscopic system. The depth of these points is determined from a map of disparities between the left and right images formed by the corresponding cameras.
[0007] To reconstruct each 3D image in step 30, an initial intrinsic mono and extrinsic stereo calibration acquisition step is filtered in step 40. Each observed and reconstructed 3D image of step 10 is then compared in depth with the corresponding expected image of step 20 of the supposedly planar road. Each expected image is formed by the reference two-dimensional 2D image of one of the cameras of the stereoscopic system, this same reference image serving to produce the corresponding 3D reconstructed image. The image of the road is supposed to be flat, following a spatio-temporal filtering with a predetermined number of points. Several points are thus used on several consecutive images in order to obtain an averaged profile of the image of the road. This predetermined number is at least equal to one but, taking into account the acquisition noise and the more or less irregular profile of the road, it is advantageous to take several points on several consecutive images. The number of useful images depends on the speed of the vehicle and the desired calibration accuracy. A depth difference is then established at step 50 from this comparison.
[0008] A prior calibration can be conducted to empirically establish the correlation between a depth deviation and the corresponding yaw angle deviation. A deterministic approach to this correlation is proposed below. The lace calibration difference due to a variation of yaw between the cameras is thus determined in step 60 and injected in step 61 in the form of image correction 35 in the reconstruction of the 3D image as established. in step 30. With each establishment of a new angle of calibration calibration gap, the calibration is successively corrected. This iteration is continued as the correction to be made in step 30 and determined in step 60 is non-zero. As illustrated by the side and top views of FIGS. 2a and 2b, the cameras 11 and 12 of the stereoscopic system are assembled on a boarding support 110 disposed on the upper part of the windshield 1a of the vehicle 1. in these figures, the marks 01X1Y1Z1, 02X2Y2Z2 and OXYZ, respectively centered on the optical centers 01, 02 and 0 of the first camera 11, the second camera 12 and the stereoscopic system, these marks being respectively linked to the cameras 11 and 12 and to the vehicle 1. The distance between the cameras 10 is measured by that between their optical centers 01 and 02, denoted by "b" in the figure. More particularly, the angular offset AL between the bidimensional sub-marks X101Z1 and X202Z2 of the cameras 11 and 12 (FIG. 2b) represents the yaw deviation between the cameras 11 and 12. In FIG. 2b, this yaw deviation AL is illustrated 15 between the axes 01X1 and 02X2 of said dimensional sub-marks, but could equally well be represented by the angular offset between the axes 01Z1 and 02Z2 which form the optical axes of the cameras 11 and 12. Or the depth difference is considered, according to the invention, as a developer of a laced calibration defect between the cameras due to this angular offset L. In reference to the lateral views of the profiles of the road according to FIGS. 3a and 3b, this depth difference is illustrated in FIG. Y101Z1 plane of the vehicle 1, perpendicular to the image plane of the first camera 11. In these figures, the actual profile 121 of the road defines on average - by spatio-temporal filtering - a linear profile such as Stretched 120 called reference. In these figures, also appears the profile of the road 130 as observed by the onboard stereoscopic system, according to a generally upward profile 13a (Figure 3a) or descending 13b (Figure 3b) corresponding to a defect of angle angle of the cameras , respectively converge and diverge, which generates the lace calibration defect. The yaw angle deviation of the cameras results from an installation error or variation of one or other of the cameras of the system, such as the camera 11, on their boarding support 110. And the inclination of the profile of the road as observed 13a or 13b in both cases (Figures 3a and 3b) is even stronger than the angle defect and therefore calibration lace is important. In the first case (FIG. 3a), any point P1a of the observed road profile 13a is seen by the stereoscopic system more closely than the corresponding point P2a of the linear reference profile 120 as expected, the points P1a and P2a. being on the right Da passing through the optical center "0" of the stereoscopic system. This optical center coincides, in side view according to Figures 2a and 2b, with the optical center 01 of the camera 11. This "approximation" of the point P1a results from the positive inclination of the profile 13a.
[0009] The difference in depth AZa between the points P1a and P2a is measured along the axis 01Z1. This difference AZa increases with the distance of the point P1a considered along the axis 01Z1, because of the inclination of the observed profile 13a relative to the linear reference profile 120. Obstacles can then appear in the field of view of the stereoscopic system on the real road corresponding to the profile 121 and can be excluded for example by filtering (step 40, FIG. 1). In the second case (FIG. 3b), any point P1b of the observed road profile 13b is seen by the stereoscopic system more distant than the corresponding point P2b of the linear reference profile 120, the points P1b and P2b being on the right Db passing, in projection on this lateral view, by the optical center 01 of the camera 11.
[0010] This "distance" from the point P1b is due to the negative inclination of the profile 13b. The difference in depth AZb between the points P1b and P2b is also measured along the OZ axis. This difference AZb increases with the distance of the point P1b considered because of the inclination of the observed profile 13b relative to the linear reference profile 120. It therefore appears important to be able to correct the calibration gap lace which is "convergent" or "diverge" from the stereoscopic system (due to vibration, initial calibration and / or thermal effects) and which induces respectively positive or negative inclination profiles. In addition to the empirical methods mentioned above, the invention also proposes a method of quantized correction of the yaw calibration by correlation between the yaw variation AL formed between the cameras - causing the yaw calibration deviation - and the depth deviation. AZ, namely AZa or AZb according to the two cases described above, which follows. To establish such a correlation, it is necessary to start from the positioning of an image point Pi of the scene observed in an image plane 11 of the camera 11, as illustrated by the perspective view of FIG. (symbolically indicated by a dotted line contour) has the optical center 01, center of the reference 01X1Y1Z1 defining an optical axis 01Z1, here assumed to be parallel to the roll longitudinal axis OZ of the stereoscopic system (FIGS. 2a, 2b), a transverse axis 01X1 , assumed here parallel to the pitch axis OX of this system and an elevation axis 01Y1, parallel to the yaw axis OY of the stereoscopic system. The camera 11 also has a focal length "f" equal to the distance between its image plane 11 and its optical center 01.
[0011] In the reference (U, V) of the image plane 11, the camera 11 forms the image point Pi of an object point Ps of the scene with Ui and Vi as coordinates, Ui being the transversal coordinate (parallel to the axis 01X1) and Vi the elevation coordinate (parallel to the axis (it-vo) 01Y1). Normalized coordinates Un = and Vn = of the point Pi are defined with reference to coordinates UO and VO of the main point PO, where the optical axis 01Z1 of the camera 11 pierces the image plane 11 perpendicularly. The other camera 12 of the stereoscopic system, illustrated similarly to the camera 11, forms - in a manner equivalent to the camera 11 - an image point P'i of the object point Ps of coordinates U'i and V'i in the reference (U ', V') of its image plane 12 of point / 0 principal P'0. The reference reference 02X2Y2Z2 of the camera 12 is centered on the optical center 02 of this camera, the axis 02Z2 forming its optical axis. An elementary variation of lace angle dL between the two cameras 11 and 12 of the system is then determined as a function of an elementary depth difference dZ - corresponding to the finite deviations 3, Za or 32b of FIGS. 3a and 3b - formed on the longitudinal axis 01Z1. The elementary depth difference dZ is measured by the difference between the depth Zi of the point Pi on the axis 01Z1 and the depth Z'i of the point P'i which is the image point of the point Ps as observed in the image plane 12 of this camera 12 by the epipolar method of stereoscopic image reconstruction. The correlation between an elementary variation of yaw angle dL and the corresponding elementary depth deviation dZ is then given by the formula: -dZ * b dL = (Z22) Y b being the distance between the optical centers of the cameras ( see Figure 2b). It thus appears that the variation of angle yaw dL depends on the depth Zs squared of the object point P's as expected in a scene considered as plane. This strong dependence induces the steepness of the road profiles as observed by the stereoscopic system embedded in the vehicle, in the case of convergent or divergent yaw angle between the cameras of the system (see Figures 3a and 3b). The invention is not limited to the examples described and shown. Thus, the invention can be applied to systems of more than two cameras using the method for each set of cameras of the system (pair, triplet, quadruplet, ...). Moreover, it is possible to use any three-dimensional image reconstruction method capable of producing disparity maps to determine the depths of the points of a scene from the images provided by the cameras of a stereoscopic system, for example local, global and semi-global methods according to the mode of determination of the matching scores, the cutting of the images and the mode of expression of the disparities.
[0012] Local methods rely on matching scores of each pair of pixels in each image obtained between pixels that immediately surround two pixels to be matched. Various correlation functions can be used (sum of the quadratic deviations, sum of the absolute deviations, standardized cross-correlation centered, etc.) to then determine the disparities of the paired pixels. For each pair of pixels analyzed, the disparity corresponding to the best score is selected. Global methods consist of optimizing an energy function defined over the entire reference image. The energy function defines the constraints that the disparity map must respect, for example the continuity of the disparity on the objects. Subsequently, all the disparities that minimize this energy function are sought. The graph-cut method and Belief Propagation are the most studied global methods. Semi-global methods are based on the same principle as global methods but on sub-parts of the image, ie lines or blocks. Splitting the problem of optimizing the energy function into sub-problems makes it possible to reduce the need for computation and memory resources compared to global methods.
权利要求:
Claims (8)
[0001]
REVENDICATIONS1. Method for the extrinsic calibration of a first (11) and a second camera (12) of a stereoscopic stereo imaging system embedded in a motor vehicle (1), comprising the following steps: - acquisition of intrinsic calibrations mono and extrinsic stereo roll, pitch and yaw cameras (step 31); acquisition of successive stereoscopic images of a scene in a field of view by reconstruction of three-dimensional images calibrated by the preceding calibration step, each stereoscopic image being reconstructed from left and right images of the first and second cameras ( 11, 12) produced simultaneously and digitally stored as pixels (step 30); filtering the stereoscopic images thus reconstructed (step 40); comparison, by spatio-temporal filtering with a predetermined number of points, between a profile of the scene (130; 13a, 13b) as observed by each reconstructed image (10) and a linear profile of the scene (120) such than expected (step 50); - establishing a calibration calibration gap (AL) from the previous comparison (step 60); correction of the lace calibration of the cameras (11, 12) from the lace calibration deviation determined by the preceding step (step 61); and iteration of the correction of the preceding step until a zero calibration difference is obtained by repeating the preceding steps (steps 31 to 61).
[0002]
2. Calibration method according to claim 1, characterized in that the comparison between profiles of the scene as observed and as expected consists in determining a difference in depth of field (AZa; AZb) between a depth of field at least one point (P1a, P1b) of the scene as observed (130; 13a, 13b) and a depth of field of at least one point (P2a; P2b) of the linear profile scene such as expected (120), then determining the lace calibration deviation (AL) between the first and second cameras (11, 12) as a function of the averaged depth deviation on a determined set of points.
[0003]
3. Calibration method according to the preceding claim, wherein the observed depth of field of an image point (Pi) of the plane scene formed by the first camera (11) is determined according to a disparity between the images of the cameras (11, 12).
[0004]
The calibration method according to claim 3, wherein the distance between said first camera (11) and the image point (Pi) is equal to the product of the focal length (f) of the first camera (11) and a distance (b) between the cameras (11, 12) divided by the disparity between the images of the cameras.
[0005]
5. Calibration method according to any one of claims 2 to 4, wherein the expected depth (Zs) is determined by a matrix of passage between a mark (01X1Y1Z1) of said first camera (11) and a mark (OXYZ) of the vehicle (1). 10
[0006]
Calibration method according to any one of claims 2 to 5, wherein the lace calibration difference (AL) is, for each image point (Pi), proportional to the difference in depth (AZa; AZb ) and inversely proportional to the square of the expected depth (Zs).
[0007]
7. Calibration method according to any one of claims 2 to 6, wherein the lace calibration difference (AL) between the first (11) and the second camera (12) corresponds to a convergence, respectively a divergence, between the optical axes (01Z1, 02Z2) of the cameras (11, 12), depending on whether the profile of the observed scene (130) is generally inclined with a generally positive slope (13a), respectively negative overall (13b), with respect to the expected generally sloping slope profile scene 20 (120).
[0008]
8. Calibration method according to any one of claims 2 to 7, wherein an elementary variation of lace angle dL between the first (11) and the second camera (12) as a function of an elementary depth deviation dZ, between the same point as observed (P1a; P1b) and as expected (P2a; P2b) respectively of the profile of the observed scene (130; 13a, 13b) and the linear profile of the expected scene (120), is given by the following formula dL = (Z2 * (1 + Un2)), << b »a distance between the cameras,« Z »the expected depth of the point (P2a; P2b) in the linear profile scene (120) , "A" a normalized transverse coordinate of an image point (Pi) corresponding to the point observed in a two-dimensional image reference of the first camera, with: Un = mi-uo), "Ui" being a transverse coordinate of image point in an image plane f of the first camera (11), "UO" a coordinate centered in said image and "f" a d focal istance of said first camera (11).
类似技术:
公开号 | 公开日 | 专利标题
FR3026591A1|2016-04-01|EXTRINSE CALIBRATION METHOD OF CAMERAS OF A TRAINING STEREO IMAGE TRAINING SYSTEM
EP2923330B1|2016-12-21|Method of 3d reconstruction and 3d panoramic mosaicing of a scene
FR3014553A1|2015-06-12|METHOD FOR ANGULAR CALIBRATION OF THE POSITION OF AN ON-BOARD VIDEO CAMERA IN A MOTOR VEHICLE
CN106370884B|2019-12-24|Vehicle speed measuring method based on binocular camera computer vision technology
EP1779677A1|2007-05-02|Method for the automatic calibration of a stereovision system
CN103424105B|2016-02-10|Method for checking object and device
FR3024127A1|2016-01-29|AUTONOMOUS AUTOMATIC LANDING METHOD AND SYSTEM
WO2012129421A2|2012-09-27|Dynamic stereo camera calibration system and method
KR101431373B1|2014-08-18|Apparatus for estimating of vehicle movement using stereo matching
EP1999484B1|2010-06-30|Obstacle detection
WO2011117539A1|2011-09-29|Method and equipment for detecting the presence and extent of defects in an optical component
EP2405399A1|2012-01-11|Method for calibrating the orientation of an on-board video camera
CN109444916A|2019-03-08|The unmanned travelable area determining device of one kind and method
FR3052581B1|2019-07-12|METHOD FOR MAKING A DEPTH CARD FROM SUCCESSIVE IMAGES OF A SINGLE CAMERA | EMBARKED IN A MOTOR VEHICLE
Jutzi et al.2013|Improved UAV-borne 3D mapping by fusing optical and laserscanner data
FR3053554A1|2018-01-05|METHOD FOR RAPID CALIBRATION OF A MOTOR VEHICLE CAMERA
EP2208963B1|2011-05-18|Method for measuring gaps and flushes
Bourzeix et al.2014|Speed estimation using stereoscopic effect
WO2018041978A1|2018-03-08|Device for determining a speed limit, on-board system comprising such a device, and method for determining a speed limit
FR2838514A1|2003-10-17|Motor vehicle parallactic telemetry system error correction method in which a distance measured using the system is compared with one independently determined using a relative velocity method so that corrections can be applied
FR3069689B1|2019-08-02|METHOD FOR ESTIMATING THE MOVEMENT OF A POINT IN AN IMAGE SEQUENCE
Lukić et al.2011|Stereoscopic vehicle speed measurement-System calibration and synchronization errors analysis
EP3008664B1|2019-10-23|Method and system for monitoringobjects in motion
FR3084864A1|2020-02-14|METHOD FOR DENSIFYING A DEPTH MAP
FR3078183A1|2019-08-23|METHOD FOR DETECTING AN OBJECT FROM AN INBOARD CAMERA IN A MOTOR VEHICLE
同族专利:
公开号 | 公开日
CN106716489A|2017-05-24|
US10672147B2|2020-06-02|
WO2016045764A1|2016-03-31|
CN106716489B|2020-08-25|
FR3026591B1|2016-10-21|
US20170287169A1|2017-10-05|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20080036576A1|2006-05-31|2008-02-14|Mobileye Technologies Ltd.|Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications|
JP2009182879A|2008-01-31|2009-08-13|Konica Minolta Holdings Inc|Calibrating apparatus and calibrating method|
DE102008008619A1|2008-02-12|2008-07-31|Daimler Ag|Method for calibrating stereo camera system, involves rectifying iteration of pair of images of stereo camera system and the pair of images is checked two times with different rectification parameters on pitch angle|
EP2541498A1|2011-06-30|2013-01-02|Harman Becker Automotive Systems GmbH|Method of determining extrinsic parameters of a vehicle vision system and vehicle vision system|
DE102004062275A1|2004-12-23|2006-07-13|Aglaia Gmbh|Method and device for determining a calibration parameter of a stereo camera|
WO2008075271A2|2006-12-18|2008-06-26|Koninklijke Philips Electronics N.V.|Calibrating a camera system|
DE102008040985B4|2008-08-05|2021-05-27|Robert Bosch Gmbh|Method for calibrating a multi-camera system|
FR2952743A3|2009-11-19|2011-05-20|Astrium Sas|METHOD OF ESTIMATING THE MOTION OF A SCREEN OBSERVING INSTRUMENT OVERLOOKING A CELESTED BODY|
US20110157373A1|2009-12-24|2011-06-30|Cognex Corporation|System and method for runtime determination of camera miscalibration|
JP5491235B2|2010-03-02|2014-05-14|東芝アルパイン・オートモティブテクノロジー株式会社|Camera calibration device|
CN101876533B|2010-06-23|2011-11-30|北京航空航天大学|Microscopic stereovision calibrating method|
US9237331B2|2011-01-18|2016-01-12|Disney Enterprises, Inc.|Computational stereoscopic camera system|
JP2013059016A|2011-08-12|2013-03-28|Sony Corp|Image processing device, method, and program|
JP5898475B2|2011-11-28|2016-04-06|クラリオン株式会社|In-vehicle camera system, calibration method thereof, and calibration program thereof|
FR2986358B1|2012-01-31|2014-03-07|Continental Automotive France|METHOD FOR CALIBRATING THE EXTRINSIC CHARACTERISTICS OF A SOLIDARITY CAMERA OF A WINDSHIELD OF A MOTOR VEHICLE|US10764561B1|2016-04-04|2020-09-01|Compound Eye Inc|Passive stereo depth sensing|
CN109242901B|2017-07-11|2021-10-22|深圳市道通智能航空技术股份有限公司|Image calibration method and device applied to three-dimensional camera|
CN108765484B|2018-05-18|2021-03-05|北京航空航天大学|Living insect motion acquisition and data reconstruction method based on two high-speed cameras|
DE102018215491A1|2018-09-12|2020-03-12|Robert Bosch Gmbh|Procedure for calibrating an acquisition system|
TWI680436B|2018-12-07|2019-12-21|財團法人工業技術研究院|Depth camera calibration device and method thereof|
WO2021150784A1|2020-01-21|2021-07-29|Compound Eye Inc.|System and method for camera calibration|
法律状态:
2015-09-22| PLFP| Fee payment|Year of fee payment: 2 |
2016-04-01| PLSC| Search report ready|Effective date: 20160401 |
2016-09-21| PLFP| Fee payment|Year of fee payment: 3 |
2017-09-28| PLFP| Fee payment|Year of fee payment: 4 |
2018-09-24| PLFP| Fee payment|Year of fee payment: 5 |
2019-09-25| PLFP| Fee payment|Year of fee payment: 6 |
2020-09-14| PLFP| Fee payment|Year of fee payment: 7 |
2021-09-21| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
申请号 | 申请日 | 专利标题
FR1459047A|FR3026591B1|2014-09-25|2014-09-25|EXTRINSE CALIBRATION METHOD OF CAMERAS OF A TRAINING STEREO IMAGE TRAINING SYSTEM|FR1459047A| FR3026591B1|2014-09-25|2014-09-25|EXTRINSE CALIBRATION METHOD OF CAMERAS OF A TRAINING STEREO IMAGE TRAINING SYSTEM|
US15/512,687| US10672147B2|2014-09-25|2015-08-24|Extrinsic calibration method for cameras of an on-board system for formation of stereo images|
PCT/EP2015/001726| WO2016045764A1|2014-09-25|2015-08-24|Extrinsic calibration method for cameras of an on-board system for formation of stereo images|
CN201580051885.0A| CN106716489B|2014-09-25|2015-08-24|Extrinsic calibration method for a camera of an onboard system for forming stereoscopic images|
[返回顶部]