专利摘要:
PERIODIC FIXED OBJECT DETECTION SYSTEM AND PERIODIC FIXED OBJECT DETECTION METHOD. It is a periodic fixed object detection system that extracts a characteristic point of a three-dimensional object from the image data in a predetermined region of a bird's eye view image for each of the multiple subregions in the predetermined region, calculates waveform data corresponding to a distribution of the characteristic points in the predetermined region in the bird's eye view image, and judges whether the three-dimensional object with the extracted characteristic point is a candidate periodic fixed object or not based on whether the peak information wave data is equal to or greater than a predetermined limit value.
公开号:BR112013007085B1
申请号:R112013007085-4
申请日:2012-02-17
公开日:2021-02-23
发明作者:Chikao Tsuchiya;Yasuhisa Hayakawa;Shinya Tanaka;Hiroyuki Furushou;Osamu Fukata
申请人:Nissan Motor Co., Ltd;
IPC主号:
专利说明:

Technique Field
[001] This invention relates to a periodic fixed object detection system and periodic fixed object detection method. Fundamentals of the Invention
[002] So far, an object detection system has been proposed, which creates a difference image from multiple captured images, captured by a camera, and when the shape of a region where the difference is present in the difference image changes in the direction of the main geometric axis of a direction in which the camera captures the images, judges that the difference shows a fixed three-dimensional object (see Patent Literature 1). Citation List Patent Literature Patent Literature 1: Publication of Japanese Patent Application N 2007129560 Summary of the Invention Technical problem
[003] The object detection system described in Patent Literature 1 judges whether there is only one fixed three-dimensional object. Therefore, it is difficult to distinguish and recognize three-dimensional fixed objects periodically present along a highway, such as landmarks, protection pillars, or utility poles (hereinafter referred to as periodic fixed objects), from another three-dimensional object.
[004] The present invention was made to solve the problem described above. An object of the present invention is to provide a periodic fixed object detection system and a periodic fixed object detection method, which enables high precision detection of periodic fixed objects. Solution to the Problem
[005] One aspect of the present invention is a periodic fixed object detection system for detecting a periodic fixed object in the vicinity of a moving object. The periodic fixed object detection system includes: an image capture device mounted on the moving object and which can capture an image of the moving object's surroundings; a point of view conversion unit configured to perform point of view conversion on the image captured by the image capture device to create an aerial view image; a feature point extraction unit configured to extract a feature point from a three-dimensional image data object in a predetermined region of the aerial view image for each subregion included in the predetermined region; a waveform data calculation unit for calculating waveform data corresponding to a distribution of the feature points extracted by the feature point extraction unit in the predetermined region of the aerial view image; peak information detection unit configured to detect peak information from waveform data; a candidate periodic fixed object detection unit configured to judge whether the three-dimensional object with the characteristic point extracted by the characteristic point extraction unit is a candidate periodic fixed object based on whether the peak information is equal to or greater than a first predetermined limit value; and a periodic fixed object judgment unit configured to determine that the candidate periodic fixed object is a periodic fixed object when the candidate periodic fixed object is detected by the candidate periodic fixed object detection unit and detection is made under a predetermined condition.
[006] Another aspect of the present invention is a periodic fixed object detection method for detecting adjacent periodic fixed objects of a moving object. The method of periodic fixed object detection includes: an image capture step of capturing an image of the vicinity of the mobile object using an image capture device mounted on the mobile object; a point of view conversion unit for performing point of view conversion on the image captured by the image capture device to create an aerial view image, a step of extracting a characteristic point of extracting a characteristic point of an object three-dimensional image data in a predetermined region of the aerial view image for each subregion included in the predetermined region; a waveform data calculation step of calculating waveform data corresponding to a feature point distribution extracted in the feature point extraction step in the predetermined region in the aerial view image; a peak information detection step of detecting peak information from the waveform data; a candidate periodic fixed object detection step to judge whether the three-dimensional object with the characteristic point extracted in the characteristic point extraction step is a candidate periodic fixed object based on whether the peak information is equal to or greater than a first limit value predetermined, and a periodic fixed object judgment step of determining that the candidate periodic fixed object is a periodic fixed object when the candidate periodic fixed object is detected in the candidate periodic fixed object detection step and detection is done under a predetermined condition . Brief Description of Drawings
[007] Figure 1 is a schematic configuration diagram of a periodic fixed object detection system according to a first embodiment of the present invention, which illustrates an example where the periodic fixed object detection system is mounted on a vehicle .
[008] Figure 2 is a top view to illustrate a state of operation of the vehicle itself illustrated in Figure 1.
[009] Figure 3 is a block diagram to illustrate the details of a calculator illustrated in Figure 1.
[010] Figure 4 illustrates top views to illustrate an overview of processing by an alignment unit illustrated in Figure 3; (a) illustrates a state of travel of the vehicle itself; and (b) illustrates an overview of alignment.
[011] Figure 5 illustrates views to illustrate the details of processing by a candidate route quantity calculation unit illustrated in Figure 3; (a) illustrates an image of difference PDt at time t; and (b) illustrates an image of difference PDt-1 at time t-1.
[012] Figure 6 is a flow chart to illustrate processing by the alignment unit and a three-dimensional object detection unit illustrated in Figure 3.
[013] Figure 7 is a flow chart to illustrate processing by a periodicity determination unit illustrated in Figure 3.
[014] Figure 8 is a graph illustrating a histogram generated by a counting unit illustrated in Figure 3.
[015] Figure 9 is a flow chart to illustrate processing by a path variation calculation unit and a periodic fixed object judgment unit illustrated in Figure 3.
[016] Figure 10 illustrates drawings to illustrate details of step S27 illustrated in Figure 9; (a) illustrates a case where another vehicle comes in front of the periodic fixed objects, (b) illustrates a histogram in the case of (a); (c) illustrates a case where another vehicle comes on the opposite side of the periodic fixed objects, and (d) illustrates a histogram in the case of (c).
[017] Figure 11 is a schematic configuration diagram of a periodic fixed object detection system according to a second embodiment of the present invention, which illustrates an example where the periodic fixed object detection system is mounted on a vehicle.
[018] Figure 12 is a view that illustrates a state of operation of the vehicle itself illustrated in Figure 11 and a captured variation of an image capture device.
[019] Figure 13 is a block diagram to illustrate the details of a calculator illustrated in figure 11.
[020] Figure 14 is a view to illustrate detailed operations of an edge distribution calculation unit, a counting unit, a candidate periodic fixed object detection unit, and a periodic fixed object judgment unit illustrated in Figure 13.
[021] Figure 15 is a flow chart to illustrate the details of a periodic fixed object detection method according to the second embodiment of the present invention.
[022] Figure 16 is a flow chart to illustrate the details of the periodic fixed object detection method according to the second embodiment of the present invention, illustrating the processing after Figure 15.
[023] Figure 17 is a block diagram to illustrate the details of a calculator for a periodic fixed object detection system according to a third embodiment of the present invention.
[024] Figure 18 illustrates graphs to illustrate detailed operations of an alignment unit in Figure 17.
[025] Figure 19 illustrates graphs to illustrate detailed operations of a difference calculation unit in Figure 17 (a) illustrates a difference in a case where an edge distribution waveform is derived from periodic fixed objects, and (b ) illustrates a difference in a case where an edge distribution waveform is derived from moving objects.
[026] Figure 20 is a flow chart to illustrate the details of a periodic fixed object detection method according to the third embodiment of the present invention, corresponding to Figure 16.
[027] Figure 21 is a flow chart to illustrate the details of a periodic fixed object detection method according to a modified example of the third embodiment of the present invention, corresponding to figure 16. Description of Modalities [First Mode]
[028] Next, the preferred embodiments of the present invention will be described based on the drawings. Figure 1 is a schematic configuration diagram of a periodic fixed object detection system 1 according to a first embodiment of the present invention, which illustrates an example where the periodic fixed object detection system 1 is mounted on a vehicle V The periodic fixed object detection system 1 shown in Figure 1 is configured to detect periodic fixed objects in the vicinity of the vehicle V itself, and specifically to detect fixed objects present periodically along a highway, such as landmarks, protection pillars, or utility poles. Incidentally, in the examples that follow, vehicle V itself will be described as an example of a moving object. However, the movable object is not limited to the V vehicle itself, and can be any other movable object such as a motorcycle or bicycle.
[029] The periodic fixed object detection system 1 includes a camera (image capture device) 10, a vehicle speed sensor 20 (speed detector) and a calculator 30. Camera 10 illustrated in Figure 1 is installed in a position at a height h and a rear part of the vehicle V itself in such a way that an optical geometric axis of it forms a downward angle θ with respect to a horizontal line. Camera 10 is configured to capture an image of a predetermined detection region from that position. The vehicle speed sensor 20 is configured to detect an operating speed of the vehicle V itself and calculates a speed from a wheel speed detected by a wheel speed sensor provided on a wheel to detect the number of revolutions, for example example. The calculator 30 is configured to detect a periodic fixed object in the vicinity of the vehicle V itself based on an image captured by camera 10 and a signal from the vehicle speed sensor 20.
[030] Figure 2 is a top view to illustrate a functioning state of the vehicle itself illustrated in Figure 1. As shown in Figure 2, camera 10 captures an image of an area behind the vehicle at a predetermined viewing angle a. In this case, camera 10 has a wide viewing angle a, and is capable of capturing an image of a track where the vehicle V itself is traveling as well as a highway where the periodic fixed object is present.
[031] Figure 3 is a block diagram to illustrate the details of the calculator 30 illustrated in Figure 1. It should be noted that, in Figure 3, the camera 10 and the vehicle speed sensor 20 are also illustrated to elucidate the relationship of connection.
[032] As shown in Figure 3, the calculator 30 includes a point of view conversion unit 31, an alignment unit 32, a three-dimensional object detection unit 33, a candidate path quantity calculation unit 34, a counting unit 35, a path variation calculation unit 36, a periodic fixed object judgment unit 37, and a lane change detection unit (lateral motion detection unit) 38.
[033] The point of view conversion unit 31 is configured to receive captured image data obtained through capture by camera 10, and to perform point of view conversion to convert the captured image data received to the image data of aerial view in a state of being seen from the bird's eyes. The state of being seen from bird's eyes refers to a state of being seen from a virtual camera that looks, for example, vertically downwards from the sky. Such a point of view conversion is performed as described, for example, in Patent Literature 1.
[034] Alignment unit 32 is configured to sequentially receive aerial view image data obtained through point of view conversion by point of view conversion unit 31, and to align positions of aerial view image data at times many different. Figure 4 illustrates top views to illustrate a processing overview of the alignment unit 32 shown in Figure 3; (a illustrates a path state of the vehicle V itself, and (b) illustrates an overview of alignment.
[035] As illustrated in Figure 4 (a), suppose that the vehicle V itself at a current time is located at V1 and has been located at V2 at a time a previous time segment. In addition, another V0 vehicle is located in a lateral region behind the V vehicle itself and running parallel to the V vehicle itself, assuming that another V0 vehicle at a current time is located in V01 and has been located in V02 at a time of a segment previous time. Furthermore, it is assumed that the vehicle V itself has moved a distance d in a time segment. It should be noted that the time in a previous time segment can be a time in a predetermined time period (for example, a control cycle) before the current time, or it can be a time in an arbitrary time period before the current time .
[036] In such a case, an aerial view image PB1 at the current time is illustrated in Figure 4 (b). In the PB1 aerial view image, a white line drawn on a highway has a rectangular shape and is in a state of being seen with relative precision from above. In addition, in a PBt-1 aerial view image at a time in a previous time segment, the white line drawn on the highway has a rectangular shape and in a state of relatively precisely seen from above. However, the other V0 vehicle located at V02 is tilted.
[037] The alignment unit 32 takes the PB1, PBt-1 aerial view images as described above in data alignment. In this case, the alignment unit 32 shifts the PBt-1 overhead image in time from a previous time segment to coincide with its position with the position of the overhead PB1 image at the current time. A displacement amount d 'is an amount corresponding to the travel distance d shown in Figure 4 (a), and is determined based on a signal from the vehicle speed sensor 20 and a time period from the moment of a previous time segment for the current time.
[038] The three-dimensional object detection unit 33 is configured to detect multiple three-dimensional objects from the data in a PDt difference image. Specifically, the three-dimensional object detection unit 33 finds a difference between the PBt, PBt-1 aerial view images, and creates data in the PDt difference image. Here, a pixel value of the PDt difference image can be defined by finding an absolute value of a difference between the pixel values of the PBt, PBt-1 aerial view images, or it can be set to “1” when the value absolute value exceeds a predetermined value and as “0” when it does not exceed the predetermined value to deal with a change in ambient lighting. In addition, the three-dimensional object detection unit 33 is configured to judge that three-dimensional objects are present in a region detected as "1" in the manner above in the data in the PDt difference image.
[039] Reference is again made to Figure 3. The candidate path quantity calculation unit 34 is configured to calculate the candidate path quantity of the multiple three-dimensional objects detected by the three-dimensional object detection unit 33. Figure 5 illustrates views for illustrate details of processing by the unit for calculating the amount of candidate path 34 shown in Figure 3; (a) illustrates an image of difference PDt, at time t; and (b) illustrates an image of difference PDt-1 at time t-1.
[040] First, the candidate route calculation unit 34 detects a grounding point (characteristic point) of a three-dimensional object from the data in the PDt-1 difference image at time t-1 as shown in Figure 5 (b) . The grounding point refers to a point of contact between the three-dimensional object and the road. In this case, the unit for calculating the amount of candidate path 34 detects, as a grounding point, the position closest to the three-dimensional object detected for camera 10 of the vehicle V. The unit for calculating the amount of candidate path 34 detects a grounding point in each region (sub-region) that has a three-dimensional object present in the data in the PDt-1 difference image, which is judged by the three-dimensional object detection unit 33.
[041] Specifically, the candidate path quantity calculation unit 34 detects a grounding point P1 for the three-dimensional object 01, detects a grounding point P2 for a three-dimensional object O2, and detects a grounding point P3 for a three-dimensional object O1. Then, the candidate path quantity calculation unit 34 fits a region T with a width W in the difference image PD1 at time t as shown in Figure 5 (a). In this case, the candidate path quantity calculation unit 34 adjusts the T regions in positions corresponding to the grounding points P1 to P3 in the data in PDt-1 at time t-1.
[042] Next, the candidate route calculation unit 34 detects a grounding point of a three-dimensional object from the data in the difference image PDt at time t. Also in this case, the candidate route calculation unit 34 detects a grounding point in each region (sub-region) with a three-dimensional object present in the data in the PDt difference image, which is judged by the three-dimensional object detection unit. 33. The candidate path quantity calculation unit 34 detects the position closest to the detected three-dimensional object, as a grounding point, for camera 10 of the vehicle V. Specifically, the candidate path quantity calculation unit 34 detects a grounding point P4 for the three-dimensional object O4, detects a grounding point P5 for a three-dimensional object O5, and detects a grounding point P6 for a three-dimensional object O6. In this way, the alignment unit 32, the three-dimensional object detection unit 33, and the candidate path amount calculation unit 34 function as a feature point extraction unit configured to extract a feature point (grounding point) from a three-dimensional object of the image data in a predetermined region of the aerial view image (image data in a posterior lateral region of the difference image) for each sub-region (each region judged to have a three-dimensional object present in the image data in the difference image) included in the predetermined region.
[043] In addition, the candidate route calculation unit 34 links the earthing points together. Specifically, the candidate path quantity calculation unit 34 links the grounding point P4 to the grounding point P1, links the grounding point P5 to the grounding point P1, and links the grounding point P6 to the grounding point P1 . Similarly, the candidate route calculation unit 34 links the grounding points P4 to P6 to the grounding points P1 and P3.
[044] Next, the candidate route quantity calculation unit 34 calculates distances (that is, candidate route quantity) between the grounding points P1 to P6 thus connected. Then, the candidate route quantity calculation unit 34 adjusts the calculated distances as candidate route quantity. In this way, the candidate path quantity calculation unit 34 calculates multiple candidate path quantities for each three-dimensional object. This prevents a problem of miscalculating the amount of travel of a periodic fixed object with similar image characteristics that appear periodically as a result of determining only the amount of travel of the three-dimensional object.
[045] It should be noted that the reason for the T region to be adjusted is that even if there is an error in the alignment of the aerial view images PBt, PBt-1 due to clamping, yawing, and the like in the V vehicle itself, the grounding points P1 to P6 are connected to each other in a stable manner. Additionally, the connection of the grounding points P1 to P6 is determined by a matching process in a luminance distribution around the grounding points of the aerial view images PBt, PBt-1.
[046] Reference is made again to Figure 3. Counting unit 35 is configured to count the amount of candidate route calculated by the unit for calculating the amount of candidate path 34, and a histogram (waveform data) is generated. through counting. For example, the counting unit 35 counts the value as “3” when the distance between the grounding point P1 and the grounding point P4, a distance between the grounding point P2 and the grounding point P5, and a distance between the grounding point P3 and the grounding point P6, are the same. In this way, by counting the amount of candidate path and generating a histogram, the counting unit 35 functions as a waveform data calculation unit configured to calculate waveform data (based on the relative positional relationship of the waypoints). grounding) corresponding to the distribution of the grounding points in the posterior lateral region of the difference image.
[047] The path variation calculation unit 36 is configured to calculate a path variation of the periodic fixed object in the aerial view image based on a camera capture interval 10 and a vehicle's own travel speed V detected by the vehicle speed sensor 20. Specifically, the travel variation calculation unit 36 calculates a travel variation that has a margin within a predetermined variation for the speed of the vehicle V. Here, the margin is, for example, ± 10 Km / h. Specifically, the path variation calculation unit 36 calculates a speed of a three-dimensional object that moves a pixel in a control cycle at approximately 5.5 km / h when the camera 10 capture interval is 33 ms and the current distance in a vehicle operating direction covered by a pixel is 5 cm. Considering that the accuracy of PBt, PBt-1 aerial view images is impaired by the movement of the vehicle, the ± 10 km / h margin is necessary to allow approximately 5.5 km / h.
[048] The periodic fixed object judgment unit 37 is configured to judge whether the multiple three-dimensional objects detected by the three-dimensional object detection unit 33 are periodic fixed objects. The periodic fixed object judgment unit 37 includes a candidate periodic fixed object detection unit 37a and a periodicity determination unit 37b. The periodic fixed object judgment unit 37 is configured to judge whether the multiple three-dimensional objects detected by the three-dimensional object detection unit 33 are periodic fixed objects based on the histogram generated by the counting unit 35, the path variation calculated by the calculation of path variation 36, candidate periodic fixed object (fixed objects that can be periodic fixed objects) detected by the candidate periodic fixed object detection unit 37a, and a periodicity calculated by the periodicity determination unit 37b.
[049] Next, a periodic fixed object detection method with reference to flowcharts will be described. Figure 6 is a flow chart to illustrate the processing by the alignment unit 32 and the three-dimensional object detection unit 33 illustrated in figure 3. First, the alignment unit 32 receives data in the aerial view images PBt, PBt-1 in times points detected by the point of view conversion unit 31 for alignment (S1). Then, the three-dimensional object detection unit 33 finds a difference between the data in the aerial view images PBt, PBt-1 aligned in step S1 (S2). Then, the three-dimensional object detection unit 33 performs binarization based on a predetermined value and creates data in the difference image PDt (S3). Therefore, processing is completed by the alignment unit 32 and the three-dimensional object detection unit 33.
[050] Figure 7 is a flowchart to illustrate the processing by the candidate periodic fixed object detection unit 37a and the periodicity determination unit 37b illustrated in Figure 3. Figure 8 is a graph that illustrates a histogram generated by the unit of count 35 shown in Figure 3. As shown in Figure 8, counting unit 35 counts the same candidate path quantities calculated between them. Specifically, as the multiple path quantities m1, m2, m3, m4 are detected in the example illustrated in Figure 8, these count values are high.
[051] As shown in Figures 7 and 8, the candidate periodic fixed object detection unit 37a first detects a maximum value M (peak value; peak information) of the histogram (S11). Then, the candidate periodic fixed object detection unit 37a establishes a predetermined threshold value Th1 based on the maximum value M detected in step S11 (S12). Here, the default limit value Th1 is set to 70% of the maximum value M. For example, when the count value of the maximum value M is "7", the default limit value Th1 is set to "4.9". As the predetermined limit value Th1 is obtained from the maximum value M of the counting values in this way, it is possible to establish an appropriate limit value even if the size of the counting values changes due to the positional relationship between the vehicle itself and the three-dimensional objects, condition sunlight, and the like. It should be noted that, in the present modality, the predetermined limit value Th1 is a value of 70% of the maximum value M, but is not limited to it.
[052] Next, candidate periodic fixed object detection unit 37a detects local maximum values M1 to M3 (peak values, peak information) equal to or greater than the predetermined limit value Th1 (S13). Here, the maximum value M is, for example, "7", the candidate periodic fixed object detection unit 37a detects the local maximum values M1 to M3 with a count value of "5" or greater. In this way, the candidate periodic fixed object detection unit 37a functions as a peak information detection unit configured to detect the peak information of the histogram (waveform data). In addition, the candidate periodic fixed object detection unit 37a judges whether the three-dimensional object with the detected ground point is a candidate periodic fixed object or not based on whether the peak information is equal to or greater than a predetermined limit value. . Specifically, for example, the candidate periodic fixed object detection unit 37a determines that three-dimensional objects linked to the candidate path amount corresponding to the local maximum values M and M1 to M3 (including the maximum value M) (for example, when a distance between two determined grounding points coincides with any of the maximum values M and M1 to M3, three-dimensional objects are two three-dimensional objects with the grounding points) are candidate periodic fixed object.
[053] Then, the periodicity determination unit 37b detects intervals (peak information) of the maximum values M and M1 to M3 (including the maximum value M), and casts a vote on the detected intervals (S14). Specifically, in the example illustrated in Figure 8, the number of votes for a D1 interval is "2", and the number of votes for a D2 interval is "1".
[054] Then the periodicity determination unit 37b determines whether or not there is a periodicity (S15). In that case, the periodicity determination unit 37b determines a periodicity based on whether the number of votes in step S14 is equal to or greater than or a predetermined number of votes. Here, the predetermined number of votes is half the number of three-dimensional objects detected from a PBt aerial view image. Consequently, when the number of three-dimensional objects detected from the PBt aerial view image is "4", the predetermined number of votes is "2". It should be noted that the predetermined number of votes is not limited to the above and can be a fixed amount.
[055] When it is judged that there is a periodicity (S15: YES), the periodicity determination unit 37b decreases the predetermined threshold value Th1 in step S12 (S16). Then, processing proceeds to step S17. Consequently, although the predetermined limit value Tg1 was, for example, 70% of the maximum value M, the predetermined limit value Th1 is configured to be 60% of the maximum value M or other values. In addition, the period during which the predetermined threshold value Th1 is decreased is around 1 second. Each time the periodicity is judged, the predetermined limit value Th1 is restored. In this way, it is judged whether or not there is periodicity from the positions where the maximum values M and M1 to M3 of the counting values appear, that is, the intervals. When it is judged that there is a periodicity, the predetermined limit value Th1 is decreased. Therefore, once a periodicity is judged, the judgment of periodic fixed objects can be facilitated. However, until a periodicity is judged, the predetermined limit value Th1 is not decreased. This can prevent wrong detection of three-dimensional objects due to misalignment or the like.
[056] However, when it is judged that there is no periodicity (S15: NO), processing proceeds to step S17 without decreasing the predetermined threshold value Th1.
[057] Thus, the periodicity determination unit 37b judges whether there is a periodicity based on the number of votes (peak information) related to the positions where the local maximum values M and M1 to M3 appear (interval), the maximum values locations M and M1 to M3 being equal to or greater than the predetermined limit value Th1 based on the maximum value M of the counting values for the number of candidate paths. This makes it possible to ignore a local maximum value, which is a relatively small value (for example, M4 reference signal in figure 8), and a periodicity can also be precisely judged with little noise influence.
[058] In step S17, the periodicity determination unit 37b judges whether lane change detection unit 38 detects lateral movement of a specific or longer distance (S17) or not. Specifically, the lane change detection unit 38 judges that a lateral movement of the specified distance or longer is detected when a rotation signal is turned on and a steering angle is equal to or higher than a specified specified angle of the vehicle speed detected by the vehicle speed sensor.
[059] If a lateral movement of the specified or longer distance is judged to be detected (S17: YES), when the predetermined threshold value Th1 has been decreased in step S16, the periodicity determination unit 37b initializes the decreased threshold value Th1 (S18). This makes it possible to properly detect periodic fixed objects according to a change in the environment after the track is changed. Then, the processing illustrated in Figure 7 is completed. However, when it is judged that no lateral movement of the specified or longer distance is detected (S17: NO), the processing illustrated in Figure 7 is completed without initializing the predetermined limit value Th1 .
[060] Figure 9 is a flow chart to illustrate processing by the path variation calculation unit 36 and the periodic fixed object judgment unit 37 illustrated in Figure 3. As shown in Figure 9, first, the variation calculation unit of travel 36 calculates a travel amount corresponding to the fixed objects (S21). Specifically, the path variation calculation unit 36 calculates a path variation of the periodic fixed object of the aerial view image based on the capture interval of the camera 10 and the vehicle's own travel speed V detected by the vehicle speed sensor 20. In this case, the path variation calculation unit 36 calculates a path variation with the margin within the predetermined variation for the speed of the vehicle V. Then, the periodic fixed object's judgment unit 37 determines that an object candidate periodic fixed object is a periodic fixed object when the candidate periodic fixed object is detected by the candidate periodic fixed object detection unit 37a and the detection is made under a predetermined condition. Specifically, the periodic fixed object judgment unit 37 judges whether the local maximum values M and M1 to M3 (reach the maximum point in the histogram) are present within the variation in the amount of travel calculated in step S21 (within the travel variation) (S22). When judging that none of the maximum values M and M1 to M3 is present within the variation of the path amount (S22: YES), the periodic fixed object judgment unit 37 judges that a periodic fixed object is present (that is, it judges that the candidate periodic fixed object detected by the candidate periodic fixed object detection unit 37a is a periodic fixed object) (S23). In other words, periodic fixed objects are often aligned at the same intervals, and a specific count value tends to be large. In addition, as the periodic fixed object is fixed, the count value of the candidate path quantity must be within the range variation established considering the speed of the moving object. Therefore, when it is judged to be “YES” in step S22, it can be said that several three-dimensional objects are periodic fixed objects. Then, the processing illustrated in figure 9 is completed.
[061] However, when judging that one of the maximum values M and M1 to M3 is not present within the range described above (S22: NO), the unit of judgment of the periodic fixed object 37 judges whether the unit of determination periodicity 37b determined that there was a periodicity (S24). When judging that the periodicity determination unit 37b did not determine that there was a periodicity (S24: NO), the periodic fixed object judgment unit 37 judges that the three-dimensional object is a moving object (S25). Then, the processing illustrated in Figure 9 is completed.
[062] When judging that the periodicity determination unit 37b has determined that there is a periodicity (S24: YES), the periodic fixed object judgment unit 37 detects a local non-periodic maximum value of the maximum values equal to or greater than the predetermined limit value Th1 (S26). The local maximum value is, for example, the local maximum value M3 illustrated in Figure 8. The range of this local maximum value M3 to an adjacent local maximum value differs from those of the local maximum values M, M1, M2. Therefore, the periodic fixed object judgment unit 37 determines that the local maximum value M3 is a non-periodic local maximum value without periodicity.
[063] In addition, when the local non-periodic maximum value is detected (S26: NO), there is a periodicity and no local non-periodic maximum value. Therefore, the periodic fixed object judgment unit 37 judges that a periodic fixed object is present (S23).
[064] However, when a non-periodic local maximum value is detected (26: YES), the periodic fixed object judgment unit 37 judges whether the periodic local maximum values M, M1, M2 are lower than the previous values ( S27). In this processing, the periodic fixed object judgment unit 37 calculates an average value of the periodic local maximum values M, M1, M2 in the present case, and also calculates an average value of the periodic local maximum values in the previous processing. Then, the periodic fixed object judgment unit 37 judges whether the average value in current processing is lower than the average value in previous processing by a predetermined or greater value.
[065] When judging that the local maximum periodic values M, M1, M2 are lower than the previous values (S27: YES), the periodic fixed object judgment unit 37 judges that another vehicle or similar comes between the vehicle itself V and periodic fixed objects, and detects the moving object (S25). Then, the processing illustrated in Figure 9 is completed.
[066] However, when judging that the local maximum periodic values M, M1, M2 are not lower than the previous values (S27: NO), the unit of judgment of the periodic fixed object 37 judges that another vehicle or similar will come in an opposite side from when seen from the vehicle V itself, and detects the periodic fixed object (S23). Then, the processing illustrated in Figure 9 is completed.
[067] Figure 10 illustrates drawings to illustrate the details of step S27 illustrated in Figure 9; (a) illustrates a case where another VO vehicle comes in front of periodic fixed objects; (b) illustrates a histogram in the case of (a). In addition, (c) illustrates a case in which the other VO vehicle comes on the opposite side of the periodic fixed objects, and (d) illustrates a histogram in the case of (c). It should be noted that, in Figures 10 (b) and (d), the dashed line represents a histogram before the arrival of the other vehicle, and the solid line represents a histogram after the arrival of the other vehicle.
[068] First, assume that the other VO vehicle comes in front of the periodic fixed objects as shown in Figure 10 (a). In this case, as the periodic fixed objects are blocked by the other VO vehicle, the count value of the maximum periodic local value tends to become smaller, as shown in Figure 10 (b). Particularly, when the other VO vehicle comes in front of the periodic fixed objects, the other VO vehicle may be present in a position where the V vehicle itself can change lanes. Therefore, in this case, a periodic fixed object judgment unit 37 detects the other VO vehicle (moving object).
[069] In comparison, suppose that the other VO vehicle comes on the opposite side to the periodic fixed objects, as shown in Figure 10 (c). In this case, the periodic fixed objects are not blocked by the other VO vehicle. Therefore, the count value of the periodic local maximum value is hardly influenced, and the count value of the periodic local maximum value does not become so small. When the other VO vehicle comes in the sludge opposite periodic fixed objects, the other VO vehicle is not present in a position where the V vehicle itself can change lanes. Therefore, in such a case, the periodic fixed object judgment unit 37 detects periodic fixed objects.
[070] In the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, a grounding point (characteristic point) of a three-dimensional object is extracted from the image data in an image of difference in a posterior lateral region (predetermined region) of an aerial view image for each region judged to have the presence of the three-dimensional object in the image data in the difference image (for each of the multiple subregions included in the predetermined region); a histogram (waveform data) corresponding to a distribution of ground points in the posterior lateral region is calculated in the aerial view image; and whether the three-dimensional object with the extracted ground point is a candidate periodic fixed object is judged based on whether the peak information (peak value, the number of votes related to the peak intervals, and so on) of the histogram is equal a or greater than a predetermined threshold value. Therefore, the periodic fixed object detection system 1 and the periodic fixed object detection method enable the clearer extraction of the periodicity (repetition) of the periodic fixed objects such as the peak information of the waveform data, and the fixed objects Candidate journals can be more easily extracted from the three-dimensional objects included in a captured image. In this way, more accurate extraction of periodic fixed objects is possible.
[071] Periodic fixed objects are often fixed objects with similar appearances and aligned at substantially equal intervals. When the image capture device captures an image such as periodic fixed objects while in motion, it is difficult to say that the elements of the periodic fixed objects in an earlier image correspond to which parts in the current image. In addition, in this case, it is also difficult to judge that the periodic fixed objects thus captured are fixed objects or objects in motion. In addition, periodic fixed objects can be mistakenly recognized as moving objects, depending on conditions such as the speed of travel of the moving object, the capture interval of an image capture device, and the spacing of periodic fixed objects.
[072] The periodic fixed object detection system 1 and the periodic fixed object detection method in accordance with the present modality enable more accurate extraction of periodic fixed objects from three-dimensional objects included in a captured image, and allow for erroneous recognition of periodic fixed objects as moving objects as described above.
[073] Periodic fixed objects generate regions of difference that are periodically present in a difference image. It is also difficult to calculate route quantities while linking these regions of periodic difference respectively to the corresponding parts in the previous image, and it is difficult to judge whether or not there are fixed objects.
[074] In the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, the candidate path quantities of the multiple detected three-dimensional objects are calculated, and the candidate path quantities are counted. Therefore the counting is performed although it is not clear which regions of periodic difference correspond to which parts in the previous image. Then, when it is judged that a count value within a path variation of a moving object between the count values of the candidate path quantities counted in this way is equal to or greater than the limit value Th1, it is judged that the multiple three-dimensional objects are periodic fixed objects. Here, periodic fixed objects are often aligned at the same intervals, and a specific count value tends to be large. In addition, since the periodic fixed objects are fixed, the count values of the candidate path quantities must be within the range variation established considering the speed of the moving object and so on. Therefore, when the specific count value within the range variation established considering the speed of the moving object and so on is equal to or greater than the predetermined limit value Th1, it can be said that the multiple three-dimensional objects are periodic fixed objects . Therefore, the highest precision detection of periodic fixed objects is possible.
[075] In addition, in the periodic fixed object detection system 1 and the periodic fixed object detection method according to the present modality, the multiple candidate path quantities are calculated for three-dimensional object. This makes it possible to avoid a problem of erroneous calculation of the amount of travel of periodic fixed objects with similar image characteristics that appear periodically as a result of determining only the amount of travel of three-dimensional objects.
[076] In addition, in the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, the predetermined limit value Th1 is obtained from the maximum value M of the count values thus counted. Therefore, it is possible to establish the limit value Th1 appropriately even if the size of the count values changes a lot due to the positional relationship between the moving object and the three-dimensional objects, sunlight condition, or the like.
[077] In addition, in the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, it is judged whether or not there is a periodicity from the positions where the local maximum values M appear and M1 to M3 of the count values thus counted. When it is judged that there is a periodicity, the predetermined limit value Th1 is decreased. Therefore, once the periodicity is judged, the judgment of the periodic fixed objects can be facilitated. However, until a periodicity is judged, the predetermined limit value Th1 is not decreased. This can prevent erroneous detection of three-dimensional objects due to misalignment or the like.
[078] In addition, in the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, a periodicity is judged from the positions where the local maximum values M and M1 to M3 appear equal to or greater than the predetermined limit value Th1 based on the maximum value M of the counting values. This makes it possible to ignore a local maximum value, which is a relatively small value and the periodicity can also be judged precisely with little noise influence.
[079] In addition, in the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, when a lateral movement of the specified or longer distance is detected and the predetermined limit value Th1 has decreased, the decreased Th1 limit value is initialized. Therefore, the limit value Th1 is initialized when the vehicle V itself changes lanes. This makes it possible to detect periodic fixed objects appropriately according to a change in an environment after a lane change.
[080] Additionally, in the periodic fixed object detection system 1 and in the periodic fixed object detection method according to the present modality, in a case of detection of local maximum value M3 equal to or greater than the predetermined limit value Th1 except local maximum values M, M1, M2 judged to have periodicity in current processing, when an average value of local maximum values M, M1, M2 judged to have periodicity in current processing is not less than an average value of maximum values sites judged to have a periodicity in the previous processing by the predetermined or greater value, judges that the multiple three-dimensional objects are periodic fixed objects. However, when the former is less than the latter by the predetermined or greater value, it judges that the object in motion is present.
[081] Furthermore, here, as the case of detecting the local maximum value M3 equal to or greater than the predetermined limit value Th1 except the local maximum values M, M1, M2 judged to have a periodicity in the current processing, it is conceivable a case, for example, when another vehicle or similar comes within the viewing angle. Such a conceivable case includes a case where another vehicle or similar comes on the opposite side of the periodic fixed objects when viewed from the vehicle V itself, and a case where another vehicle or similar comes on the front side.
[082] When another vehicle or similar comes on the opposite side, periodic fixed objects hardly influence the periodic local maximum values M, M1, M2, and the local non-periodic maximum value M3 tends to be detected. However, when another vehicle or similar comes on the front side, the periodic fixed objects are blocked by another vehicle or similar, and the count values of the local periodic maximum values M, M1, M2 tend to be lower.
[083] Therefore, when the average value of the local maximum values M, M1, M2 judged to have the periodicity in the current processing is not less than the average value of the local maximum values judged to have the periodicity in the previous processing by the predetermined value or larger, another vehicle or similar is located on the opposite side of the periodic fixed objects where the vehicle V itself cannot change lanes. Therefore, it is not necessary to detect a moving object such as another vehicle. However, when the average value of the local maximum values M, M1, M2 judged to have the periodicity in the current processing is less than the average value of the local maximum values judged to have the periodicity in the previous processing by the predetermined or greater value, another vehicle it is present in front of the periodic fixed objects where the vehicle V itself can change lanes. Consequently, the moving object is detected.
[084] Thus, the periodic fixed object detection system 1 and the periodic fixed object detection method according to the present modality can make an appropriate judgment according to a current phenomenon.
[085] It should be noted that, in the mode described above, the images captured at a present time and at a time in a previous time segment are converted into aerial views, and the PDt difference image is created by the alignment performance in the views aerial. However, the present invention is not limited to this. For example, only one image at a time in a previous time segment is converted to an aerial view; the converted aerial view is submitted to alignment and then converted back to an originally captured image; this image and an image at a present time can be used to create an image of difference. In other words, aerial views should not always be created explicitly, as long as an image at a current time and an image at a time in a previous time segment are aligned to create the PD1 difference image of a difference between the two images so aligned. [Second Mode]
[086] In the following, a second embodiment of the present invention will be described based on the drawings. It should be noted that those equivalent to what was described in the first modality receive the same reference signals, and their description is omitted.
[087] Figure 11 is a schematic configuration diagram of a periodic fixed object detection system 2 according to the present modality. The present modality describes an example where the periodic fixed object detection system 2 is mounted on the vehicle V. As illustrated in Figure 11, the periodic fixed object detection system 2 includes the camera 10 and a calculator 40.
[088] Figure 12 is a view to illustrate a captured variation of camera 10 illustrated in Figure 11, and so on. As shown in Figure 12, camera 10 is configured to capture an image of a side region behind the vehicle V itself at a predetermined viewing angle as in the first mode. The viewing angle a of camera 10 is adjusted so that the captured variation of camera 10 can include an adjacent lane or a roadside in addition to the lane where the vehicle V itself travels.
[089] The calculator 40 performs several processing in parts of detection regions A1, A2 of the periodic fixed objects in the captured image, captured by camera 10. In this way, the calculator 40 determines whether the three-dimensional objects present in the detection regions A1, A2 they are periodic fixed objects. The detection regions A1, A2 have a rectangular shape when viewed from above. The positions of the detection regions A1, A2 can be established based on the positions relative to the vehicle V itself, or they can be established based on the position of a white line by using the existing white line recognition technique or the like. The shape of the detection regions A1, A2 in an aerial view image is not limited to the rectangular shape. When the detection region is a rectangular region in real space, the shape of the detection regions A1, A2 in an aerial view image can be a trapezoidal shape.
[090] The sides of the detection regions A1, A2 next to the vehicle V itself (the sides along the direction of travel) are established as the ground lines L1, L2. The ground lines L1, L2. They mean where another VO vehicle present on a lane adjacent to the lane where the vehicle V itself travels and a periodic fixed object present along a roadside comes into contact with the ground.
[091] A distance in the direction of movement of the vehicle from a rear end part of the vehicle V to front end parts of detection regions A1, A2 is determined so that at least the detection regions A1, A2 are within the viewing angle a of the camera 10.
[092] In addition, an extension of each of the detection regions A1, A2 in the direction of the vehicle's movement and its width in an orthogonal direction The direction of the vehicle's movement are determined based on the size of a periodic fixed object a be detected. In the present embodiment, to distinguish a periodic fixed object from the other VO vehicle, the extension in the direction of movement of the vehicle is established as an extension that can include at least the other VO vehicle. In addition, the width in the direction orthogonal to the direction of movement of the vehicle has an extension that does not include the lanes (that is, second adjacent lanes), which are also adjacent to the adjacent lanes on the right and left sides in the aerial view image.
[093] Figure 13 is a block diagram to illustrate the details of the calculator 40 shown in Figure 11. As shown in Figure 13, the calculator 40 includes a point of view conversion unit 41, a distribution distribution calculation unit. edge 42, a counting unit 43, a candidate periodic fixed object detection unit 44, and a periodic fixed object judgment unit 45. It should be noted that the calculator 40 is a computer consisting of a CPU, a RAM, a ROM, and so on. The calculator 40 performs image processing and the like according to a program established in advance to thereby implement the function of each unit such as the point-of-view conversion unit 41, the edge distribution calculation unit 42, the counting unit 43, candidate periodic fixed object detection unit 44, and periodic fixed object judgment unit 45.
[094] Point of view conversion unit 41 is configured to receive image data captured in a predetermined region obtained through capture by camera 10. Point of view conversion unit 41 is configured to perform point of view conversion to convert the received capture image data to aerial view image data in a state being viewed from the bird's eyes. The state of being seen from the bird's eye refers to a state of being seen from a virtual camera that isles, for example, vertically downwards (or slightly obliquely downwards) from the sky.
[095] Figure 14 is a view to illustrate detailed operations of the edge distribution calculation unit 42, the counting unit 43, the candidate periodic fixed object detection unit 44, and the periodic fixed object judgment unit 45 It should be noted that although Figure 14 describes by illustration only the right side in the direction of movement of the vehicle including the detection region A1, the edge distribution calculation unit 42, the counting unit 43, the candidate periodic fixed object 44, and the periodic fixed object judgment unit 45 perform the same processing for the region on the left side in the direction of vehicle movement including detection region A2.
[096] As shown in Figure 13, the edge distribution calculation unit 42 includes an edge element extractor 42a and an edge distribution waveform calculator 42b. The edge element extractor 42a is configured to calculate the luminance difference in the overhead image data obtained by converting the point of view by the point conversion unit 41 to detect an edge-constituting element (hereinafter , referred to as the edge element (characteristic point) of a periodic fixed object included in an aerial view image The edge element extractor 42a calculates a luminance difference between two pixels near each of the multiple positions along a virtual line vertical that extends in a vertical direction in a real space.
[097] Specifically, the edge element extractor 42a establishes a first vertical virtual line corresponding to a line segment that extends in the vertical direction in real space and a second virtual line corresponding to a line segment that extends in the vertical direction in real space, in an aerial view image after the point of view conversion. The edge element extractor 42a sequentially obtains luminance differences along the first vertical virtual line between the points on the first vertical virtual line and the points on the second vertical virtual line.
[098] The edge distribution waveform calculator 42b is configured to add the number of edge elements extracted by the edge element extractor 42a for each vertical virtual line, and to calculate an edge distribution waveform with based on the number of border elements thus added.
[099] The operations of the edge element extractor 42a and the edge distribution waveform calculator 42b will be described more specifically.
[100] As shown in Figure 14, the edge element extractor 42a establishes multiple vertical virtual Lai lines (hereinafter referred to as Lai lines), which are line segments that extend in the vertical direction from the points on the L1 ground line. in real space and passes through detection region A1. The number of Lai lines of attention is not particularly limited. In the description below, a case description will be provided in which n lines of attention s Lai (i = a 1 to n) are established.
[101] In addition, the edge element extractor 42a establishes multiple vertical virtual lines Lri (hereinafter referred to as reference lines Lri), which respectively correspond to the multiple lines of attention Lai, are line segments that extend in the direction vertical of the points on the ground line L1 in real space, and pass through the detection region A1. Each reference line Lri is established in a position away from the line of attention Lai corresponding to the real space by a predetermined distance (for example, 10 cm). It should be noted that the lines corresponding to the line segments extend in the vertical direction in real space become lines that spread radially from a position Ps in camera 10 in the aerial view image.
[102] Next, the edge element extractor 42a establishes multiple Paj attention points on each Lai attention line. In the example shown in Figure 14, attention points Pa1 to Pa8 are established, but the number of attention points Paj is not particularly limited. In the description below, a case will be provided in which k points of attention Paj are established (j = 1 to k).
[103] In addition, the edge element extractor 42a establishes multiple reference points Prj in each reference line Lri, the reference points Prj respectively corresponding to the points of attention Paj. The points of attention Paj and the reference points Prj corresponding to each other are established at substantially the same height in real space. It should be noted that the attention points Paj and the reference points Prj must not always be the same height in an exact direction. Of course, it is a matter of some differences in height if they are allowed since the heights of the points of attention Paj and the reference points Prj can be considered the same.
[104] The edge element extractor 42a sequentially obtains luminance differences along each of the attention lines Lai between the attention points Paj and the corresponding reference points Prj with each other. In the example illustrated in Figure 14, the edge element extractor 42a calculates a luminance difference between the first attention point Pa1 and the first reference points Pr1, and calculates a luminance difference between the second attention points Pa2 and the second reference points Pr2. Then, similarly, the edge element extractor 42a sequentially obtains luminance differences between the third to eighth points of attention Pa3 to Pa8 and the third to eighth reference points Pr3 to Pr8.
[105] When the difference in luminance between an attention point Paj and a reference point Prj is equal to or greater than a predetermined value, the edge element extractor 42a determines that an edge element is present between the attention Paj and the reference point Prj. In this way, the edge element extractor 42a functions as a feature point extraction unit configured to extract an edge element (feature point) present along each of the multiple vertical virtual lines that extend in the vertical direction in real space. In other words, the edge element extractor 42a extracts a characteristic point (edge element) from a three-dimensional image data object in a predetermined region (detection region) of the aerial view image for each of the multiple subregions ( each region close to the vertical virtual lines) included in the predetermined region.
[106] The edge distribution waveform calculator 42b counts how many edge elements extracted by the edge element extractor 42a are present along a single Lai attention line. The edge distribution waveform calculator 42b stores the number of edge elements so counted as an attribute of each Lai attention line.
[107] The edge distribution waveform calculator 42b counts the edge elements for all Lai attention lines. It should be noted that the extensions of parts of the Lai attention lines that overlap with the A1 detection region differ from one another, depending on where each Lai attention line is located. The number of border elements thus counted can be normalized by dividing the number by extending an overlapping part of the corresponding Lai lines of attention.
[108] In an example illustrated in Figure 14, the other VO vehicle is displayed in detection region A1. Suppose that the Lai line of attention is established on a rubber part of a tire in the other VO vehicle, and that the reference line Lri is established at a position approximately 10 cm apart on a tire wheel. In this case, as the first points of attention Pa1 and the first reference points Pr1 are located on the same part of the tire, the difference in luminance between them is small. However, from the second to the eighth points of attention Pa2 to Pa8 are located on a rubber part of the tire, and from the second to the eighth reference points Pr2 to Pr8 are located on a part of the tire wheel, which increases the luminance differences between the same. When the luminance differences between the second to eighth points of attention Pa2 to Pa8 and from the second to eighth reference points Pr2 to Pr8 are equal to or greater than the predetermined value, the edge element extractor 42a detects that the border elements are present from the second to the eighth points of attention Pa2 to Pa8 and from the second to the eighth reference points Pr2 to Pr8. Additionally, since the number of the second to eighth points of attention Pa2 to Pa8 present along the line of attention Lai is 7, the edge element extractor 42a detects edge elements 7 times. In this case, the edge distribution waveform calculator 42b counts the value of the edge element as "7".
[109] In addition, the edge distribution waveform calculator 42b plots the edge element count value obtained for each Lai attention line, and obtains an edge distribution waveform (shape data wave). Specifically, the edge distribution waveform calculator 42b marks the count value of the border element in a plane where the vertical geometric axis represents the count value of the element and border, and the horizontal geometric axis represents the position of the line of attention Lai on the ground line L1 in real space. In a case where the attention lines La1 to Lan are established at equal intervals on the ground line L1 in real space, the edge distribution waveform can be obtained simply by aligning the count values of the edge elements obtained for each attention line Lai in the order of attention lines La1 to Lan. In the example illustrated in Figure 14, the attention line Lai established on the rubber part of the tire of the other vehicle VO has an edge element count value of “7 ”In the position that crosses the ground line L1 in the aerial view image.
[110] In this way, the edge distribution waveform calculator 42b functions as a waveform data calculation unit to add the number of edge elements extracted by the edge element extractor 42a for each of the multiple lines. that extend vertically in real space, and to calculate an edge distribution waveform (waveform data) based on the number of edge elements so added. In other words, the edge distribution waveform calculator 42b calculates waveform data (based on the relative positional relationship of the edge elements) corresponding to the distribution of the characteristic points (edge elements) in the predetermined region (region of detection) in the aerial view image.
[111] Counting unit 43 is configured to detect peaks of the edge distribution waveform calculated by the edge distribution waveform calculator 42b of the edge distribution calculation unit 42. The peak is a point in the form edge distribution waveform where the common value of the edge element changes from increase to decrease. The counting unit 43 detects peaks after the noise reduction processing is performed in the edge distribution waveform with, for example, a low frequency filter, a moving average filter, or the like. Here, the peak to be detected can be just a peak with a value equal to or greater than a predetermined limit value. The predetermined limit value can be set, for example, to a value of 60% of the maximum value of the edge distribution waveform.
[112] In addition, counting unit 43 counts the number of peaks aligned at equal intervals between the peaks thus detected (peak information). Specifically, the counting unit 43 calculates distances between the peaks thus detected, extracts peaks with the calculated peak distance within a predetermined value, and counts the number of the peaks. The “predetermined variation” for the peak distance can be a fixed value established in advance according to the type of periodic fixed objects to be detected, or it can be a variable value established based on the peak distances detected sequentially for a period of time. predetermined or longer. It should be noted that when detecting the peaks the counting unit of the edge distribution waveform, the counting unit 43 skips a peak detected at an earlier time in some cases. In this case, the peak interval is detected as being twice, three times the size, or greater than the size of the actual interval. Consequently, to prevent a mistake in calculating the peaks aligned at equal intervals, the “predetermined variation” is established to include values corresponding to multiple peak intervals that must be extracted first. For example, when the range of peaks that must first be extracted is X, the "predetermined range" for the peak distance is established as X ± 10%, 2X ± 20%, 3X ± 30%. In this way, the counting unit 43 functions as a peak information detection unit configured to detect peak information from the waveform data.
[113] Candidate periodic fixed object detection unit 44 is configured to determine whether a three-dimensional object having an extracted edge element is a candidate periodic fixed object based on whether the number of peaks (peak information) counted by the counting unit 43 is equal to or greater than a predetermined threshold value Th2. Specifically, when the number of peaks counted by the counting unit 43 is equal to or greater than the predetermined threshold value Th2, the candidate periodic fixed object detection unit 44 determines that the objects corresponding to the counted peaks are candidate periodic fixed objects. The limit value Th2 is a value determined according to the type of periodic fixed objects to be detected, such as milestones, protection pillars, or utility poles, for example, and can be determined through experiments or the like. Specifically, the limit value Th2 is set to a value of, for example, 3 to 100, both included.
[114] When a candidate periodic fixed object is continuously detected for a predetermined period of time, the periodic fixed object judgment unit 45 is configured to determine that the candidate periodic fixed object is a periodic fixed object. Specifically, when the candidate periodic fixed object detection unit 44 continuously detects for a predetermined period of time a state in which the number of peaks is equal to or greater than the predetermined threshold value Th2, the fixed object judgment unit periodical 45 believes that there is a sufficiently high possibility that the candidate periodic fixed object detected is a periodic fixed object. Then, the periodic fixed object judgment unit 45 determines that the objects corresponding to the counted peaks are periodic fixed objects. The “predetermined period of time” is a value determined according to the type of periodic fixed objects to be detected, and can be obtained through experiments or the like. The value can be a fixed value, or it can vary according to the capture interval of the camera 10 and the travel speed of the vehicle V. Specifically, the “predetermined period of time” is established, for example, 0.1 to 5 seconds.
[115] Next, the periodic fixed object detection method according to the present modality will be described. Figures 15 and 16 are flowcharts to illustrate the details of the periodic fixed object detection method, in accordance with the present modality. It should be noted that, in figures 15 and 16, a description of the target processing of the detection region A1 will be provided for convenience, however, the processing can also be performed in the detection region A2.
[116] As shown in Figure 15, first, in step S31, the point-of-view conversion unit 41 receives image data captured in a predetermined region obtained through the capture by camera 10, and performs point of view conversion on them to create aerial view image data.
[117] Then, in step S32, the edge distribution calculation unit 42 establishes n attention lines Lai, in the detection region A1, and establishes n reference lines Lri, which corresponds respectively to the n attention lines Lai ( i = 1 to n). The edge distribution calculation unit 42 establishes line segments that extend in the vertical direction of the points on the ground line L1 in real space like the Lai attention lines. In addition, the edge distribution calculation unit 42 establishes line segments, which extend in the vertical direction from the points on the ground line L1 in real space and are spaced away from the corresponding Lai attention lines in real space by a distance predetermined, like the reference lines Lri.
[118] Then, in step S33, the edge distribution calculation unit 42 sets k points of attention Paj on each of the lines of attention Lai, and sets k points of reference Prj (j = 1 ak) on each of the lines reference Lri, the k reference points Prj respectively correspond to the attention points Paj. The edge distribution calculation unit 42 performs the adjustments so that the points of attention Paj and the reference points Prj correspond to each other at substantially the same height in real space.
[119] Then, in step S34, the edge distribution calculation unit 42 determines whether the luminance difference between the attention point Paj and the reference point Prj corresponding to each other is equal to or greater than the predetermined value . When determining that the luminance difference is equal to or greater than the predetermined value, the edge distribution calculation unit 42 judges that an edge element is present between the attention point Paj and the reference points Prj thus judged, and replaces “1” with a count value (bincount (i)) of the i-th Lai attention line in step S35. In step S34, when determining that the luminance difference is less than the predetermined value, the edge distribution calculation unit 42 judges that no edge element is present between the attention point Paj and the reference point Prj thus judged , and proceeds the processing to step S36.
[120] In step S36, the edge distribution calculation unit 42 determines whether or not the processing in step S34 was performed at all Paj points in the currently processed Lai attention line. When determining that processing in step S34 is not performed at all points of attention Paj, the edge distribution calculation unit 42 returns processing to step S34, obtains luminance difference between the next point of attention Paj + 1 and the reference point Prj + 1, and determines whether the luminance difference is equal to or greater than the predetermined value. In this way, the edge distribution calculation unit 42 sequentially obtains the luminance differences between the attention points Paj and the reference points Prj along the attention line Lai in sequence. When the difference in luminance obtained becomes equal to or greater than the predetermined value, the edge distribution calculation unit 42 judges that an edge element is present.
[121] After replacing “1” for the count value (bicount (i)) of the i-th line of attention Lai in step S35, the edge distribution calculation unit 42 42 advances processing to step S37. Then, the edge distribution calculation unit 42 obtains a luminance difference between the next attention point Paj + 1 and the reference point Prj + 1, and determines whether the luminance difference is equal to or greater than the value predetermined. When determining that the luminance difference is equal to or greater than the predetermined value, the edge distribution calculation unit 42 judges that an edge element is present between the attention point Paj + 1 and the reference point Prj + 1 thus judged, and increases the count value (bincount (i)) of the i-th line of attention Lai in step S38.
[122] In step S37, when determining that the luminance difference is less than the predetermined value, the edge distribution calculation unit 42 judges that no element is present between the attention point Paj + 1 and the reference points Prj1 thus judged, and processing proceeds to step S39, skipping step S38.
[123] Then, in step S39, the edge distribution calculation unit 42 determines whether the processing in step S34 or S37 was performed at all points of attention Paj in the line of attention Lai currently processed. When determining that processing is not performed at all points of attention Paj, the edge distribution calculation unit 42 returns processing to step S37, obtains a luminance difference between the next point of attention Paj + 1 and the point reference value Prj + 1, and determines whether or not the luminance difference is equal to or greater than the predetermined value. In step S39, when determining that processing has been performed at all points of attention Paj, the edge distribution calculation unit 42 advances processing to step S41. In this way, the edge distribution calculation unit 42 counts how many edge elements are present along the same Lai attention line, and stores the number of edge elements thus counted as an attribute (bincount (i)) of the edge line. attention Lai.
[124] It should be noted that, in step S36, when determining that processing in step S34 has been performed at all points of attention Paj, the edge distribution calculation unit 42 judges that no edge elements are present in the line of attention Lai currently sued. Then, the edge distribution calculation unit 42 replaces “0” for bincount (i) in step S40, and advances por to step S41.
[125] Then, in step S41, the edge distribution calculation unit 42 determines whether the above processing was performed on all n attention lines Lai. Upon determining that processing has not been performed on all points of attention Paj, the edge distribution calculation unit 42 returns processing to step S34 and performs processing on the next line of attention Lai + 1. In step S41, when determining that processing has been performed on all lines of attention Lai, the edge distribution calculation unit 42 advances processing to step S42.
[126] Then, in step S42, the edge distribution calculation unit 42 graphs the bincount count value (i) (i = 1 an) of the edge element obtained for each of the n attention lines Lai, and get an edge distribution waveform. Specifically, the edge distribution calculation unit 42 marks the bincount count value (i) (i = 1 an) of the border element in a plane where the vertical geometric axis represents the count value of the border element, and the horizontal geometric axis represents the position of the Lai line of attention on the L1 ground line in real space.
[127] Subsequently, in step S43, the counting unit 43 detects edge distribution waveform peaks calculated by the edge distribution calculation unit 42.
[128] Subsequently, in step S44, the counting unit 43 calculates distances between the peaks thus detected.
[129] Subsequently, in step S45, counting unit 43 extracts peaks with the calculated peak distance within a predetermined range, and counts the number of peaks.
[130] Subsequently, in step S46, candidate periodic fixed object detection unit 44 determines whether or not the number of peaks counted by counting unit 43 is equal to or greater than the predetermined threshold value Th2. When determining that the number of peaks is equal to or greater than the predetermined threshold value Th2, candidate periodic fixed object detection unit 44 determines that the objects corresponding to the counted peaks are candidate periodic fixed object, and proceeds with processing to step S47.
[131] In step S47, the periodic fixed object judgment unit 45 judges whether a state in which the number of peaks is equal to or greater than the predetermined threshold value Th2 is detected sequentially at predetermined times or more. When judging that the state in which the number of peaks is equal to or greater than the predetermined limit value Th2 is sequentially detected, the predetermined times or more, the periodic fixed object judgment unit 45 judges that the objects corresponding to the counted peaks they are periodic fixed objects, and replace “1” with a f_shuki flag in step S48. However, in step S47, when judging that the state in which the number of peaks is being equal to or greater than the predetermined threshold value Th2 is not sequentially detected, the predetermined times or more, the judgment unit of the periodic fixed object 45 jumps step S48, and maintains the value of the f_shuki flag. Then, the processing is determined in Figures 15 and 16.
[132] In step S46, when determining that the number of peaks is less than the predetermined threshold value Th2, the candidate periodic fixed object detection unit 44 advances processing to step S49.
[133] In step S49, the periodic fixed object judgment unit 45 judges whether a state in which the number of peaks is being less than the predetermined threshold value Th2 is sequentially detected or predetermined times. When judging that the state in which the number of peaks is being less than the predetermined limit value Th2 is sequentially detected, the predetermined times or more, the periodic fixed object judgment unit 45 judges that the objects corresponding to the counted peaks are not objects fixed, and replace “0” with the f_shuki flag in step S50. However, in step S49, when judging that the state in which the number of peaks is being less than the predetermined threshold value Th2 is not sequentially detected, the predetermined times or more, the judgment unit of the periodic fixed object 45 skips step S50 and keeps the value of the f_shuki flag. Then, the processing in Figures 15 and 16 is completed.
[134] In the periodic fixed object detection system 2 and in the periodic fixed object detection method according to the present modality, an edge element (characteristic point) of a three-dimensional object is extracted from the image data in a predetermined region an aerial view image for each of the regions (sub-regions) next to the multiple vertical virtual lines in the predetermined region; an edge distribution waveform (waveform data) corresponding to a distribution of the edge elements in the predetermined region is calculated; and whether the three-dimensional object with the edge element extracted and is judged based on whether the number of peaks (peak information) of the edge distribution waveform a candidate periodic fixed object is equal to or greater than a predetermined limit value . This, similarly to the first modality, allows the clearer extraction of the periodicity (repetition) of periodic fixed objects because the peak information of the waveform data, and the candidates for periodic fixed object can be extracted more easily from the included three-dimensional objects captured image. In this way, more accurate extraction of periodic fixed objects is possible.
[135] In addition, the periodic fixed object detection system 2 and the periodic fixed object detection method according to the present modality, the number of edge elements present along each of the multiple vertical virtual lines that extend in the vertical direction in a real space it is added, and the edge distribution waveform is obtained based on the number of edge elements thus added. In addition, when the number of peaks of the edge distribution waveform is equal to or greater than the predetermined threshold value Th2 it is determined that the three-dimensional object with the extracted edge element is a candidate for periodic fixed object. Therefore, a case in which the edges extending in the vertical direction are aligned with high density is certainly detected without determining that the detected three-dimensional objects are fixed objects or objects in motion. Therefore, it is possible to more easily detect candidates for periodic fixed objects that are likely to be periodic fixed objects.
[136] Particularly, in the periodic fixed object detection system 2 and in the periodic fixed object detection method according to the present modality, the number of peaks aligned at equal intervals is counted between the peaks of the distribution waveform. edge. Consequently, it is also possible to reliably detect candidates for periodic fixed objects with edges that extend in the vertical direction and aligned with high density at equal intervals and are probably periodic fixed objects.
[137] In addition, the periodic fixed object detection system 2 and the periodic fixed object detection method according to the present modality, when a candidate for periodic fixed object is continuously detected for a predetermined period of time, is determined that the candidate for a periodic fixed object is a periodic fixed object. Consequently, erroneous detection due to noise is prevented, and periodic fixed objects can be detected with greater reliability. [Third Mode]
[138] In the following, a third embodiment of the present invention will be described based on the drawings. It should be noted that the equivalences to what was described in the first and second modalities are indicated by the same reference signs, and their description is omitted.
[139] A periodic fixed object detection system 3 according to the present modality has the same schematic configuration as the periodic fixed object detection system 1 illustrated in Figure 1, but includes a calculator 40 'in place of the calculator 30. Specifically , the periodic fixed object detection system 3 according to the present embodiment includes the camera 10, the vehicle speed sensor 20, and the calculator 40 '.
[140] Figure 17 is a block diagram to illustrate the details of the calculator 40 'in accordance with the present embodiment. As shown in Figure 17, the calculator 40 'includes the point of view conversion unit 41, the edge distribution calculation unit 42, a counting unit 43', the candidate periodic fixed object detection unit 44, a alignment unit 51, a difference calculation unit 52, and a periodic fixed object judgment unit 53. It should be noted that the calculator 40 'is a computer consisting of a CPU, a RAM, a ROM, and so on . The calculator 40 'performs image processing and the like according to a program established in advance to implement the function of each unit such as the point conversion unit 41, the edge distribution calculation unit 42, the count 43 ', the candidate periodic fixed object detection unit 44, the alignment unit 51, the difference calculation unit 52, and the periodic fixed object judgment unit 53.
[141] The counting unit 43 'according to the present embodiment is configured to detect peaks of an edge distribution waveform calculated by the edge distribution calculation unit 42, and to count the number of peaks. Counting unit 43 'is different from counting unit 43 according to the second embodiment in which counting unit 43' counts the number of peaks without excluding peaks with a peak distance outside a predetermined range.
[142] Candidate periodic fixed object detection unit 44 is configured to determine whether objects corresponding to peaks are candidates for periodic fixed object or not based on whether the number of peaks (peak information) counted by counting unit 43 ' is equal to or greater than a predetermined threshold value Th3. Specifically, when the number of peaks counted by the counting unit 43 'is equal to or greater than the predetermined threshold value Th3, the candidate periodic fixed object detection unit 44 determines that the corresponding objects are candidates for periodic fixed object . The limit value Th3 is a value determined according to the type of periodic fixed objects to be detected, such as milestones, protection pillars, or utility poles, for example, and can be obtained through experiments or the like. Specifically, the threshold value Th3 is adjusted to be a value of, for example, 3 to 100 including both.
[143] Figure 18 illustrates graphs to illustrate detailed operations of the alignment unit 51. The alignment unit 51 is configured to sequentially receive edge distribution waveforms calculated by the edge distribution calculation unit 42, and to align positions of the edge distribution waveforms received at different times based on the travel speed of the vehicle V itself detected by the vehicle speed sensor 20. For example, assuming the alignment unit 51 receives an edge distribution waveform Et- Δt calculated by the edge distribution calculation unit 42 at time t- Δt (second time) and an Et edge distribution waveform calculated at time t (first time). Furthermore, suppose that as the vehicle V itself moves for a second time (Δt), an edge distribution waveform moves δ with respect to a coordinate system. In this case, the alignment unit 51 moves the Et edge distribution waveform by δ along the horizontal geometric axis, as shown in Figure 18 to thereby match the position of the Et edge distribution waveform with the position of the Et- Δt edge distribution waveform. In this way, the alignment unit 51 obtains an edge distribution waveform Et '. Here, the alignment of the positions of the edge distribution waveforms means when a Lai attention line corresponding to a point (for example, G1) in the edge distribution waveform and a Lai attention line corresponding to a point ( for example, G2) in the other edge distribution waveform are present in the same or approximately the same position in real space, the edge distribution waveform (s) are moved in parallel so that the horizontal coordinate values of point G1 and point G2 coincide with each other. Incidentally, the length of a time segment (Δt) can, for example, be a predetermined time period such as a control cycle, it can be an arbitrary time period.
[144] Figure 19 illustrates graphs to illustrate detailed operations of the difference calculation unit 52. The difference calculation unit 52 is configured to receive the Et- Δt edge distribution waveform and the distribution waveform of the difference. Et 'edge calculated by the alignment unit 51, and to calculate a distribution of absolute values of a difference between them | Et-Δt - Et' |. When the edge distribution waveform is derived from a periodic fixed object, the Et- Δt edge distribution waveform also coincides with the Et 'edge distribution waveform. Therefore, the absolute values of the difference | Et-Δt - Et '| are small general values as shown in Figure 19 (a). However, when an edge distribution waveform is derived from moving objects, the Et- Δt edge distribution waveform does not match the Et 'edge distribution waveform and the absolute values of the difference | Et -Δt - Et '| changes a lot compared to Figure 19 (a) as illustrated in Figure in Figure 19 (b).
[145] The periodic fixed object judgment unit 53 is configured to integrate absolute values | Et-Δt - Et '| calculated by the difference calculation unit 52 to calculate an integrated value ID1 (first integrated value) of the same and to calculate an integrated value I1 (second integrated value) of the edge distribution waveform Et '. In addition, the periodic fixed object judgment unit 53 is configured to calculate a ratio of the integrated value ID1 to the integrated value I1 (ID1 / I1) and to determine whether a candidate periodic fixed object detected by the periodic fixed object detection unit candidate 44 is or is not fixed (stationarity) based on whether the proportion value is or is not less than a predetermined limit value Th4. The limit value Th4 is a value determined according to the type of periodic fixed objects to be detected, such as milestones, protection pillars, or utility poles, for example, and can be obtained through experiments or the like. When the ratio of the integrated value ID1 to the integrated value I1 (ID1 / I1) is less than the predetermined limit value Th4, the periodic fixed object judgment unit 53 determines that the candidate periodic fixed object is fixed.
[146] When a candidate periodic fixed object is continuously detected for a predetermined period of time, the periodic fixed object judgment unit 53 determines that the candidate periodic fixed object is a periodic fixed object. Specifically, when a state in which the ID1 / I1 ratio is being less than the predetermined threshold value Th4 is continuously detected for the predetermined period of time, the periodic fixed object's judgment unit 53 judges that there is a sufficiently high possibility that the object fixed periodic candidate detected is a periodic fixed object. Then, the periodic fixed object judgment unit 53 determines that the objects corresponding to the counted peaks are periodic fixed objects. The "predetermined period of time" is a value determined according to the type of periodic fixed objects to be detected, and can be obtained through experiments or the like. The value can be a fixed value, or it can vary according to the capture interval of the camera 10 and the travel speed of the vehicle V. Specifically, the “predetermined period of time” is established, for example, 0.1 to 5 seconds. This time period can ensure the reliability of the determination that candidates for periodic fixed objects are periodic fixed objects, and is shorter than an average interval in which an error occurs in measuring the travel speed of the vehicle itself or something similar .
[147] Next, a periodic fixed object detection method according to the present invention will be described. Figure 20 is a flow chart to illustrate the details of the periodic fixed object detection method according to the present modality, corresponding to Figure 16 of the second modality. It should be noted that as the processing of steps S31 to S41 of the periodic fixed object detection method according to the present modality is the same as the processing of steps S31 to S41 of the second modality, whose description and illustration are omitted. In addition, the description of the processing target of the detection region At will be provided below for convenience, however, the same processing can also be performed in the detection region A2.
[148] As shown in Figure 20, in step S51 subsequent to step S41 (see Figure 15), the edge distribution calculation unit 42 plots a bincount count value (i) (i = 1 an) of an edge element obtained for each of n Lai lines, and obtains an edge distribution waveform. Specifically, the edge distribution calculation unit 42 marks the bincount count value (i) (i = 1 an) of the border element in a plane where the vertical geometric axis represents the count value of the border element, and the horizontal geometric axis represents the position of the Lai line of attention on the L1 ground line in real space.
[149] Subsequently, in step S52, the counting unit 43 'detects peaks in the edge distribution waveform calculated by the edge distribution calculation unit 42, and counts the number of peaks.
[150] Subsequently, in step S53, candidate periodic fixed object detection unit 44 determines whether the number of peaks counted by counting unit 43 'is equal to or greater than the predetermined threshold value Hh3. When determining that the number of peaks is equal to or greater than the predetermined threshold value Th3, candidate periodic fixed object detection unit 44 determines that the objects corresponding to the peaks are candidate periodic fixed objects, and proceeds to step processing S54. In step S53, when determining that the number of peaks counted by the counting unit 43 'is less than the predetermined threshold value Th3, the processing in Figure 20 is completed.
[151] Subsequently, in step S54, the alignment unit 51 aligns positions of the edge distribution waveforms at different times from the edge distribution calculation unit 42 based on the travel speed of the vehicle V itself detected by the speed sensor. vehicle speed 20. Specifically, when an edge distribution waveform moves δ with respect to the graph coordinate system as the vehicle V itself moves during a time segment (Δt), the alignment unit 51 shifts the shape of edge distribution wave E4 by δ along the horizontal geometric axis, and obtains the edge distribution waveform Et '.
[152] Then, in step S55, the difference calculation unit 52 calculates a distribution of the absolute values of the difference | Et-Δt - Et '| between the edge distribution waveform Et-Δt and the edge distribution waveform Et 'calculated by the alignment unit 51 therefrom.
[153] Next, the periodic fixed object judgment unit 53 calculates the integrated value I1 of the edge distribution waveform Et 'in step S56, and calculates the integrated value ID1 of the absolute values | Et-Δt - Et' | in the subsequent step S57.
[154] Then, in step S58, the periodic fixed object judgment unit 53 calculates the ratio (ID1 / I1) of the integrated value ID1 to the integrated value ID1, and determines whether the candidate periodic fixed object detected by the detection unit candidate periodic fixed object number 44 is fixed based on whether the proportion value is less than the predetermined limit value Th4. When determining that the ratio ID1 / I1 is less than the predetermined threshold value Th4, the periodic fixed object judgment unit 53 determines that the candidate periodic fixed object is fixed, and proceeds with processing to step S59.
[155] In step S59, the periodic fixed object judgment unit 53 judges whether the state where the ID1 / I1 ratio is less than the predetermined threshold value Th4 is sequentially detected at predetermined times or more, in other words, if the state is continuously detected for the predetermined period of time. When judging that the state where the ratio ID1 / I1 is less than the predetermined limit value Th4 is sequentially detected the number of predetermined times or more, the unit of judgment of the periodic fixed object 53 judges that the objects corresponding to the counted peaks are objects fixed, and replace “1” with the f_shuki indicator in step S60. However, in step S59, when judging that the state where the ratio ID1 / I1 is less than the predetermined limit value Th4 does not sequentially detect the number of predetermined times or more, the periodic fixed object judgment unit 53 skips step S60 and keeps the value of the f_shuki indicator. Then, the processing is completed in Figure 20.
[156] In step S58, when determining that the ratio ID1 / I1 is equal to or greater than the predetermined threshold value Th4, the periodic fixed object judgment unit 53 advances processing to step S61.
[157] In step S61, the periodic fixed object judgment unit 53 judges whether the state where the ID1 / I1 ratio is equal to or greater than the predetermined threshold value Th4 is sequentially detected the number of predetermined times or more. When judging the state where the ratio ID1 / I1 is equal to or greater than the predetermined limit value Th4, the number of predetermined times or more is sequentially detected, the unit of judgment of the periodic fixed object 53 judges that the objects corresponding to the counted peaks are not periodic fixed objects, and replace “0” with the f_shuki indicator. However, in step S61, when judging that the state where the ID1 / I1 ratio is equal to or greater than the predetermined threshold value Th4 is not sequentially detected the number of predetermined times or more, the judgmental unit of the periodic fixed object 53 jumps step S62, and maintains the value of the f_shuki indicator. Then, the processing in Figure 20 is completed.
[158] In the periodic fixed object detection system 3 and in the periodic fixed object detection method according to the present modality, an edge element (characteristic point) of a three-dimensional object is extracted from the image data in a predetermined region of an aerial view image for each of the regions (sub-regions) next to the multiple vertical virtual lines included in the predetermined region, and an edge distribution waveform (waveform data) corresponding to a distribution of the elements is calculated border in the predetermined region; and it is judged whether the three-dimensional object with the extracted edge element is a candidate periodic fixed object based on whether the number of peaks (peak information) of the edge distribution waveform is equal to or greater than a predetermined limit value . This, like the first modality and the second modality, allows a clearer extraction of the periodicity (repetition) of the periodic fixed objects such as the peak information of the waveform data, and the candidate periodic fixed objects can be more easily extracted from the three-dimensional objects. -nais included in a captured image. In this way, more accurate extraction of periodic fixed objects is possible.
[159] In addition, in the periodic fixed object detection system 3 and in the periodic fixed object detection method according to the present modality, in the same way that the second modality is added to the number of edge elements present along multiple vertical virtual lines that extend in the vertical direction in real space, and the edge distribution waveform is obtained based on the number of edge elements thus added together. In addition, when the number of peaks of the edge distribution waveform is equal to or greater than the predetermined threshold value Th3, the three-dimensional object with the extracted edge element is determined to be a candidate periodic fixed object. Consequently, similarly to the second modality, a case where the edges extend in the vertical direction are aligned with high density is certainly detected, without determining that the detected three-dimensional objects are fixed objects or objects in motion. Therefore, it becomes possible to more easily detect candidate periodic fixed objects that are more likely to be periodic fixed objects.
[160] In addition, in the periodic fixed object detection system 3 and in the periodic fixed object detection method according to the present modality, the position of the Et edge distribution waveform at time t is aligned with the position of the edge distribution waveform Et-Δt at time t-Δt based on the travel speed of a moving object, and the difference of the distribution waveform | Et-Δt - Et '| between the Et-Δt edge distribution waveform at time t-Δt and the Et 'edge distribution waveform thus aligned is calculated. So this difference in the distribution waveform | Et-Δt - Et '| is integrated to calculate the integrated value ID1, and the edge distribution waveform Et 'is integrated to calculate the integrated value I1. The ratio of the integrated value ID1 to the integrated value I1 (ID1 / I1) is calculated, and it is determined whether the candidate periodic fixed object is fixed based on whether the proportion value is less than the predetermined limit value Th4. Consequently, it is possible to detect fixed periodic fixed objects candidates that are also probable periodic fixed objects, and periodic fixed objects can also certainly be easily detected.
[161] In addition, the periodic fixed object detection system 3 and the periodic fixed object detection method according to the present modality, when it is continuously detected for a predetermined period of time that a candidate periodic fixed object is fixed, is The candidate periodic fixed object is determined to be a periodic fixed object. Thus, erroneous detection due to noise is prevented, and periodic fixed objects can be detected with greater reliability. [Modified example]
[162] In the third modality above, the ratio of the integrated value ID1 to the integrated value I1 (ID1 / I1) is calculated, and whether a candidate periodic fixed object is fixed is determined based on whether the proportion value is less than the predetermined limit value Th4. Nevertheless, the method of determination is not limited to this.
[163] In this modified example, the difference calculation unit 52 is configured to calculate the distribution of the absolute values of the difference (first difference) | Et-Δt - Et '| between the Et ’edge distribution waveform and the Et-Δt edge distribution waveform and to calculate a distribution of absolute values of a difference (second difference) | Et-Δt - Et | between the Et edge distribution waveform and the Et-Δt edge distribution waveform.
[164] In addition, the periodic fixed object judgment unit 53 is configured to integrate the absolute values | Et-Δt - Et | calculated by the difference calculation unit 52 to calculate the integrated value ID1 (first integrated value) of the same, and to integrate the absolute values | Et-Δt - Et | to calculate an integrated value ID2 (second integrated value) of it.
[165] In addition, the periodic fixed object judgment unit 53 is configured to calculate a ratio of the integrated value ID1 to the integrated value ID2 (ID1 / ID2), and to determine whether the candidate periodic fixed object detected by the detection unit candidate periodic fixed object number 44 is fixed based on whether the proportion value is less than a predetermined threshold value Th5. The limit value Th5 is a value determined according to the type of periodic fixed objects to be detected, such as landmarks, protection pillars, or utility poles, for example, and can be obtained through experiments or the like. When the ratio of the integrated value ID1 to the integrated value ID2 (ID1 / ID2) is less than the predetermined limit value Th5, the judgmental unit of the periodic fixed object 53 determines that the candidate periodic fixed object is fixed.
[166] When a candidate physical periodic fixed object is continuously detected for a predetermined period of time, the periodic fixed object judgment unit 53 determines that the candidate periodic fixed object is a periodic fixed object. Specifically, when a state is continuously detected where the ID1 / ID2 ratio is less than the predetermined threshold value Th4 for a predetermined period of time, the periodic fixed object's judgment unit 53 judges that there is a sufficiently high possibility that the object fixed periodic candidate detected is a periodic fixed object. Then, the periodic fixed object judgment unit 53 determines that the objects corresponding to the count peaks are periodic fixed objects.
[167] Next, a periodic fixed object detection method will be described according to this modified example. Figure 21 is a flowchart to illustrate the details of the periodic fixed object detection method for this modified example, corresponding to Figures 16 and 20. It should be noted that the processing of steps S31 to S41 of the periodic fixed object detection method of according to this modified example it is the same as the processing of steps S31 to S41 of the modality described above, and their illustration and description will be omitted. In addition, in the periodic fixed object detection method according to this modified example, the processing equivalent to the processing described in the second and third modalities is indicated by the same reference signals, and the description of the same is omitted.
[168] In this modified example, as shown in Figure 21, in step S55 'subsequent to step S55, the periodic fixed object judgment unit 53 calculates a distribution of the absolute values of the difference | Et-Δt - Et | between the Et-Δt edge distribution waveform and the Et edge distribution waveform calculated by the alignment unit 51 of them.
[169] Next, the periodic fixed object judgment unit 53 calculates the integrated value ID1 of the absolute values Et-Δt - Et '| in step S57, and calculates the integrated value ID2 of the absolute values | Et-Δt - Et | in the subsequent step S57 '.
[170] Then, in step S58 ', the periodic fixed object judgment unit 53 calculates the ratio of the integrated value ID1 to the integrated value ID2 (ID1 / ID2), and determines whether the candidate periodic fixed object detected by the unit of candidate periodic fixed object detection 44 is fixed based on whether the proportion value is less than the predetermined threshold value Th5. When determining that the ID1 / ID2 ratio is less than the predetermined threshold value Th5, the periodic fixed object judgment unit 53 determines that the candidate periodic fixed object is fixed, and proceeds with processing to step S59. However, in step S58, when determining that ID1 / ID2 is equal to or greater than the predetermined threshold value Th5, the periodic fixed object judgment unit 53 advances processing to step S61.
[171] Processing after step S59 and processing after step S61 are the same as those in the third modality, and the description of the same is omitted.
[172] In this modified example, the first difference distribution waveform | Et-Δt - Et '| between the Et-Δt edge distribution waveform at time t-Δt and the Et 'edge distribution waveform so aligned and the second difference distribution waveform | Et-Δt-Et | between the edge distribution waveform Et at time t and the edge distribution waveform Et-Δt at time t-Δt are calculated. So, the first difference distribution waveform | Et-Δt - Et '| is integrated to calculate the integrated value ID1, and the second difference distribution waveform | Et-Δt-Et | is integrated to calculate the integrated ID2 value. The ratio of the integrated value ID1 to the integrated value ID2 (ID1 / ID2) is calculated, and it is determined whether the candidate periodic fixed object is fixed based on whether the proportion value is less than the predetermined limit value Th5. As the denominator of the ratio is the integrated value ID2 of the difference between the Et edge distribution waveform at time t and the Et distribution edge waveform Et-Δt at the time t-Δt, this makes a difference between a value of the proportion obtained when the edge distribution waveforms Et, Et-Δt are derived from the moving objects and a value of the proportion obtained when derived from the periodic fixed objects, and the candidate fixed periodic fixed object can be more safely detected.
[173] It should be noted that, in the third modality and in the modified example, what is compared to the predetermined limit values in determining whether a candidate periodic fixed object is fixed is not limited to the ID1 / I1 ratio and the ID1 / ID2 ratio. The ratio can be, for example, a ratio of the integrated value ID1 to an integrated value I2 of the edge distribution waveform Et (ID1 / ID2) or a proportion of the integrated value ID1 to an integrated value I3 of the waveform of Et edge distribution Et-Δt calculated at time t-Δt (ID1 / I3).
[174] In addition, all integrated values I1, I2, I3, ID1, ID2 that constitute the denominators and numerators of the above proportions are obtained by integrating the edge distribution waveforms or absolute values of a difference therefrom, but they can be one obtained by integrating a waveform, which results in the squared elevation of the edge distribution waveforms or a difference between them.
[175] Previously, the modalities of the present invention have been described. However, these modalities are only examples described to facilitate the understanding of the present invention, and the present invention is not limited to the modalities. The technical scope of the present invention includes not only specific technical issues described in the above modalities, but also various modifications, variations, alternative techniques, and the like, which can be readily developed from them.
[176] For example, in the above modalities, the vehicle speed of the vehicle V itself is determined based on a signal from the vehicle speed sensor 20. However, without being limited to this, the speed can be evaluated from multiple images at different times. In that case, the vehicle speed sensor is no longer needed, and configuration can be simplified.
[177] This application claims priority based on Japanese Patent Application No. 2011 -034097 filed on February 21, 2011, the description of which is incorporated herein by reference. Industrial Applicability
[178] In a periodic fixed object detection system and a periodic fixed object detection method according to the present invention, a characteristic three-dimensional object point is extracted from the image data in a predetermined region of an aerial view image. respectively for multiple regions included in the predetermined region; the waveform data corresponding to a distribution of the characteristic point in the predetermined region in the aerial view image is calculated; and it is judged whether the three-dimensional object with the extracted characteristic point is a candidate periodic fixed object based on whether the peak information of the waveform data is equal to or greater than a predetermined limit value. This allows for clearer extraction of the periodicity (repetition) of periodic fixed objects such as peak information from waveform data, and the candidate periodic fixed object can be more easily extracted from the three-dimensional objects included in a captured image. In this way, more accurate extraction of periodic fixed objects is possible. Reference Signal List 1, 2, 3: periodic fixed object detection system 10 camera (image capture device) 20 vehicle speed sensor (speed detector) 30, 40 calculator 31, 41 point conversion unit view 32, 51 alignment unit 33 three-dimensional object detection unit 34 candidate path quantity calculation unit 35, 43 counting unit 36 path variation calculation unit 37, 45, 53 periodic fixed object judgment unit 37a candidate periodic fixed object detection unit 37b periodicity determination unit 38 lane change detection unit (lateral motion detection unit) 42 edge distribution calculation unit 44 candidate periodic fixed object detection unit 52 unit of difference calculation at viewing angle PBt aerial view image PDt difference image V vehicle itself
权利要求:
Claims (14)
[0001]
1. Repetitive fixed object detection system to detect a repetitive fixed object adjacent to a moving object (V), CHARACTERIZED by the fact that it comprises: an image capture device (10) mounted on the moving object (V) and able to capture an image of the surroundings of the moving object (V); a point of view conversion unit (31, 41) configured to perform point of view conversion on the image captured by the image capture device (10) to create an aerial view image; a feature point extraction unit (32, 33, 34, 42a) configured to extract a feature point from a three-dimensional object from the image data in a predetermined region of the aerial view image for each of a plurality of sub-regions included in the predetermined region; a waveform data calculation unit (35, 42b) configured to calculate waveform data corresponding to a distribution of the characteristic points extracted by the characteristic point extraction unit (32, 33, 34, 42a) in the predetermined region in the aerial view image; a peak information detection unit (37a, 43) configured to detect peak information from waveform data; a candidate repetitive fixed object detection unit (37a, 44) configured to judge whether the three-dimensional object with the characteristic point extracted by the characteristic point extraction unit (32, 33, 34, 42a) is a candidate repetitive fixed object or not based on whether the peak information is equal to or greater than a first predetermined limit value (Th1, Th2, Th3); and a repetitive fixed object judgment unit (37, 45, 53) configured to determine that the candidate repetitive fixed object is a repetitive fixed object when the candidate repetitive fixed object is detected by the candidate repetitive fixed object detection unit (37a, 44) and the detection is made under a predetermined condition, in which the characteristic point extraction unit (42a) is configured to extract edge elements present along each of a plurality of vertical virtual lines (Lai) that extend in a vertical direction in a real space, the waveform data unit (42b) is configured to add the number of edge elements extracted by the feature point extraction unit (42a) for each of the plurality of vertical virtual lines (Lai) and to calculate an edge distribution waveform based on the number of edge elements thus added, the peak information detection unit (43) is a unit of count (43, 43 ') and configured to detect edge distribution waveform peaks calculated by the waveform data calculation unit (42b) and to count the number of peaks thus detected, and the candidate repetitive fixed object (44) is configured to determine that the three-dimensional object with the characteristic point extracted by the characteristic point extraction unit (42a) is a candidate repetitive fixed object when the number of peaks counted by the counting unit (43, 43 ') is equal to or greater than a second predetermined limit value (Th2, Th3).
[0002]
2. Repetitive fixed object detection system, according to claim 1, CHARACTERIZED by the fact that the counting unit (43) is configured to detect peaks of the edge distribution waveform calculated by the data calculation unit of waveform (42b) and to count the number of peaks aligned at equal intervals between the peaks thus detected.
[0003]
3. Repetitive fixed object detection system according to claim 1 or 2, CHARACTERIZED by the fact that the repetitive fixed object judgment unit (45) is configured to determine that the candidate repetitive fixed object is a repetitive fixed object when the repetitive fixed object is continuously detected for a predetermined period of time.
[0004]
4. Repetitive fixed object detection system, according to claim 1, CHARACTERIZED by the fact that it comprises: a speed detector (20) configured to detect a travel speed of the moving object (V); an alignment unit (51) configured to align a position of an edge distribution waveform (Et) in a first time (t) calculated by the waveform data unit (42b) with a position of a edge distribution waveform (Et-Δt) in a second time (t-Δt) different from the first time (t) based on the travel speed detected by the speed detector (20); and a difference calculation unit (52) configured to calculate a distribution waveform difference (| Et-Δt - Et '|) between the edge distribution waveform (Et-Δt) in a second time ( t-Δt) and the edge distribution waveform (Et ') aligned by the alignment unit (51), in which the judgmental unit of the repetitive fixed object (53) is configured to integrate the difference in the waveform of distribution (| Et-Δt - Et '|) calculated by the difference calculation unit (52) to calculate a first integrated value (ID1), to integrate any of the edge distribution waveform (Et) in the first time ( t), the edge distribution waveform (Et-Δt) in the second time (t- Δt), and the edge distribution waveform (Et ') aligned by the alignment unit (51) to calculate a second integrated value (l1, l2, l3), to calculate a ratio of the first integrated value (lD1) to the second integrated value (l1, l2, l3), and to determine whether the candidate repetitive fixed object is fixed based on whether a proportion value (lD1 / l1, ld1 / l2, ld1 / l3) is or is not less than a third predetermined limit value.
[0005]
5. Repetitive fixed object detection system, according to claim 1, CHARACTERIZED by the fact that it additionally comprises: a speed detector (20) configured to detect a travel speed of the moving object (V); an alignment unit (51) configured to align a position of an edge distribution waveform (Et) in a first time (t) calculated by the waveform data unit (42b) with a position of a edge distribution waveform (Et-Δt) in a second time (t-Δt) different from the first time (t) based on the travel speed detected by the speed detector (20); and a difference calculation unit (52) configured to calculate a first distribution waveform difference (| Et-Δt - Et '|) between the edge distribution waveform (Et-Δt) in the second time ( t-Δt) and the edge distribution waveform (Et ') aligned by the alignment unit (51), and a second distribution waveform difference (| Et-Δt-Et |) between the waveform edge distribution (Et) in the first time (t) and the edge distribution waveform (Et-Δt) in the second time (t-Δt), in which the judgment unit of the repetitive fixed object (53) is configured to integrate the first distribution waveform difference (| Et-Δt - Et '|) to calculate a first integrated value (lD1), to integrate the second distribution waveform difference (| Et-Δt-Et |) to calculate a second integrated value (lD2), to calculate a ratio of the first integrated value (lD1) to the second integrated value (lD2), and to determine whether the object f candidate repetitive ix is fixed or not based on whether a proportion value (lD1 / lD2) is or is not less than a fourth predetermined limit value (Th5).
[0006]
6. Repetitive fixed object detection system, according to claim 4 or 5, CHARACTERIZED by the fact that the repetitive fixed object judgment unit (53) determines that the candidate repetitive fixed object is a repetitive fixed object when it continuously detects that the candidate repetitive fixed object is fixed for a predetermined period of time.
[0007]
7. Repetitive fixed object detection system, according to claim 1, CHARACTERIZED by the fact that it additionally comprises: a speed detector (20) configured to detect a travel speed of the moving object (V); and a path variation calculation unit (36) configured to calculate a path variation of the repetitive fixed objects in the aerial view image based on a capture interval of the image capture device (10) and the detected path speed by the speed detector (20), where the characteristic point extraction unit (32, 33, 34) includes: an alignment unit (32) configured to align image data positions at different times created by the image conversion unit point of view (31); a three-dimensional object detection unit (33) configured to detect a plurality of three-dimensional objects based on data in a difference image from the image data at different times aligned by the alignment unit (32); and a candidate path quantity calculation unit (34) configured to calculate the candidate path quantity of the plurality of three-dimensional objects detected by the three-dimensional object detection unit (33), the waveform data calculation unit includes a counting unit (35) configured to count the candidate path quantity calculated by the candidate path quantity calculation unit (34) and to generate a histogram as the waveform data when the peak values (M, M1, M2, M3) of the histogram generated by the counting unit (35) are equal to or greater than a fifth predetermined limit value (Th1), the candidate repetitive fixed object detection unit (37a) is configured to determine that the three-dimensional objects linked to the number of candidate paths corresponding to the peaks (M, M1, M2, M3) are candidate repetitive fixed objects, and the judgment unit of the repetitive fixed object (37) judges that the candidate repetitive fixed objects are repetitive fixed objects when the peaks (M, M1, M2, M3) of the histogram are present within the path variation calculated by the path variation calculation unit (36).
[0008]
8. Repetitive fixed object detection system, according to claim 7, CHARACTERIZED by the fact that the candidate path quantity calculation unit (34) is configured to calculate a plurality of candidate path quantity for each of the three-dimensional objects .
[0009]
9. Repetitive fixed object detection system according to claim 7 or 8, CHARACTERIZED by the fact that the candidate repetitive fixed object detection unit (37a) is configured to establish the fifth predetermined limit value (Th1) obtained from a maximum value of the peaks (M, M1, M2, M3).
[0010]
10. Repetitive fixed object detection system according to any one of claims 7 to 9, CHARACTERIZED by the fact that it additionally comprises a repeatability determination unit (37b) configured to judge whether there is a repeatability from the positions where the peaks (M, M1, M2, M3) appear, and to decrease the fifth predetermined limit value (Th1) when judging that there is repeatability.
[0011]
11. Repetitive fixed object detection system, according to claim 10, CHARACTERIZED by the fact that the repeatability determination unit (37b) is configured to judge if there is a repeatability from the positions where peaks appear (M, M1 , M2, M3), the peaks having values equal to or greater than the fifth predetermined limit value (Th1) adjusted from the maximum value of the peaks (M, M1, M2, M3).
[0012]
12. Repetitive fixed object detection system, according to claim 10 or 11, CHARACTERIZED by the fact that it additionally comprises a lateral movement detection unit (38) configured to detect a lateral movement of the moving object (V), and when the lateral movement detection unit (38) detects lateral movement of a specific or longer distance and the fifth predetermined limit value (Th1) has been decreased, the repeatability determination unit (37b) is configured to initialize the fifth limit value (Th1) decreased.
[0013]
13. Repetitive fixed object detection system according to any of claims 7 to 12, CHARACTERIZED by the fact that the repetitive fixed object judgment unit (37) is configured to store positions where the peaks (M, M1, M2, M3) appear and their peak values when it is judged that there is a repeatability in the previous processing, to judge that the candidate repetitive fixed objects are repetitive fixed objects in case of detecting a peak with a value equal to or greater than the fifth predetermined limit value (Th1) except the peaks (M, M1, M2, M3) judged to be repeatable in the current processing, when an average value of the peak values judged to be repeatable in the current processing does not is less than an average value of the peak values judged to be repeatable in previous processing by a predetermined or greater value, and to judge that moving objects are present when the average value of the peak values judged to be repeatable in the current processing is less than the average value of the peak values judged to be repeatable in the previous processing by the predetermined or greater value.
[0014]
14. Repetitive fixed object detection method to detect repetitive fixed objects in the vicinity of a moving object (V), CHARACTERIZED by the fact that it comprises: an image capture step of capturing an image of the surroundings of the moving object (V) ) using an image capture device (10) mounted on the moving object (V); a point of view conversion step of performing point of view conversion on the image captured by the image capture device (10) to create an aerial view image; a feature point extraction step of extracting a feature point from a three-dimensional object from the image data in a predetermined region of the aerial view image for each of a plurality of subregions included in the predetermined region; a waveform data calculation step of calculating waveform data corresponding to a distribution of the characteristic points extracted in the characteristic point extraction step in the predetermined region in the aerial view image; a peak information detection step of detecting peak information from the waveform data; a candidate repetitive fixed object detection step to judge whether the three-dimensional object with the characteristic point extracted in the characteristic point extraction step is or is not a candidate repetitive fixed object based on whether or not the peak information is equal to or greater than that a first predetermined limit value (Th1, Th2, Th3); and a repetitive fixed object judgment step of determining that the candidate repetitive fixed object is a repetitive fixed object when the candidate repetitive fixed object is detected in the candidate repetitive fixed object detection step and the detection is made under a predetermined condition, in that the characteristic point extraction step extracts edge elements present along each of the plurality of vertical virtual lines (Lai) that extend in a vertical direction in a real space, the waveform data calculation step adds up to the number of edge elements extracted by the characteristic point extraction step for each of the plurality of vertical virtual lines (Lai) and calculates a distribution waveform at the edge based on the number of edge elements thus added, the detection step of peak information is a counting step and detects peaks of the distribution waveform at the edge calculated by the step of calculating waveform data and counts the number of peaks thus detected, and the candidate repetitive fixed object detection step determines that the three-dimensional object with the characteristic point extracted by the characteristic point extraction step is a candidate repetitive stationary object when the number of peaks counted by the step count is equal to or greater than a second predetermined limit value (Th2, Th3).
类似技术:
公开号 | 公开日 | 专利标题
BR112013007085B1|2021-02-23|REPETITIVE FIXED OBJECT DETECTION SYSTEM AND METHOD
US11062155B2|2021-07-13|Monocular cued detection of three-dimensional structures from depth images
Zhao et al.2012|A novel multi-lane detection and tracking system
Mei et al.2009|A constant-time efficient stereo slam system
US10956756B2|2021-03-23|Hazard detection from a camera in a scene with moving shadows
US9558556B2|2017-01-31|Three-dimensional object detection device
BR112013003851B1|2021-03-09|device and three-dimensional object detection method
BR112014001824B1|2021-04-20|solid object detection device and method
Jodoin et al.2016|Tracking all road users at multimodal urban traffic intersections
BR112014020404B1|2021-08-31|THREE-DIMENSIONAL OBJECT DETECTION DEVICE
Cui et al.2014|Real-time global localization of intelligent road vehicles in lane-level via lane marking detection and shape registration
BR112014020407B1|2021-09-14|THREE-DIMENSIONAL OBJECT DETECTION DEVICE
JP2012159469A|2012-08-23|Vehicle image recognition device
Kanhere et al.2008|Automatic camera calibration using pattern detection for vision-based speed sensing
BR112015001861A2|2019-12-31|Solid thing detecting device
JP2007101690A|2007-04-19|Map update device
US20180204077A1|2018-07-19|Traffic Light Recognition Device and Traffic Light Recognition Method
Cheng et al.2019|Obstacles detection and depth estimation from monocular vision for inspection robot of high voltage transmission line
JP2017045261A|2017-03-02|Information acquisition device, information acquisition method, and information acquisition program
JP2012159470A|2012-08-23|Vehicle image recognition device
CN103456026A|2013-12-18|Method for detecting ground moving object under road landmark constraints
BR112015001872B1|2021-11-03|VEHICLE IMAGE RECOGNITOR
JP2007164565A|2007-06-28|System and device of vehicle sensing for traffic-actuated control
JP2007164564A|2007-06-28|System and device of vehicle sensing for traffic-actuated control
Weng et al.2012|A segmentation and data association annotation system for laser-based multi-target tracking evaluation
同族专利:
公开号 | 公开日
US8903133B2|2014-12-02|
MX2013005980A|2013-07-15|
EP2680247A1|2014-01-01|
JP5783243B2|2015-09-24|
WO2012115009A1|2012-08-30|
CN103124995A|2013-05-29|
BR112013007085A2|2016-06-14|
CN103124995B|2015-07-01|
EP2680247B1|2018-12-26|
EP2680247A4|2017-11-22|
JPWO2012115009A1|2014-07-07|
RU2013118701A|2015-03-27|
US20130322688A1|2013-12-05|
MX321872B|2014-07-11|
RU2549595C2|2015-04-27|
MY166665A|2018-07-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP3085094B2|1994-07-14|2000-09-04|トヨタ自動車株式会社|Moving object detection method|
JP3304687B2|1995-05-24|2002-07-22|日産自動車株式会社|Vehicle lane recognition device, obstacle detection device, road departure notification device|
US5963664A|1995-06-22|1999-10-05|Sarnoff Corporation|Method and system for image combination using a parallax-based technique|
JP2000161915A|1998-11-26|2000-06-16|Matsushita Electric Ind Co Ltd|On-vehicle single-camera stereoscopic vision system|
US6411898B2|2000-04-24|2002-06-25|Matsushita Electric Industrial Co., Ltd.|Navigation device|
DE10035223A1|2000-07-20|2002-01-31|Daimler Chrysler Ag|Device and method for monitoring the surroundings of an object|
JP3645196B2|2001-02-09|2005-05-11|松下電器産業株式会社|Image synthesizer|
US6999620B1|2001-12-10|2006-02-14|Hewlett-Packard Development Company, L.P.|Segmenting video input using high-level feedback|
US7003136B1|2002-04-26|2006-02-21|Hewlett-Packard Development Company, L.P.|Plan-view projections of depth image data for object tracking|
JP3864945B2|2003-09-24|2007-01-10|アイシン精機株式会社|Road lane detection device|
CA2455359C|2004-01-16|2013-01-08|Geotango International Corp.|System, computer program and method for 3d object measurement, modeling and mapping from single imagery|
US7298247B2|2004-04-02|2007-11-20|Denso Corporation|Vehicle periphery monitoring system|
JP4899424B2|2005-11-04|2012-03-21|トヨタ自動車株式会社|Object detection device|
JP4901275B2|2006-04-07|2012-03-21|富士重工業株式会社|Travel guidance obstacle detection device and vehicle control device|
JP4887932B2|2006-06-23|2012-02-29|日産自動車株式会社|Road edge recognition device, automobile, and road edge recognition method|
DE102006060045A1|2006-12-19|2008-06-26|Imi Intelligent Medical Implants Ag|Visual aid with three-dimensional image capture|
JP2008219063A|2007-02-28|2008-09-18|Sanyo Electric Co Ltd|Apparatus and method for monitoring vehicle's surrounding|
JP4378571B2|2007-05-31|2009-12-09|Necシステムテクノロジー株式会社|MAP CHANGE DETECTION DEVICE, MAP CHANGE DETECTION METHOD, AND PROGRAM|
JP5109691B2|2008-01-31|2012-12-26|コニカミノルタホールディングス株式会社|Analysis device|
JP2009266136A|2008-04-29|2009-11-12|Mitsubishi Electric Corp|Road structure abnormality detector|
US8306269B2|2009-03-12|2012-11-06|Honda Motor Co., Ltd.|Lane recognition device|
JP4806045B2|2009-03-12|2011-11-02|本田技研工業株式会社|Lane recognition device|MY160274A|2010-08-19|2017-02-28|Nissan Motor|Three-dimensional object detection device and three-dimensional object detection method|
JP5954712B2|2011-01-13|2016-07-20|パナソニックIpマネジメント株式会社|Image processing apparatus, image processing method, and program thereof|
JP5869239B2|2011-06-21|2016-02-24|浜松ホトニクス株式会社|Light measurement apparatus, light measurement method, and light measurement program|
EP2725346B1|2011-06-21|2017-03-08|Hamamatsu Photonics K.K.|Light measurement device, light measurement method, and light measurement program|
MY171030A|2011-09-12|2019-09-23|Nissan Motor|Three-dimensional object detection device|
CN104246821B|2012-04-16|2016-08-17|日产自动车株式会社|Three-dimensional body detection device and three-dimensional body detection method|
JP5884771B2|2013-05-22|2016-03-15|株式会社デンソー|Collision mitigation device|
US9230366B1|2013-12-20|2016-01-05|Google Inc.|Identification of dynamic objects based on depth data|
US9342747B2|2014-04-14|2016-05-17|Bendix Commercial Vehicle Systems Llc|Vehicle driver assistance apparatus for assisting a vehicle driver in maneuvering the vehicle relative to an object|
CN106339659A|2015-07-10|2017-01-18|株式会社理光|Road segment detecting method and device|
US9676326B2|2015-09-25|2017-06-13|Ford Global Technologies, Llc|Drive history parking barrier alert|
JP6793448B2|2015-10-26|2020-12-02|株式会社デンソーテン|Vehicle condition determination device, display processing device and vehicle condition determination method|
KR20170075474A|2015-12-23|2017-07-03|주식회사 만도|Method for detecting target object and apparatus thereof|
US9836673B2|2015-12-30|2017-12-05|International Business Machines Corporation|System, method and computer program product for training a three dimensional object indentification system and identifying three dimensional objects using semantic segments|
JP6572862B2|2016-10-14|2019-09-11|株式会社デンソー|Display control device|
JP6794243B2|2016-12-19|2020-12-02|日立オートモティブシステムズ株式会社|Object detector|
CN110782459B|2019-01-08|2021-02-19|北京嘀嘀无限科技发展有限公司|Image processing method and device|
法律状态:
2018-12-18| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-10-15| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-10-13| B06A| Patent application procedure suspended [chapter 6.1 patent gazette]|
2021-01-12| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-02-23| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/02/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
JP2011034097|2011-02-21|
JP2011-034097|2011-02-21|
PCT/JP2012/053834|WO2012115009A1|2011-02-21|2012-02-17|Periodic stationary object detection device and periodic stationary object detection method|
[返回顶部]