专利摘要:
The subject of the invention is a method for detecting and classifying events of a scene by means of a monopupill imaging system equipped with a VisPIR detector in the 0.6 μm-1.1 μm band. a SWIR detector, which comprises steps of acquisition of successive synchronized VisPIR and SWIR 2D images, display of VisPIR images, and processing of these images, which consists in: comparing the SWIR images to determine for each pixel the variation of illumination from one image SWIR to the other and the peak value of these illuminations SWIR, if this variation of illumination SWIR is greater than a threshold, then an event associated with said pixel is detected and: date, its temporal form and its duration - the coordinates of the corresponding pixel are determined in the VisPIR images for which o the variation of the illumination of one VisPIR image is calculated to the other and the peak value of these illuminations VisPIR, and we compare these SWIR and VisPIR illuminance measurements and their peak values to estimate a temperature of the event, where the distance of the corresponding point of the scene is estimated to calculate the intensity of the event from the SWIR and VisPIR illuminations and from from this distance, where the total energy of the event is estimated from its temporal form and its intensity, the event is classified according to its duration, temperature, intensity and energy, the previous steps are reiterated for another pixel of the SWIR images.
公开号:FR3017480A1
申请号:FR1400350
申请日:2014-02-07
公开日:2015-08-14
发明作者:Thierry Midavaine;Olivier Garin
申请人:Thales SA;
IPC主号:
专利说明:

[0001] FIELD OF THE INVENTION The field of the invention is that of the classification of events in an observed scene. This classification of events is part of the detection of threats before shooting or after fire for the self protection or the collateral protection of vehicles, boats, aircraft such as helicopters or airplanes, or military infrastructure or civil. These platforms are likely to be exposed to different types of threats that can come from all directions in 360 ° azimuth around the platform or in solid angle domains of 2Tr steradians or more (supra-hemispherical) for ground platforms or 4-r steradians for aerial platforms. These threats are: ammunition fire, laser illumination, weapons targeting systems, the men who implement them (shooters) or the possible platforms where they are integrated. Several "Soft Kill" or "Hard Kill" systems that incorporate threat detection, response, protection or vehicle survival features or its crew and responses require real-time detection functions at very short reaction times. Today, few equipment is available to respond to the diversity of threats. However, there are: gun-start detectors (or "HFI"), based on acoustic solutions, or radar, or infrared band 2 or MWIR (Mid Wave lnfrared 3 pm-5 pm) or UV (Ultra Violet) or a combination of these techniques; sniper detectors based on acoustic solutions for the detection of the muzzle wave and supersonic wave of the ammunition or cat eye detection (DOP). means of sighting; Missile Warning Sensors (MSDs) based on radar, UV or MWIR detections; laser warning detectors (or LADs) based on optical detections on the wavelengths of lasers used for military applications. These types of receivers are sometimes encountered on high value aircraft such as combat aircraft. On the armored vehicles they are more rare because of the unit costs of this equipment and the cost of their integration in a protection system compared to the value of the platform. LADs or MSDs have been on the market for more than 20 years and 10 have not found real opportunities for land vehicles. Some companies offer equipment at different stages of maturity for detecting the departure of missiles or cannon shots based on techniques associating N cameras for example 4 each covering a sector in 90 ° field to ensure 360 ° coverage. These networks of distributed and synchronized N cameras remain costly and complex solutions in integration and in fact unaffordable. In fact, today most armored vehicles have no means of protection outside their armor. Indeed, in the terrestrial field the existing solutions are unsatisfactory mainly because of the costs and the difficulty of making a complete protection system. Indeed it is useless to detect threats if we do not know how to initiate a reaction ensuring the neutralization of the threat or its effects and to ensure the survival of the crew, the platform or to ensure the continuation of the realization of the mission. The object of the invention is to overcome these disadvantages. More specifically, the subject of the invention is a method for detecting and classifying events of a scene by means of a monopupill imaging system 30 equipped with several detectors including a detector in the wavelength band 0. 6 pm-1.1 pm said VisPIR detector (Visible Near InfraRed) and a detector in the wavelength band 0.9 pm-1.7 pm or band 1 said SWIR detector (Short Wave InfraRed, associated with a unit processing, which comprises a step of acquisition of successive 2D images of the scene from the VisPIR detector called VisPIR images and successive 2D images of the scene from the SWIR detector and synchronized with the VisPIR images, called SWIR images, a step of displaying the VisPIR images, a step of processing the VisPIR and SWIR images by the processing unit, it is mainly characterized in that this image processing step comprises the following sub-steps: - comparing the SWIR images s uccessives for determining for each pixel (xi, y1) and neighboring pixels, the variation of illumination from one SWIR image to another and the peak value ei (t) of these illuminations said SWIR. If this variation of illumination SWIR is greater than a predetermined threshold of illumination, then an event i associated with said pixel (xi, y1) or the barycenter of the considered pixels, is detected and: one determines the date fi of the event the time form and the duration δt are determined; of the event i, the coordinates of the pixel (x2, y2) and neighboring pixels corresponding to the pixel (x1, y1) or the barycenter are determined in the VisPIR images synchronized with the SWIR images, and for this pixel (x2, y2) and these neighboring pixels, where the variation of the illumination from one VisPIR image to the other and the peak value e2 (t) of these so-called VisPIR illuminances are calculated, and these variations of illumination SWIR and VisPIR are compared. and their peak values ei (t) and e2 (t) to estimate a temperature of the event from a predetermined correspondence table, ^ If the scene is a daytime scene and the temperature is greater than a threshold of temperature, then the event is a false alarm (= the event is rejected) and we reiterate the previous steps with another pixel, ^ If not, o we estimate the distance Ri of the corresponding point of the scene to calculate the intensity I ; of the event i from the SWIR and VisPIR illuminations of this pixel and from this distance Ri, where the total energy Ei of the event is estimated from its temporal form and its intensity the event is classified i depending on its duration Ôti, its temperature Ti, its intensity I; and its energy E. the preceding steps are repeated for another pixel of the SWIR images.
[0002] Thanks to this process, the VisPIR detector ensures the basic function of high resolution day and night vision but also additional functions of detection, angular localization and classification of events (or threats) or false alarm rejection, by the use of one (or more) possibly multispectral SWIR detectors (s) and determining duration, temperature and intensity attributes of these events. When an event is associated with several adjacent pixels of more than 4x4 pixels for example, then the luminance of the SWIR band event is determined, and the event is also classified according to its luminance in the SWIR domain. In the visible-near IR band, it is also possible to measure the apparent luminance associated with the event. These two luminances SWIR and VisPIR make it possible to estimate the temperature of the event (already estimated as indicated previously) with more precision. Besides having estimated the distance of the event it is possible to estimate its dimension, its intensity and its total energy. Preferably, the event is stamped on the displayed VisPIR image, possibly with associated distance. It can be incremented in reality augmented by the embedding of a symbology for example a colored reticle with associated data. This distance information makes it possible to make the classification of the detected threat more robust, to assess its degree of threat or priority, and to assess whether we are within the lethal range of this threat and to eventually launch an appropriate response. automatic or supervised by an operator (movement, masking by smoke firing, riposte to neutralize the threatening firing post, riposte to neutralize the ammunition for example).
[0003] According to one characteristic of the invention, the processing unit is connected to an event management system and the detection and classification method further comprises a step of triggering this management system from the event classified and its distance.
[0004] The invention also relates to a computer program product, said computer program comprising code instructions for performing the steps of the method as described, when said program is executed on a computer. The invention also relates to a system for detecting and classifying events of a scene, which comprises: a monopupillic imaging system of the stage, equipped with several detectors including a visible-near IR detector and a detector SWIR, 25 - a processing unit connected to the detectors, means for estimating the distances between points of the scene and the imaging system, - a display device connected to the processing unit, characterized in that that the processing unit comprises means 30 for implementing the method of detection and classification as described. Other features and advantages of the invention will appear on reading the detailed description which follows, given by way of nonlimiting example and with reference to the appended drawings, in which: FIG. 1 schematically represents an example of a system used to FIG. 2 represents an exemplary flow diagram of the steps of the method according to the invention, FIG. 3 are examples of images obtained by the detector SWIR (FIG. 3a), by the detector. VisPIR without inlay (fig 3b) and 5 with inlay (fig 3c) of an event detected on the SWIR image with the distance associated with this event, figure 4 shows an example of an enlarged SWIR image of an event resolved Figure 5 shows an example of a threat classification determined according to its attributes. From one figure to another, the same elements are identified by the same references. The method of detection and classification according to the invention is implemented by means of a monopupill imaging system which can be seen in FIG. 1, comprising an optics 10 designed to accommodate at least two focal planes with, for example, a detector 22 visible - PIR (0.6 pm-1.1 pm or else 0.4 pm-1.1 pm including the visible range) and for the other a detector 21 in Band 1 or SWIR (0.9 pm-1.7 pm). This optic is preferably designed to form on each detector a panoramic image of the scene to be observed; it comprises a head module 11 with a hemispherical field or more, followed by optical elements 12 for forming images, a dichroic separator 13 and on each spectral channel, a matching optics 14 and 15. For example, a catadioptric combination (mirrors and glass diopters) to enable it to operate in a very broad Visible, Near IR (NIR) and Band 1 (SWIR) spectral band. The use of an optical common to both detectors makes it possible to avoid the effect of parallax on the images coming from these detectors. The matrix detector SWIR 21, for example based on InGaAs, typically has the format of the matrix 320 lines x 256 columns in the step of 30pm with an image rate of 400 Hz or 640 lines x512 columns at the step of 15pm with a frame rate of 300 Hz, which correspond to the current state of the art. This detector typically has a spatial resolution of 4 mrd for a pixel with the format 640x512. Larger formats may be considered. However, the need for image rate and temporal response for the classification of threats that are for the most part very short (generally <100 ms duration) require acquisition rates equal to or greater than 200 Hz or even 1000 Hz if we seek to classify small calibers. The detector integrates its reading circuit and multiplexing said ROIC (acronym for the English expression "Read Out Integrated Circuit"). The pixels are provided with one or more suitable filters such as: A single filter on the entire pixel array 211 simultaneously passing a narrow band around 1.064pm and a wide band above 1.4 pm up to the cut-off wavelength of the detector for example 1.7pm or beyond. Several different filters each adapted to a pixel for example on a 2x2 tiling with: o A sub-band for the detection of kick start signatures (1.4-1.7 pm) o narrow bands tuned to the laser lines (1 , 06 pm, 1.5 pm, ...) The read circuit or ROIC is connected to a proximity electronics (or E-prox) 210 of the detector; the ROIC and / or the E-prox integrate special functions such as for example: A high rate reading greater than 200Hz or less than 5ms of period and integration time without dead time; the integration time is equal to or very close to the acquisition period of the image frames.
[0005] A high dynamic (HDR mode acronym for the English expression High Dynamic Range) can be achieved in different ways: bi-slope, non-linear logarithmic response or root for example, reset before saturation counted ...
[0006] A CMOS silicon detector is for example used as a visible detector 22. It has a high spatial resolution (for example 1 mrd for 1 pixel) and operates at a rate of 25 Hz or more generally between 25 Hz and 100 Hz. As an example of such a detector, there may be mentioned a matrix 221 CMOS 4T or 5T (with 4 or 5 transistors in the pixel) with low noise and amplification and parallel digital analog conversion by column such as a CMOS s-CMOS. The CMOS detector 221 integrates its reading and multiplexing circuit (or ROIC) which is connected to the proximity electronics (or E-prox) 220 of the detector. This proximity electronics associated with ROIC performs all digital analog conversion and signal restoration (NUC Non Uniformity Corrections) operations to enable the acquired images to be used with the maximum performance day and night. It is connected to a display device 40 so as to provide a vision function of the scene for an operator. The visible image 42 is displayed but not necessarily the image SWIR 41. These detectors 22, 21 are connected to a processing unit 50 of the visible images 42 and SWIR 41 respectively obtained by the two detectors, capable of implementing the The following steps are described in connection with FIG. 1 and the flowchart of FIG. 2. The threats (or events) are characterized by optical signatures on the acquired images. According to the invention, an event is detected and then four attributes are defined from which these optical signatures will be classified. All threats are characterized by very short optical signatures and therefore very difficult to discern to the eye by an operator looking at a screen or even in direct vision because of the spectral emission domain; their detection must therefore be ensured by automatic processing. The temporal profiles of the signatures, that is their duration, and possibly the rise time and the descent profile, constitute a first attribute; it is obtained in the following manner. An event is first detected by detection means 51 and some of its characteristics are estimated as follows. Images of the scene to be observed said SWIR images 41 are acquired successively by the SWIR detector. Images of this scene called VisPIR images 42 are simultaneously acquired by the visible detector, these SWIR and VisPIR images being temporally synchronized with each other by means of an electronics 30 for controlling and synchronizing the detectors, and stored in memory for the purposes of the classification. This synchronization may result from image acquisition at the same rate for both detectors, but the rates are generally different as previously mentioned. The images SWIR 41 are compared with each other to determine for each pixel (Ai, yil) the variation of illumination SWIR. The illumination is given by the integrated signal on the pixel on which the image of the threat is formed. If this variation of illumination is greater than a predetermined or adaptive threshold according to the signatures of temporal space funds, then it is considered that it represents the SWIR signature of an event: an event i is detected as shown in FIG. 3a. This illumination threshold is typically 2 to 10 times the variation of the standard deviation of the background according to the probabilities of detection and the false alarm rates desired at this stage. An illumination eil in W / m2 is thus determined in the reception spectral band determined by the detector SWIR, its optical 15 and its possible filters.
[0007] And its duration Esti is determined in seconds, that is to say the duration during which this variation of illumination is greater than this threshold. This duration Ôt1 therefore constitutes the first classification attribute of the event (step A).
[0008] From this event detection performed in the SWIR domain, it will be possible to measure on the VisPIR images the level of stream collected synchronously in the same field of view. The procedure is as follows. The coordinates of the pixel (A2, Yi2) corresponding to this event in the VisPIR images 42 synchronized with said SWIR images are determined by means 53. When the dimensions of the SWIR and VisPIR images are the same, the coordinates of the pixel in the visible images are identical to those of the SWIR images; we have xi1 = xi2 and yil = Yi2. If the VisPIR images are more resolute than the SWIR images because of the spatial resolution of the VisPIR detector greater than that of the SWIR detector, as in the example of FIG. 3, the coordinates of the VisPIR pixels are determined in correspondence with the SWIR pixel on which is made the detection. This operation of correspondence of the pixels VisPIR and SWIR can be carried out at the factory after integration of the arrays of detectors on the optics. If the detection is made on adjacent SWIR pixels, the position of its center of gravity is determined and the VisPIR pixels corresponding to this center of gravity are located. By performing the measurement of the peak signal in the VisPIR band or by adding the contribution of the adjacent pixels illuminated by the event, the VisPIR illumination of the event ei2 in W / m2 is determined in the spectral band of this receiver. It is possible that the VisPIR channel can not detect the event. In this case the illumination ei2 and zero or amplitude too small compared to noise or uncertainties to be determined. For this pixel, the ratio ei2 / ei1 is calculated. This VisPIR illumination ratio on the illumination SWIR makes it possible to estimate a temperature Ti (in K) of the event (step B), by the means for calculating the attributes 54. This temperature is the second attribute of the optical signature . For this purpose, a predetermined table is used which makes it possible to establish a correspondence between these illuminance ratios and the temperature of the corresponding black body or gray body, for example using the Planck law, the contribution of which for both bands is incorporated. spectral SWIR and VisPIR depending on the temperature. The digital signal from both channels is calibrated in W / m2 to allow measurement in this unit. Another way to do this is to calibrate the detectors by measuring the signal they deliver by targeting a black calibration body in the laboratory. This temperature is typically used to reject a source of false alarm constituted by the Sun or its modulation or diffusion by clouds, or its reflection by mirrors, metal contours (such as those of windows or road signs) or by reflections on the water. Indeed, on a mobile platform these solar reflections are likely to generate spatio-temporal signatures in the SWIR domain very close to those of the ammunition sought. The Sun or its reflections having a black body temperature close to 5800K will generate an intense signature at shorter wavelength (in the visible and the near IR) much stronger than those associated with the pyrotechnic signatures of the starts of blows having a black body temperature less than 2000 K. Thus the detection of intense signatures in the visible or near-IR range makes it possible to neutralize any detections in the SWIR on the pixels covering the same instantaneous field of view. The level of flow collected on the visible detector allows via the temperature to validate the detection of the event or to reject it: for a scene of day, if the temperature is higher than a threshold (5000 K for example), then this event is a false alarm, otherwise the event is validated (step C). This validated event may also be embedded in the visible image for the attention of the operator. This temperature attribute is advantageously determined by the processing unit parallel to the first duration attribute. All devices that cause these threats are characterized by lethal or P-effectiveness ranges beyond which the threat is no longer effective. They are 100m class for an RPG or short-range anti-tank rocket, 500m to 8km for anti-tank missiles according to their type or shell fire according to their caliber and their load. According to the invention, the processing unit comprises means 52 for estimating the distance Ri of a point of the visible image 42 (object point of the scene-imaging system). The imaging system is installed on a platform, for example terrestrial. Knowing the site and deposit directions in the VisPIR detector's reference system and knowing its position in height and orientation with respect to the platform or better with respect to the ground (by exploiting the relative position information of the platform with respect to the ground) it is possible to estimate the distance of the points of the ground image according to their apparent elevation by assuming a horizontal plane ground or better by exploiting a digital terrain model (DTM) recalibrated for example by the GPS position of the vehicle and data from the images delivered by the visible detector, to have a horizon profile or the location of bitters in the field of the image. When the platform is fixed on an infrastructure it is possible, during its installation, to match at all points of the visible image of the terrestrial scene its distance. When the platform is mobile, the measurement of the angular velocities of the characteristic points of the scene is possible between successive images with good accuracy due to the angular resolution of the imaging system. This field of angular velocities throughout the scene is called optical flow. It makes it possible to measure the rotation (s) of the field and the direction of the speed vector of the platform (after rotation of the field), direction for which the optical flow is zero. A large coverage of the angular field of the panoramic imaging system makes it possible to ensure that the direction of the speed vector of the plateform is in the field of observation and coincides with a pixel. By knowing the speed of the platform, information delivered by its driving and its propulsion or by exploiting the information of an inertial unit or possibly measured by exploiting the optical flow on the ground at short distance, the measurement of the angular velocity of the elements of the scene and the measurement of the angular difference between the pixel in coincidence with the speed vector of the platform and the direction associated with the image of a scene element whose angular velocity is estimated makes it possible to estimate its distance R. It is possible to display in real time on the visible image 42 the distance RI associated with the event i as shown in FIG. 3c; the event i is embedded in the visible image using, for example, a predetermined symbol, such as a colored reticle. The SWIR image 41 shown in FIG. 3a is generally not displayed. From this distance and the visible illumination obtained for this event, the means 54 calculate its intensity (in W / sr) which is the third attribute (step D). Indeed, it is recalled that the amplitude of the illumination SWIR is dependent on the distance R with a law in 1 / R2 and the atmospheric attenuation Tatm which will provide a transmission coefficient dependent on the distance R. : I p. (1 / Tatm) ei Ri2. In the SWIR images 41, the optical signatures of the sources are either resolved (extended over several pixels) or unresolved (the image of the source is formed on a single pixel) or in an intermediate situation very small in relation to the field An instantaneous snapshot defined by the pixel size, the focal length of the optics and its FTM (Modulation Transfer Function) in the band 1. When the signature is resolved spatially as shown in the example of FIG. 3 or 4. , the illumination ei corresponding to the amplitude of the peak signal seen on a pixel is determined by the luminance of the signature during the integration time (step E), and obtained by the means 54. This value constitutes a fourth attribute for the classification of the threat. This attribute is very robust because it is not dependent on the distance R by a law in 1 / R2 as in the case of the illumination delivered by an unresolved source. In this case, the luminance Li in W / sr / m2 of the event is given by the peak illumination seen on a pixel by the relation: Li = ed [Tatm (IFOV) 2]. The IFOV being the solid angle of the instantaneous field of view of the detector. Or the average luminance can be given by the integration of the signal em collected by the adjacent pixels on which the image of the source divided by the solid angle O in steradian under which the resolved event is seen. Due to the dependence of the illumination received as a function of the distance R (law in 1 / R2), it is not an attribute for the classification of the threat. Only the barycenter of this signature has a meaning by allowing to locate angularly in site and deposit in the repository of the detector SWIR or in the repository of the platform knowing the movement of the first relative to the frame if the imaging system is mounted on an articulated body on the platform (turret, pan and tilt platform). This "resolved" event is classified according to its duration, temperature, intensity and luminance.
[0009] Finally, the intensity of the event or its luminance associated with its dimension and its duration make it possible to estimate the energy Ei (in J) of the event (step F) which can be related to an estimation of the range of the event. the threat and its ammunition. They are calculated by means 54.
[0010] When the event i is solved we have: Ei = L; S2 R24 Tr 6t1. When the event is not resolved we have: 4-rr These four (if unsolved signature) or five (if resolved signature) attributes make it possible to achieve by means 55, a robust classification of the threat. When the event has been validated from its temperature or rather has not been considered as a false alarm, it is classified from its duration, its temperature, its intensity, its energy and possibly its luminance and predetermined classes as shown in the table of FIG. 5. And these steps are reiterated for each other pixel of the SWIR image.
[0011] The combination of the threat classification and the estimate of the distance and energy of the event makes it possible to decide whether the imaging system or the observer is in the lethal domain P of the threat and thus to engage if necessary and if possible a response adapted in real time. This riposte can be triggered automatically by means 56 of the processing unit as shown in FIG. 1 or manually by the operator. This method of detection and classification can in particular be implemented from a computer program product, the computer program comprising code instructions for performing the steps of the method. It is recorded on a computer readable medium, also used to synchronize SWIR images and visible images. The support can be electronic, magnetic, optical, electromagnetic or be an infrared type of diffusion medium. Examples of such media are Random Access Memory RAMs, ReadOnly Memory ROMs, tapes, floppy disks or magnetic or optical disks (Compact Disk - Read Only Memory (CD-ROM), Compact Disk - Read / VVrite (CD-R / W) and DVD) .20
权利要求:
Claims (8)
[0001]
REVENDICATIONS1. Method for detecting and classifying events of a scene by means of a monopupill imaging system equipped with several detectors including a detector in the wavelength band 0.6 μm-1.1 μm called VisPIR detector (22) and a detector in the wavelength range of 0.9 pm-1.7 pm said SWIR detector (21), associated with a processing unit (50), which comprises a step of acquiring images 2D successive scenes from the VisPIR detector called VisPIR images (42) and successive 2D images of the scene from the SWIR detector and synchronized with the VisPIR images, called SWIR images (41), a step of displaying the VisPIR images , a step of processing the VisPIR and SWIR images by the processing unit, mainly characterized in that this image processing step comprises the following sub-steps: - comparing the successive SWIR images to determine for each pixel (xi, y1 ) and neighboring pixels, the variation of clearly from one SWIR image to the other and the peak value ei (t) of these so-called SWIR illuminations, if this variation of illumination SWIR is greater than a predetermined illumination threshold, then an event i associated with said pixel ( xi, y1) or at the barycenter of the considered pixels, is detected and: one determines the date fi of the event i one determines the temporal form and the duration $ 5t; of the event i (step A), the coordinates of the pixel (x2, y2) and neighboring pixels corresponding to the pixel (x1, y1) or the center of gravity are determined in the VisPIR images synchronized with the SWIR images, and for this pixel (x2, y2) and these neighboring pixels, where the variation of the illumination from one VisPIR image to the other and the peak value e2 (t) of these so-called VisPIR illuminations are calculated, and these variations of illumination SWIR and VisPIR and their peak values ei (t) and e2 (t) to estimate a temperature of the event from a predetermined correspondence table (step B), ^ if the scene is a daytime scene and the temperature is greater than a temperature threshold, then the event is a false alarm (step C) and the previous steps are repeated with another pixel, - otherwise, where the distance Ri of the corresponding point of the scene is estimated to calculate the intensity I; of the event i from the lightings SWIR and VisPIR of this pixel and from this distance Ri (step D), where the total energy Ei of the event is estimated from its temporal form and its intensity ( step F), the event i is classified according to its duration θt, its temperature Ti, its intensity and its energy E. the preceding steps are reiterated for another pixel of the SWIR images.
[0002]
2. A method for classifying events of a scene according to the preceding claim, characterized in that when an event is associated with adjacent pixels, then the luminance of all these events is determined (step E), and this set of events is also classified according to its luminance.
[0003]
3. A method of classifying events of a scene according to one of the preceding claims, characterized in that incrust the event on the VisPIR image (42) displayed.
[0004]
4. A method of classifying events of a scene according to the preceding claim, characterized in that one displays the distance associated with the event on the VisPIR image (42) displayed.
[0005]
5. A method for classifying events of a scene according to one of the preceding claims, characterized in that the processing unit (50) is connected to an event management system and in that it comprises in addition, a step of triggering this management system from the classified event and its distance.
[0006]
A computer program product, said computer program comprising code instructions for performing the steps of the method according to any one of claims 1 to 5, when said program is executed on a computer.
[0007]
7. A scene event detection and classification system, which comprises: a single-stage imaging system equipped with a plurality of detectors including a VisPIR detector (22) and a SWIR detector (21), a unit processing device (50) connected to the detectors, means for estimating the distance between points of the scene and the imaging system, a display device (40) connected to the processing unit, characterized in that the processing unit comprises means for implementing the method of detection and classification according to one of claims 1 to 5.
[0008]
8. A detection and classification system according to the preceding claim, characterized in that the means for implementing the detection and classification method are a computer program according to claim 6.
类似技术:
公开号 | 公开日 | 专利标题
FR3017480A1|2015-08-14|METHOD FOR DETECTING AND CLASSIFYING EVENTS OF A SCENE
US8642961B2|2014-02-04|Apparatus and methods for use in flash detection
Repasi et al.2009|Advanced short-wavelength infrared range-gated imaging for ground applications in monostatic and bistatic configurations
EP2616864A1|2013-07-24|Optoelectronic system having super-hemispheric vision
EP2593904A1|2013-05-22|Multifunctional bispectral imaging method and device
EP2689403A1|2014-01-29|Monitoring system
US10257472B2|2019-04-09|Detecting and locating bright light sources from moving aircraft
EP1718067A1|2006-11-02|Airborne surveillance procedure and system using analysis of infrared images
Steinvall et al.2019|Remote detection and size estimation of optical apertures
EP3415949B1|2021-05-12|On-board observation system comprising a lidar for obtaining high-resolution three-dimensional images
Schwering et al.2007|Optical characterization of small surface targets
FR2969273A1|2012-06-22|Portable bi-spectral fire control system for launching surface to air missile toward aircraft, has camera detecting electromagnetic radiation with spectral band that is chosen so as to maximize contrast and/or signal to noise ratio
EP3268691A1|2018-01-17|Airborne device for detecting shots and for piloting assistance
FR3040848A1|2017-03-10|METHOD AND SYSTEM FOR BISTATIC IMAGING
FR2942062A1|2010-08-13|System for detecting or video monitoring presence and displacement of e.g. human, has scanning module oriented with respect to imager such that ray source and monitoring device are pointed in direction corresponding to part of image
Hewitt et al.2021|optimizing sensor design using a time-limited search model for moving sensor
Dulski et al.2013|Application of infrared uncooled cameras in surveillance systems
EP0540395B1|1996-09-18|Passive artillary fire or analogue detector
FR2974432A1|2012-10-26|LASER ACTIVE IMAGING DIFFERENTIAL GUIDING DEVICE
FR3110226A1|2021-11-19|Video combat shooting simulator and associated shooting simulation method
FR3098903A1|2021-01-22|OPTICAL PULSE DETECTION DEVICE
FR2570835A1|1986-03-28|Optical device for detecting the passing of a moving object and locating the point at which the passage occurs
FR3049066A1|2017-09-22|SYSTEM FOR MONITORING AND DETECTING AN EVENT ON THE GROUND SURFACE BY A SATELLITE CONSTELLATION
FR3073280A1|2019-05-10|METHOD FOR OBSERVING THE GROUND SURFACE AND DEVICE FOR IMPLEMENTING IT
van Dorp et al.2008|TActical sensor network TEst bed |
同族专利:
公开号 | 公开日
JP6449894B2|2019-01-09|
IL246633A|2019-03-31|
CA2938227A1|2015-08-13|
EP3103062A1|2016-12-14|
RU2016135936A3|2018-09-17|
US20160335504A1|2016-11-17|
WO2015118075A1|2015-08-13|
US9953226B2|2018-04-24|
FR3017480B1|2017-09-08|
JP2017516049A|2017-06-15|
RU2686566C2|2019-04-29|
RU2016135936A|2018-03-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
EP3151032A1|2015-10-02|2017-04-05|MBDA France|Method and device for determining the distance separating an aerial receiver and a fixed ground transmitter|US5233541A|1990-08-10|1993-08-03|Kaman Aerospace Corporation|Automatic target detection process|
US5930202A|1996-11-20|1999-07-27|Gte Internetworking Incorporated|Acoustic counter-sniper system|
US6178141B1|1996-11-20|2001-01-23|Gte Internetworking Incorporated|Acoustic counter-sniper system|
US5970024A|1997-04-30|1999-10-19|Smith; Thomas|Acousto-optic weapon location system and method|
GB2340222B|1998-07-14|2000-07-26|Infrared Integrated Syst Ltd|Multi-array sensor and method of identifying events using same|
JP2001194092A|2000-01-05|2001-07-17|Mitsubishi Electric Corp|Safety monitoring system for firing range|
WO2004001667A2|2002-06-21|2003-12-31|The Trustees Of Columbia University In The City Of New York|Systems and methods for de-blurring motion blurred images|
WO2007056753A2|2005-11-08|2007-05-18|General Atomics|Apparatus and methods for use in flash detection|
WO2007124539A1|2006-04-28|2007-11-08|Orica Explosives Technology Pty Ltd|Wireless electronic booster, and methods of blasting|
BRPI0806968B1|2007-02-08|2018-09-18|Behavioral Recognition Sys Inc|method for processing video frame stream and associated system|
US8585567B2|2007-12-11|2013-11-19|Tokitae Llc|Systems, devices, and methods including paramagnetic oscillation, rotation and translation of hemozoin asymmetric nanoparticles in response to multi-harmonic optical detection of the presence of hemozoin|
US8385997B2|2007-12-11|2013-02-26|Tokitae Llc|Spectroscopic detection of malaria via the eye|
US9044141B2|2010-02-10|2015-06-02|Tokitae Llc|Systems, devices, and methods including a dark-field reflected-illumination apparatus|
CN102077580B|2009-04-30|2013-11-06|索尼公司|Display control device, display control method|
FR2964818B1|2010-09-14|2012-09-28|Thales Sa|OPTRONIC SYSTEM WITH SUPRA HEMISPHERIC VISION|
US9692991B2|2011-11-04|2017-06-27|Qualcomm Incorporated|Multispectral imaging system|
US9117277B2|2012-04-04|2015-08-25|Canon Kabushiki Kaisha|Determining a depth map from images of a scene|
FR3017480B1|2014-02-07|2017-09-08|Thales Sa|METHOD FOR DETECTING AND CLASSIFYING EVENTS OF A SCENE|
IL231111A|2014-02-24|2016-06-30|Ori Afek|Flash detection|FR3017480B1|2014-02-07|2017-09-08|Thales Sa|METHOD FOR DETECTING AND CLASSIFYING EVENTS OF A SCENE|
US9626569B2|2014-11-06|2017-04-18|Bae Systems Information And Electronics Systems Integration Inc.|Filtered image data recovery using lookback|
FR3033649B1|2015-03-12|2018-06-15|Sagem Defense Securite|AIRPROOF FIRE DETECTION EQUIPMENT AND STEERING AID|
KR20180010581A|2016-07-21|2018-01-31|한화테크윈 주식회사|Surveillance Data Providing System and Method Thereof|
JP6774603B2|2017-03-06|2020-10-28|株式会社Jvcケンウッド|Laser light irradiation detection device, laser light irradiation detection method, laser light irradiation detection system|
法律状态:
2016-01-25| PLFP| Fee payment|Year of fee payment: 3 |
2017-01-26| PLFP| Fee payment|Year of fee payment: 4 |
2018-01-26| PLFP| Fee payment|Year of fee payment: 5 |
2020-01-27| PLFP| Fee payment|Year of fee payment: 7 |
2021-01-26| PLFP| Fee payment|Year of fee payment: 8 |
2022-01-27| PLFP| Fee payment|Year of fee payment: 9 |
优先权:
申请号 | 申请日 | 专利标题
FR1400350A|FR3017480B1|2014-02-07|2014-02-07|METHOD FOR DETECTING AND CLASSIFYING EVENTS OF A SCENE|FR1400350A| FR3017480B1|2014-02-07|2014-02-07|METHOD FOR DETECTING AND CLASSIFYING EVENTS OF A SCENE|
JP2016545990A| JP6449894B2|2014-02-07|2015-02-05|How to detect and classify scene events|
RU2016135936A| RU2686566C2|2014-02-07|2015-02-05|Method for detecting and classifying scene events|
CA2938227A| CA2938227A1|2014-02-07|2015-02-05|Method for detecting and classifying events of a scene|
US15/110,937| US9953226B2|2014-02-07|2015-02-05|Method for detecting and classifying events of a scene that include firing threat type events|
EP15705929.6A| EP3103062A1|2014-02-07|2015-02-05|Method for detecting and classifying events of a scene|
PCT/EP2015/052430| WO2015118075A1|2014-02-07|2015-02-05|Method for detecting and classifying events of a scene|
IL246633A| IL246633A|2014-02-07|2016-07-06|Method for detecting and classifying events of a scene|
[返回顶部]