![]() VALIDATION METHOD FOR VALIDATING THAT AN ELEMENT IS COVERED WITH A REAL SKIN
专利摘要:
The invention relates to a validation method for validating that an element (10) is covered with a real skin and implemented by a validation device (100) comprising a light source (102) having at least a length of wave, a sensor (104), an analysis module (106) and a decision-making module (108), said validation method comprising: - a positioning step during which a surface of said element (10) is placed in front of the light source (102) and the sensor (104), - an illumination step during which the light source (102) illuminates the surface of said element (10), - a capture step during which the sensor (104) captures, for the or each wavelength, an image of the surface thus illuminated and having a reflection zone where the light beam emitted by the light source (102) is directly reflected by said element (10) and a diffusion zone where part of the light beam is diffused through said element (10), - an analysis step during which the reflection zone and the diffusion zone of the or each image thus captured are analyzed, and - a decision-making step during wherein the decision module (108) makes a decision that said element (10) is covered with a real skin based on the results of the analyzing step. 公开号:FR3015728A1 申请号:FR1362977 申请日:2013-12-19 公开日:2015-06-26 发明作者:Adrien Besson;Alain Thiebot;Denis Dumont;Jerome Lorenzi 申请人:Morpho SA; IPC主号:
专利说明:
[0001] The present invention relates to a validation method for validating that an element is covered with a real skin, in particular in the context of a biometric analysis, and a validation device implementing such a method. A device for identifying an individual by his fingerprint / palmar consists of a sensor, a means of comparison, a means of decision-making. The sensor has a lighting device and an acquisition device for capturing images of one or more fingerprints or palm prints. A template is then extracted from each fingerprint or palmar highlighting its discriminant characteristics. The comparison means compares the captured image or the biometric templates that are derived from the image with the biometric images or templates of a database that gathers the images or templates of persons previously registered in the identification device. The decision-making means is intended to make a decision as to the identification of the individual from the result of the comparisons. [0002] Some malicious people try to be fraudulently identified by using decoys to mislead the identification device. Various validation methods are known to validate the fact that the skin present in front of the sensor is true and therefore that the finger bearing the fingerprint is a real finger. [0003] Some known methods rely entirely on the analysis of images, in particular by identification of artifacts for carrying out fraud. However, these methods are not robust to careful fraud. Other methods are also known for capturing a series of images of the finger and for measuring for example sweating, pulse, oximetry, bleaching of the finger during the support on the capture surface. Such methods require in particular an incompressible acquisition time because related to the rate of evolution of the observed phenomenon which degrades the ergonomics of the sensor. An object of the present invention is to propose a validation method intended to validate that an element is covered with a real skin and which does not have the disadvantages of the state of the art. For this purpose, a validation method is proposed for validating that an element is covered with a real skin and implemented by a validation device comprising a light source with at least one wavelength, a sensor, a analysis module and a decision-making module, said validation method comprising: - a positioning step during which a surface of said element is placed in front of the light source and the sensor, - an illumination step during which light source illuminates the surface of said element, - a capture step during which the sensor captures, for the or each wavelength, an image of the surface thus illuminated and having a reflection zone where the light beam emitted by the light source is directly reflected by said element and a diffusion zone where a part of the light beam is diffused through said element, - a step of analysis during from which the reflection zone and the diffusion area of the or each captured image are analyzed, and - a decision-making step in which the decision-making module makes a decision as to whether said element is covered real skin based on the results of the analysis step. Advantageously, the analysis step consists, for the or each image, of cutting an analysis zone covering the reflection zone and the diffusion zone into several calculation zones, to establish an average intensity for each calculation zone. , an intensity curve and a curve of the intensity gradient as a function of the distance from the calculation zone to the limit of the reflection zone and to compare characteristics of these curves with those extracted from reference curves, said zone analysis. Advantageously, when the light source emits in at least two distinct wavelengths, the analysis step further consists in establishing, for each area of calculation of said analysis zone, the curve of the intensity ratio for two distinct wavelengths as a function of the distance of said calculation zone at the limit of the reflection zone, and the curve of the intensity gradient ratio for two distinct wavelengths as a function of the distance of said zone of computation at the limit of the reflection zone and to compare characteristics of these curves with those extracted from reference curves. According to a particular embodiment, the light source is rectangular, and the division of the analysis zone consists, for the or each image, in cutting said image into several rectangular strips whose edges are equidistant from the edges of the reflection zone. . According to another particular embodiment, the light source is circular, and the division of the analysis zone consists, for the or each image, in cutting said image into several concentric rings and centered on the center of the reflection zone. Advantageously, the light source emits in a wavelength between 350 and 550 nm and a wavelength greater than 600 nm. The invention also proposes a validation device intended to validate that an element is covered with a real skin, said validation device comprising: a light source emitting at least one wavelength, and intended to illuminate a surface of said surface element, - a sensor intended to capture, for the or each wavelength, an image of the surface thus illuminated and having a reflection zone where the light beam emitted by the light source is directly reflected by said element and a zone of diffusion where a part of the light beam is diffused through said element, - an analysis module intended to receive the or each image captured by the sensor and to analyze it, and - a decision-making module intended to take a decision on the fact that said element is covered with a real skin, from the information transmitted by the analysis module. Advantageously, the analysis module comprises, for the or each image, means for cutting an analysis zone covering the reflection zone and the diffusion zone in several calculation zones, means for establishing, for each calculation zone , the average intensity of said calculation zone, for establishing the intensity curve and the intensity gradient curve as a function of the distance of the calculation zone at the limit of the reflection zone and means for comparing characteristics of these curves to those extracted from reference curves. Advantageously, when the light source emits in at least two distinct wavelengths, the analysis module further comprises means for establishing, for each zone of calculation of the analysis zone, the curve of the intensity ratio for two distinct wavelengths as a function of the distance of said calculation zone at the limit of the reflection zone, and the curve of the intensity gradient ratio for two distinct wavelengths as a function of the distance of said zone at the limit of the reflection zone and means for comparing characteristics of these curves with those extracted from reference curves. According to a particular embodiment, the light source is rectangular, and the analysis module comprises, for the or each image, means for cutting said image into several rectangular strips whose edges are equidistant from the edges of the reflection zone. According to another particular embodiment, the light source is circular, and the analysis module comprises, for the or each image, means for cutting said image into several concentric rings and centered on the center of the reflection zone. The characteristics of the invention mentioned above, as well as others, will appear more clearly on reading the following description of an exemplary embodiment, said description being given in relation to the attached drawings, among which: Fig. 1 is a schematic representation of a validation device according to the invention, FIG. 2a is an image of a fingerprint seen by the validation device according to one embodiment of the invention, FIG. 2b is an image of a fingerprint seen by the validation device according to another embodiment of the invention, FIG. 3 is an algorithm of a validation method according to the invention, FIG. 4 shows the reflectance of a real finger as a function of the wavelength of the luminous flux which illuminates it, FIG. Figure 5 shows an example of an intensity curve, and FIG. 6 shows a curve representative of the penetration in cm of the light in the skin as a function of the wavelength. The principle of the invention consists in validating that an element, in particular the lower surface of one or more fingers or the palm, is covered with a real skin and therefore is a part of a human body. The principle of the invention consists more precisely in illuminating the surface of the element with the aid of a light source making it possible to illuminate only a well-defined zone of the element and to preserve a zone without direct illumination. , to capture an image of these two zones and to analyze this image to deduce that said element is covered with a real skin or a false skin. [0004] Illumination can be achieved by non-contact or contact image capture through a contact surface, in particular a prism or an optical plate. In the rest of the description, the element is a finger, but the invention applies to all other parts of the body, such as several fingers, a palm or a face. In particular, an advantageous embodiment allows a clear transition between the directly lit area and the area without direct lighting. An exemplary embodiment may be one or more light sources of low spread or one or more lines. [0005] Fig. 1 shows a validation device 100 which is intended to validate that an element 10, here a finger, is covered with a real skin and is therefore a real finger 10. The validation device 100 comprises: a light source 102 to at least one wavelength for illuminating a surface of the finger 10, - a sensor 104 for capturing, for the or each wavelength, an image 200a-b which as hereinafter described comprises both the surface illuminated directly by the light source 102 and the unlit surface immediately adjacent to the illuminated surface, that is to say an image 200a-b having a reflection zone 202a-b where the light beam emitted by the light source 102 is directly reflected by said element 10 and a diffusion zone 204a-b where part of the light beam is diffused through said element 10, - an analysis module 106 intended to receive the or each image 200a-b captured by the 104 and to analyze it as described below, and - a decision-making module 108 intended to make a decision on the fact that the finger 10 is covered with a real skin, from the information transmitted by the module d 106. FIG. 2a shows an image 200a captured by the sensor 104 in the case of a circular light source 102. Fig. 2b shows an image 200b captured by the sensor 104 in the case of a rectangular light source 102. The light source 102 has a reduced angular aperture in at least one direction, so that the illumination of the surface is not uniform throughout the image 200a-b. [0006] Thus, the surface of the finger 10 is illuminated at a reflection zone 202a-b where the light beam emitted by the light source 102 is directly reflected by the surface (the scattered and absorbed part being of a second order) and the surface of the finger 10 illuminates at a peripheral zone called diffusion zone 204a-b (not illuminated directly by the light source 102) because a portion of the light beam is transmitted in the finger 10, and is then diffused at the diffusion zone 204a-b which extends on the periphery of the reflection zone 202a-b. The image 200a-b which is captured by the sensor 104 shows the reflection zone 202a-b and the associated diffusion zone 204a-b and the simultaneous analysis of these zones 202a-b and 204a-b makes it possible to conclude that the finger 10 is true or false. The illumination of the surface is said to be non-uniform throughout the image 200a-b when the reflection zone 202a-b and the diffusion zone 204a-b are simultaneously visible by the sensor 104 but the diffusion zone 204a-b does not undergo direct illumination of the light source 102 unlike the reflection zone 202a-b. The analysis therefore takes into account both the reflection on the finger 10 and the diffusion by the finger 10 and not just one of these two phenomena. Thus, even if a lure is used in place of a real finger, it will be difficult to achieve a lure with the optical characteristics of a real finger for reflection and for dissemination. In the particular case of a capture with contact and to facilitate the positioning and the maintenance of the finger 10 during the capture of the image 200a-b, the validation device 100 may comprise a capture surface on which the finger 10 is placed in support and which allows the guiding of the light between the light source 102, the finger 10, and the sensor 104. In some embodiments with contact, a translucent blade is interposed between the finger 10 and the sensor 104 and the light source 102 Fig. 3 is an algorithm of a validation process 300 intended to validate that the finger 10 is covered with a real skin and implemented by the validation device 100. The validation process 300 comprises: a positioning step 302 at the during which a surface of the finger 10 is placed in front of the validation device 100, and more particularly in front of the light source 102 and the sensor 104, an illumination step 304 during which the light source 102 illuminates the surface of the finger 10, a capture step 306 during which the sensor 104 captures, for the or each wavelength, an image 200a-b of the surface thus illuminated and having the reflection zone 202a-b and the diffusion zone. 204a-b associated, - an analysis step 308 during which the reflection zone 202a-b and the diffusion zone 204a-b of the or each image 200a-b thus captured are analyzed, and - a step of taking of decision 310 during the In the decision-making module 108, a decision is made as to whether the finger 10 is covered with a real skin according to the results of the analysis step 308. The analysis step 308 consists, for example, of the or each image 200a-b, to cut an analysis zone 206a-b covering the reflection zone 202a-b and the diffusion zone 204a-b in several calculation zones 208a-b, to establish an average intensity for each computation area 208a-b, an intensity curve and a curve of the intensity gradient as a function of the distance from said calculation area 208a-b to an origin and to compare characteristics of these curves with those extracted from reference. The origin of the analysis zone 206a-b depends on the shape of the latter and is determined by using methods of analysis of shapes known to those skilled in the art such as skeletonization and debarking methods. . These methods make it possible to ensure that each point of the origin is equidistant from the point of the closest reflection zone 202a-b. In the particular case of circular illumination (Fig. 2a), the reflection zone 202a and the diffusion zone 204a are concentric circles, and the origin is fixed on the boundary of the reflection zone 202a. The analysis zone 206a is divided into calculation zones 208a which here are concentric rings 208a to the reflection zone 202a. In the particular case of rectangular lighting (FIG 2b), the reflection zone 202b is a rectangle and the diffusion zone 204b is a larger rectangle whose edges are equidistant from the edges of the reflection zone 202b, and origin is fixed on the boundary of the reflection zone 202b. The analysis zone 206b is divided into calculation zones 208b which here are rectangular strips 208b whose edges are equidistant from the edges of the reflection zone 202b. [0007] The limit of the reflection zone 202a-b which serves as the origin is determined for example by a calibration process based on the acquisition of a non-diffusing white target image through the sensor under non-uniform illumination as defined. previously. The image obtained by capturing this material only reflects the direct reflection effect and not the diffusion effect due to the nature of the material. Illumination defects that can be observed (for example, nonuniformity in the forward reflection area) will be compensated for when calculating the intensity profiles to obtain a plateau of intensity in the forward reflection area. The limit of the direct reflection zone is defined at each point of the source contour, that is to say at each limit point of the intensity plateau measured on the non-diffusing material. The analysis module 106 then comprises, for the or each image 200a-b, means for cutting the analysis zone 206a-b into several calculation zones 208a-b, means for establishing, for each calculation zone 208a b, the average intensity of said calculation zone 208a-b, and then to establish the intensity curve as a function of the distance from the calculation zone 208a-b at the origin and the curve of the intensity gradient as a function of the distance from the calculation area 208a-b at the origin and means for comparing characteristics of these curves with those extracted from reference curves. The characteristics to be compared are extracted from each image 200a-b advantageously previously treated in order not to take into account the pixels of the background. An example of such processing is the application of a simple frequency filter or the use of a peak pixel location mask generated by a template extractor. The characteristics to be compared are thus in particular the absolute value of the intensity of each calculation zone 208a-b and the intensity gradient in each calculation zone 208a-b. Fig. 5 shows an example of an intensity curve 502 as a function of the distance from the limit of the reflection zone 202a-b corresponding to the origin defined by the method described above. [0008] For the finger 10 to be validated, the intensity curve and the gradient curve as a function of the distance must each remain between two reference terminals extracted from the reference curves. It is also possible to measure at each point, the difference of how much the intensity value and the value of the gradient of said point with respect to the two corresponding reference points are coming out and to summation these differences in absolute value and to compare this sum with an acceptable threshold. The reference curves are here intensity curves and intensity gradients that have been established with a large panel of real fingers and for the wavelength considered. In the case of a circular light source 102, the cutting of the analysis zone 206a consists, for the or each image 200a, in cutting said image 200a into a plurality of concentric rings 208a centered on the center of the reflection zone 202a. , of which at least one ring 208a is in the reflection zone 202a and whose at least one ring 208a is in the diffusion zone 204a, to calculate the intensity average on each ring 208a of the previously processed images, to establish a curve of the intensity average thus calculated as a function of the distance of the ring 208a at the limit of the reflection zone 202a-b, and a curve of the intensity gradient as a function of the distance of the ring 208a to the limit of the reflection zone 202a-b and comparing these curves with reference curves, for example by difference and verification with respect to a threshold. The reference curves are here a curve of the intensity average as a function of the distance of the ring 208a at the limit of the reflection zone 202a and a curve of the intensity gradient as a function of the distance of the ring 208a at the limit of the reflection zone 202a-b which have been established with real fingers for the wavelength considered. The analysis module 106 then further comprises, for the or each image 200a, means for cutting said image 200a into a plurality of concentric rings 208a centered on the center of the reflection zone 202a. [0009] In the case of a rectangular light source 102, the principle is similar except that the calculation areas 208b are rectangular strips 208b and not rings 208a. The division of the analysis zone 206b then consists, for the or each image 200b, in cutting said image 200b into several rectangular strips 208b. The analysis module 106 then further comprises, for the or each image 200b, means for cutting said image 200b into several rectangular strips 208b whose edges are equidistant from the edges of the reflection zone 202b. When the light source 102 emits in at least two distinct wavelengths, it is possible to combine the results obtained for each wavelength with the analysis results for each pair of distinct wavelengths. [0010] The analysis step 308 then furthermore consists in establishing, for each calculation zone 208a-b of said analysis zone 206a-b, the curve of the intensity ratio for two distinct wavelengths as a function of the distance from said calculation zone 208a-b to the limit of the reflection zone 202a-b and the curve of the intensity gradient ratio for two distinct wavelengths as a function of the distance of said calculation zone 208a-b at the limit of the reflection zone 202a-b and comparing characteristics of these curves with those extracted from reference curves. The reference curves are here the intensity ratio and intensity gradient ratio curves which have been established with a large panel of true fingers 10 and for the two wavelengths considered. The analysis module 106 then further comprises means for establishing, for each calculation zone 208a-b of the analysis zone 206a-b, the curve of the intensity ratio for two distinct wavelengths as a function of the distance from said calculation zone 208a-b to the limit of the reflection zone 202a-b, and the intensity gradient ratio curve for two distinct wavelengths as a function of the distance of said calculation zone 208a -b at the limit of the reflection zone 202a-b and means for comparing characteristics of these curves with those extracted from reference curves. In the various embodiments presented, there are for example 5 to 50 rings and the difference in radii between two consecutive rings is of the order of 0.1 to 0.3 mm depending on the wavelength used. In the case of a light source 102 at several wavelengths, the sensor 204 preferably takes the form of a camera having a signal output for each Red-Green-Blue wavelength. The camera is for example equipped with a Bayer filter which allows in a single capture to recover the image 200a-b corresponding to each wavelength on the appropriate signal output. According to another particular embodiment, the light source 102 emits a white light. In another embodiment at several wavelengths, it is possible to use a monochrome camera and to use a light source emitting the different wavelengths one after the other. An image is then obtained for each wavelength and this embodiment makes it possible in particular to use a wavelength in the near infrared range. [0011] It is also conceivable that for each wavelength the area of the illuminated finger is different. For example, there may be several light sources 102, each illuminating according to a particular wavelength and illuminating a different zone of the finger 10. [0012] Fig. 4 shows the curve of the reflectance 400 as a percentage of a real finger as a function of the wavelength in nm of the luminous flux which illuminates it. Fig. 6 shows the penetration curve 602 in cm of the light in the skin as a function of the wavelength. This curve is related to the curve of the reflectance 400 in that the wavelengths not absorbed by the epidermis reaching the dermis which is a less absorbent tissue will be able to be diffused over a distance greater than the lengths of the waves remaining in the epidermis. To maximize the difference in behavior between reflection and diffusion, it is preferable to choose wavelengths with very different behaviors on a finger 10. [0013] The reflectance curves 400 and penetration depth 602 highlight these very different behaviors. In the low wavelengths, that is for wavelengths ranging from 350 to 550 nm (UV, Blue, Green), the light remains in the superficial layers of the skin and is absorbed. For higher wavelengths, ie for wavelengths of 600 to 900 nm and beyond (amber / red, near IR), light enters the dermis and diffuses further. Thus, in the case where two wavelengths will be used, the choice will be preferentially towards a wavelength between 350 and 550 nm and a wavelength greater than 600 nm. [0014] Of course, the present invention is not limited to the examples and embodiments described and shown, but it is capable of numerous variants accessible to those skilled in the art. For example, the invention has been more particularly described in the case of a single light source 102, but it is possible to have several light sources 102 each pointing to a different surface of the element 10 to be validated. There will then be as many reflection zones 202a and diffusion zones 204a which will undergo identical treatment and which will allow consolidation between several values and thus avoid complex frauds having parts of real skins and parts of false skins.
权利要求:
Claims (11) [0001] CLAIMS1) Validation method (300) for validating that an element (10) is covered with a real skin and implemented by a validation device (100) comprising a light source (102) at least one length of d a sensor (104), an analysis module (106) and a decision-making module (108), said validation method (300) comprising: - a positioning step (302) in which a surface of said element (10) is placed in front of the light source (102) and the sensor (104), - an illumination step (304) during which the light source (102) illuminates the surface of said element (10), a capture step (306) during which the sensor (104) captures, for the or each wavelength, an image (200a-b) of the surface thus illuminated and having a reflection zone (202a-b ) where the light beam emitted by the light source (102) is directly reflected by said element (10) and a diff area usion (204a-b) where a part of the light beam is diffused through said element (10), - an analysis stage (308) during which the reflection zone (202a-b) and the diffusion zone ( 204a-b) of the or each captured image (200a-b) are analyzed, and - a decision-making step (310) in which the decision-making module (108) makes a decision as to whether said element (10) is covered with a real skin according to the results of the analysis step (308). [0002] 2) validation method (300) according to claim 1, characterized in that the analysis step (308) consists, for the or each image (200a-b), to cut an analysis zone (206a-b ) covering the reflection zone (202a-b) and the diffusion zone (204a-b) in several calculation zones (208a-b), to establish, an average intensity for each calculation zone (208a-b), a intensity curve and a curve of the intensity gradient as a function of the distance from the calculation zone (208a-b) to the limit of the reflection zone (202a-b) and to compare characteristics of these curves with those of extracted from reference curves, said analysis zone (206a-b). [0003] 3) validation method (300) according to claim 2, characterized in that, when the light source (102) emits in at least two distinct wavelengths, the analysis step (308) further consists in establishing for each computing area (208a-b) of said analysis area (206), the intensity ratio curve for two distinct wavelengths as a function of the distance of said calculation area (208a-b) at the limit of the reflection zone (202a-b), and the intensity gradient ratio curve for two distinct wavelengths as a function of the distance from said calculation zone (208a-b) to the boundary of the reflection zone (202a-b) and to compare characteristics of these curves with those extracted from reference curves. [0004] 4) validation method (300) according to one of claims 2 or 3, characterized in that the light source (102) is rectangular, and in that the cutting of the analysis zone (206b) consists, for the or each image (200b), to cut said image (200b) into several rectangular strips (208b) whose edges are equidistant from the edges of the reflection zone (202b). [0005] 5) validation process (300) according to one of claims 2 or 3, characterized in that the light source (102) is circular, and in that the cutting of the analysis zone (206a) consists, for the or each image (200a), to cut said image (200a) into a plurality of concentric rings (208a) centered on the center of the reflection zone (202a). [0006] 6) validation process (300) according to one of claims 1 to 5, characterized in that the light source (102) emits in a wavelength between 350 and 550 nm and a wavelength greater than 600 nm . [0007] 7) Validation device (100) for validating that an element (10) is covered with a real skin, said validation device (100) comprising: a light source (102) emitting at least one wavelength , and for illuminating a surface of said element (10), - a sensor (104) for capturing, for the or each wavelength, an image (200a-b) of the illuminated surface and having a reflection zone (202a-b) where the light beam emitted by the light source (102) is directly reflected by said element (10) and a diffusion area (204a-b) where a portion of the light beam is diffused through said element (10) an analysis module (106) for receiving the or each image (200a-b) captured by the sensor (104) and for analyzing it, and - a decision-making module (108) for taking a decision that said element (10) is covered with a real skin, based on the information transmitted by the analysis module (106). [0008] 8) Validation device (100) according to claim 7, characterized in that the analysis module (106) comprises, for the or each image (200a-b), means for cutting an analysis zone (206a). b) covering the reflection zone (202a-b) and the diffusion zone (204a-b) in a plurality of computing zones (208a-b), means for setting, for each computing area (208a-b), mean intensity of said calculation zone (208a-b), for establishing the intensity curve and the intensity gradient curve as a function of the distance from the calculation zone (208a-b) to the boundary of the zone of reflection (202a-b) and means for comparing characteristics of these curves with those extracted from reference curves. [0009] 9) Validation device (100) according to claim 8, characterized in that, when the light source (102) emits in at least two distinct wavelengths, the analysis module (106) further comprises means for establishing, for each computing area (208a-b) of the analysis area (206a-b), the intensity ratio curve for two distinct wavelengths as a function of the distance from said calculation area (208a-b); b) at the limit of the reflection zone (202a-b), and the curve of the intensity gradient ratio for two distinct wavelengths as a function of the distance from said calculation zone (208a-b) to the limit of the reflection zone (202a-b) and means for comparing characteristics of these curves with those extracted from reference curves. [0010] 10) Validation device (100) according to one of claims 8 or 9, characterized in that the light source (102) is rectangular, and in that the analysis module (106) comprises, for the or each image (200b), means for cutting said image (200b) into a plurality of rectangular strips (208b) whose edges are equidistant from the edges of the reflection zone (202b). [0011] 11) Validation device (100) according to one of claims 8 or 9, characterized in that the light source (102) is circular, and in that the analysis module (106) comprises, for the or each image (200a), means for cutting said image (200a) into a plurality of concentric rings (208a) centered on the center of the reflection zone (202a).
类似技术:
公开号 | 公开日 | 专利标题 WO2015091701A1|2015-06-25|Method of validation intended to validate that an element is covered by a true skin US7627151B2|2009-12-01|Systems and methods for improved biometric feature definition FR2913788A1|2008-09-19|METHOD AND APPARATUS FOR IDENTIFYING AN INDIVIDUAL BY OPTICAL CAPTURE OF AN IMAGE OF A BODY FOOTPRINT EP2901370B1|2016-07-06|Method for detecting a real face EP2240884A1|2010-10-20|Device for acquiring fingerprints on the fly EP3388976A1|2018-10-17|Method for detecting fraud FR2962025A1|2012-01-06|METHOD AND DEVICE FOR DETECTION AND QUANTIFICATION OF SKIN SIGNS ON A SKIN AREA WO2016102854A1|2016-06-30|Method and system for acquiring and analysing fingerprints with fraud detection EP3413233A1|2018-12-12|Method for detecting a fake fingerprint EP3139307A1|2017-03-08|Method for characterising a material by speckle analysis FR3066294A1|2018-11-16|IMPRESSION CAPTURE DEVICE EP3036687B1|2021-05-05|Decision device provided to decide whether an eye is real or false FR2910150A1|2008-06-20|Iris image capturing method for individual identification, involves selecting wavelength of infrared radiation source of lighting unit for showing pattern of iris during capturing of image of iris, for individual WO2015185522A1|2015-12-10|Method for validating the authenticity of an element of the human body WO2017029455A1|2017-02-23|Biometric sensor with at least two light sources of different apparent sizes EP3206160B1|2018-11-21|Method for biometric processing of images FR2915008A1|2008-10-17|Body zone`s e.g. finger, living character determining method for identifying person, involves determining living or non living character of body zone based on difference between ratio of luminance and ratio predicted by reflectance model EP3614305A1|2020-02-26|Authentication by optical index IL184442A|2013-10-31|Biometric recognition/verification using multispectral imaging EP2827282A1|2015-01-21|Method for verifying the veracity of a finger or a palm CA2976691A1|2016-08-25|Non-invasive method for measuring a physiological parameter via a confocal spectroscopic measurement device WO2010081972A1|2010-07-22|Device and method for characterizing a skin surface
同族专利:
公开号 | 公开日 US9898646B2|2018-02-20| EP3044729A1|2016-07-20| FR3015728B1|2019-04-19| US20160275335A1|2016-09-22| WO2015091701A1|2015-06-25| ZA201604053B|2019-10-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20120218397A1|2009-10-26|2012-08-30|Nec Corporation|Fake finger determination apparatus and fake finger determination method|FR3047588A1|2016-02-09|2017-08-11|Morpho|BIOMETRIC IMAGE PROCESSING METHOD| EP3388976A1|2017-04-12|2018-10-17|Idemia Identity & Security France|Method for detecting fraud|GB9907639D0|1999-04-06|1999-05-26|Ncr Int Inc|Self-service terminal| DE10128717A1|2001-06-13|2002-12-19|Tst Touchless Sensor Technolog|Method for recognition of natural skin, based on the properties of visible and near visible light scattered from the skin surface, thus increasing the security of biometric fingerprint identification| US20050031173A1|2003-06-20|2005-02-10|Kyungtae Hwang|Systems and methods for detecting skin, eye region, and pupils| FR2872318B1|2004-06-23|2006-09-22|Sagem|OPTICAL BIOMETRIC CAPTURE DEVICE BY CONTACT AND INSTALLATION USING SUCH A DEVICE| FR2911205B1|2007-01-05|2009-06-05|Commissariat Energie Atomique|METHOD AND DEVICE FOR RECOGNIZING AN INDIVIDUAL| US8284019B2|2007-05-08|2012-10-09|Davar Pishva|Spectroscopic method and system for multi-factor biometric authentication| JP5451540B2|2009-10-16|2014-03-26|日立オムロンターミナルソリューションズ株式会社|Biometric authentication device and biometric authentication method|FR3034224B1|2015-03-23|2018-03-23|Morpho|DEVICE FOR VERIFYING THE VERACITY OF A DIGITAL FOOTPRINT| FR3067493B1|2017-06-07|2021-07-16|Safran Identity & Security|FALSE FOOTPRINT DETECTION PROCESS| FR3085079B1|2018-08-20|2021-09-24|Idemia Identity & Security France|AUTHENTICATION BY OPTICAL INDEX|
法律状态:
2015-11-23| PLFP| Fee payment|Year of fee payment: 3 | 2016-11-21| PLFP| Fee payment|Year of fee payment: 4 | 2017-11-21| PLFP| Fee payment|Year of fee payment: 5 | 2019-11-20| PLFP| Fee payment|Year of fee payment: 7 | 2020-11-20| PLFP| Fee payment|Year of fee payment: 8 | 2021-11-18| PLFP| Fee payment|Year of fee payment: 9 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1362977A|FR3015728B1|2013-12-19|2013-12-19|VALIDATION METHOD FOR VALIDATING THAT AN ELEMENT IS COVERED WITH A REAL SKIN| FR1362977|2013-12-19|FR1362977A| FR3015728B1|2013-12-19|2013-12-19|VALIDATION METHOD FOR VALIDATING THAT AN ELEMENT IS COVERED WITH A REAL SKIN| PCT/EP2014/078320| WO2015091701A1|2013-12-19|2014-12-17|Method of validation intended to validate that an element is covered by a true skin| US15/036,919| US9898646B2|2013-12-19|2014-12-17|Method of validation intended to validate that an element is covered by a true skin| EP14815344.8A| EP3044729A1|2013-12-19|2014-12-17|Method of validation intended to validate that an element is covered by a true skin| ZA201604053A| ZA201604053B|2013-12-19|2016-06-14|Method of validation intended to validate that an element is covered by a true skin| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|