专利摘要:
The present invention relates to a method for identifying / authenticating an individual comprising segmenting an acquired image of an eye of the individual so as to isolate a region containing the texture of the iris and to determine an associated mask. , dividing and normalizing said region containing the iris texture and said N2 mask of concentric rings, determining an iris IC binary code of size equal to N2 * P * F and associated IM mask by applying F Gabor filters at each position among P positions of each ring, matching at least one ring of the iris of the acquired image with a ring of a reference iris, so as to minimize a correspondence distance (HDViterbi) between the acquired image and a reference image obtained from determined iris and stored reference codes and associated mask codes by calculating a sum of distances (Dist) between each of said rings of the iris of the matched acquired image and the reference iris ring corresponding thereto and an identification / authentication of the individual if said computed minimized match distance is less than a predetermined threshold.
公开号:FR3037422A1
申请号:FR1555440
申请日:2015-06-15
公开日:2016-12-16
发明作者:Emine Krichen;Pierre Gacon;Stephane Gentric
申请人:Morpho SA;
IPC主号:
专利说明:

[0001] TECHNICAL FIELD The invention relates to an improved method of identification or biometric authentication by iris recognition.
[0002] STATE OF THE ART In order to compare the iris of an individual with iris data stored in a reference database, the iris of this individual must be characterized numerically. For this, a commonly used method is the Daugman method (US5291560). This method provides that the image of the eye to be analyzed is segmented in order to isolate the part of the image corresponding to the iris of the individual, as shown in FIG. 1. In order to be able to compare this part of the image to the reference iris data, the area of the iris to be analyzed can then be cut into a number of angular areas, and cut radially into a number of rings, as shown in Figure 2. In order to effectively compare the iris to be analyzed at the reference irises despite a variable dilatation according to the shooting conditions, the image of the iris to be analyzed can be standardized in the form of a rectangular image of fixed size, thus transforming the Cartesian coordinates of the original image in polar coordinates, as shown in Figures 3a to 3c. In order to transform this normalized image into a binary code, called iris code, a filtering, for example of Gabor, can be applied to various zones of this image, for example to each zone corresponding to the intersection of a ring and a an angular sector. A binary value 0 or 1 can then be assigned to each zone according to the sign of the result of the filtering on this zone. Such a binary code can then be easily compared bit by bit with one or more reference binary codes stored in a database to identify or authenticate the individual whose iris has been analyzed. Such a binary comparison amounts to comparing between them zones of the iris positioned at the same position in the normalized image. Note that the left and right edges of the normalized image actually correspond to areas near the iris and should be considered as such in post-normalization analyzes. Such a normalization would perfectly compensate for a linear dilation / contraction of the iris as a function of brightness. In practice, the dilation of the iris observed is far from linear. A ring of the iris can then be at different positions in the normalized image depending on the dilation of the pupil. In this case, the bit-by-bit comparison of the binary codes may return to comparing standardized image areas that do not correspond to identical zones of the iris. In addition, the movement of the iris 3037422 2 during the dilation of the pupil is in three dimensions. Some rings of the iris may appear or disappear during dilatation due to the overlap of different areas of the iris. Complex models have been proposed in order to model the movement of the iris in more finely depending on the expansion such as the Wyatt model (Harry Wyatt, "A minimum wear-and-tear meshwork for the iris", Vision Research 40 (2000) 2167-2176). But such a model does not take into account the appearance or disappearance of areas of the iris. A model such as the Clark model (A. Clark, S. Kulp, I. Herron and A. Ross, "A theoretical model for describing iris dynamics," Handbook of Iris Recognition, pp. 129-150, 10 2013 ") allows to take into account both a non-linear variation of the iris and appearance / disappearance of certain areas of the iris. Nevertheless, the prediction of the movement of the iris by such a model is a function of a parameter, called Poisson parameter, whose value varies from one individual to another. It is therefore necessary to be able to recognize an individual by his iris using such a model to perform a specific learning to determine the value of the Poisson parameter for that individual. In the article ("lnmaculada Tomeo-Reyes, Arun Ross, Antwan D. Clark, Vinod Chandran" A Biomechanical Approach to Iris Normalization) the authors propose a methodology of normalization of the texture of the iris based on the Clark model. As stated by the authors, this methodology requires knowledge of the dilation ratio of the pupil of the reference image to normalize the test image. If this approach can be acceptable for a verification scenario (where the reference image is known), it becomes impossible to implement for an identification scenario (where the reference image to be compared is unknown). The authors also set the fish parameter at 0.49 for all people.
[0003] In order to limit the dependence on such complex models of iris dilation, methods of identification or authentication by iris recognition have been developed in which the dilation of the pupil is not taken into account when of the normalization step but during the step of comparing the binary codes. Such models generally attempt, when comparing two iris images, to determine the most likely movement of the iris rings between these two images from the iris codes, before calculating a distance between the images from the iris codes taking into account this movement. Each zone of the iris in the normalized image is then no longer necessarily compared to the corresponding zone of the reference image but can be compared to an area of this image that corresponds to it better. Such a comparison better takes into account the nonlinear nature of the movement of the iris as a function of the dilation of the pupil.
[0004] 3037422 3 It can also take into account the appearance or disappearance of certain areas of the iris. The distance between the compared binary codes can then be reduced, thus improving the efficiency of the recognition method. By way of example, the method described in Uhl & VVild (Andreas Uhl, Peter Wild, "Enhancing Iris Matching Using Levelshtein Distance with Alignment Constraints", In G. Bebis, R. Boyle, B. Parvin, D. Koracin, R. Chung, R. Hammoud, editors, Advances in Visual Computing: International Symposium 6th, (ISVC 2010) pp. 469-479, LNCS, 6453, Springer Verlag, 2010) proposes to compare bit by bit the binary codes of an image to analysis and a reference image, and to take into account, when there is a discrepancy between these codes, the possibility that this is the result of an insertion or disappearance of an iris zone in one of the images, or of a substitution between the iris zones corresponding to the bits in gap between these two images. Each operation having a predetermined cost, the method proposes to determine the set of substitution operations, disappearance and appearance to explain the differences between the iris codes compared for the lowest cost. Such a method, however, fails to process bit-by-bit binary codes and does not take into account the correlation of the motion of an area of the iris and other areas belonging to the same ring of the iris. Such a correlation is important because of the isotropic nature of the movement of the iris. Such bitwise processing thus greatly increases the probability of false positives when comparing two iris images of different individuals.
[0005] There is therefore a need for an effective method of identifying or authenticating iris recognition individuals to take into account the non-linear nature of iris displacement and the appearance or disappearance of the iris. certain areas of the iris depending on the dilation of the pupil, while taking advantage of the isotropic nature of this movement, without requiring specific training for this individual.
[0006] SUMMARY OF THE INVENTION The present invention thus relates, according to a first aspect, to a method for identifying and / or authenticating an individual comprising the following steps implemented by an identification and / or an identification system. authentication of an individual, said identification and / or authentication system of an individual comprising a storage device 30 storing at least one binary code of a reference iris obtained from an iris image of reference divided into a first predetermined number Ni of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources of noise have been detected: 35 - acquisition of an image of an eye of the individual to identify and / or authenticate, 3037422 4 - segmentation of the image of the acquired eye so as to isolate a region containing the texture of the irisand determining an associated mask, - dividing said region containing the iris texture and said mask into a second number N2 of concentric rings, - normalizing said region containing the texture of the iris and said Cartesian coordinate mask in polar coordinates, - determining an iris IC binary code of size equal to N2 * P * F encoding said region containing the normalized iris texture and a mask bit code IM of size equal to N2 * P * F encoding the associated mask, by applying a third predetermined number of Gabor filters F at each of a fourth predetermined number P of positions of each ring, - matching at least one ring of the iris of the image acquired with a ring of one of said reference iris, so as to minimize a correspondence distance, between the acquired image and the reference image, said correspondence distance being obtained from the coders a determined iris and stored reference and associated mask codes by calculating a sum of distances between each of said iris rings of the matched acquired image and the ring of the reference iris corresponding thereto calculating said minimized matching distance, identifying and / or authenticating the individual if said minimized matching distance is less than a predetermined threshold. The mapping step may implement a Viterbi algorithm. The mapping step may comprise: a calculation of a comparison matrix M of size N1 * N2, each coefficient M (i, j) of said comparison matrix coding the result of the comparison between the ith ring of The stored reference iris and the iris ring j of the acquired image using said determined and reference iris codes and associated mask codes, - a Viterbi lattice construction in which each mapping of a ring i of the reference iris and a ring j of the iris of the acquired image has the following cost: for i = 1, Cost (i, j) = Prob (i , j) * M (i, j) 3037422 5 for all i> 1: Cost (i, j) = Prob (i, j) * M (i, j) + minL-ii: is and k = kmin: kmax ProbT ((i, j), - L, j + k)) * Cost - L, j + k)) with: li, Is, kmin, kmax of the predetermined relative integers, with li k 1 and 0 <j + k N2, o Prob (i, j) a predetermined probability of correspondence of the ith ring of the reference iris and the jth ring of the iris of the acquired image, o ProbT ((i, j), (p, q)) a predetermined probability of correspondence of the p-ring of the reference iris and the q-th ring of the iris of the acquired image knowing mapping the ith ring of the reference iris and the iris ring j of the acquired image; - determining a path in said Viterbi lattice comprising: Same reference iris ring with the Path (M) th ring of the iris of the acquired image where Path (M) is the index belonging to [1; N2] minimizing on j (Cost (M, j)), with M an integer less than or equal to the first predetermined number Ni, o a mapping of at least one other ring of the reference iris 15 of index M 'with a ring of the iris of the acquired index image Path (M'), where M '<M, as a function of a ring of the reference index iris D already mapped to a iris ring of the acquired index image Path (D) with M '= D - Z (D, Path (D), 1) and Path (M') = Path (D) + Z (D, Path (D), 2), the Z function being defined from R3 in R such that Z (i, j, 1) and Z (i, j, 2) are the integers L c [li, ls] and kc, respectively [kmin, kmax] minimizing the function ProbT ((i, j), - L, j + k)) * Cost (i-L, j + k). The calculation of the minimized match distance can be performed using the formula H Dv iterbi = MI Nt = _T: T 1 + E; v1Mask (i, Path (i), t) with: o T maximum tolerance predetermined at one rotation between the acquired image and the reference image, riv Dist (i, Path (i), t) o IC1, IC2, IM1 and IM2, matrices of size N1x (P * F) or N2x (P * F) respectively corresponding to the code of the reference iris, to the iris code of the acquired image, and to the associated mask codes, o Dist 0, j, = E113, = 1 Er1, 3111C10, p + f) - 1c2 (j, p + t + 0 I, 5 o Mask j, = E117, =, Epolimio, p + 0 * IM2 (j, p + t + 0 The individual can thus be identified or iris recognition by taking into account the non-linear nature of iris displacement as well as the appearance or disappearance of certain areas of the iris as a function of pupil dilation, while taking advantage of the isotropic nature of this movement, without requiring specific learning This individual M (i, j) of said comparison matrix M can be the result of a transformation LT of the ratio between Dist (i, j, t) and Mask (i, j, t) for a t fixed. Such a transformation makes it possible to quantify the degree of resemblance between each of the rings of the reference iris and the iris of the acquired image, without taking into account the zones of the images disturbed by the presence of obstacles or noise. . Said concentric rings may be of various thicknesses. This makes it possible to adapt the thickness of the rings to the radial variability of the iris texture, which can be homogeneous over a large radial distance in certain parts of the iris and vary rapidly radially on other portions of the iris.
[0007] At least two of said concentric rings may overlap. The matching of the rings of two images then makes it possible to take into account the disappearance or the appearance in an iris of a crown of thickness less than that of a ring. The probabilities Prob and Prob can be determined by automatic learning or by using a model of distortion of the texture of the iris. The ProbT and Prob probabilities can be a function of the level of dilation of the pupil. This makes it possible to use functions adapted to the dilation of the pupil found in the acquired image and to obtain more suitable functions than generic functions that would be defined whatever the level of expansion.
[0008] According to a second aspect, the present invention relates to a computer program product comprising code instructions for the execution of a method for identifying and / or authenticating an individual according to the first aspect when this program is executed by a processor.
[0009] According to a third aspect, the present invention relates to a data processing device characterized in that it is capable of being connected to a storage device storing at least one binary code of a reference iris obtained from a reference iris image divided into a first predetermined number Ni of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources of noise have been detected and in that it comprises: a communication interface configured to receive an image of an eye of an individual to be identified and / or authenticated acquired; a computer configured to: segment the image of the received garnet so as to isolate a region containing the texture of the iris and to determine an associated mask; divide said region containing the iris texture and said mask; in a second number N2 of concentric rings, o normalizing said region containing the texture of the iris and said Cartesian coordinate mask 20 in polar coordinates, o determining a binary code of iris IC of size equal to N2 * P * F encoding said region containing the normalized iris texture and a mask bit code IM of size equal to N2 * P * F encoding the associated mask, by applying a third predetermined number of Gabor filters F at each of a position fourth predetermined number P of 25 positions of each ring, o matching at least one ring of the iris of the acquired image with a ring of one of said reference iris, so as to minimize a distance of correspondence (HDViterbi), between the acquired image and the reference image, said correspondence distance being obtained from the determined iris and reference reference codes 30 and associated mask codes by calculating a sum of distances (Dist) between each of said iris rings of the matched acquired image and the ring of the reference iris corresponding thereto, calculating said minimized match distance, o comparing said minimized match distance with a threshold predetermined. According to a fourth aspect, the present invention relates to an identification and / or authentication system for an individual comprising: a storage device storing at least one binary code of a reference iris obtained from a reference iris image divided into a first predetermined number Ni of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources of noise have been detected, 10 - an image capture device configured to acquire an image of an eye of the individual to be identified and / or authenticated, - a data processing device according to the third aspect configures for to minimize a correspondence distance (HDyiterbi) between the acquired image and the reference image and to identify and / or authenticate the individual if said minimized match distance is inferior ure at a predetermined threshold. Such computer program product, processing device and identification and / or authentication system of an individual have the same advantages as those mentioned for the method according to the first aspect.
[0010] PRESENTATION OF THE FIGURES Other features and advantages will become apparent upon reading the following description of an embodiment. This description will be given with reference to the appended drawings in which: FIG. 1 illustrates the implementation of an image segmentation step of an eye of an individual to be identified / authenticated; FIG. 2 illustrates the implementation of a step of dividing the texture of an iris into concentric rings and angular sectors; FIGS. 3a to 3c illustrate the implementation of a step of normalizing a region containing the texture of an iris; FIG. 4 schematically illustrates hardware means in a system for identifying and / or authenticating an individual according to one embodiment of the invention; FIG. 5 is a diagram schematizing a method for identifying and / or authenticating an individual according to an embodiment of the invention; FIGS. 6a to 6c represent images of an eye of an individual having different pupil dilations; FIGS. 6d and 6e show examples of comparison matrices corresponding to the comparison of two eye images having pupil dilation shown respectively in FIGS. 6a and 6b and FIGS. 6a and 6c; FIG. 7 illustrates the implementation of the determination of a path in said Viterbi trellis. DETAILED DESCRIPTION An embodiment of the invention relates to a method for identifying and / or authenticating an individual 1 implemented by a system 100 for identifying and / or authenticating an individual, illustrated In FIG. 4, such a system 100 comprises an image capture device 2 dedicated to the acquisition of biometric data used to identify or authenticate the individual 1. Such biometric data may notably consist of an image of an eye of the individual 1. The image capture device 2 can in particular consist of a camera or a camera capturing an image in the visible range or in the infra-red domain. The system 100 further comprises a storage device 3. Such a storage device is configured to store at least one binary code of a reference iris obtained from a reference iris image divided into a first number predetermined pair of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections, or other sources of noise have been detected. It can be any type of read-only memory such as a flash memory or a hard disk. Such a storage device can store such binary codes in the form of a database.
[0011] These reference iris images are iris images of individuals registered in the system in order to be identified or authenticated by it. These reference iris images were processed to obtain the corresponding reference iris codes. The treatments applied to these images are the same as those described below and applied to the image of the iris of the individual to be authenticated acquired by the image capture device 2. The system 100 also comprises a device data processing unit 4 configured to process the image of the iris of the individual 1 acquired by the image-capture device 2 in order to deduce therefrom a corresponding iris code and to compare it with the iris codes. 10 reference stored. Such a treatment notably comprises cutting the iris into a second predetermined number N2 of concentric rings. Said processing device is connected to the image capture device 2 and to the storage device 3. Such a data processing device may comprise a computer 5 and a communication interface 6. Such a computer 5 may consist of a processor or microprocessor, of the x-86 or RISC type for example, a controller or microcontroller, a DSP, an integrated circuit such as an ASIC or programmable such as an FPGA, a combination of such elements or any other combination of components making it possible to implement the calculation steps of the method described below. Such a communication interface 6 may be any interface, analog or digital, allowing the computer 5 to exchange information with the other elements of the system 100 described above. Such an interface may for example be an R5232 serial interface, a USB interface, Firewire, HDMI or a wired or wireless communication network interface of Ethernet or VVifi type. In one embodiment, said processing device, the image capture device 2 and the storage device 3 may be integrated in a common box, for example in the case of integration into a mobile phone or into a system. integrated access control controlling the opening of an exit. Alternatively, one or more of these devices may be remote from the other device (s), for example the image capture device 2 consists of a surveillance camera connected via a communication network to a processing device consisting of a central server of a computer system. The invention proposes to improve the comparison of the image of the iris of an individual to identify or authenticate and an image of a reference iris by determining an optimal matching between the rings of the iris. in an acquired image of the ear of the individual and the rings of the reference iris. Such matching makes it possible to take into account the distortion of the iris between the acquired image and the reference image due to the dilation of the pupil. This expansion may itself be the consequence of a difference in light exposure between the captures of the two images. The process does not necessarily match all the rings of the two irises, in order to take into account any appearance or disappearance of rings during the distortion of the iris. More specifically, such optimal matching is determined so as to minimize a distance, called the matching distance (HDyiterbi) between the acquired image and the reference image. This distance is calculated from the determined iris code for the acquired image and the reference iris code, taking into account the mapping chosen between the iris rings. Such mapping is determined in an implementation mode using a Viterbi algorithm.
[0012] The method of identification and / or authentication can be implemented according to the steps shown in FIG. 5. During an acquisition step 101, an image of the individual 1 is acquired by the user. image capture device 2. This image can be two-dimensional or three-dimensional, in color or black and white. In the case of a digital image, it can be encoded according to various formats such as bitmap, jpeg, png, tiff, raw ... In the case of a black and white image, this one may have a number of gray levels. Then the data processing device implements segmentation steps 102, division 103, normalization 104, bit code determination 105, mapping 106 and distance calculation 107. In a step of segmentation 102, the image of the acquired eye is segmented so as to isolate a region containing the texture of the iris. This is in Figure 1 of the region 7 between the circles. The iris region is thus isolated from the white of the eye 8 and the pupil 9. In this step an associated mask identifying regions of the iris in which eyelids, eyelashes, reflections or other Noise sources have been detected can also be determined. Such a mask can in particular identify a region of the eye in which is reflected a light spot for imposing a predetermined light exposure during the acquisition of the image of the oail. In a division step 103, said region containing the iris texture and said mask may be divided into N2 concentric rings as shown in Figure 2. These concentric rings may be of constant or varied thicknesses. These concentric rings can also overlap. In this case, the matching of the rings of two images makes it possible to take into account the disappearance or the appearance in an iris of a crown of thickness less than that of a ring. In a normalization step 104, the region containing the texture of the iris and said mask from segmentation and division steps 102 and 103 are normalized to a fixed size rectangular image, thereby transforming the images. Cartesian coordinates of the original image in polar coordinates, as represented in FIGS. 3a to 3c according to the polar transformation proposed by Daugman: 1 (x (r, e), y (r, e)) -> l (r, e) with 1 acquired iris image. In this normalized image, the subdivision rings of the iris form lines.
[0013] In order to transform this normalized image into a binary code, referred to as an iris code, a filtering by a third predetermined number of filters F can be applied, during a binary code determination step F 105, in each position among a fourth predetermined number P of positions of each ring. The iris may for example be angularly subdivided into P angular regions and such filtering may then be applied to each zone corresponding to the intersection of a ring and an angular sector. In the normalized image of the iris each of these angular sectors constitutes a column. Such filters may be Gabor filters, characterizing the variations in brightness of the texture of the iris in the zone to which it is applied. A binary value 0 or 1 can then be assigned to each position according to the sign of the result of the filtering on this position. A similar operation can be performed on the mask associated with the acquired image. An iris IC binary code of size equal to N2 * P * F encoding said region containing the normalized and filtered iris texture can thus be determined. An IM mask bit code of size equal to N2 * P * F encoding the associated mask may also be determined. These binary codes may be in the form of two matrices of size N2x (P * F) each line of which corresponds to a ring 25 (as in the normalized image) and each group of P columns corresponds to an angular sector of the iris image. The line 1 of each of these matrices may for example be associated with the ring closest to the pupil. In one embodiment, Ni may be equal to N2 and may for example be equal to 8. By way of example, the iris may be subdivided into 32 rings and 64 angular regions and the number of filters applied to each The position may be equal to 4. Then, in a mapping step 106, at least one ring of the iris of the acquired image is mapped to a ring of one of said reference iris, so as to minimize a distance, called the match distance (1-1Dv iterbi) between the acquired image and the reference image, said correspondence distance being obtained from the determined iris and stored reference codes and codes Means 3037422 13 associated by calculating a sum of distances (Dist) between each of said at least one ring of the iris of the acquired image matched and the ring of the corresponding reference iris. The mapping step 106 may implement a Viterbi algorithm.
[0014] More specifically, the mapping step 106 comprises the following steps. A comparison matrix M of size N1 * N2 is firstly calculated 1061. This comparison matrix reflects the degree of resemblance of each of the N2 rings of the iris of the acquired image with the Ni rings of the iris of reference to which it is compared. Each coefficient M (i, j) of said comparison matrix encodes the result of the comparison between the ith ring of the stored reference iris and the jir iris ring of the image acquired by means of said codes. defined and reference iris and associated mask codes. Each coefficient M (i, j) of said comparison matrix M can be the result of a transformation LT of the ratio between Dist (i, j, t) and Mask (i, j, t) for a fixed t where: Dist 0, j, = EIP, = 1Epo1pc1o, p + 0 - 1c2 (j, p + t + 01, P F-1 Mask (i, j, t) = IM10, p + f) * IM2 (j, p + t + 0 p = 1 f = 0 with IC1, IC2, IM1 and IM2, matrices of size N1x (P * F) or N2x (P * F) respectively corresponding to the code of the reference iris, to the code of the iris of the acquired image, and the associated mask codes The rotation parameter t makes it possible to take into account a rotation of the iris around the axis of symmetry of the loop passing through the center of the pupil between the acquired image and the reference image, for example due to a different inclination of the head of the individual, such a matrix can be calculated for each value of the rotation parameter t to be taken into account, for example example for all the values of t multiple of F included in the interval [- T; T] with T a tolerance predetermined maximum amount of rotation between the acquired image and the reference image. The result of the LT transformation may correspond to a gray level representative of the level of resemblance between the rings compared. For example, the transformation may assign the value 255 at a zero distance between the compared rings and a zero value when the ratio between Dist (i, j, t) and Mask (i, j, t) is less than at 0.5. The more two 14 rings compared are similar, the more the gray level associated with their comparison will then be close to white. Two examples of comparison matrices are shown in FIG. 6d and FIG. 6e, FIG. 6d corresponding to the comparison of two eye images having pupil dilation shown in FIG. 6a and FIG. 6b, FIG. 6e corresponding to the comparison of FIG. the eye image of FIG. 6a with the eye image of FIG. 6c having a much more pronounced pupil dilation. In Figure 6d, the whitest pixels, corresponding to the smallest distances between rings, are mainly located along the diagonal of the matrix. This is a consequence of low distortion between the irises of FIGS. 6a and 6b. Each zone of the iris of the image acquired then corresponds approximately to the area of the reference iris placed at the same position in the reference image. Conversely, the lightest areas of Figure 6e are not located on the diagonal of the image, because of the large difference in pupil dilation between the images 6a and 6c. Then a Viterbi lattice is constructed 1062 in which each mapping of a ring i of the reference iris and a ring j of the iris of the acquired image has the cost: for i = 1, Cost (i, j) = Prob (i, j) * M (i, j) for all i> 1: Cost (i, j) = Prob (i, j) * M (i, j) + minL = ii : is and k = kmin: kmax ProbT ((i, j), - L, j + k)) * Cost - L, j + k)) with: li, Is, kmin, kmax of predetermined relative integers, with li k 1 and 0 <j + k N2, 20 o Prob (i, j) a predetermined probability of correspondence of the ith ring of the reference iris and the jth ring of the iris of the acquired image, o ProbT ( (i, j), (p, q)) a predetermined probability of correspondence of the p-ring of the reference iris and the q-th ring of the iris of the acquired image knowing the mapping of the ith ring of the iris of reference and of the jir ring of the iris 25 of the image acquired, the function ProbT being moreover defined so as to return a high standard value for the combinations of L and k corresponding to non-physical cases such that L = 1 and k = 1, corresponding to crossed correspondences, for example the ring j + 1 of the iris of the image acquired in correspondence with the i-1 ring of the iris of the reference image while the ring of the acquired image iris has been mapped to the iris ring of the image reference. Preferably L = Ii = ls = 1 and (kmin, kmax) = (0,1) or (1,0). The cost associated with the matching of two rings (ii) is thus a function of the intrinsic probability Prob (i, j) that the ith ring of the reference iris has moved to the position of the ring j of the iris of the image acquired due to the distortion of the iris caused by the dilation of the pupil in the acquired image. This cost also reflects the degree of similarity between these rings through M (i, j). This cost also takes into account the correspondence established for a preceding ring of the reference iris, so as to respect an overall coherence in the mappings and to avoid numerically attractive correspondences but corresponding to no physical reality, for example the mapping of the ith ring of the reference iris with a very central ring of the iris of the acquired image while the i-1th ring of the reference image has been associated with a peripheral ring of the iris of the acquired image.
[0015] The Prob and ProbT functions can be determined by automatic learning on an image basis of irises photographed at different stages of dilatation. Alternatively, these functions can be determined analytically using an iris texture distortion model such as the Clark model cited above. The Prob and ProbT functions can be tabulated according to the level of pupil dilation. Such a Viterbi lattice can be constructed for each of the comparison matrices M calculated for different values of the rotation parameter t. Then a path in each of said Viterbi lattices is determined 1063 by: mapping a same ring of the reference iris with the Path (M) th ring of the acquired image iris where Path (M) is the index belonging to [1; N2] minimizing on j (Cost (M, j)), with M an integer less than or equal to the first predetermined number Ni, that is to say, Path (M) = (COSt (M, 0), with Argmin ( x,) the function returning the index of the minimum value among the series of values x, This same ring of the reference iris will be the outermost ring of the iris mapped to a ring of the acquired image iris 3037422 16 o matching at least one other ring of the reference index iris M 'with a ring of the iris of the acquired index image Path ( M '), where M' <M, as a function of a ring of the index reference iris D already mapped to a ring of the iris of the acquired index image Path (D) 5 with M '= D - Z (D, Path (D), 1) and Path (M') = Path (D) + Z (D, Path (D), 2), the function Z being defined from R3 in R such that Z (i, j, 1) and Z (i, j, 2) are respectively the integers L c [li, ls] and kc [kmin, kmax] minimizing the function ProbT ((i, j), - L , j + k)) * Cost - L, j + that is to say (k, L) = Argmin k = kmin: kmax, L = li: Is PrObT (0, 0, j + k)) * Cost (i - L, j + 10 k). Preferably li = ls = 1. We then have: M '= D - 1 and Path (M') = Path (D) + k with k minimizing ProbT [(D, Path (D)), (D-1, Path (D) + k)] * Cost (D-1, Path (D) + k) This mapping can be determined successively for several rings of the reference iris in descending order of these until i = 1 or until that Path (i) = 1 (the ith ring of the iris of the acquired image, respectively of reference, then corresponds to the central ring of the reference iris, respectively of the acquired image, this which means that the dilation of the pupil in the reference image, respectively acquired, has made disappear all the rings corresponding to the rings 1 to i-1 of the acquired image, respectively of reference).
[0016] 20 This Viterbi path makes it possible to determine a correspondence minimizing the total cost of the path and thus corresponding to the most probable matching between the rings of the irises compared, given their similarity and probabilities of correspondence Prob (i, j) and ProbT ((i, j), (p, q)). Such a path is illustrated in FIG. 7 in which the circles correspond to the rings of the iris of the acquired image and the reference iris and in which each cross represents the mapping between one of the said iris rings. of the acquired image and one of said rings of the reference iris. The correspondence distance minimized by this mapping can be a Hamming distance calculated during a calculation step 107 using the following formula: H Dviterbi = MI Nt = _T: T 1 + E; v1 M ask (i, Path (i), t) riv D ist (i, Path (i), t) 3037422 17 with T a predetermined maximum tolerance to a rotation between the acquired image and the reference image, the distance of Correspondence thus obtained between the two iris images is minimized by the Viterbi path chosen in the Viterbi graph so as to retain the most probable correspondence between the rings of the compared images. Finally, during an identification and / or authentication step 108, the individual whose image of the job has been acquired is identified and / or authenticated, as a function of the minimized correspondence distance obtained during the training. The calculation step 107. The individual can be identified or authenticated if this minimized match distance is less than a predetermined threshold. The individual whose image of the iris has been acquired can thus be identified or authenticated by recognizing his iris while taking into account the non-linear distortion as well as the appearance or disappearance of certain zones of the iris according to of the dilation of the pupil, while taking advantage of the isotropic nature of this movement, without requiring specific training for this individual. 20 25
权利要求:
Claims (12)
[0001]
REVENDICATIONS1. A method of identifying and / or authenticating an individual (1) comprising the following steps implemented by an identification and / or authentication system (100) of an individual, said identification and / or authentication of an individual comprising a storage device (3) storing at least one binary code of a reference iris obtained from a reference iris image divided into a first predetermined number Ni d ' concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources of noise have been detected: - acquisition (101) of a image of an eye of the individual to identify and / or authenticate, - segmentation (102) of the image of the acquired eye so as to isolate a region containing the texture of the iris and to determine an associated mask, - division (103) of said region containing the textur with iris and said mask in a second number N2 of concentric rings, - normalization (104) of said region containing the texture of the iris and said coordinate coordinate mask in polar coordinates, - determination (105) of a binary code of iris IC of size equal to N2 * P * F encoding said region containing the texture of the normalized iris and a binary mask code IM of size equal to N2 * P * F encoding the associated mask, by applying a third predetermined number of Gabor filters F at each of a fourth predetermined number P of positions of each ring; mapping (106) at least one ring of the iris of the acquired image with a ring of one of said reference irises, so as to minimize a correspondence distance (HDviterbi) 1, between the acquired image and the reference image, said correspondence distance being obtained from the determined iris codes and reference and stored codes e mask associated by calculating a sum of distances (Dist) between each of said rings of the iris of the acquired image matched and the ring of the reference iris corresponding thereto, 3037422 19 - calculation of said distance of minimized match (107); - identification and / or authentication of the individual (108) if said minimized match distance is less than a predetermined threshold.
[0002]
2. Method of identification and / or authentication according to the preceding claim 5 wherein the mapping step (106) implements a Viterbi algorithm.
[0003]
3. Identification and / or authentication method according to the preceding claim wherein the mapping step (106) comprises: - a calculation (1061) of a comparison matrix M of size N1 * N2, each coefficient M (i, j) of said comparison matrix coding the result of the comparison between the ith ring 10 of the stored reference iris and the jth ring of the iris of the image acquired with the aid of said reference codes. defined and reference iris and associated mask codes, - a construction of a Viterbi lattice (1062) in which each mapping of a ring i of the reference iris and a ring of the iris of the acquired image has for cost: for i = 1, Cost (i, j) = Prob (i, j) * M (i, j) for all i> 1: Cost (i, j) = Prob (i, j) * M (i, j) + minL = ii: is and k = kmin: kmax with: ProbT ((i, j), - L, j + k)) * Cost - L, j + k)), li, Is, kmin, kmax of predetermined relative integers, with li k 1 and 0 <j + k N2, o Prob (i, j) a probabil pre-determined correspondence state of the ith ring of the reference iris and the j iris ring of the acquired image, o ProbT ((i, j), (p, q)) a predetermined probability of correspondence of the ring p 20 of the reference iris and the qth ring of the iris of the image acquired knowing the mapping of the ith ring of the reference iris and the jth ring of the iris of the acquired image, determining a path (1063) in said Viterbi lattice comprising: mapping the same ring of the reference iris to the path (M) iris ring of the acquired image where Path (M) is the index belonging to [1; N2] minimizing on j (Cost (M, j)) with M an integer less than or equal to the first predetermined number Ni, 3037422 20 o mapping of at least one other ring of the reference index iris M 'with a ring of the iris of the acquired index image Path (M'), where M '<M, as a function of a ring of the reference index iris D already mapped to a iris ring of acquired index image Path (D) 5 with M '= D - Z (D, Path (D), 1) and Path (M') = Path (D) + Z (D) , Path (D), 2), the function Z being defined from R3 in R such that Z (i, j, 1) and Z (i, j, 2) are respectively the integers L c [li, ls] and kc [kmin, kmax] minimizing the function ProbT ((i, j), - L, j + k)) * Cost - L, j +
[0004]
4. Identification and / or authentication method according to the preceding claim 10 wherein the calculation (107) of the minimized correspondence distance is performed using the formula H Dv iterbi = MI Nt = _T: T 1 + E; 'M ask (i, Path (i), t) with: o T a predetermined maximum tolerance to a rotation between the acquired image and the reference image, o IC1, 102, IM1 and IM2, matrices of size N1x (P * F) or N2x (P * F) respectively corresponding to the code of the reference iris, to the iris code of the acquired image, and to the associated mask codes, o Dist 0 , j, = EIP, = 1ErT01-11C10, p + f) - 1c2 (j, p + t + I, o Mask j, = E117, =, Epolimio, p + 0 * IM2 (j, p + t + 0 20
[0005]
5. Identification and / or authentication method according to the preceding claim wherein each coefficient M (i, j) of said comparison matrix M is the result of an LT transformation of the ratio between Dist (i, j, t ) and Mask (i, j, t) for a fixed t.
[0006]
6. Identification and / or authentication method according to one of the preceding claims wherein said concentric rings are of varying thicknesses. 25
[0007]
7. The method of identification and / or authentication according to one of the preceding claims wherein at least two of said concentric rings overlap. riv Dist (i, Path (i), t) 3037422 21
[0008]
8. The method of identification and / or authentication according to one of claims 2 to 6 wherein ProbT Prob probabilities are determined by automatic learning or using a model of distortion of the texture of the iris.
[0009]
9. The method of identification and / or authentication according to one of claims 2 to 7 5 wherein ProbT prob prob prob and prob depend on the level of dilation of the pupil.
[0010]
A computer program product comprising code instructions for executing a method of identifying and / or authenticating an individual according to one of the preceding claims when the program is executed by a processor.
[0011]
11. A data processing device (4) characterized in that it can be connected to a storage device (3) storing at least one binary code of a reference iris obtained from an image reference iris lens divided into a first predetermined number Ni of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources noise has been detected and in that it comprises: a communication interface (6) configured to receive an image of an eye of an individual to be identified and / or authenticated; a calculator (5) configured to: segment the image of the received eye so as to isolate a region containing the texture of the iris and to determine an associated mask, divide said region containing the iris texture and said mask in a second number N2 of concentric rings, o normalize said region containing the texture of the iris and said Cartesian coordinate mask in polar coordinates, o determine a binary code of iris IC of size equal to N2 * P F encoding said region containing the normalized iris texture and a mask mask bit code IM of size equal to N2 * P * F encoding the associated mask, by applying a third predetermined number of Gabor filters F in each position among a fourth predetermined number P of positions of each ring, o matching at least one ring of the iris of the acquired image with a ring of one of said reference iris so as to minimize a distance of correspondence this (HDViterbi), between the acquired image and the reference image, said correspondence distance being obtained from the determined iris and reference reference codes and associated mask codes by calculating a sum of distances ( Dist) between each of said iris rings of the matched acquired image and the ring of the reference iris corresponding thereto, calculating said minimized match distance, comparing said minimized match distance with a predetermined threshold.
[0012]
12. A system for identifying and / or authenticating (100) an individual comprising: - a storage device (3) storing at least one binary code of a reference iris obtained from an image of reference iris divided into a first predetermined number Ni 10 of concentric rings, and an associated reference mask bit code identifying regions of the reference iris in which eyelids, eyelashes, reflections or other sources of noise has been detected, - an image capture device (2) configured to acquire an image of an eye of the individual to be identified and / or authenticated, - a data processing device (4) according to the claim previous configures to minimize a match distance (HIDviterbi) between the acquired image and the reference image, and to identify and / or authenticate the individual if said minimized match distance is less than a predetermined threshold.
类似技术:
公开号 | 公开日 | 专利标题
EP3107035A1|2016-12-21|Method for identifying or authenticating an individual by means of iris recognition
US9836643B2|2017-12-05|Image and feature quality for ocular-vascular and facial recognition
CN108846390B|2020-02-14|Feature extraction and matching and template update for biometric authentication
US8744141B2|2014-06-03|Texture features for biometric authentication
US9672341B2|2017-06-06|Systems and methods for spoof detection in iris based biometric systems
WO2015180042A1|2015-12-03|Learning deep face representation
EP2140401B1|2011-05-25|Method of comparing images, notably for iris recognition, implementing at least one quality measurement determined by applying a statistical learning model
Seal et al.2018|À‐trous wavelet transform‐based hybrid image fusion for face recognition using region classifiers
Eid et al.2015|Development of Iris Security System Using Adaptive Quality-Based Template Fusion
Arora et al.2021|A robust framework for spoofing detection in faces using deep learning
Makwana et al.2011|Evaluation and analysis of illumination normalization methods for face recognition
Rathgeb et al.2012|Iris segmentation methodologies
Rathgeb et al.2012|State-of-the-Art in Iris Biometrics
Ipe et al.2019|CNN Based Periocular Recognition Using Multispectral Images
Rathgeb et al.2012|Iris Biometric Processing
Rathgeb et al.2012|Multi-stage Real-Time Iris Preprocessing
Rathgeb et al.2012|Eye Detection
FR3017732A1|2015-08-21|METHOD FOR IDENTIFYING AND / OR AUTHENTICATING AN INDIVIDUAL FROM ITS VENOUS NETWORK
WO2012175391A1|2012-12-27|Method of establishing a final score of similarity between images
同族专利:
公开号 | 公开日
US20160364611A1|2016-12-15|
TW201706907A|2017-02-16|
TWI710961B|2020-11-21|
FR3037422B1|2017-06-23|
CN106250810B|2021-11-23|
CN106250810A|2016-12-21|
JP2017004525A|2017-01-05|
US10078782B2|2018-09-18|
JP6820672B2|2021-01-27|
EP3107035A1|2016-12-21|
KR20160147675A|2016-12-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5291560A|1991-07-15|1994-03-01|Iri Scan Incorporated|Biometric personal identification system based on iris analysis|
US6229905B1|1997-03-26|2001-05-08|Oki Electric Industry Co., Ltd.|Animal identification based on irial granule analysis|
JP3586431B2|2001-02-28|2004-11-10|松下電器産業株式会社|Personal authentication method and device|
KR100954640B1|2002-02-05|2010-04-27|파나소닉 주식회사|Personal authentication method and device|
JP2004362619A|2002-11-07|2004-12-24|Matsushita Electric Ind Co Ltd|Iris-registering apparatus|
JP2006136666A|2004-11-15|2006-06-01|Asahi Kasei Corp|Device and method for body motion recognition, and program|
US8375218B2|2004-12-07|2013-02-12|Mitsubishi Electric Research Laboratories, Inc.|Pre-processing biometric parameters before encoding and decoding|
JP5028751B2|2005-06-09|2012-09-19|ソニー株式会社|Action recognition device|
CN101030244B|2006-03-03|2010-08-18|中国科学院自动化研究所|Automatic identity discriminating method based on human-body physiological image sequencing estimating characteristic|
US8023699B2|2007-03-09|2011-09-20|Jiris Co., Ltd.|Iris recognition system, a method thereof, and an encryption system using the same|
US20100202669A1|2007-09-24|2010-08-12|University Of Notre Dame Du Lac|Iris recognition using consistency information|
TWI459309B|2008-11-28|2014-11-01|Chi Mei Comm Systems Inc|System and method for identifying irises|
US8577094B2|2010-04-09|2013-11-05|Donald Martin Monro|Image template masking|
CN101894256B|2010-07-02|2012-07-18|西安理工大学|Iris identification method based on odd-symmetric 2D Log-Gabor filter|
FR2969344B1|2010-12-17|2013-01-04|Thales Sa|METHOD OF COMPARING IRIS IMAGES BY INTELLIGENT SELECTION OF TEXTURED AREAS|
US20150320311A1|2014-05-12|2015-11-12|Here Global B.V.|Method and apparatus for iris recognition using natural light|
CN104166848B|2014-08-28|2017-08-29|武汉虹识技术有限公司|A kind of matching process and system applied to iris recognition|US9495590B1|2015-04-23|2016-11-15|Global Bionic Optics, Ltd.|Extended depth-of-field biometric system|
WO2018008934A2|2016-07-07|2018-01-11|Samsung Electronics Co., Ltd.|Adaptive quantization method for iris image encoding|
CN109964245A|2016-12-06|2019-07-02|深圳市大疆创新科技有限公司|System and method for correcting wide angle picture|
US10242449B2|2017-01-04|2019-03-26|Cisco Technology, Inc.|Automated generation of pre-labeled training data|
KR102279550B1|2017-01-23|2021-07-20|삼성전자주식회사|Authentication method and authentication apparatus using synthesized code for iris|
JP2020512634A|2017-03-24|2020-04-23|マジック リープ, インコーポレイテッドMagic Leap,Inc.|Iris code storage and reliability allocation|
CN109840461B|2017-11-28|2021-05-25|武汉真元生物数据有限公司|Identification method and device based on dynamic iris image|
KR102035230B1|2018-01-22|2019-10-22|보은에스앤씨 주식회사|Check valve for dry chemical fire extinguisher|
CN108961494A|2018-06-21|2018-12-07|南通大学|A kind of locker open method of iris recognition|
CN110781747B|2019-09-23|2022-03-01|杭州电子科技大学|Eyelash occlusion area pre-detection method based on coefficient of variation|
CN110728251B|2019-10-22|2020-05-12|吉林大学|Lightweight identity verification method based on enhanced universal iris feature label|
法律状态:
2016-05-24| PLFP| Fee payment|Year of fee payment: 2 |
2016-12-16| PLSC| Publication of the preliminary search report|Effective date: 20161216 |
2017-05-23| PLFP| Fee payment|Year of fee payment: 3 |
2018-05-25| PLFP| Fee payment|Year of fee payment: 4 |
2020-05-20| PLFP| Fee payment|Year of fee payment: 6 |
2021-05-19| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
FR1555440A|FR3037422B1|2015-06-15|2015-06-15|METHOD FOR IDENTIFYING AND / OR AUTHENTICATING AN INDIVIDUAL BY RECOGNIZING IRIS|FR1555440A| FR3037422B1|2015-06-15|2015-06-15|METHOD FOR IDENTIFYING AND / OR AUTHENTICATING AN INDIVIDUAL BY RECOGNIZING IRIS|
JP2016115028A| JP6820672B2|2015-06-15|2016-06-09|How to identify and / or authenticate an individual by iris recognition|
US15/181,192| US10078782B2|2015-06-15|2016-06-13|Method for identifying and/or authenticating an individual by iris recognition|
TW105118580A| TWI710961B|2015-06-15|2016-06-14|Method and system for identifying and/or authenticating an individual and computer program product and data processing device related thereto|
EP16174661.5A| EP3107035A1|2015-06-15|2016-06-15|Method for identifying or authenticating an individual by means of iris recognition|
KR1020160074607A| KR20160147675A|2015-06-15|2016-06-15|Method for identifying and/or authenticating an individual by iris recognition|
CN201610424184.3A| CN106250810B|2015-06-15|2016-06-15|Method for identifying and/or authenticating an individual by iris recognition|
[返回顶部]