专利摘要:
Embodiments of the present application provide a biometric authentication, identification and detection method and device for a mobile terminal device and an electronic device, the method comprising: obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
公开号:EP3693876A1
申请号:EP18887829.2
申请日:2018-09-07
公开日:2020-08-12
发明作者:Xinyi FU;Zhe Li
申请人:Alibaba Group Holding Ltd;
IPC主号:G06F3-00
专利说明:
[0001] The present application relates to the field of computer software technologies, and in particular to a biometric authentication, identification and detection method and device for a mobile terminal device, and an electronic device. Background
[0002] The security of a biometric authentication system in use has always been an issue that attracts widespread attention.
[0003] Taking a face recognition system as an example, the most common spoofing method in the face recognition system is impersonation attack or reproduction attack. Namely, an imposter uses a fake feature having the same expression form to invade the system. At present, commonly used means for impersonation attack mainly include photos (or processed images), videos, 3D models, and the like.
[0004] Impersonation attacks to face recognition systems are mainly addressed by using liveness detection technologies. Existing liveness detection methods include 3D depth information analysis, detection of physiological behaviors like blink, facial expression and speech interaction, Fourier frequency analysis, thermal infrared radiation (IR), and the like. The detection of physiological behaviors like blink and the facial expression and speech interaction typically require a user to blink, open mouth, or read a randomly generated speech. These liveness detection methods are effective, but depend on cooperation by the user. Summary
[0005] The objective of embodiments of the present application is to provide a biometric authentication, identification and detection method and apparatus for a mobile terminal device, and an electronic device, which can reduce the dependence on user cooperation or do not depend on user cooperation to achieve the effect of strengthening the security of biometric authentication and identification. The embodiments of the present application provide better security support for users while enhancing user experience.
[0006] To solve the above technical problems, the embodiments of the present application are implemented as follows.
[0007] According to a first aspect, a biometric authentication, identification and detection method for a mobile terminal device is provided, and the method comprises: obtaining sensor data of the mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0008] According to a second aspect, a biometric authentication, identification, and detection apparatus is provided, and the device comprises: an obtaining unit configured to obtain sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and an attack determining unit configured to determine whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0009] According to a third aspect, an electronic device is provided, comprising: a processor; and a memory configured to store computer executable instructions, where the executable instructions cause, when executed, the processor to execute the following operations: obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0010] According to a fourth aspect, a computer readable storage medium is provided, where the computer readable storage medium stores one or more programs, and the one or more programs, when executed by an electronic device comprising a plurality of applications, cause the electronic device to execute the following operations: obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0011] From the technical solutions provided by the embodiments of the present application, it can be seen that the technical solutions according to the embodiments of the present application can at least achieve one of the following technical effects.
[0012] Sensor data collected by a mobile terminal device during biometric authentication and identification is obtained, and attack determination is performed on the sensor data based on a sensor classification model. As a result, with reduced dependence or without dependence on user cooperation, the effect of strengthening the security of biometric authentication and identification can be achieved. The embodiments of the present application provide better security support for users while enhancing user experience. Brief Description of the Drawings
[0013] To more clearly describe the technical solutions of the embodiments of the present application or of the current technologies, the accompanying drawings to be used in the description of the embodiments or the current technologies will be briefly described below. Apparently, the accompanying drawings in the description below are merely some of the embodiments of the present application. Other drawings may be obtained by one of ordinary skill in the art without creative effort according to these accompanying drawings.FIG. 1 is a flow chart of a biometric authentication, identification and detection method for a mobile terminal device according to some embodiments of the present application. FIG. 2 is a flow chart of interactions in biometric authentication, identification and detection by a mobile terminal device according to some embodiments of the present application. FIG. 3 is a schematic flow chart of training and prediction for biometric authentication and identification according to some embodiments of the present application. FIG. 4 is a schematic structural diagram of an electronic device according to some embodiments of the present application. FIG. 5 is a schematic structural diagram of a biometric authentication, identification, and detection apparatus according to some embodiments of the present application. Detailed Description
[0014] To enable those skilled in the art to better understand the technical solutions of the present application, the technical solutions of the embodiments of the present application will be completely and thoroughly described below with reference to the accompanying drawings in the embodiments of the present application. Apparently, the described embodiments are merely some, but not all, embodiments of the present application. All other embodiments obtained, based on some embodiments of the present application, by one of ordinary skill in the art without creative effort shall fall within the protection scope of the present application.
[0015] FIG. 1 is a flow chart of a biometric authentication, identification and detection method for a mobile terminal device according to some embodiments of the present application. The method shown in FIG. 1 may be implemented by a biometric authentication, identification, and detection apparatus or a biometric authentication, identification and detection application on the mobile terminal device. In other words, the method shown in FIG. 1 may be implemented by the mobile terminal device.
[0016] S110, obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification.
[0017] It should be understood that, in the embodiments of the present application, there may be various authentication and identification methods for biometric authentication and identification, such as facial feature authentication and identification, speech feature authentication and identification, gesture input feature authentication and identification, etc. The solutions of the embodiments of the present application may be applicable to various biometric authentication and identification, which is not limited by the embodiments of the present application.
[0018] It should be understood that the time length for collecting the sensor data may be the same or different for users having different risk levels. Optionally, in the step S110, a desired time length for collecting the sensor data may be determined based on a risk level of a user. A higher risk level requires a longer time length for sensor data collection. For example, it is assumed that user risk levels are divided into three types, i.e., low risk users, regular risk users, and high risk users. For users at a low risk level, it may be prescribed that the collection time length is 3 seconds; for users at a regular risk level, it may be prescribed that the collection time length is 5 seconds; for users at a low risk level, it may be prescribed that the collection time length is 7 seconds, and the like.
[0019] FIG. 2 is a schematic diagram of a six-dimensional parameter of the mobile terminal device according to some embodiments of the present application. It should be understood that a three-axis gyroscope may sense comprehensive dynamic information of the mobile terminal device, including left and right angle of inclination (roll), front and back angle of inclination (pitch), and left and right swing angle (yaw), respectively; and a three-axis accelerator may sense acceleration of the mobile terminal device in three directions in a 3D space (x-axis, y-axis, and z-axis) and three directions of front and back, left and right, and up and down. The three-axis gyroscope is configured to detect angular rotation and balance, and data measured thereby may be referred to as gyroscope data. The three-axis accelerator is configured to detect lateral acceleration, and data measured thereby may be referred to as acceleration sensor data. The three-axis gyroscope and the three-axis accelerator may be collectively referred to as a six-axis gyroscope.
[0020] It should be understood that, when the sensor data of the mobile terminal device is obtained in the embodiments of the present application, the gyroscope data and/or the acceleration sensor data as shown in FIG. 2 may be obtained.
[0021] S120, determining whether an attack occurs based on the sensor data and an attack determination model.
[0022] Here, the attack determination model is obtained based on training with sensor training data.
[0023] It should be understood that, when the user performs biometric authentication and identification in the embodiments of the present application, sensor data may be different for a normal user and an imposter user. Taking face authentication and identification as an example, if the imposter user performs authentication and identification by a photo (or a processed image), a video, a 3D model, or other impersonation attack means, the sensor data at the time of authentication thereof is apparently different from sensor data at the time of authentication and identification by the normal user. On the basis of this concept, whether an attack occurs during the process of biometric authentication and identification corresponding to the sensor data may be determined based on the sensor data and an attack determination model.
[0024] It should be understood that, to determine whether an attack occurs based on the sensor data and an attack determination model may, for example, use the sensor data as an input to the attack determination model, so as to determine whether an attack occurs, or to perform data processing on the sensor data to obtain processed sensor data, and then use the processed sensor data as an input to the attack determination model to determine whether an attack occurs.
[0025] For example, an output result of the attack determination model may be a probability that an attack occurs, a range of probabilities that an attack occurs, a risk level of attack occurrence, or straightly be whether an attack occurs. Based on the output result, it may be directly determined whether an attack occurs. Alternatively, based on certain conversion processing on the output result, it may be determined whether an attack occurs. Alternatively, based on the output result and in combination with other factors like models and rules, it may be determined whether an attack occurs. A probability segmentation of the attack determination model and risk levels of attack occurrence are listed in Table 1. Table 1 Model probability segmentation Risk levels of attack occurrence [0.5, 1] Ultra-high risk [0.4, 0.5] High risk [0.3, 0.4] Medium risk [0, 0.3] Low risk
[0026] According to practical online testing, when a determination threshold is set to 0.4 (i.e., it is determined that an attack occurs when the probability is greater than 0.4 and no attack occurs when the probability is lower than 0.4), the number of times of identifying a risk accounts for one ten-thousandth of the total number of times of invocation; most of the times of identifying a risk are attack attempts according to manual verification, and the number of times of misjudgment accounts for three hundred-thousandth of the total number of times of invocation.
[0027] In the embodiments of the present application, sensor data collected by a mobile terminal device during biometric authentication and identification is obtained, and attack determination is performed on the sensor data based on a sensor classification model. As a result, with reduced dependence or without dependence on user cooperation, the effect of strengthening the security of biometric authentication and identification can be achieved. The embodiments of the present application provide better security support for users while enhancing user experience.
[0028] It should be understood that, in the step S120, data processing may be performed on the sensor data, and the sensor data after the data processing may be used as an input to the attack determination model.
[0029] Optionally, the step S120 may comprise: pre-processing the sensor data to obtain pre-processed sensor data, where the pre-processing comprises at least one of the following: filtering operation, boundary detection operation, and length normalization operation; anddetermining whether an attack occurs based on the pre-processed sensor data and the attack determination model;where the attack determination model is obtained based on training with the pre-processed sensor training data.
[0030] These three pre-processing methods will be described briefly as follows. Filtering processing:
[0031] Since noises due to shaky hands or inherent to the sensors themselves are mostly with high-frequency data, filtering processing is often performed first when data pre-processing is performed. In the embodiments of the present application, a low-pass filter may be used to suppress data with high frequencies. In addition, a moving average method may be adopted in the embodiments of the present application to process the sensor data. For example, taking acceleration data on the x-axis as an example, the processing may be performed using the following equation:acc x i = 1 M∑ j = 0 M − 1 acc x i − j
[0032] The processing of acceleration data on the x-axis may be referenced for similar processing on acceleration data on the y-axis and z-axis and gyroscope data on the x, y, and z axes, which will not be elaborated.
[0033] At the same time, assuming that an output from the acceleration sensor comprises gravitational acceleration, a high-pass filter may be used to remove the gravitational impact, or a linear acceleration sensor may be directly used to replace the regular acceleration sensor.
[0034] It should be understood that the filtering processing in the embodiments of the present application is merely exemplary, while other filtering processing methods are not excluded. For example, a variable sliding smoothing algorithm may be adopted for processing, etc. (2) Boundary detection processing:
[0035] For example, time points at which the user starts and ends the biometric authentication and identification may be identified by a threshold method based on an added window. One time window is used from the front and back ends of the sensor data, respectively, to explore toward the middle. If the variation of the sensor values in one time window exceeds a threshold, then the first frame corresponding to the time window used to explore from the front to the back is recorded as a beginning boundary, the last frame corresponding to the time window used to explore from the back to the front is recorded as an ending boundary, and the beginning boundary and the ending boundary are used to define a beginning frame and an ending frame of the biometric authentication and identification. Similarly, other boundary detection processing methods are not excluded in the process of boundary detection processing, either. (3) Length normalization processing:
[0036] By the normalization processing, the original data with inconsistent variation lengths is processed to obtain data with a consistent variation length. 3D interactions based on the sensor mutually depend on amplitude and changing speed of actions of the user of the mobile terminal device, a data sequence, and changes to the data sequence appear to have inconsistent lengths, leading to discontinuous changes of 3D scenes and jitter in the data.
[0037] There are many methods for length normalization.
[0038] If the sensor has been smoothed, the linear difference method may be adopted in the length normalization processing on the sensor data. A result of the linear difference may effectively fill up blanks among discontinuous data points, and the complexity of the calculation is low, which takes the processor performance of the mobile terminal device into consideration at the same time.
[0039] For example, Y value at a position X in the interval [X0, X1] is obtained by using the coordinates of starting points of the discontinuous data (X0, Y0), (X1, Y1) (X1, Y1), based on an exemplary equation as follows: Y = Y 0 + x − x 0 x 1 − x Y 1 − Y 0 .
[0040] The above equation is merely exemplary and may be modified to a certain extent.
[0041] Furthermore, in addition to the linear difference method, there are many normalization methods. For example, the sensor data in the first direction of X, Y, and Z axes may be used as a normalization standard, and the sensor data in the other two directions may be converted according to the first direction, so as to determine the coordinate points corresponding to the normalized data.
[0042] For example, X = X0 + nDx; Y = Y0 + nDy; Z = Z0 + nDz.
[0043] Here, Dx is a distance on the X axis between two adjacent normalized coordinate points, DY is a distance on the Y axis between two adjacent normalized coordinate points, and DZ is a distance on the Z axis between two adjacent normalized coordinate points.
[0044] In the embodiments of the present application, by performing data pre-processing on the sensor data to eliminate noises due to shaky hands or inherent to the sensors themselves during sensor data collection, the sensor data with relatively high confidence may then be obtained.
[0045] It should be understood that, in addition to performing data pre-processing on the sensor data before using an attack determination model for attack determination, the sensor data may also be pre-processed before training the attack determination model based on sensor training data, so as to obtain the sensor training data.
[0046] Optionally, the step S120 may comprise: performing feature extraction based on the sensor data to obtain the sensor data with features extracted; and determining whether an attack occurs based on the sensor data with features extracted and the attack determination model; where the attack determination model is obtained based on training with sensor training data with features extracted.
[0047] It should be understood that, in the embodiments of the present application, the sensor data before the feature extraction may be the pre-processed sensor data or may be the sensor data of the mobile terminal device when the mobile terminal device performs biometric authentication and identification, which is not limited in the embodiments of the present application.
[0048] For example, the sensor data with features extracted comprises the acceleration sensor data of the mobile terminal device and/or the gyroscope data of the mobile terminal device, where,the acceleration sensor data comprises one or more pieces of the following data: an average, a standard deviation, a median, a maximum, a minimum, and an integral of the acceleration sensor of the mobile terminal device in three directions of x, y, and z axes; andthe gyroscope data comprises one or more pieces of the following data:
[0049] an average, a standard deviation, a median, a maximum, a minimum, and an integral of the gyroscope of the mobile terminal device in three directions of x, y, and z axes.
[0050] It should be understood that, in the embodiments of the present application, the integral is used to represent the speed and displacement of the mobile terminal device.
[0051] It should be understood that the time length for performing the biometric authentication and identification is also an important parameter of the sensor data. In the embodiments of the present application, the sensor data serving as an input to the attack determination model may further comprise the time length for performing the biometric authentication and identification.
[0052] The data processing methods before the sensor data is input to the attack determination model are listed above. It should be understood that the above-described pre-processing operation and feature extraction operation are optional.
[0053] It should be understood that, before the attack determination model is used to perform determination on the sensor, training based on a large amount of sensor training data needs to be performed to obtain the attack determination model. The above-described data processing methods may also be applied in the training phase of the attack determination model to perform data processing based on the sensor data to obtain sensor training data for training the attack determination model.
[0054] Optionally, before the step S110, the method may further comprise: obtaining sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically to obtain multiple sets of sensor training data for the mobile terminal device; and using the multiple sets of sensor training data as inputs to the attack determination model for training to obtain the attack determination model.
[0055] As described above, in the embodiments of the present application, data processing operations, such as pre-processing, feature extraction, and the like, may be performed on the sensor data collected during biometric authentication and identification to obtain the sensor training data.
[0056] It should be understood that supervised binary classification model training or unsupervised binary classification model training may be adopted during the model training.
[0057] With regard to the supervised model training, training samples with concept labels (classifications) need to be learned, so as to predict labels (classifications) of data outside of the training sample set as much as possible. In other words, when the sensor data is input for model training, the input sensor data needs to be labeled as sensor data with an attack occurring or sensor data without an attack. For example, label (classification) prediction needs to be performed on an output result corresponding to the input sensor data. Algorithms for supervised training of the model may comprise all regression algorithms and classification algorithms, such as linear regression, decision tree, neural network, K-Nearest Neighbor (KNN), Support Vector Machine (SVM), etc.
[0058] At this point, before the step S110, the method may further comprise: obtaining sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically; determining sensor training data and a corresponding label based on the sensor data and corresponding attack occurrence situation, the label used for indicating whether an attack has occurred to the corresponding sensor training data; and using multiple sets of sensor training data and corresponding labels as inputs to the attack determination model for training to obtain the attack determination model.
[0059] It should be understood that, in the embodiments of the present application, the multiple sets of sensor training data may comprise sensor training data labeled as attack occurring and/or sensor training data labeled as attack not occurring.
[0060] There are many training algorithms for supervised training of the model, and different training algorithms have different parameters set for the training of the model. The Random Forest algorithm (Random Forrest) is taken as an example of functions. In a particular example, main parameter settings in the attack determination model may comprise the following: the number of decision trees is 100, the minimum number of leaf nodes is 100, etc. It should be understood that these parameter settings are adjustable.
[0061] With regard to the unsupervised model training, training samples without concept labels (classifications) need to be learned, so as to find structural knowledge in the training sample set. In the unsupervised model training, all labels (classifications) are unknown. Algorithms for unsupervised training of the model may comprise all clustering algorithms, such as k-means, Principal Component Analysis (PCA), gaussian mixture model (GMM), etc.
[0062] At this point, before the step S110, the method may further comprise: obtaining sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically; determining sensor training data based on the sensor data; and using multiple sets of sensor training data as inputs to the attack determination model for performing clustering training on the multiple sets of historical sensor data to obtain the attack determination model.
[0063] It should be understood that, during the model training, the training may also be performed based on sensor data of different user risk levels and/or different mobile terminal device models to obtain an attack determination model capable of supporting different user risk levels and/or different mobile terminal device models.
[0064] Optionally, as an exemplary embodiment, the method may further comprise: obtaining a risk level of a user corresponding to the mobile terminal device and/or model data of the mobile terminal device;where the step S120 may be, for example, implemented as follows: determining whether an attack occurs based on the risk level of a user corresponding to the mobile terminal device and/or model data of the mobile terminal device, the sensor data and the attack determination model; where the training of the attack determination model further uses sensor training data corresponding to the risk level of the user of the mobile terminal device and/or the model of the mobile terminal device.
[0065] For example, taking the supervised model training as an example, in addition to using the sensor training data as an input feature during the model training, the risk level of the user corresponding to the sensor data may also be used as an input feature, the model of the mobile terminal device corresponding to the sensor data may be used as an input feature, the risk level of the user and the model of the mobile terminal device corresponding to the sensor data may be both used as input features, or the like.
[0066] For example, it is assumed that user risk levels are divided into three types, i.e., low risk users, regular risk users, and high risk users. During the attack model training, the attack determination model may be trained based on sensor training data corresponding to these three risk levels. Accordingly, when attack determination and detection is performed, the sensor training data needs to be used as input features of the attack determination model, and moreover, the user risk level corresponding to the sensor training data also needs to be used as an input feature of the attack determination model to determine whether an attack occurs.
[0067] The method for training the attack determination model according to sensor training data of mobile terminal devices of different models is similar to the method for training the attack determination model according to the user risk level, which will not be elaborated here.
[0068] It should be understood that, in exemplary applications, the mobile terminal device model and the user risk level may be both used as input features of the attack determination model to train the attack determination model.
[0069] The above description is based on the supervised binary classification model that trains the attack determination model according to the model of the mobile terminal device and/or the user risk level. An unsupervised binary classification model may also be adopted to train the attack determination model based on sensor training data of different models of mobile terminal device and/or different user risk levels. Namely, in addition to using the sensor training data as input features, the risk level of the user corresponding to the sensor data may also be used as an input feature, or the model of the mobile terminal device corresponding to the sensor data may also be used as an input feature, alternatively the risk level of the user and the mobile terminal device model corresponding to the sensor data may be both used as input features, or the like.
[0070] An example of an attack determination model that supports various user risk levels and/or different models of mobile terminal device is described above. In exemplary applications, one attack determination model may only correspond to one user risk level and/or one mobile terminal device model.
[0071] It should be understood that, during the model training, the training may be performed based on sensor training data of different user risk levels and/or sensor data of different models of mobile terminal device to obtain attack determination models corresponding to different user risk levels and/or different models of mobile terminal device. For example, the training may be performed by using multiple sets of sensor training data of a mobile terminal device model A to obtain an attack determination model corresponding to the mobile terminal device model A, etc.
[0072] Optionally, as an exemplary embodiment, there are multiple attack determination models, and each attack determination model corresponds to a different mobile terminal device model;where the step S120 may be, for example, implemented as follows:inputting the sensor training data into an attack determination model corresponding to the model of the mobile terminal device to determine whether an attack occurs.
[0073] For example, with regard to a mobile terminal device of the model A, training may be performed based on sensor training data corresponding to the mobile terminal device of the model A to obtain an attack determination model corresponding to the mobile terminal device of the model A. With regard to a mobile terminal device of a model B, training may be performed based on sensor training data corresponding to the mobile terminal device of the model B to obtain an attack determination model corresponding to the mobile terminal device of the model B. Accordingly, when attack determination is performed, for the mobile terminal device of the model A, obtained sensor data (or processed sensor data) may be input into the attack determination model corresponding to the mobile terminal device of the model A, so as to obtain an attack determination result; for the mobile terminal device of the model B, obtained sensor data (or processed sensor data) may be input into the attack determination model corresponding to the mobile terminal device of the model B, so as to obtain an attack determination result, and the like.
[0074] Optionally, as an exemplary embodiment, an attack determination model has different determination thresholds, and the determination thresholds are set based on risk levels of users corresponding to mobile terminal devices.
[0075] For example, it is assumed again that user risk levels are divided into three types, i.e., low risk users, regular risk users, and high risk users. With regard to users of different risk levels, different determination thresholds may be set for an attack determination model. For example, in an exemplary application, the determination threshold may be set at 60% for a low risk user; the determination threshold may be set at 50% for a regular risk user; and the determination threshold may be set at 40% for a high risk user. Assuming that the attack probability of a set of sensor data is determined by the attack determination model to be 45%, if this user's risk level is a low risk level, then the determined attack determination result is that no attack occurs since 45% is lower than the determination threshold of 60% corresponding to the low risk level; if this user's risk level is a regular risk level, then the determined attack determination result is that no attack occurs since 45% is lower than the determination threshold of 50% corresponding to the low risk level; and if this user's risk level is a high risk level, then the determined attack determination result is that an attack occurs since 45% is higher than the determination threshold of 40% corresponding to the low risk level. Therefore, for the same sensor data, the determination results may be different since users have different risk levels.
[0076] It should be understood that an output result of an attack determination model may further be fed back to a system that performs biometric authentication and identification, so as to adjust the process of biometric authentication and identification.
[0077] Furthermore, after the step S120, the method may further comprise: adjusting, based on an output result of the attack determination model, a security authentication level of the biometric authentication and identification; where the higher the probability of attack occurrence is indicated by the output result, the higher the security authentication level of the biometric authentication and identification becomes after the adjustment.
[0078] Taking face recognition as an example, multiple sets of liveness detection steps may be prepared in a face recognition system based on facial expression and speech interaction for different security authentication levels. If the output risk level of the attack determination model exceeds a first designated threshold, a liveness detection step with a high security authentication level is output to perform authentication and identification on the user to prevent risks. If the output risk level of the attack determination model is lower than a second designated threshold, a liveness detection step with a low security authentication level is output to perform authentication and identification on the user to improve user experience. For example, in combination with Table 1, assuming that the output result of the attack determination model indicates that the risk level of risk occurrence is ultra-high risk, the security authentication level of the biometric authentication and identification is then adjusted to the highest security authentication level corresponding to the "ultra-high risk," etc.
[0079] The solution of the embodiments of the present application will be further described below with reference to an exemplary embodiment.
[0080] FIG. 3 is a schematic flow chart of training and prediction of biometric authentication and identification according to some embodiments of the present application. Here,the training phase may comprise historical data collection, data pre-processing, feature extraction, and binary classification model training. The prediction phase may comprise online data collection, data pre-processing, feature extraction, determination of whether an attack occurs based on a binary classification model, and the like, as well as output of a determination result to a biometric authentication and identification system. The binary classification model according to some embodiments of the present application is the attack determination model in the embodiment as shown in FIG. 1. For example, the flow chart shown in FIG. 3 may comprise the following.(1) Historical data collectionA mobile terminal device collects sensor data obtained in a historical process of biometric authentication and identification. The sensor data may comprise the acceleration sensor data and/or the gyroscope data.For example, the mobile terminal device may obtain sensor data in the entire process of biometric authentication and identification or in a predetermined time period in the process of biometric authentication and identification. The sensor data may comprise sensor data of one or more dimensions in the 6 dimensions as shown in FIG. 2 and time stamp information corresponding to the sensor data. (2) Data pre-processingAfter the collection of the sensor data is completed, pre-processing operations, such as filtering operation, boundary detection operation, and length normalization operation, may be further performed on the sensor data, so as to eliminate noises due to shaky hands or inherent to the sensors themselves during collection, thereby obtaining sensor data with relatively high confidence. See relevant description of the embodiment as shown in FIG. 1 for exemplary implementations of filtering operation, boundary detection operation, and length normalization operation, which will not be elaborated. (3) Feature extractionFeature extraction is performed based on the sensor data obtained in the step (1) or based on the pre-processed sensor data in the step (2) to obtain sensor data with features extracted.For example, based on data of an acceleration sensor in three directions of x, y, and z axes in the entire or a predetermined time length in the process of biometric authentication and identification, feature data of the acceleration sensor in three directions of x, y, and z axes, such as an average, a standard deviation, a median, a maximum, a minimum, and an integral, may be obtained. Based on data of a gyroscope in three directions of x, y, and z axes in the entire or a predetermined time length in the process of biometric authentication and identification, feature data of the gyroscope in three directions of x, y, and z axes, such as an average, a standard deviation, a median, a maximum, a minimum, and an integral, may be obtained.It should be understood that the feature data such as the time length of performing the biometric authentication and identification may also be determined based on the data obtained by the acceleration sensor or the gyroscope.It should be understood that the sensor data with features extracted may comprise one or more pieces of the above-described feature data. (4) Binary classification model trainingDuring the model training, the sensor data obtained in the step (1) may be used as training data, the sensor data obtained after the pre-processing in the step (2) may be used as training data, or the sensor data obtained after the feature extraction in the step (3) may be used as training data.Apparently, when the sensor data obtained after the feature extraction in the step (3) is input as training data into the attack determination model, the trained attack determination model may be more accurate.With regard to the training of the attack determination model, supervised binary classification model training or unsupervised binary classification model training may be adopted, so that two types of application scenarios, i.e., with attack occurrence and without attack occurrence, are identified based on the sensor data. For an exemplary training method, the relevant model training methods in the current technologies may be referenced, which will not be elaborated.Moreover, it should be understood that, during the training of the attack determination model, the model of the mobile terminal device and/or the user risk level corresponding to the sensor data may be further used as an input feature of the attack determination model, or different attack determination models may be trained based on different models of mobile terminal device and/or different user risk levels. (5) Online data collectionThe step (1) may be referenced for an exemplary implementation of the step (5). (6) Data pre-processingThe step (2) may be referenced for an exemplary implementation of the step (6). (7) Feature extractionThe step (3) may be referenced for an exemplary implementation of the step (7). (8) Determination of whether an attack occurs based on a binary classification model.In this step, the sensor data obtained after the feature extraction may be used as an input to the binary classification model for determining whether an attack occurs.It should be understood that, if the binary classification model is trained based on directly collected sensor data or is trained based on pre-processed sensor data, corresponding online data is used as an input to the binary classification model.Furthermore, if the binary classification model further uses the user risk level and/or the model of the mobile terminal device as an input for training, corresponding information of the user risk level and/or the model of the mobile terminal device also needs to be input when determination is performed based on the binary classification model.If multiple binary classification models exist and each binary classification model corresponds to a user risk level and/or a model of mobile terminal device, a binary classification model corresponding to the user risk level and/or the model of mobile terminal device corresponding to the sensor data is selected for determination of the sensor data. (9) Output of a determination result to a biometric authentication and identification system.
[0081] A determination result may be obtained based on the binary classification model.
[0082] At this point, the determination result may be fed back to the biometric authentication and identification system, so that the biometric authentication and identification system executes a corresponding operation.
[0083] Optionally, the determination result may be fed back to the biometric authentication and identification system, so that the biometric authentication and identification system determines whether an attack occurs, thereby determining whether the current biometric authentication and identification is valid.
[0084] Optionally, the determination result may be fed back to the biometric authentication and identification system, so that the biometric authentication and identification system determines whether to adjust the security level of the biometric authentication and identification, so as to take both security and user experience into consideration. For example, the determination result may be a risk level of the current biometric authentication and identification, and based on this risk level and the current security level of the biometric authentication and identification, the biometric authentication and identification system may determine whether to adjust the security level of the biometric authentication and identification.
[0085] For example, it is assumed that there are four risk levels in determination results, i.e., low risk, regular risk, high risk, and ultra-high risk, and that corresponding security levels of biometric authentication and identification are divided into Level 1, Level 2, Level 3, and Level 4, respectively (the higher the level is, the higher the authentication requirements are for the biometric authentication and identification, and generally speaking, the poorer the user experience becomes). Assuming that the determination result is high risk and the security level of biometric authentication and identification is Level 1, then the biometric authentication and identification system may adjust the security level of biometric authentication and identification to Level 3. Assuming that the determination result is low risk and the security level of biometric authentication and identification is Level 4, then the biometric authentication and identification system may adjust the security level of biometric authentication and identification to Level 1, and the like.
[0086] Particular embodiments of the present specification are described above, and other embodiments fall within the scope of the appended claims. In certain situations, actions or steps stated in the claims may be executed in an order different from those in the embodiments and can still achieve desired results. In addition, a process depicted in the accompanying drawings does not necessarily require the illustrated particular order or continuous order to achieve desired results. In some implementations, multi-task processing and parallel processing may be feasible or may be beneficial.
[0087] FIG. 4 is a schematic structure diagram of an electronic device according to some embodiments of the present application. Referring to FIG. 4, at a hardware level, the electronic device comprises a processor, and optionally may further comprise an internal bus, a network interface, and a memory, where the memory may comprise an internal memory, such as a Random-Access Memory (RAM), and may further comprise a non-volatile memory, such as at least one disk memory. The electronic device may also comprise hardware required by other business.
[0088] The processor, the network interface, and the memory may be connected to each other via the internal bus. The internal bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, or an Extended Industry Standard Architecture (EISA) bus. The bus may be divided into an address bus, a data bus, a control bus, and the like. For convenience of description, only one two-way arrow is used in FIG. 4 to represent the bus, which does not indicate that there is only one bus or one type of buses.
[0089] The memory is used for storing programs. For example, the programs may comprise program codes, and the program codes comprise computer operation instructions. The memory may comprise an internal memory and a non-volatile memory, and may provide instructions and data to the processor.
[0090] The processor reads corresponding computer programs from the non-volatile memory into the internal memory, and then executes the computer programs to form a biometric authentication, identification, and detection apparatus on the logical level. The processor executes programs stored in the memory and is configured, for example, to execute the following operations: obtaining sensor data of the mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0091] The method implemented by the biometric authentication, identification, and detection device disclosed by the embodiment as shown in FIG. 1 of the present application may be applied to a process or may be implemented by the processor. The processor may be an integrated circuit chip with signal processing capabilities. In an implementation process, the steps of the above-described method may be carried out by an integrated logic circuit of the hardware or instructions in the form of software in the processor. The above-described processor may be a general-purpose processor, including a Central Processing Unit (CPU), a Network Processor (NP), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other programmable logic devices, discrete gates or transistor logic devices, and discrete hardware components, which can implement or execute the methods, steps, and logic block diagrams disclosed in the embodiments of the present application. The general-purpose processor may be a micro-processor, or the processor may also be any conventional processor. Steps of the method disclosed with reference to the embodiments of the present application may be directly executed and completed by a hardware decoding processor or executed and completed by a combination of hardware and software modules in a decoding processor. The software modules may be disposed in a storage medium mature in the art, such as a random memory, a flash memory, a read-only memory, a programmable read-only memory or an electrically erasable programmable memory, a register and the like. The storage medium is disposed in the memory, and the processor reads information in the memory and completes the steps of the above-described methods in combination with the hardware thereof.
[0092] The electronic device may further implement the methods shown in FIG. 1 and achieve functions of the biometric authentication, identification, and detection apparatus in the embodiments as shown in FIG. 1, or achieve functions of the training phase and/or the prediction phase in the embodiments as shown in FIG. 2, which will not be elaborated by the embodiments of the present application.
[0093] In addition to the software implementation, the electronic device according to the present application does not exclude other implementations, such as logic devices or a combination of software and hardware. In other words, the execution entities of the following processing flow are not limited to various logic units, but may also be hardware or logic devices.
[0094] The embodiments of the present application further provide a computer readable storage medium, where the computer readable storage medium stores one or more programs, and the one or more programs comprise instructions can, when executed by a portable electronic device comprising a plurality of applications, cause the portable electronic device to implement the method in the embodiment as shown in FIG. 1, and to implement, for example, the following operations: obtaining sensor data of the mobile terminal device when the mobile terminal device performs biometric authentication and identification; and determining whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0095] FIG. 5 is a schematic structural diagram of a biometric authentication, identification, and detection apparatus 500 according to some embodiments of the present application. Referring to FIG. 5, in a software implementation, the biometric authentication, identification, and detection apparatus 500 may comprise: an obtaining unit 501 configured to obtain sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and an attack determining unit 502 configured to determine whether an attack occurs based on the sensor data and an attack determination model; where the attack determination model is obtained based on training with sensor training data.
[0096] In the embodiments of the present application, sensor data collected by a mobile terminal device during biometric authentication and identification is obtained, and attack determination is performed on the sensor data based on a sensor classification model. As a result, with reduced dependence or without dependence on user cooperation, the effect of strengthening the security of biometric authentication and identification can be achieved. The embodiments of the present application provide better security support for users while enhancing user experience.
[0097] Optionally, as an exemplary embodiment, the obtaining unit 501 is further configured to obtain a risk level of a user corresponding to the mobile terminal device and/or model data of the mobile terminal device; andthe attack determining unit 502 is further configured to: determine whether an attack occurs based on the risk level of a user corresponding to the mobile terminal device and/or the model data of the mobile terminal device, the sensor data and the attack determination model; where the training of the attack determination model uses sensor training data under the risk level of a user corresponding to the mobile terminal device and/or a model of the mobile terminal device.
[0098] Alternatively, as another exemplary embodiment, there are optionally multiple attack determination models, and each attack determination model corresponds to a different mobile terminal device model;the attack determining unit 502 is configured, for example, todetermine whether an attack occurs based on the sensor data and the attack determination model corresponding to the model of the mobile terminal device.
[0099] Furthermore, an attack determination model has different determination thresholds, and the determination thresholds are set based on risk levels of users corresponding to mobile terminal devices.
[0100] Alternatively, as another exemplary embodiment, an attack determination model optionally has different determination thresholds, and the determination thresholds are set based on risk levels of users corresponding to mobile terminal devices.
[0101] Optionally, as an exemplary embodiment, the attack determining unit 502 is configured, for example, topre-process the sensor data to obtain pre-processed sensor data, where the pre-processing comprises at least one of the following: filtering operation, boundary detection operation, and length normalization operation; anddetermine whether an attack occurs based on the pre-processed sensor data and the attack determination model;where the attack determination model is obtained based on training with the pre-processed sensor training data.
[0102] Optionally, as an exemplary embodiment, the attack determining unit 502 is configured, for example, toperform feature extraction on the sensor data to obtain the sensor data with features extracted; anddetermine whether an attack occurs based on the sensor data with features extracted and the attack determination model;where the attack determination model is obtained based on training with sensor training data with features extracted;the sensor data with features extracted comprises acceleration sensor data of the mobile terminal device and/or the gyroscope data of the mobile terminal device;the acceleration sensor data comprises one or more pieces of the following data: an average, a standard deviation, a median, a maximum, a minimum, and an integral of the acceleration sensor of the mobile terminal device in three directions of x, y, and z axes; andthe gyroscope data comprises one or more pieces of the following data:an average, a standard deviation, a median, a maximum, a minimum, and an integral of the gyroscope of the mobile terminal device in three directions of x, y, and z axes.
[0103] Furthermore, the sensor data with features extracted also comprises a time length of performing the biometric authentication and identification.
[0104] For example, the sensor data subject to feature extraction may be the pre-processed sensor data or may be the sensor data that is not pre-processed.
[0105] Optionally, the biometric authentication, identification, and detection apparatus 500 may further comprise a training unit 503, where the training unit 503 may be configured, for example, toobtain sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically to obtain multiple sets of sensor training data of the mobile terminal device; anduse the multiple sets of sensor training data as inputs to the attack determination model for training to obtain the attack determination model.
[0106] Furthermore, the training unit 503 may be configured, for example, toobtain sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically;determine sensor training data and a corresponding label based on the sensor data and corresponding attack occurrence situation, the label used for indicating whether an attack has occurred to the corresponding sensor training data; anduse multiple sets of sensor training data and corresponding identifiers as inputs to the attack determination model for training to obtain the attack determination model.
[0107] Furthermore, the training unit 503 may be configured, for example, toobtain sensor data of the mobile terminal device collected when the mobile terminal device performed biometric authentication and identification historically;determine sensor training data based on the sensor data; anduse multiple sets of sensor training data as inputs to the attack determination model for performing clustering training on the multiple sets of historical sensor data to obtain the attack determination model.
[0108] Optionally, the biometric authentication, identification, and detection apparatus 500 may further comprise an adjusting unit 504, where the adjusting unit 504 may be configured, for example, toadjust, based on an output result of the attack determination model, a security authentication level of the biometric authentication and identification;where the higher the probability of attack occurrence is indicated by the output result, the higher the security authentication level of the biometric authentication and identification becomes after the adjustment.
[0109] In summary, the above-described is merely preferred embodiments of the present application, which are not used to limit the scope of the present application. Any modification, equivalent substitution, or improvement made within the spirit and principle of the present application shall fall within the scope of the present application.
[0110] The system, device, module, or unit elaborated in the above-described embodiments may be achieved, for example, by a computer chip or entity or by a product having a function. One typical implementation apparatus is a computer. For example, the computer may be a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of the above apparatuses.
[0111] Computer readable media include non-volatile, volatile, mobile, and immobile media, which can implement information storage through any method or technology. The information may be computer readable instructions, data structures, program modules, or other data. Examples of storage media of computers include, but are not limited to, Phase-change Random Access Memories (PRAMs), Static Random Access Memories (SRAMs), Dynamic Random Access Memories (DRAMs), other types of Random Access Memories (RAMs), Read-Only Memories (ROMs), Electrically Erasable Programmable Read-Only Memories (EEPROMs), flash memories or other memory technologies, Compact Disk Read-Only Memories (CD-ROMs), Digital Versatile Discs (DVDs) or other optical memories, magnetic cassettes, magnetic cassette and disk memories or other magnetic storage devices, or any other non-transmission media, which can be used for storing information accessible to a computation device. According to the description in the present specification, the computer readable media do not include transitory media, such as modulated data signals and carriers.
[0112] It should be further noted that the terms of "including," "comprising," or any other variants of the terms are intended to encompass a non-exclusive inclusion, causing a process, method, product, or device comprising a series of elements to not only comprise these elements, but also comprise other elements that are not explicitly listed, or further comprise elements that are inherent to the process, method, product, or device. When there is no further restriction, elements described by the statement "comprising one..." does not exclude that a process, method, product, or device comprising the above elements further comprises additional identical elements.
[0113] The various embodiments in the present specification are described in a progressive manner, the same or similar parts between the various embodiments may refer to each other, and each embodiment focuses on the difference from the other embodiments. In particular, since the system embodiments are substantially similar to the method embodiments, the system embodiments are described in a relatively simple manner, and the description of relevant parts of the method embodiments may be referenced.
权利要求:
Claims (11)
[0001] A biometric authentication, identification, and detection method for a mobile terminal device, comprising:
obtaining sensor data of the mobile terminal device when the mobile terminal device performs biometric authentication and identification; and
determining whether an attack occurs based on the sensor data and an attack determination model;
wherein the attack determination model is obtained based on training with sensor training data.
[0002] The method according to claim 1, further comprising:
obtaining a risk level of a user corresponding to the mobile terminal device and/or model data of the mobile terminal device; wherein
the determining whether an attack occurs based on the sensor data and an attack determination model comprises:
determining whether an attack occurs based on the risk level of a user corresponding to the mobile terminal device and/or model data of the mobile terminal device, the sensor data and the attack determination model;
wherein the training of the attack determination model is trained by using sensor training data under the risk level of the user corresponding to the mobile terminal device and/or a model of the mobile terminal device.
[0003] The method according to claim 1, wherein there are multiple attack determination models, and each attack determination model corresponds to a different mobile terminal device model; andthe determining whether an attack occurs based on the sensor data and an attack determination model comprises:determining whether an attack occurs based on the sensor data and the attack determination model corresponding to a model of the mobile terminal device.
[0004] The method according to claim 1 or 3, wherein the attack determination model has different determination thresholds, and the determination thresholds are set based on risk levels of users corresponding to mobile terminal devices.
[0005] The method according to claim 1, whereinthe determining whether an attack occurs based on the sensor data and an attack determination model comprises:
pre-processing the sensor data to obtain pre-processed sensor data, the pre-processing comprising at least one of the following: filtering operation, boundary detection operation, and length normalization operation; and
determining whether an attack occurs based on the pre-processed sensor data and the attack determination model;
wherein the attack determination model is obtained based on training with the pre-processed sensor training data.
[0006] The method according to claim 1 or 5, whereinthe determining whether an attack occurs based on the sensor data and an attack determination model comprises:
performing feature extraction based on the sensor data to obtain the sensor data with features extracted; and
determining whether an attack occurs based on the sensor data with features extracted and the attack determination model;
wherein the attack determination model is obtained based on training with sensor training data with features extracted;
the sensor data with features extracted comprises acceleration sensor data of the mobile terminal device and/or gyroscope data of the mobile terminal device;
the acceleration sensor data comprises one or more pieces of the following data: an average, a standard deviation, a median, a maximum, a minimum, and an integral of an acceleration sensor of the mobile terminal device in three directions of x, y, and z axes; and
the gyroscope data comprises one or more pieces of the following data:an average, a standard deviation, a median, a maximum, a minimum, and an integral of a gyroscope of the mobile terminal device in three directions of x, y, and z axes.
[0007] The method according to claim 6, wherein the sensor data with features extracted also comprises a time length of performing the biometric authentication and identification.
[0008] The method according to claim 1, further comprising:
adjusting, based on an output result of the attack determination model, a security authentication level of the biometric authentication and identification;
wherein the higher a probability of attack occurrence is indicated by the output result, the higher the security authentication level of the biometric authentication and identification becomes after the adjustment.
[0009] A biometric authentication, identification, and detection apparatus, comprising:
an obtaining unit configured to obtain sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and
an attack determining unit configured to determine whether an attack occurs based on the sensor data and an attack determination model;
wherein the attack determination model is obtained based on training with sensor training data.
[0010] An electronic device, comprising:
a processor; and
a memory configured to store computer executable instructions, wherein the executable instructions cause, when executed, the processor to execute the following operations:
obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and
determining whether an attack occurs based on the sensor data and an attack determination model;
wherein the attack determination model is obtained based on training with sensor training data.
[0011] A computer readable storage medium, wherein the computer readable storage medium stores one or more programs, and the one or more programs, when executed by an electronic device comprising a plurality of applications, cause the electronic device to execute the following operations:
obtaining sensor data of a mobile terminal device when the mobile terminal device performs biometric authentication and identification; and
determining whether an attack occurs based on the sensor data and an attack determination model;
wherein the attack determination model is obtained based on training with sensor training data.
类似技术:
公开号 | 公开日 | 专利标题
US10650280B2|2020-05-12|Systems and methods for machine learning enhanced by human measurements
TWI689942B|2020-04-01|Man-machine recognition method and device, and method and device for collecting behavior characteristic data
CN106716382B|2019-05-28|The method and system of aggregation multiple utility program behavioural analysis for mobile device behavior
US9819677B2|2017-11-14|Supplementing biometric identification with device identification
US10664581B2|2020-05-26|Biometric-based authentication method, apparatus and system
US10402627B2|2019-09-03|Method and apparatus for determining identity identifier of face in face image, and terminal
de Freitas Pereira et al.2014|Face liveness detection using dynamic texture
US10438077B2|2019-10-08|Face liveness detection method, terminal, server and storage medium
De Marsico et al.2014|Firme: Face and iris recognition for mobile engagement
US20180121640A1|2018-05-03|System, device, and method of secure utilization of fingerprints for user authentication
AU2020201662B2|2020-04-16|Face liveness detection method and apparatus, and electronic device
CN107077551B|2020-07-10|Scalable authentication process selection based on sensor input
US10776463B2|2020-09-15|Active authentication of users
US20170140138A1|2017-05-18|Behavior based authentication for touch screen devices
KR101938033B1|2019-01-11|Biometric authentication in connection with camera-equipped devices
US9390340B2|2016-07-12|Image-based character recognition
Singh et al.2019|A comprehensive overview of biometric fusion
US8254647B1|2012-08-28|Facial image quality assessment
JP6345276B2|2018-06-20|Face authentication method and system
CN107209818B|2020-09-29|Method and system for detecting false user interactions with a mobile device for improved malware protection
US9552642B2|2017-01-24|Apparatus and method for tracking object using feature descriptor, and apparatus and method for removing garbage feature
US8805018B2|2014-08-12|Method of detecting facial attributes
KR20130105527A|2013-09-25|Facial feature detection
US9436883B2|2016-09-06|Collaborative text detection and recognition
US20160062456A1|2016-03-03|Method and apparatus for live user recognition
同族专利:
公开号 | 公开日
US20200210561A1|2020-07-02|
EP3693876A4|2020-12-09|
SG11202004392RA|2020-06-29|
WO2019114337A1|2019-06-20|
CN108280332A|2018-07-13|
TW201928741A|2019-07-16|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
2019-06-21| STAA| Information on the status of an ep patent application or granted ep patent|Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
2020-07-10| STAA| Information on the status of an ep patent application or granted ep patent|Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
2020-07-10| PUAI| Public reference made under article 153(3) epc to a published international application that has entered the european phase|Free format text: ORIGINAL CODE: 0009012 |
2020-08-12| 17P| Request for examination filed|Effective date: 20200505 |
2020-08-12| AX| Request for extension of the european patent|Extension state: BA ME |
2020-08-12| AK| Designated contracting states|Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
2020-12-09| A4| Supplementary search report drawn up and despatched|Effective date: 20201105 |
2020-12-09| RIC1| Information provided on ipc code assigned before grant|Ipc: H04L 29/06 20060101AFI20201030BHEP Ipc: G06F 21/57 20130101ALI20201030BHEP Ipc: H04W 12/00 20090101ALI20201030BHEP Ipc: G06F 21/32 20130101ALI20201030BHEP Ipc: G06F3/034620130101ALI20201030BHEP Ipc: H04W 12/06 20090101ALI20201030BHEP Ipc: H04W 12/12 20090101ALI20201030BHEP Ipc: G06F 21/55 20130101ALI20201030BHEP |
2021-01-13| RAP1| Party data changed (applicant data changed or rights of an application transferred)|Owner name: ADVANCED NEW TECHNOLOGIES CO., LTD. |
2021-03-24| DAX| Request for extension of the european patent (deleted)|
2021-03-24| DAV| Request for validation of the european patent (deleted)|
2021-06-28| REG| Reference to a national code|Ref country code: DE Ref legal event code: R079 Ref document number: 602018028663 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06F0021320000 Ipc: H04L0029060000 |
2021-07-14| STAA| Information on the status of an ep patent application or granted ep patent|Free format text: STATUS: GRANT OF PATENT IS INTENDED |
2021-07-14| GRAP| Despatch of communication of intention to grant a patent|Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
2021-08-04| RIC1| Information provided on ipc code assigned before grant|Ipc: H04L 29/06 20060101AFI20210628BHEP Ipc: H04W 12/12 20210101ALI20210628BHEP Ipc: H04W 12/06 20210101ALI20210628BHEP Ipc: H04W 12/00 20210101ALI20210628BHEP Ipc: H04W 12/122 20210101ALI20210628BHEP Ipc: H04W 12/60 20210101ALI20210628BHEP Ipc: G06F 21/32 20130101ALI20210628BHEP Ipc: H04W 12/68 20210101ALI20210628BHEP Ipc: G06F 21/57 20130101ALI20210628BHEP Ipc: G06F 21/55 20130101ALI20210628BHEP Ipc: G06F3/034620130101ALI20210628BHEP Ipc: G06N 20/00 20190101ALI20210628BHEP Ipc: G06N3/08 20060101ALI20210628BHEP Ipc: G06N5/00 20060101ALI20210628BHEP |
2021-08-11| INTG| Intention to grant announced|Effective date: 20210715 |
2021-11-17| GRAS| Grant fee paid|Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
2021-11-19| STAA| Information on the status of an ep patent application or granted ep patent|Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
2021-11-19| GRAA| (expected) grant|Free format text: ORIGINAL CODE: 0009210 |
2021-12-22| AK| Designated contracting states|Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
2021-12-22| REG| Reference to a national code|Ref country code: GB Ref legal event code: FG4D |
2021-12-30| REG| Reference to a national code|Ref country code: CH Ref legal event code: EP |
2022-01-13| REG| Reference to a national code|Ref country code: DE Ref legal event code: R096 Ref document number: 602018028663 Country of ref document: DE |
2022-01-15| REG| Reference to a national code|Ref country code: AT Ref legal event code: REF Ref document number: 1457822 Country of ref document: AT Kind code of ref document: T Effective date: 20220115 |
2022-01-19| REG| Reference to a national code|Ref country code: IE Ref legal event code: FG4D |
优先权:
申请号 | 申请日 | 专利标题
[返回顶部]