专利摘要:
Method and system for the recognition of the state of mind by means of image analysis. The invention relates to a method for recognizing the state of mind of a subject (1) from its relationship with expressions/facial movements. The method of the invention focuses on recognizing moods, a concept distinct from emotion. The way of passing the captured images of the subjects (1) to facial movements is personalized, learning the particular form of the facial characteristics of the subject (1) analyzed. The invention is based on the analysis of a collection of images of a certain number, but greater than that used in the typical recognition of emotion. In this way a more robust state of mind recognition method is defined. The method comprises three fundamental stages: data definition and general previous criteria; definition of personalized resting patterns and assessment of mood. (Machine-translation by Google Translate, not legally binding)
公开号:ES2633152A1
申请号:ES201730259
申请日:2017-02-27
公开日:2017-09-19
发明作者:Javier Varona Gómez;Diana ARELLANO TÁVARA;Miquel MASCARÓ OLIVER;Cristina MANRESA YEE;Simón GARCÉS RAYO;Juan Sebastián FILIPPINI
申请人:Universitat de les Illes Balears;
IPC主号:
专利说明:

image 1
image2
image3
image4
image5
image6
5
10
fifteen
twenty
25
30
35


More preferably, the relationship of the eight moods of the Mehrabian PAD space with the units of action that are activated in each of them are those defined in the Facial Expression Repertoire (FER).
In another preferred embodiment of the method of the invention, one or more resting patterns corresponding to the distances between the characteristic facial points of the subject are defined, said distances being one or more of the following: right eye-eyebrow average distance; inner eye-right eye distance; middle distance left eye-eyebrow; internal distance left eye; right open eye distance; left open eye distance; horizontal mouth distance; upper mouth-nose distance; jaw-nose distance; near-lower mouth-outer mouth distance; distance left eyebrow-upper eyelid; distance left eyebrow lower eyelid; right eyebrow-upper eyelid distance; right eyebrow distance - lower eyelid.
In another preferred embodiment of the method of the invention, a calibration of state or moods of the subject is performed in a session of known and controlled stimuli, so that one or more units of action can be associated with one or more moods I of that subject.
Another object of the invention relates to a system for the recognition of the mood of a subject through the method of recognition of the mood according to any of the embodiments described herein, comprising:
-a camera adapted for taking facial images of said subject;
-one or more processing means (3) for the storage and / or processing of facial images, where said processing means (3) are configured by hardware and / or software for the realization of a method of recognition of emotional states according to any of the embodiments described herein.
In a preferred embodiment of the system of the invention, it further comprises a learning subsystem configured by means of hardware and / or software, to establish criteria for classifying the sequences taken by the camera, based on results obtained in previous analyzes. More preferably, said learning subsystem is connected to the processing medium locally or remotely.
DESCRIPTION OF THE FIGURES
8
image7
image8
image9


-One or more facial images of the subject (1) are registered in a reference mood. - A plurality of facial characteristic points of the subject (1) are defined in one or more of the images in the reference mood. For example, as
5 shows Figure 2, 24 points or facial curves can be taken. These characteristic points are strategically associated with the points or facial curves most susceptible to changes in their position when activating one or more AUj. -A plurality of distances are defined between those characteristic facial points chosen in the previous step. At these distances we call them P. parameters.
10 As an example, Table 5 defines 15 distance parameters that will be used in the detection of AUs.
Parameter DistanceDescription
P1 point 2 to segment 9-10half eye-right eyebrow distance
P2 point 5 to point 12inner eye-right eye distance
P3 point 7 to segment 13-14half distance left eye-eyebrow
P4 point 4 to point 11inner eye-left eye distance
P5 segment 9-10 to segment 15-16right open eye distance
P6 segment 13-14 to segment 1718left open eye distance
P7 point 20 to point 21horizontal mouth distance
P9 point 19 to point 22upper mouth-nose distance
P10 point 19 to point 24jaw-nose distance
P11 point 20 to point 23 verticallydistance almost lower mouth-outer mouth
P12 Segment 1-3 to segment 9-10distance left eyebrow-upper eyelid
P13 Segment 1-3 to segment 15-16left eyebrow distance - lower eyelid
P14 Segment 6-8 to segment 13-14right eyebrow-upper eyelid distance
P15 Segment 6-8 to segment 17-18right eyebrow distance - lower eyelid
Table 5. Distance parameters for the detection of 15 AUs. 12
image10
image11
image12
image13
权利要求:
Claims (1)
[1]
image 1
image2
image3
image4
类似技术:
公开号 | 公开日 | 专利标题
US9489934B2|2016-11-08|Method for selecting music based on face recognition, music selecting system and electronic apparatus
Potter et al.2010|Picture detection in rapid serial visual presentation: Features or identity?
Woźniak et al.2018|Prioritization of arbitrary faces associated to self: An EEG study
KR20200124280A|2020-11-02|Motion recognition, driving motion analysis method and device, electronic device
US10482333B1|2019-11-19|Mental state analysis using blink rate within vehicles
Loucks et al.2009|Sources of information for discriminating dynamic human actions
Kondo et al.2017|Auditory and visual scene analysis: an overview
ES2633152A1|2017-09-19|Method and system for the recognition of the state of mood by means of image analysis |
Ruffini et al.2018|Deep learning with EEG spectrograms in rapid eye movement behavior disorder
CN110214320A|2019-09-06|The certification compared using face-image
Herrera-Granda et al.2019|Drowsiness Detection in Drivers Through Real-Time Image Processing of the Human Eye
CN107710221A|2018-02-16|A kind of method, apparatus and mobile terminal for being used to detect live subject
Kissler et al.2017|Something always sticks? How emotional language modulates neural processes involved in face encoding and recognition memory
Moll et al.2017|Modality-invariant audio-visual association coding in crow endbrain neurons
KR102286468B1|2021-08-06|Method and system for integrity verification of fake video created through deep learning
Xu et al.2019|Ultra-rapid object categorization in real-world scenes with top-down manipulations
Rakover2012|A feature-inversion effect: can an isolated feature show behavior like the face-inversion effect?
Pérez-Moreno et al.2016|Where to look when looking at faces: Visual scanning is determined by gender, expression and tasks demands
JP6888542B2|2021-06-16|Drowsiness estimation device and drowsiness estimation method
US10242253B2|2019-03-26|Detection apparatus, detection method, and computer program product
CN110688991A|2020-01-14|Intelligent reading method and intelligent learning table thereof
Sabri et al.2020|Student emotion estimation based on facial application in E-learning during COVID-19 pandemic
Chan et al.2015|Eye movement pattern in face recognition is associated with cognitive decline in the elderly
Kreiman2021|Biological and Computer Vision
Gorji et al.2018|A temporal neural network model for object recognition using a biologically plausible decision making layer
同族专利:
公开号 | 公开日
ES2633152B1|2018-05-03|
WO2018154098A1|2018-08-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20100266213A1|2009-04-16|2010-10-21|Hill Daniel A|Method of assessing people's self-presentation and actions to evaluate personality type, behavioral tendencies, credibility, motivations and other insights through facial muscle activity and expressions|
US20110263946A1|2010-04-22|2011-10-27|Mit Media Lab|Method and system for real-time and offline analysis, inference, tagging of and responding to person experiences|
CN110889908B|2019-12-10|2020-11-27|苏州鱼得水电气科技有限公司|Intelligent sign-in system integrating face recognition and data analysis|
法律状态:
2017-08-29| PC2A| Transfer of patent|Owner name: UNIVERSITAT DE LES ILLES BALEARS Effective date: 20170823 |
2018-05-03| FG2A| Definitive protection|Ref document number: 2633152 Country of ref document: ES Kind code of ref document: B1 Effective date: 20180503 |
优先权:
申请号 | 申请日 | 专利标题
ES201730259A|ES2633152B1|2017-02-27|2017-02-27|METHOD AND SYSTEM FOR THE RECOGNITION OF THE STATE OF MOOD THROUGH IMAGE ANALYSIS|ES201730259A| ES2633152B1|2017-02-27|2017-02-27|METHOD AND SYSTEM FOR THE RECOGNITION OF THE STATE OF MOOD THROUGH IMAGE ANALYSIS|
PCT/EP2018/054622| WO2018154098A1|2017-02-27|2018-02-26|Method and system for recognizing mood by means of image analysis|
[返回顶部]