专利摘要:

公开号:SE1000386A1
申请号:SE1000386
申请日:2010-04-16
公开日:2011-10-17
发明作者:Niclas Wadstroemer;Joergen Ahlberg;Bengt Gustafsson;Amritpal Singh
申请人:
IPC主号:
专利说明:

1015202530352stay close to crowds for a long time without doing what is normally done in the area,for example shop in a mall or travel away from an airport. Pickpockets often operatein pairs or three and three and the method can detect that three people consistentlystay close to each other without being so close that they seem to be in the samecompany. The method can also detect interesting events indirectly e.g. byNo one wants to go near a certain person in situations where people normally do notavoid others.
The invention solves the present problem of video surveillance of an area and analysissera events in this and provide various forms of alarms and decision support to an operatorby being performed in the manner set forth by the subsequent independentpatent claims.
The invention will be described in the following with reference to the accompanyingdrawing, thereFig. 1Fig. 2 shows a schematic diagram of a behavior analysis filter in Fig. 1.shows a schematic diagram of the method andThe inventory system uses one or more camcorders that supply oneimage sequence to a computer system. The computer system automatically analyzes the video sequencesframe-by-frame, preferably time-synchronized if multiple cameras are used.
In addition to camcorders, different individuals can be located with other known sensors andthe system can also use this information in its analysis. The analysis is improved ifthere is information on individuals' group affiliation and definitions of behavioralclasses entered into the system by an operator.
The result of the analysis is a situational picture of an area. It constitutes a continuousupdated, formalized description of the individuals seen by one or morecamcorders and their motion behaviors. The result can be presented as onedescriptive text with characteristics of individuals, groups and the whole area andin addition, include any class affiliation and any anomaly. The result canpresented to the operator or forwarded to another system for forwardinguse.1015202530353A main point of the invention is that the analysis takes place in three blocks. First analyzedindividuals individually. An important feature is that this individual analysis takes place in twoparts. First, one tries to associate each observation of an individual with an individualwhich is known in a previous image or, if this is not possible, the individual is stated to be a new oneindividual. For this purpose, one extracts and uses values of the propertiesused to determine the state variables needed to be able toperform the association.
Depending on which state variables you are interested in to be able to analyzeany group affiliation and the development of the situation in the whole area is extractedthen additional characteristics for the different individuals. When a new observation hasassociated with a previously known individual may be the condition variables that characterizethe individual is updated, which includes various forms of stored in the previous imageinformation about the individual's previous history. The result of the analysis is a descriptionof the individuals moving in the area.
When the individual analysis is completed, groups of individuals are analyzed based on the results ofthe individual analysis. The analysis of groups takes place in the same way as for individuals withthe difference that groups can be detected either automatically by any commoncharacteristic in the form of a certain value of one or more state variables orspecified by enumerating the individuals who are part of the group. The enumerationmade by an operator or another part of the system. The division of individuals intogroups do not have to be mutually exclusive, one individual can be part of severalgroups. The result of the analysis is a description of the groups that move inthe area.
Finally, the whole area is analyzed and gives a situational picture based on the results ofindividual and group analysis and on characteristics extracted to characterizethe situation in the whole area.
The mentioned state variables are usually numeric, but can be presentinterpreted into a description in words. The same applies to events that can be described asnumerical changes in one or more state variables, but can be described in words inthe presentation.10152025304The value of each state variable is described by a probability distribution over the valuethe room. In this way, one can express different degrees of certainty in the assessment of whichvalue that applies. One can also give a definite value all probability, which is oneway of expressing that a specific value applies.
The person skilled in the art understands to select state variables based on the situation, to describerelevant properties of the moving objects. Anyone who wants to study the issue is referredto A. Yilmaz, O. Javed and M. Shah, “Object tracking: a surveyf 'AClVl ComputingSurveys, Vol. 38, no. 4, Article 13, Dec. 2006.
For an individual, some common examples of condition variables are: Position, appearance,posture, body movement pattern and spatial movement, speed and directionning. For a group, some common examples of state variables are: lndividerithe group, property and values that define the group, position and spatial movement,speed and direction. For the whole area, some common examples of permitvariables: Number of individuals, number of groups, spatial movement pattern and individual analysis.Individual analysis, see individual analysis in figure 1, consists of finding, following and characterizingthe individuals. In this analysis, no account is taken of the individuals' possible part in anyonegroup. Detection, extraction and tracking of moving objects are well known to those skilled in the art.men. Anyone who wants can study the issue in e.g. A. Yilmaz et al. as above and W.
Hu, T.Tan, L. Wang and S. Maybank, “A survey on visual surveillance of objectmotion and behaviors ”, IEEE Trans. on systems, man and cybernetics, vol. 34, no.3, August 2004.
The various parts of the block of individual analysis in Figure 1 are presented in the following. The analysisis done picture by picture and in two main parts. The first part consists of detection, extractionand follow-up and results in further observations associated withexisting tracks. Traces may also have been added and disappeared. The second part inthe analysis consists of extracting additional properties and characterizing the conditionand events to update the characterization of the individuals. lndata, 1 in figure 1,is a sequence of images that are processed one by one. Output 2 in Figure 1, is track andcharacterization of the individuals moving in the scene.'101520253035Detection takes place in block D.,Detection of moving objects can be done e.g. in that they differ from the background.
Those skilled in the art are well acquainted with such background detection. The interested party canstudy the issue in K. Kim, T. H. Chalidabhongse, D. Harwood and L. Davis, “Real-timeforeground-background segmentation using codebook model ”. Real-time imaging,vol. 11, no. 3, pp. 172 ~ 185, 2005 and in C. Stauffer and W. E. L. Grimson, “Learningpatterns of activity using real-time tracking “, IEEE Trans. on pattern analysis andmachine intelligence, vol. 22, no. 8, August 2000.
Each observation of an object (or part of an object or multiple objects) is onecontiguous area in the image that is pointed out with a silhouette or otherwise.
The detection is done for each sensor separately and several different algorithms can be used foreach sensor. The detection module produces a number of hypotheses for each moving objectcan be found in the current image.
Extraction of properties takes place in block X.,Extraction of properties needed for the follow-up such as position, appearanceand shape. The extraction may look different depending on the detection algorithmused. The result must be comparable to the results from the other extractionmodules in the tracking module.
Tracking takes place in block ToEach observation of a moving object is associated with an individual (object trace). INIn many cases, it is a one to one correspondence between observations and knownindividuals but it can happen both that an individual is described by several observations andthat several individuals are described by an observation. individuals can be created and disappear. Oneindividual who has not been observed for a long time or who is known not to returncome is removed. An observation that cannot be associated with an existing individualleads to the birth of a new individual. The individual is in some sense preliminary until severalobservations have been made.
Extraction of additional properties takes place in block XjExtraction of additional properties can be done when the detection is associated witha famous individual. Changes over time are extracted if the individual contains data aboutprevious observations. If the individual is classified, traits can be related to1015202530356the class is extracted, e.g. number of wheels if it is a vehicle or length of persons. Forpeople are extracted e.g. characteristics that describe bodily movement.
Individual behavioral analysis takes place in the IBF blockl IBF, Individual behavioral filter, updates the individual'sproperties with a further observation. Classification and anomaly detection are doneto detect unwanted and unusual events.
The result of the individual analysis is that the objects moving in the area are detected andfollowed so you can see how the objects move in the scene. The behavior of the objectis characterized continuously so that alarms can be given for predefined eventsand in case of anomalies.
Group analysis consists of finding, following and characterizing groups of individuals. A groupdefined by enumeration of individuals or by some common value ofany property. The characteristics that form the basis for group division can be divideddetermined by the operator partly by analysis of the individuals.
The different parts of the group analysis block in figure 1 are presented in the following. The analysistakes place in a similar way as the analysis of individuals. lndata to the block year data fromthe individual analysis and the result is that further observations can be associated withtraces of the groups. Traces of groups may appear and disappear. lndata can alsobe user-defined groups which in that case must be entered into the system on the sideof the analysis of images.
Detection of groups takes place in block DgGroups can be specified by an operator or other system countingup the individuals included in each group. In this case, detection of individuals is likeincluded in the trivial group. Groups can also be specified by someone commonproperty provided by an operator or other system or the detectionThe module automatically finds groups of individuals with similar characteristics. Groupsneed not be mutually exclusive but an individual can be part of several groups.
Extraction of the properties of the group takes place in block XgOnce a group has been identified, the properties of the group, which are needed for, are extractedto associate the group with one of the known groups.101520253035Tracking cuts the block TgEach observation of a group is associated with a known group or a new one is createdgfUpp.
Extraction of additional properties takes place in block XhNow that there is another observation regarding the group, the groupcreates extracts that are interesting and needed for the characterization ofthe group and its behavior.
Group behavioral analysis cuts the GBF blockIn GBF, group behavioral filter, the group's owncreates with another observation. Unwanted and unusual behaviors inthe group is detected.
The result of the group analysis is that groups in the area are followed and their behavioris continuously characterized so that alarms can be given at predefined events andanomalies. The method also analyzes the individuals who move in the scene to findany groups that may be individuals who have a common goal, individualswho have similar behavior or individuals who look alike m.m.
Based on individual and group analysis and other information about the entire area, one takes placeanalysis of the whole area. The different parts of the Stage Analysis block, see Figure 1, presentin the following. lndata is output from the individual and group analysis. This block isanalogous to the second part of the individual and group analysis blocks, the first is extractedproperties that describe the entire area scene - and then update the statethe variables that describe the scene.
Extraction of properties for the whole area XSWhen state variables for individuals and groups have been updated, properties forthe whole area which is interesting and needed to characterize the situation inthe area is extracted.
Behavioral analysis of the entire area takes place in the SBF blockl SBF, scene behavioral filter, updates properties forthe whole area.101520253035The result of the analysis of the whole area is a situational picture that is created based on themindividuals moving in the area and the characterization of their behavior andthe analysis of possible groups and their behavior. The situation picture would e.g. be able tosay if there are several groups that behave threateningly towards others or each other orif only a few individuals behave suspiciously.
The three filters (for individual, group and the whole area) used in behavioral analysishas the same structure with four components each, see figure 2. The contents of thethree fi ltren differ. Each filter has a method that calculates the current state from the outsidenew observations and current time and a method that predicts the current state ata time after the last observation. There is a method that classifiescurrent status in either pre-given classes or in classes given bycurrent and previous data and a method that detects anomalies in the current stateby keeping statistics on how often different conditions occur. The first partcan e.g. be done with a Kalman filter that must contain prediction of the next measurementvalue, updating the state and calculating the distance between two states.
The different parts of a behavioral filter are presented in the following. The filter contains twodatabases, one with description of property classes, 5 in Figure 2, which may beestimated from data or given from outside by the user. The second database, 6 inFigure 2, contains a description of normal values to be able to detect deviationsfrom such normal values. The description of normal values is estimated continuouslyfrom current data.
Prediction takes place in block PCurrent state is a function of time and of all or part of the received sensor data.data. In many cases, the condition changes over time and if you want the current valuesome time after the last observation, the current value should be predicted based on the previous onedata and current time.
Updating of permits takes place in block TThe second part of the filter consists in characterizing the current state in terms of dynamics.mystical or deterministic pre-given classes. The classes can be done in terms ofdirectly measurable properties but can also be given an explanatory description atdefinition of the class.101520253035Classification takes place in block CThe current state is classified in block C in Figure 2, ie. for each state vectoris calculated to which class the state vector belongs. The definition of the classes is stored indatabase 5 in Figure 2. The definition of the classes is either given by another part ofsystem and entered into the database from the outside (user-defined classes) or sothe classes in block K in Figure 2 are created. The class definition can be updated for each new onestate vector that comes and is then entered into the database. The classes can benamed to facilitate interpretation of the current situation. About the class divisionupdated dynamically, any naming must be reviewed when the classeschanges.
Classification of data is done to facilitate the interpretation of the current situation and supportthe operator's work to find connections between different individuals and events.
The method can see connections that are difficult for an operator to detect. Conditionalthe value of the variables provides more detailed information but may be too extensive toan operator must be able to interpret them. Classification is done with someone for the professionalknown method such as e.g. K-means. An overview of this method is presented in R. O.
Duda, P. E. Hart and D. G, Stork, “Pattern Classification”, 2nd Edition, Wiley, 2000.
Anomaly detection takes place in block ADThe method detects conditions that are predefined as abnormal orconditions that deviate from the normal in current sensor data. Normal behaviors canfor example determined by statistics regarding sensor data. One can e.g. employ that oneparameter is normally distributed and estimate the distribution parameters from the data.
Those skilled in the art are familiar with such anomaly detection. I O. Boiman and M. lrani,"Detecting irregularities in images and in video" is an example of how anomaliesmovements can be detected.
The result in the form of the analysis of the invention is a situational picture of the area and canpresented in a combination of maps, sensor data and text descriptions. A mapor drawing can show the area being monitored with symbols for basic conditionsand current condition. It is convenient to let the operator select a subset of allinformation available. The map can e.g. show geographical constraintsand position and field of view of sensors. Video sequences can be viewed in real time as well as data10from other sensors. In the streams can be symbols for current mode and analysis resultsadded.
An event log can display a list in chronological order or with other sortingevents. Events can e.g. get different design in the event log due toits prioritization to facilitate the operation of the operator. It may also be possible to soundthe operator freezes the image at a certain time and the ability to go back in time tosee how the situation was at an earlier time.
权利要求:
Claims (2)
[1]
A method for video surveillance of an area with behavioral analysis of individuals and groups of individuals and which generates a situational picture over the whole area, comprising, for said individuals, groups and area, determining the variables of the states one is interested in to describe said individuals, groups and areas and their behaviors and also how said state variables depend on properties that can be extracted from an image, that each of said individuals, groups and areas is assigned such a set of state variables that for each image in an image sequence it is detected the individuals present in the image and by extracting values of affected properties determine and store the state variables needed to be able to associate an individual with the corresponding individual in any previous image in the image sequence, that when analyzing an image in the image sequence, one tries to implement the association of each individual in the image, which either succeeds, thereby phasing indicates that an individual is the same as an individual in a previous image, or fails, indicating the individual as a new individual, characterized by extracting values of the properties needed to determine additional state variables that one is interested in for each individual, that based on these characteristics, and - if said each individual is associated with an individual in a previous image -previous values of state variables for this individual, the said further state variables determine and store, that one classifies each individual and performs anomaly detection, that one attributes individuals to groups, either by designating an operator or based on state variables determined either in the individual analysis of each individual or on state variables that are determined and stored specifically for the current group analysis, in the latter case extracting values of the relevant properties that determine the said state variables as best especially for the group analysis, that one extracts values of the properties needed to determine additional state variables that one is interested in for said groups, that one based on these properties, and -for the groups that are associated with groups in a previous image - previous values of state variables for these groups, said additional state variables for said groups determine and classify the groups and perform anomaly detection, extract values from the properties needed to determine state variables such as In addition to the state variables of individuals and groups, it is necessary to update the position picture of the area, that one, based on the state variables of individuals and groups and the state variables determined specifically for updating the position picture, updates the state variables describing the situation picture and presents this position picture.
[2]
A computer system for video surveillance of an area with behavioral analysis of individuals and groups of individuals and which generates a situational picture over the whole area, comprising the computer system, for said individuals, groups and areas, determining the state variables stated to be interesting to describe said individuals , groups and areas and their behaviors and also how said state variables depend on properties that can be extracted from an image, that the computer system assigns to each of said individuals, groups and areas such a set of state variables that the computer system for each image in an image sequence detects the individuals who represent the image and by extracting values of affected properties determine and store the state variables needed to be able to associate an individual with the corresponding individual in any previous image in the image sequence, that the computer system, when analyzing an image in the image sequence, tries to implement the association of each ind ivid in the image, which either succeeds, the computer system determining that an individual is the same as an individual in a previous image, or failing, the computer system indicating the individual as a new individual, characterized in that the computer system extracts values of the properties needed for determining additional state variables that are stated to be of interest to each individual, that the computer system is based on these properties, and - if said each individual is associated with an individual in a previous image - previous values of state variables for that individual , determines and stores said additional state variables, that the computer system classifies each individual and performs anomaly detection, that the computer system assigns individuals to groups, either by designating an operator or based on state variables either determined in the individual analysis of each individual or on state variables determined and stored especially for the current gr analysis, the computer system in the latter case extracting values of relevant properties that determine said state variables determined specifically for the group analysis, that the computer system extracts values of the properties needed to determine additional state variables that are stated to be interesting for said groups, that the computer system based on these properties, and -for the groups associated with groups in a previous image -previous values of state variables for these groups, the additional state variables for said groups determine and store, that the computer system classifies the groups and performs anomaly detection, that the computer system extracts values of the properties needed to determine state variables that, in addition to the state variables of individuals and groups, are needed to update the status of the area, that the computer system, based on the state variables of individuals and groups and the state variables determined specifically for the status image update, updates the state variables that describe the status image and presents this status image.
类似技术:
公开号 | 公开日 | 专利标题
Benabbas et al.2010|Motion pattern extraction and event detection for automatic visual surveillance
EP2659456B1|2020-02-19|Scene activity analysis using statistical and semantic feature learnt from object trajectory data
Lau et al.2010|Multi-model hypothesis group tracking and group size estimation
JP2012194967A|2012-10-11|Finding of abnormal pattern
WO2009137118A1|2009-11-12|System architecture and process for assessing multi-perspective multi-context abnormal behavior
Choudhary et al.2017|Real-time crowd behavior detection using SIFT feature extraction technique in video sequences
Ghrab et al.2016|Abnormal events detection based on trajectory clustering
Shukla et al.2016|Survey on image mining, its techniques and application
Zaidi et al.2017|Video anomaly detection and classification for human activity recognition
US20200012866A1|2020-01-09|System and method of video content filtering
Zhang et al.2010|A Multiple Instance Learning and Relevance Feedback Framework for Retrieving Abnormal Incidents in Surveillance Videos.
Toloue et al.2018|Anomalous behavior detection of marine vessels based on Hidden Markov Model
Campo et al.2017|Modeling and classification of trajectories based on a gaussian process decomposition into discrete components
Henrio et al.2018|Anomaly detection in videos recorded by drones in a surveillance context
KR20140106362A|2014-09-03|Method and Apparatus for detecting abnormal behavior
Ramachandran et al.2019|An intelligent system to detect human suspicious activity using deep neural networks
SE1000386A1|2011-10-17|Method for video monitoring of an area of behavioral analysis and computer systems for implementing the method
KR102028930B1|2019-10-07|method of providing categorized video processing for moving objects based on AI learning using moving information of objects
Sheu et al.2019|STAM-CCF: suspicious tracking across multiple camera based on correlation filters
Cota2020|Monitoring COVID-19 prevention measures on CCTV cameras using Deep Learning
Madokoro et al.2012|Classification of behavior patterns with trajectory analysis used for event site
Fernández-Caballero et al.2010|A proposal for local and global human activities identification
Kang et al.2014|Motion pattern study and analysis from video monitoring trajectory
Dilectin et al.2012|Classification and dynamic class detection of real time data for tsunami warning system
Akilandasowmya et al.2015|Human action analysis using K-NN classifier
同族专利:
公开号 | 公开日
SE534706C2|2011-11-22|
WO2011129739A1|2011-10-20|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US9892606B2|2001-11-15|2018-02-13|Avigilon Fortress Corporation|Video surveillance system employing video primitives|
US7558404B2|2005-11-28|2009-07-07|Honeywell International Inc.|Detection of abnormal crowd behavior|
BRPI0806968B1|2007-02-08|2018-09-18|Behavioral Recognition Sys Inc|method for processing video frame stream and associated system|
US8195598B2|2007-11-16|2012-06-05|Agilence, Inc.|Method of and system for hierarchical human/crowd behavior detection|
US20100036875A1|2008-08-07|2010-02-11|Honeywell International Inc.| system for automatic social network construction from image data|
法律状态:
2015-12-15| NUG| Patent has lapsed|
优先权:
申请号 | 申请日 | 专利标题
SE1000386A|SE534706C2|2010-04-16|2010-04-16|Method for video monitoring of an area of behavioral analysis and computer systems for implementing the method|SE1000386A| SE534706C2|2010-04-16|2010-04-16|Method for video monitoring of an area of behavioral analysis and computer systems for implementing the method|
PCT/SE2011/000068| WO2011129739A1|2010-04-16|2011-04-14|Method for video surveillance of an area with behavioural analysis and computer system for carrying out the method|
[返回顶部]