ANALYSIS APPARATUS, ANALYSIS METHOD, AND STORAGE MEDIUM
Provided is an analysis apparatus (10) including a person extraction unit (11) that analyzes video data to extract a person, a time calculation unit (12) that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person, and an inference unit (13) that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
Latest NEC Corporation Patents:
- BASE STATION, TERMINAL APPARATUS, FIRST TERMINAL APPARATUS, METHOD, PROGRAM, RECORDING MEDIUM AND SYSTEM
- COMMUNICATION SYSTEM
- METHOD, DEVICE AND COMPUTER STORAGE MEDIUM OF COMMUNICATION
- METHOD OF ACCESS AND MOBILITY MANAGEMENT FUNCTION (AMF), METHOD OF NEXT GENERATION-RADIO ACCESS NETWORK (NG-RAN) NODE, METHOD OF USER EQUIPMENT (UE), AMF NG-RAN NODE AND UE
- ENCRYPTION KEY GENERATION
The present application is a Continuation application of U.S. patent application Ser. No. 16/080,745 filed on Aug. 29, 2018, which is a National Stage Entry of international application PCT/JP2017/005100, filed Feb. 13, 2017, which claims the benefit of priority from Japanese Patent Application 2016-067538 filed on Mar. 30, 2016, the disclosures of all of which are incorporated herein, in their entirety, by this reference.
TECHNICAL FIELDThe invention relates to an analysis apparatus, an analysis method, and a program.
BACKGROUND ARTThe related art is disclosed in Patent Document 1. The Patent Document 1 discloses a countermeasure system against suspicious persons which detects the face of a person in a captured image of a scene in a surveillance range and determines whether countermeasures are needed or the degree of countermeasures on the basis of, for example, the size of the face, the time for which the person has been continuously present in the surveillance range, or the number of times the person appears in the surveillance range. It is assumed that, as the length of time or the number of times described above increases, the possibility that the person is a suspicious person increases.
Patent Documents 2 and 3 disclose an index generation apparatus that generates an index in which a plurality of nodes are hierarchized.
RELATED DOCUMENT Patent Document[Patent Document 1] Japanese Patent Application Publication No. 2006-11728
[Patent Document 2] WO2014/109127
[Patent Document 3] Japanese Patent Application Publication No. 2015-49574
SUMMARY OF THE INVENTION Technical ProblemIn a case in which, as the time period for which a person has been continuously present or the number of times the person appears increases, the possibility that the person is a suspicious person is determined to be higher as in the technique disclosed in the Patent Document 1, a determination error is likely to occur. For example, a person who continuously works in the surveillance range is determined to be a suspicious person. In order to solve this problem, it is desirable to have various criteria for determination.
An object of the invention is to provide a new technique for inferring a characteristic of a person extracted from an image.
Solution to ProblemIn one exemplary embodiment of the invention, there is provided an analysis apparatus comprising: a person extraction unit that analyzes video data to extract a person; a time calculation unit that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference unit that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
In another exemplary embodiment of the invention, there is provided an analysis method performed by a computer, the method comprising: a person extraction step of analyzing video data to extract a person; a time calculation step of calculating a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference step of inferring a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
In still another exemplary embodiment of the invention, there is provided a program that causes a computer to function as: a person extraction unit that analyzes video data to extract a person; a time calculation unit that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference unit that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
Advantageous Effects of InventionAccording to the invention, it is possible to provide a new technique for inferring the characteristic of a person extracted from an image.
The above objects and other objects, features and advantages will become more apparent from the following description of the preferred example embodiments and the accompanying drawings below.
First, an example of the hardware configuration of an apparatus (analysis apparatus) according to an exemplary embodiment will be described. Each unit of the apparatus according to the exemplary embodiment is implemented by an arbitrary combination of software and hardware including a central processing unit (CPU), a memory, a program loaded to the memory, a storage unit (which can store a program loaded from a storage medium, such as a compact disc (CD), or a server on the Internet, in addition to a program that is stored in the apparatus in advance in a shipment stage), such as a hard disk storing the program, and an interface for connection to a network of an arbitrary computer. It will be understood by those skilled in the art that there are various modification examples of the implementation method and the apparatus.
The bus 5A is a data transmission path through which the processor 1A, the memory 2A, the peripheral circuit 4A, and the input/output interface 3A transmit and receive data. The processor 1A is an arithmetic processing unit such as a central processing unit (CPU) or a graphics processing unit (GPU). The memory 2A is, for example, a random access memory (RAM) or a read only memory (ROM). The input/output interface 3A includes, for example, an interface for acquiring information from an external apparatus, an external server, and an external sensor. The processor 1A outputs commands to each module and performs calculation on the basis of the calculation results of the modules.
Next, the exemplary embodiment will be described. The functional block diagrams used in the following description of the exemplary embodiment do not illustrate the structure of each hardware unit, but illustrate a functional unit block. In the diagrams, each apparatus is implemented by one device. However, a means of implementing each apparatus is not limited thereto. That is, each apparatus may be physically divided or may be logically divided. The same components are denoted by the same reference numerals and the description thereof will not be repeated.
First Exemplary EmbodimentFirst, the outline of this exemplary embodiment will be described. An analysis apparatus according to this exemplary embodiment analyzes video data to extract a person. Then, the analysis apparatus calculates the time period (continuous appearance time period) for which the extracted person has been continuously present in a predetermined area and the time interval (reappearance time interval) until the extracted person reappears in the predetermined area after leaving the predetermined area (disappearing from the predetermined area) for each extracted person. Then, the analysis apparatus infers the characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval. The characteristic of the extracted person inferred on the basis of the continuous appearance time period and the reappearance time interval is a kind of information that can be recognized from context or the state of the person. Examples of the characteristic include a traveler, a passerby, a pickpocket, an operator, a migrant worker, a suspicious person, a demonstrator, and a homeless person. Here, the examples are illustrative and are not limited thereto. Hereinafter, this exemplary embodiment will be described in detail.
The person extraction unit 11 analyzes video data and extracts a person from the video data. Any technique can be used as a process of extracting a person.
For example, video data which is captured by one or a plurality of cameras (for example, surveillance cameras) installed at predetermined positions is input to the person extraction unit 11. For example, the person extraction unit 11 processes the video data in time series and extracts a person from the video data.
The person extraction unit 11 may process all of the frames included in the video data or process may be performed on a basis of each predetermined frame. Then, the person extraction unit 11 extracts a person from the frame which is a processing target. In addition, the person extraction unit 11 extracts the feature amount (for example, the feature amount of the face) of the outward appearance of the extracted person.
The time calculation unit 12 calculates the continuous appearance time period for which the person extracted by the person extraction unit 11 has been continuously present in a predetermined area and the reappearance time interval until the person reappears in the predetermined area after leaving the predetermined area for each extracted person. Any technique can be used to calculate the continuous appearance time period and the reappearance time interval. Hereinafter, an example will be described and the invention is not limited thereto.
For example, the time period for which a person appears continuously in the video data may be the continuous appearance time period and the time interval until the person reappears after disappearing from the video data may be the reappearance time interval. That is, when an n-th frame to be processed is represented by Fn, for example, it is assumed that a certain person is extracted from each of frames F1 to F500, is not extracted from each of frames F501 to F1500, and is extracted from a frame F1501 again. In this case, the time elapsed from the frame F1 to the frame F500 may be the continuous appearance time period and the time elapsed from the frame F501 to the frame F1501 may be the reappearance time interval.
In this case, an area captured by the camera is the predetermined area. For example, the continuous appearance time and the reappearance time are calculated by the method detailed below, and thus the predetermined area can be expanded to the area captured by the camera and a peripheral area.
Here, it is assumed that video data captured by one camera is a processing target. The time calculation unit 12 calculates an elapsed time t from the extraction of a person A (first person) from the video data to the next extraction of the person A from the video data. Then, when the elapsed time t is less than a predetermined time ts (a matter of design) (or when the elapsed time t is equal to or less than the predetermined time ts), it is determined that the first person has been continuously present in the predetermined area for the elapsed time t. On the other hand, when the elapsed time t is equal to or greater than the predetermined time ts (or when the elapsed time t is greater than the predetermined time ts), it is determined that the first person has not been present in the predetermined area for the elapsed time t.
A detailed example will be described with reference to
An elapsed time t2 from the second extraction of the person A to the third extraction of the person A is less than the predetermined time ts. Therefore, the time calculation unit 12 determines that the person A has been continuously present in the predetermined area for the time (elapsed time t2) from the second extraction to the third extraction. Then, the time calculation unit 12 determines that the person A has been continuously present in the predetermined area for the time (here, the time from the first extraction to the third extract) for which the state that an elapsed time is less than the predetermined time ts continues.
An elapsed time t3 from the third extraction of the person A to the fourth extraction of the person A is greater than the predetermined time ts. Therefore, the time calculation unit 12 determines that the person A has not been present in the predetermined area for the time (elapsed time t3) from the third extraction to the fourth extraction. Then, the time calculation unit 12 sets the elapsed time t3 as the reappearance time interval. In addition, the time calculation unit 12 sets (t1+t2) as the continuous appearance time period.
Another example will be described. Here, it is assumed that video data captured by a plurality of cameras is a processing target. All of the plurality of cameras capture the image of a predetermined position in the same predetermined area. For example, all of the plurality of cameras may be installed in the same area of a “∘∘” park. The imaging areas captured by the plurality of cameras may partially overlap each other or may not overlap each other.
The time calculation unit 12 calculates the elapsed time t from the extraction of the first person from video data captured by a first camera to the next extraction of the first person from video data by any camera (which may be the first camera or another camera). When the elapsed time t is less than the predetermined time ts (a matter of design) (or when the elapsed time t is equal to or less than the predetermined time ts), it is determined that the first person has been continuously present in the predetermined area for the elapsed time t. On the other hand, when the elapsed time t is equal to or greater than the predetermined time ts (or when the elapsed time t is greater than the predetermined time ts), it is determined that the first person has not been present in the predetermined area for the elapsed time t.
A detailed example will be described with reference to
The elapsed time t2 from the second extraction of the person A from the video data captured by the camera B (Cam B) to the next (third) extraction of the person A from video data captured by a camera C (Cam C) is less than the predetermined time ts. Therefore, the time calculation unit 12 determines that the person A has been continuously present in the predetermined area for the time (elapsed time t2) from the second extraction to the third extraction. Then, the time calculation unit 12 determines that the person A has been continuously present in the predetermined area for the time (the time from the first extraction to the third extract) for which the state that an elapsed time is less than the predetermined time ts continues.
The elapsed time t3 from the third extraction of the person A from the video data captured by the camera C (Cam C) to the next (fourth) extraction of the person A from the video data captured by the camera B (Cam B) is greater than the predetermined time ts. Therefore, the time calculation unit 12 determines that the person A has not been present in the predetermined area for the time (elapsed time t3) from the third extraction to the fourth extraction. Then, the time calculation unit 12 sets the elapsed time t3 as the reappearance time interval. In addition, the time calculation unit 12 sets a time (t1+t2) as the continuous appearance time period.
Next, this exemplary embodiment will be described on the assumption that the calculation method described with reference to
Incidentally, it is necessary to determine whether a person extracted from a certain frame and a person extracted from the previous frame are the same person, in order to perform the above-mentioned process. All of pairs of the feature amounts of the outward appearance of each person extracted from the previous frame and the feature amounts of the outward appearance of each person extracted from a certain frame may be compared to perform the determination described above. However, in the case of this process, as the accumulated data of persons increases, the number of pairs to be compared increases and thus processing load increases. Therefore, for example, the method described below may be adopted.
For example, the extracted person may be indexed as illustrated in
An extraction identifier (ID) “F∘∘∘-∘∘∘∘” illustrated in
In a third layer, nodes corresponding to all of the extraction IDs obtained from the processed frames are arranged. Among a plurality of nodes arranged in the third layer, nodes with similarity (similarity between the feature amounts of the outward appearance) that is equal to or higher than a first level are grouped. In the third layer, a plurality of extraction IDs which are determined to indicate the same person are grouped. That is, the first level of the similarity is set to a value that can implement the grouping. Person identification information (person ID) is given so as to correspond to each group in the third layer.
In a second layer, one node (representative) which is selected from each of a plurality of groups in the third layer is arranged and is associated with the group in the third layer. Among a plurality of nodes arranged in the second layer, nodes with similarity that is equal to or higher than a second level are grouped. The second level of the similarity is lower than the first level. That is, the nodes which are not grouped together on the basis of the first level may be grouped together on the basis of the second level.
In a first layer, one node (representative) which is selected from each of a plurality of groups in the second layer is arranged and is associated with the group in the second layer.
For example, the time calculation unit 12 indexes a plurality of extraction IDs obtained by the above-mentioned process as illustrated in
Then, when a new extraction ID is obtained from a new frame, the time calculation unit 12 determines whether a person corresponding to the extraction ID is identical to a previously extracted person using the information. In addition, the time calculation unit 12 adds the new extraction ID to the index. Next, this process will be described.
First, the time calculation unit 12 sets a plurality of extraction IDs in the first layer as a comparison target. The person extraction unit 11 makes a pair of the new extraction ID and each of the plurality of extraction IDs in the first layer. Then, the person extraction unit 11 calculates similarity (similarity between the feature amounts of the outward appearance) for each pair and determines whether the calculated similarity is equal to or greater than a first threshold value (is equal to or higher than a predetermined level).
In a case in which an extraction ID with similarity that is equal to or greater than the first threshold value is not present in the first layer, the time calculation unit 12 determines that the person corresponding to the new extraction ID is not identical to any previously extracted person. Then, the time calculation unit 12 adds the new extraction ID to the first to third layers and associates them with each other. In the second and third layers, a new group is generated by the added new extraction ID. In addition, a new person ID is issued in correspondence with the new group in the third layer. Then, the person ID is specified as the person ID of the person corresponding to the new extraction ID.
On the other hand, in a case in which the extraction ID with similarity that is equal to or greater than the first threshold value is present in the first layer, the time calculation unit 12 changes a comparison target to the second layer. Specifically, the group in the second layer associated with the “extraction ID in the first layer which has been determined to have similarity equal to or greater than the first threshold value” is set as a comparison target.
Then, the time calculation unit 12 makes a pair of the new extraction ID and each of a plurality of extraction IDs included in the group to be processed in the second layer. Then, the time calculation unit 12 calculates similarity for each pair and determines whether the calculated similarity is equal to or greater than a second threshold value. The second threshold value is greater than the first threshold value.
In a case in which an extraction ID with similarity that is equal to or greater than the second threshold value is not present in the group to be processed in the second layer, the time calculation unit 12 determines that the person corresponding to the new extraction ID is not identical to any previously extracted person. Then, the time calculation unit 12 adds the new extraction ID to the second and third layers and associates them with each other. In the second layer, the new extraction ID is added to the group to be processed. In the third layer, a new group is generated by the added new extraction ID. In addition, a new person ID is issued in correspondence with the new group in the third layer. Then, the time calculation unit 12 specifies the person ID as the person ID of the person corresponding to the new extraction ID.
On the other hand, in a case in which the extraction ID with similarity that is equal to or greater than the second threshold value is present in the group to be processed in the second layer, the time calculation unit 12 determines that the person corresponding to the new extraction ID is identical to a previously extracted person. Then, the time calculation unit 12 puts the new extraction ID into the group in the third layer associated with the “extraction ID in the second layer which has been determined to have similarity equal to or greater than the second threshold value”. In addition, the time calculation unit 12 determines the person ID corresponding to the group in the third layer as the person ID of the person corresponding to the new extraction ID.
For example, in this way, it is possible to associate a person ID with one extraction ID or each of a plurality of extraction IDs extracted from a new frame.
For example, the time calculation unit 12 may manage information illustrated in
The values of the continuous appearance time period and the latest extraction timing are updated as needed. For example, when a certain person is extracted from the video data first and a new person ID is added to the information illustrated in
Then, in a case in which the person is extracted for the second time, it is determined whether the person has been continuously present for the elapsed time t1 on the basis of the result of the large and small comparison between the elapsed time t1 and the predetermined time ts, as described above. The elapsed time t1 is calculated on the basis of, for example, the value in the field of latest extraction timing and the extraction timing of the second time. In a case in which it is determined that the person has been present, the value of the continuous appearance time period is updated. Specifically, the sum of the value recorded at that time and the elapsed time t1 is recorded in the field. Here, t1 (=0+t1) is recorded. Then, the latest extraction timing is updated to the extraction timing of the second time. Then, the time calculation unit 12 waits for the next extraction.
Then, in a case in which the person is extracted for the third time, it is determined whether the person has been continuously present for the elapsed time t2 on the basis of the result of the large and small comparison between the elapsed time t2 and the predetermined time ts, as described above. The elapsed time t2 is calculated on the basis of, for example, the value in the field of latest extraction timing and the extraction timing of the third time. In a case in which it is determined that the person has been present, the value of the continuous appearance time period is updated. Specifically, the sum (t1+t2) of the value (t1) recorded at that time and the elapsed time t2 is recorded in the field. Then, the latest extraction timing is updated to the extraction timing of the third time. Then, the time calculation unit 12 waits for the next extraction.
Then, in a case in which the person is extracted for the fourth time, it is determined whether the person has been continuously present for the elapsed time t3 on the basis of the result of the large and small comparison between the elapsed time t3 and the predetermined time ts, as described above. The elapsed time t3 is calculated on the basis of, for example, the value in the field of latest extraction timing and the extraction timing of the fourth time. In a case in which it is determined that the person has not been present, the value of the continuous appearance time period at that time is fixed as the continuous appearance time period of the person. In addition, the elapsed time t3 is fixed as the reappearance time interval of the person. Then, a pair of the fixed continuous appearance time period and the fixed reappearance time interval is input to the inference unit 13.
In addition, the value of the continuous appearance time period is updated. Specifically, “0” is recorded in the field. Then, the latest extraction timing is updated to the extraction timing of the fourth time. Then, the time calculation unit 12 waits for the next extraction. Then, the same process as described above is repeated.
Returning to
For example, the inference unit 13 may infer the characteristic of the person (hereinafter, referred to as a personal characteristic in some cases) on the basis of correspondence information (correspondence information indicating the relationship between the continuous appearance time period and the reappearance time interval) in which the pair of the continuous appearance time period and the reappearance time interval is associated with the inferred characteristic.
In a case in which the correspondence information is used, the inference unit 13 determines which personal characteristic area the pair of the continuous appearance time period and the reappearance time interval is located in, as illustrated in
As another example, the inference unit 13 may infer the personal characteristic on the basis of the correspondence information having different contents for each time slot during which a person appears.
That is, as illustrated in
For example, the inference unit 13 may use correspondence information corresponding to a time slot including a representative timing for the period of time for which the extracted person appears. The representative timing may be, for example, the timing (the first extraction timing in the example illustrated in
In addition, the inference unit 13 may calculate the overlapping period between the time slot corresponding to each correspondence information item and the time period for which the person appears. Then, the inference unit 13 may use the correspondence information corresponding to the longer overlapping period. For example, in a case in which the appearance period is from 2 a.m. to 5 a.m., the overlapping period between the time slot (from 4 a.m. to 10 p.m.) corresponding to the correspondence information illustrated in
In the above-mentioned example, two correspondence information items corresponding to two time slots are used. However, the number of correspondence information items is a matter of design and is not limited thereto.
Furthermore, the inference unit 13 may infer the personal characteristic on the basis of the continuous appearance time period, the reappearance time interval, data indicating a probability distribution which is stored in advance, and the correspondence information.
For example, as illustrated in
The analysis apparatus 10 may include a notification unit, which is not illustrated in
The analysis apparatus 10 may include a storage unit that stores the inference result of the inference unit 13 and an output unit that outputs the inference result, which is not illustrated in
Then, the output unit may acquire predetermined information from the storage unit and output the predetermined information, in accordance with an operation of the operator. For example, when an input specifying the personal characteristic is received, a list of the persons corresponding to the personal characteristic may be displayed. In addition, when an input specifying the personal characteristic and a period is received, a list of the persons who have been inferred to be the personal characteristic within the period may be displayed. The display of the list may be implemented, using image data corresponding to each person.
Next, an example of the flow of the process of the analysis apparatus 10 according to this exemplary embodiment will be described with reference to the flowchart illustrated in
In a person extraction step S10, the person extraction unit 11 analyzes video data to extract a person.
In a continuous appearance time period and reappearance time interval calculation step S11, the time calculation unit 12 calculates the continuous appearance time period for which each person extracted in S10 has been continuously present and the reappearance time interval until the person reappears in a predetermined area after leaving the predetermined area.
In a personal characteristic inference step S12, the inference unit 13 infers the characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval calculated in S11.
According to the above-described exemplary embodiment, it is possible to infer the personal characteristic on the basis of the continuous appearance time period for which a person has been continuously present in a predetermined area and the reappearance time interval until the person reappears after leaving the predetermined area. That is, it is possible to infer the personal characteristic on the basis of new information such as the reappearance time interval. Therefore, for example, the accuracy of inference is expected to be improved and an inference technique is expected to progress.
According to this exemplary embodiment, it is possible to infer the personal characteristic on the basis of a pair of the continuous appearance time period and the reappearance time interval. In the case of this exemplary embodiment, the personal characteristic is inferred not on the basis of the criterion that “as the continuous appearance time period increases, the possibility that a person is a suspicious person increases”. However, in a case in which the position (a position in a two-dimensional coordinate illustrated in
According to this exemplary embodiment, it is possible to infer the personal characteristic on the basis of a plurality of correspondence information items with different contents for each time slot during which a person appears. It is inferred that there is a large difference in personal characteristics between a person who appears during the day and a person who appears during the night. Since the personal characteristic is inferred considering the appearance timing, it is possible to improve the accuracy of inference.
According to this exemplary embodiment, it is possible to infer the personal characteristic, using a probability distribution. The output result of the inference unit 13 is just the inference result and is not 100 percent accurate. Therefore, there is the possibility that the person, who is essentially to be inferred as a suspicious person, is inferred as another personal characteristic such as a traveler. The inference of possible personal characteristics using probability distributions enables a wide inference of possible personal characteristics. For example, in the case of the above-mentioned example, in addition to a traveler, a suspicious person can be inferred as a possible personal characteristic.
According to this exemplary embodiment, the continuous appearance time period and the reappearance time interval can be calculated by the method described with reference to
As a modification example of this exemplary embodiment, in the correspondence information, not all pairs of the continuous appearance time period and the reappearance time interval are necessarily associated with personal characteristics as illustrated in, for example,
An analysis apparatus 10 according to this exemplary embodiment stores the personal characteristic inferred by the method described in the first exemplary embodiment in association with each person. In a case in which a certain person appears repeatedly in a predetermined area, the analysis apparatus 10 calculates the continuous appearance time period and the reappearance time interval whenever the person appears. On all such occasion, on the basis of the calculation result, a personal characteristic is inferred. On the basis of the inference result, the analysis apparatus 10 counts the number of times each personal characteristic is inferred, for each person. Then, the analysis apparatus 10 calculates the reliability of each inferred personal characteristic on the basis of the number of counts. Hereinafter, this exemplary embodiment will be described in detail.
The count unit 14 counts the number of times each personal characteristic which is inferred in correspondence with each person is inferred.
For example, the count unit 14 manages information illustrated in
Whenever a certain person appears repeatedly in a predetermined area, the time calculation unit 12 calculates the continuous appearance time period and the reappearance time interval. On all such occasion, the inference unit 13 infers the personal characteristic on the basis of the continuous appearance time period and the reappearance time interval calculated whenever the certain person appears repeatedly in the predetermined area.
The count unit 14 updates the information illustrated in
The reliability calculation unit 15 calculates the reliability of the inferred personal characteristic on the basis of the number of times each personal characteristic which is inferred in correspondence with a certain person is inferred. The larger the number of times of inference is, the higher the reliability is calculated by the reliability calculation unit 15.
The output unit may output predetermined information on the basis of the inference result of the inference unit 13 and the calculation result of the reliability calculation unit 15. The output unit can output the predetermined information in accordance with an operation of the operator.
For example, when an input specifying a personal characteristic and reliability conditions (for example, reliability is equal to or higher than a predetermined level) is received, a list of the persons who are inferred to be the personal characteristic with reliability equal to or higher than the predetermined level may be displayed. The display of the list may be implemented using image data corresponding to each person.
According to the above-described exemplary embodiment, it is possible to obtain the same advantageous effect as that in the first exemplary embodiment. In addition, it is possible to calculate the reliability of each inferred personal characteristic on the basis of many inference results which are stored in correspondence with each person. As a result, according to this exemplary embodiment, it is possible to improve the accuracy of inferring the personal characteristic of each person.
Third Exemplary EmbodimentAn analysis apparatus 10 according to this exemplary embodiment provides a function of setting correspondence information in which a pair of the continuous appearance time period and the reappearance time interval is associated with a personal characteristic.
The setting unit 16 has a function of setting the correspondence information in which a pair of the continuous appearance time period and the reappearance time interval is associated with a personal characteristic. The setting unit 16 can set the correspondence information in accordance with an input from the user.
For example, the setting unit 16 may output a setting screen illustrated in
In addition, the setting unit 16 may output a setting screen illustrated in
The inference unit 13 infers a personal characteristic on the basis of the correspondence information set by the setting unit 16. The other structures of the inference unit 13 are the same as those in the first and second exemplary embodiments.
According to the above-described exemplary embodiment, it is possible to obtain the same advantageous effect as that in the first and second exemplary embodiments. In addition, it is possible to freely set various personal characteristics. By setting a personal characteristic of a person to be detected in the video data, it is possible to detect a person with the personal characteristic.
Hereinafter, an example of reference exemplary embodiments will be additionally described.
1. An analysis apparatus including: a person extraction unit that analyzes video data to extract a person; a time calculation unit that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference unit that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
2. The analysis apparatus described in 1, in which the inference unit infers the characteristic of the person on the basis of a relationship between the continuous appearance time period and the reappearance time interval.
3. The analysis apparatus described in 1 or 2 further including: a count unit that counts the number of times each characteristic which is inferred in correspondence with each person is inferred; and a reliability calculation unit that calculates reliability of the inferred characteristic on the basis of the number of times each characteristic which is inferred in correspondence with a certain person is inferred.
4. The analysis apparatus described in any one of 1 to 3, in which the inference unit infers the characteristic of the person on the basis of correspondence information in which a pair of the continuous appearance time period and the reappearance time interval is associated with a characteristic.
5. The analysis apparatus described in 4, in which the inference unit infers the characteristic of the person on the basis of the correspondence information having different contents for each time slot during which the person appears.
6. The analysis apparatus described in 4 or 5, in which the inference unit infers the characteristic of the person on the basis of the continuous appearance time period, the reappearance time interval, a probability distribution, and the correspondence information.
7. The analysis apparatus described in any one of 1 to 6,
in which, in a case in which a time t elapsed from the extraction of a first person from the video data to the next extraction of the first person from the video data is less than a predetermined time ts, the time calculation unit determines that the first person has been continuously present in the predetermined area for the elapsed time t, and
in a case in which the elapsed time t is equal to or greater than the predetermined time ts, the time calculation unit determines that the first person has not been present in the predetermined area for the elapsed time t.
8. An analysis method performed by a computer including: a person extraction step of analyzing video data to extract a person; a time calculation step of calculating a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference step of inferring a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
8-2. The analysis method described in 8, in which in the inference step, the characteristic of the person is inferred on the basis of a relationship between the continuous appearance time period and the reappearance time interval.
8-3. The analysis method performed by the computer described in 8 or 8-2, the method further including: a count step of counting the number of times each characteristic which is inferred in correspondence with each person is inferred; and a reliability calculation step of calculating reliability of the inferred characteristic on the basis of the number of times each characteristic which is inferred in correspondence with a certain person is inferred.
8-4. The analysis method described in any one of 8 to 8-3, in which in the inference step, the characteristic of the person is inferred on the basis of correspondence information in which a pair of the continuous appearance time period and the reappearance time interval is associated with a characteristic.
8-5. The analysis method described in 8-4, in which in the inference step, the characteristic of the person is inferred on the basis of the correspondence information having different contents for each time slot during which the person appears.
8-6. The analysis method described in 8-4 or 8-5, in which in the inference step, the characteristic of the person is inferred on the basis of the continuous appearance time period, the reappearance time interval, a probability distribution, and the correspondence information.
8-7. The analysis method described in any one of 8 to 8-6, in which in the time calculation step,
in a case in which a time t elapsed from the extraction of a first person from the video data to the next extraction of the first person from the video data is less than a predetermined time ts, it is determined that the first person has been continuously present in the predetermined area for the elapsed time t, and
in a case in which the elapsed time t is equal to or greater than the predetermined time ts, it is determined that the first person has not been present in the predetermined area for the elapsed time t.
9. A program causing a computer to function as: a person extraction unit that analyzes video data to extract a person; a time calculation unit that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person; and an inference unit that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
9-2. The program described in 9, in which the inference unit infers the characteristic of the person on the basis of a relationship between the continuous appearance time period and the reappearance time interval.
9-3. The program described in 9 or 9-2 causing the computer to further function as: a count unit that counts the number of times each characteristic which is inferred in correspondence with each person is inferred; and a reliability calculation unit that calculates reliability of the inferred characteristic on the basis of the number of times each characteristic which is inferred in correspondence with a certain person is inferred.
9-4. The program described in any one of 9 to 9-3, in which the inference unit infers the characteristic of the person on the basis of correspondence information in which a pair of the continuous appearance time period and the reappearance time interval is associated with a characteristic.
9-5. The program described in 9-4, in which the inference unit infers the characteristic of the person on the basis of the correspondence information having different contents for each time slot during which the person appears.
9-6. The program described in 9-4 or 9-5, in which the inference unit infers the characteristic of the person on the basis of the continuous appearance time period, the reappearance time interval, a probability distribution, and the correspondence information.
9-7. The program described in any one of 9 to 9-6,
in which, in a case in which a time t elapsed from the extraction of a first person from the video data to the next extraction of the first person from the video data is less than a predetermined time ts, the time calculation unit determines that the first person has been continuously present in the predetermined area for the elapsed time t, and
in a case in which the elapsed time t is equal to or greater than the predetermined time ts, the time calculation unit determines that the first person has not been present in the predetermined area for the elapsed time t.
It is apparent that the present invention is not limited to the above exemplary embodiment, and may be modified and changed without departing from the scope and spirit of the invention.
This application claims priority based on Japanese Patent Application No. 2016-067538 filed on Mar. 30, 2016, the disclosure of which is incorporated herein in its entirety.
Claims
1. An analysis apparatus comprising
- a memory storing instructions; and
- at least one processor executing the instructions to perform:
- extracting a person from a video data;
- calculating a time period for which the extracted person has been continuously present in a predetermined area and a time interval between first point in time when the extracted person disappears from the predetermined area and second point in time when the extracted person reappears to the predetermined area;
- inferring one or more personal characteristics that may be corresponding to the extracted person based on the calculated time period and the calculated time interval;
- calculating probability that the person is corresponding to each personal characteristic; and
- outputting information based on the probability.
2. The analysis apparatus according to claim 1, wherein
- the at least one processor executes the instructions to perform:
- when an input specifying a personal characteristic is received, displaying a list of persons corresponding to the specified personal characteristic.
3. The analysis apparatus according to claim 1, wherein
- the at least one processor executes the instructions to perform:
- when an input specifying a personal characteristic and a period is received, displaying a list of persons corresponding to the specified personal characteristic within the period.
4. An analysis method performed by a computer, the method comprising:
- extracting a person from a video data;
- calculating a time period for which the extracted person has been continuously present in a predetermined area and a time interval between first point in time when the extracted person disappears from the predetermined area and second point in time when the extracted person reappears to the predetermined area;
- inferring one or more personal characteristics that may be corresponding to the extracted person based on the calculated time period and the calculated time interval;
- calculating probability that the person is corresponding to each personal characteristic; and
- outputting information based on the probability.
5. The analysis method according to claim 4, wherein
- when an input specifying a personal characteristic is received, displaying a list of persons corresponding to the specified personal characteristic.
6. The analysis method according to claim 4, wherein
- when an input specifying a personal characteristic and a period is received, displaying a list of persons corresponding to the specified personal characteristic within the period.
7. A non-transitory computer readable recording medium storing programs, the programs causing a computer to perform:
- extracting a person from a video data;
- calculating a time period for which the extracted person has been continuously present in a predetermined area and a time interval between first point in time when the extracted person disappears from the predetermined area and second point in time when the extracted person reappears to the predetermined area;
- inferring one or more personal characteristics that may be corresponding to the extracted person based on the calculated time period and the calculated time interval;
- calculating probability that the person is corresponding to each personal characteristic; and
- outputting information based on the probability.
8. The non-transitory computer readable recording medium according to claim 7, wherein the programs causes the computer to perform:
- when an input specifying a personal characteristic is received, displaying a list of persons corresponding to the specified personal characteristic.
9. The non-transitory computer readable recording medium according to claim 7, wherein the programs causes the computer to perform:
- when an input specifying a personal characteristic and a period is received, displaying a list of persons corresponding to the specified personal characteristic within the period.
Type: Application
Filed: Jan 28, 2019
Publication Date: May 23, 2019
Applicant: NEC Corporation (Tokyo)
Inventors: Yasufumi HIRAKAWA (Tokyo), Jianquan LIU (Tokyo), Shoji NISHIMURA (Tokyo), Takuya ARAKI (Tokyo)
Application Number: 16/258,863