FACE RECOGNITION SYSTEM, FACE RECOGNITION METHOD, AND STORAGE MEDIUM
A face recognition system, a face recognition method, and a storage medium that can perform face matching smoothly in a short time are provided. The face recognition system includes: a face detection unit that detects a face image from an image including an authentication subject as a detected face image; a storage unit stores identification information identifying the authentication subject and a registered face image of the authentication subject in association with each other; and a face matching unit that, in response to acquisition of the identification information identifying the authentication subject, matches, against the registered face image corresponding to the acquired identification information, the detected face image detected by the face detection unit from an image captured before the acquisition.
Latest NEC CORPORATION Patents:
- CORE NETWORK NODE, UE, ACCESS NETWORK NODE AND CONTROLLING METHOD
- COMMUNICATION DEVICE AND METHOD
- WAVELENGTH VARIABLE LASER APPARATUS AND METHOD OF MANUFACTURING WAVELENGTH VARIABLE LASER APPARATUS
- METHOD, DEVICE AND COMPUTER READABLE MEDIUM FOR RESOURCE SELECTION
- COMMUNICATION CONTROL SYSTEM, COMMUNICATION CONTROL DEVICE, AND COMMUNICATION CONTROL METHOD
The present invention relates to a face recognition system, a face recognition method, and a storage medium.
BACKGROUND ARTIn recent years, biometric authentication that performs authentication using biometric information that is information on a physical feature or behavior feature of a human has been utilized in a situation of identity verification. Face authentication that is one of the forms of biometric authentication is advantageous because of less mental stress at an authentication subject, ability of authentication from a distant place, a mental deterrent effect against a fraud, or the like.
Face authentication technologies have been utilized for identity verification in various fields. For example, in a gate system installed in an entrance gate of a facility such as a theme park, a face authentication technology is utilized for identity verification of visitors who use a ticket such as annual pass thereof or the like (Non Patent Literature
CITATION LIST Non Patent LiteratureNPL 1: NEC Corporation, “Face Authentication: Gate System”, [online], [searched on Feb. 12, 2016], Internet <URL: http://jpn.nec.com/ad/usj/entry.html>
SUMMARY OF INVENTION Technical ProblemIn the face authentication technology in the conventional gate system, it is necessary for the staff of a facility to operate a camera to capture a face image of a visitor for face matching after ticket presentation in which the visitor causes a reading unit of a gate apparatus to read information on the ticket. In this way, since the conventional gate system requires the operation of capturing a face image of a visitor after the visitor causes information on a ticket to be read and before the visitor enters the facility, many visitors waiting for identity verification by face authentication may be detained at the entrance gate of the facility.
The present invention intends to provide a face recognition system, a face recognition method, and a storage medium that can perform face matching smoothly in a short time.
Solution to ProblemAccording to one example aspect of the present invention, provided is a face recognition system including: a face detection unit that detects a face image from an image including an authentication subject as a detected face image; a storage unit that stores identification information identifying the authentication subject and a registered face image of the authentication subject in association with each other; and a face matching unit that, in response to acquisition of the identification information identifying the authentication subject, matches, against the registered face image corresponding to the acquired identification information, the detected face image detected by the face detection unit from an image captured before the acquisition.
According to another example aspect of the present invention, provided is a face recognition method including: detecting a face image from an image including an authentication subject as a detected face image; and in response to acquisition of the identification information identifying the authentication subject, matching, against the registered face image associated with the acquired identification information, the detected face image detected from an image captured before the acquisition.
According to yet another example aspect of the present invention, provides is a storage medium in which a program is stored, and the program causes a computer to execute: detecting a face image from an image including an authentication subject as a detected face image; and in response to acquisition of the identification information identifying the authentication subject, matching, against the registered face image associated with the acquired identification information, the detected face image detected from an image captured before the acquisition.
Advantageous Effects of InventionAccording to the present invention, face matching can be performed smoothly in a short time.
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
A face recognition system and a face recognition method according to a first example embodiment of the present invention will be described by using
First, the face recognition system according to the present example embodiment will be described by using
The face recognition system according to the present example embodiment performs identity verification by using face matching at the entrance gate of a facility, where an authentication subject is a visitor who intends to enter the facility by using an admission ticket. For example, the facility may be a theme park, an event hall, a stadium, a concert hall, or the like. For example, a ticket used by a visitor is an admission ticket called an annual pass, an annual passport, or the like with which the visitor can enter the facility any times during a particular period such as a year or the like, although the type thereof is not limited in particular. The admission ticket may be a paper ticket or an electronic ticket as long as it is a medium in which identification information that identifies the admission ticket is recorded in a readable manner. A case of identity verification by using face matching when a visitor uses an annual pass to enter a facility will be described below.
As illustrated in
At the entrance gate 50 where the gate apparatus 10 and the like are installed, a roof 502 is installed. A lighting apparatus 504 is provided to the roof. Further, a guide plate 508 indicating the entrance gate is provided to the roof 502 so as to be located above the gate apparatus 10.
The face matching apparatus 30 and the datacenter server 40 are connected to a network via a network 60, respectively, and can be communicated with each other via the network 60. The network 60 may be a Wide Area Network (WAN) or a Local Area Network (LAN), for example, although the type thereof is not limited in particular.
Further, the gate apparatus 10 and the fixed camera 20 are directly, locally connected in a communicable manner to the face matching apparatus 30 through cable connection or the like, respectively. The connection among the gate apparatus 10, the fixed camera 20, and the face matching apparatus 30 may be of a wired scheme or a wireless scheme.
An annual pass can be purchased from a web ticket store or a ticket booth. A web server 70 that provides a web ticket store and a ticket booth terminal 80 are connected to the network 60. The web server 70 and the ticket booth terminal 80 can communicate with the datacenter server 40 via the network 60, respectively. The web server 70 is installed inside a datacenter located in a remote place of the entrance gate 50, for example. The ticket booth terminal 80 is installed inside the ticket booth neighboring the entrance gate 50, for example.
Next, each component of the face recognition system 1 according to the present example embodiment will be described in detail.
The gate apparatus 10 has a main unit 102, a fence 104, a gate 106, a reading unit 108, a hand camera 110, and a gate control unit 112.
The main unit 102 and the fence 104 are installed so as to face each other. A path 114 through which a visitor walks to enter the facility runs between the main unit 102 and the fence 104. On the path 114, an entrance 114a is outside, and an exit 114b is inside. The main unit 102 is installed on the right side when viewed from the entrance 114a to the exit 114b of the path 114. On the other hand, the fence 104 is installed on the left side when viewed from the entrance 114a to the exit 114b of the path 114.
The gate 106 is provided on the sidewall on the main unit 102 on the path 114 so as to block the path 114 during a standby state. When opened from a closed state during a standby state for blocking the path 114, the gate 106 allows a visitor to walk through the path 114 and enter the inside of the facility. The gate 106 is a turn-style gate in which three bars rotate, for example. Note that, without limited to the above, various types of gates may be used as a gate. For example, as the gate 106, a flapper gate in which two flappers provided on both sides or a single flapper provided on one side of the path 114 is opened and closed may be used.
As described later, the gate 106 is opened when identity verification by face matching is successful. Thereby, the visitor is allowed to walk through the path 114 and enter the inside of the facility.
Note that the gate 106 may be a gate that is in an opened state during a standby state and maintains the opened state when identity verification by face matching is successful, and is closed when identity verification by face matching is failed.
The reading unit 108 is provided on a portion on the entrance 114a side on the path 114 of the gate 106 on the top of the main unit 102. The reading unit 108 reads information recorded in an annual pass carried by a visitor from the annual pass. Specifically, in an annual pass, identification (ID) information that is identification information uniquely identifying the annual pass is recorded. The reading unit 108 reads ID information from an annual pass. The ID information read by the reading unit 108 may be a member number, a serial number, or the like of the annual pass, for example. An annual pass is a medium that is carried by a visitor who is an authentication subject and required when the visitor enters the inside of the facility and on which ID information uniquely identifying itself is recorded. Here, the medium may be a medium, such as a card, a sheet, a smartphone, or the like, which has information identifying an authentication subject. As described later, information on purchasers who have purchased the annual passes in association with ID information of the annual passes is accumulated in the datacenter server 40.
The reading unit 108 has a reading scheme in accordance with a recording scheme of ID information on an annual pass. For example, when an annual pass has ID information recorded in a one-dimensional code such as a barcode or a two-dimensional code such as a QR code (registered trademark), the reading unit 108 is a code reader such as a barcode reader, a QR code reader, or the like. Further, for example, when an annual pass has ID information recorded in a non-contact IC card or a non-contact IC tug using Radio Frequency Identification (RFID), the reading unit 108 is an RFID reader.
When there is ticket presentation of an annual pass at the reading unit 108, the reading unit 108 reads ID information recorded in the annual pass from the annual pass. Note that ticket presentation here means that a visitor who is an authentication subject causes the reading unit 108 to read information including the ID information recorded in an annual pass.
The reading unit 108 transmits ID information read from an annual pass to the gate control unit 112 described later. Note that identification information identifying an authentication subject is not limited to ID information stored in a medium such as an annual pass. The identification information identifying an authentication subject may include biometric information of an authentication subject such as a finger print, a vein, an iris, or the like, for example, and may be any information that can identify an authentication subject. In this case, the reading unit 108 may be a finger print scanner, a vain scanner, a camera, or the like that can read biometric information such as a finger print, a vein, an iris, or the like of an authentication subject.
The hand camera 110 is provided to a portion near the gate 106 on the top of the main unit 102. The hand camera 110 is a digital video camera, for example, and can capture a face image of a visitor who is an authentication subject and acquire the face image according to an operation of the staff of the facility. The hand camera is other image-capturing units used when face matching based on a face image captured by the fixed camera 20 is failed, as described later. Note that the hand camera 110 may be any camera as long as it can acquire a face image of a visitor and may be a digital still camera.
The hand camera 110 transmits image data of the captured face image of a visitor to the gate control unit 112 described later.
The gate control unit 112 controls the operation of the gate apparatus 10. The reading unit 108 is connected to the gate control unit 112 so as to be able to communicate therewith. Further, the gate 106 is connected to the gate control unit 112 in a controllable manner. Further, the hand camera 110 is connected to the gate control unit 112 so as to be able to communicate therewith.
ID information of an annual pass read by the reading unit 108 is transmitted to the gate control unit 112 from the reading unit 108. The gate control unit 112 transmits ID information of an annual pass transmitted from the reading unit 108 to the face matching apparatus 30.
Further, the gate control unit 112 controls opening and closing of the gate 106 based on a matching result signal transmitted from the face matching apparatus 30 described later.
Further, image data of a face image of a visitor captured by the hand camera 110 is transmitted to the gate control unit 112 from the hand camera 110. The gate control unit 112 transmits image data of a face image transmitted from the hand camera 110 to the face matching apparatus 30.
The fixed camera 20 is fixed to the upper end of a support pillar 202 installed inside the facility with respect to the gate apparatus 10. The fixed camera 20 is an image capturing unit that captures an image of an area in front of the gate apparatus 10 and in which the orientation facing the side of the outside of the facility is fixed. The fixed camera 20 is fixed at a height located above a head of a human of a height of around 200 cm, for example, from the ground face at the entrance gate 50 and is directed obliquely downward to face an area in front of the gate apparatus 10. Note that a fixing scheme of the fixed camera 20 is not limited to a scheme using the support pillar 202. For example, the fixed camera 20 may be hanged from and fixed to the roof 502 of the entrance gate 50.
The fixed camera 20 fixed as described above captures an image of an area in front of the gate apparatus 10 that is the entrance side to the installation area of the gate apparatus 10 including the reading unit 108. That is, the fixed camera 20 captures an image on the entrance side to the installation area of the reading unit 108. Thereby, the fixed camera 20 can capture a visitor V in an area in front of the gate apparatus 10 that is the entrance side to the installation area of the reading unit 108. Therefore, the fixed camera 20 can capture an image including an authentication subject.
The fixed camera 20 is a digital video camera, for example, and is able to capture a moving image at a predetermined framerate to continuously acquire a plurality of images at a predetermined cycle synchronized with the framerate. For example, the fixed camera 20 is able to capture a moving image at 15 fps and continuously acquire images of 15 frame per second.
Note that the fixed camera 20 may be a digital still camera. In this case, the fixed camera 20 can be configured to continuously capture static images at a predetermined capturing interval and continuously acquire a plurality of images at a predetermined cycle.
Further, the fixed camera 20 may be a visible light camera or an infrared camera. When the fixed camera 20 is an infrared camera, an infrared lighting apparatus 506 that emits an infrared may be provided to the roof 502 of the entrance gate 50 in addition to the normal lighting apparatus 504 that emits an illumination light including a visible light. By using an infrared camera as the fixed camera 20 under the infrared lighting apparatus 506, it is possible to perform face matching based on a face image captured by the fixed camera 20 while reducing the influence by the brightness nearby.
Further, the fixed camera 20 is installed in a vertical orientation so as to capture a vertically long image. This enables the fixed camera 20 to capture an image including a visitor's face of a wide range of heights from a short visitor to a tall visitor. Specifically, the fixed camera 20 can capture an image including a face of a visitor whose height ranges from 99 cm to 220 cm, for example. Note that the fixed camera 20 is not necessarily required to be installed vertically but may be installed horizontally so as to capture a horizontally long image.
Faces of a plurality of visitors in an area in front of the gate apparatus 10 may be included in an image captured by the fixed camera 20.
The fixed camera 20 transmits image data of a plurality of images acquired at a predetermined cycle as described above to the face matching apparatus 30 in synchronization with the cycle.
The face matching apparatus 30 has a face matching control unit 302, a storage unit 304, and a display unit 306.
The face matching control unit 302 performs face matching based on a face image captured by the fixed camera 20. The face matching control unit 302 includes an image data acquisition unit 308, a face detection unit 310, a face feature amount extraction unit 312, and a face matching unit 314.
The image data acquisition unit 308 sequentially acquires image data of images transmitted from the fixed camera 20 at a predetermined cycle. Note that the image data acquisition unit 308 can perform image processing such as a correction process on the acquired image data.
The face detection unit 310 performs face detection on respective images of image data sequentially acquired from the image data acquisition unit 308. Thereby, the face detection unit 310 detects a face image of a visitor in an area in front of the gate apparatus 10 as a detected face image out of images of image data sequentially acquired by the image data acquisition unit 308. As an algorithm used by the face detection unit 310 for face detection, without being limited in particular, various algorithms may be used.
Note that a plurality of visitors in an area in front of the gate apparatus 10 may be captured in one frame of image captured by the fixed camera 20. Further, the same person may be captured in different frames of images captured by the fixed camera 20. These cases will be described later.
The face feature amount extraction unit 312 extracts a face feature amount that is a feature amount of a face image for respective face images detected by the face detection unit 310. Note that a face image detected by the face detection unit 310 may be referred to as a detected face image below. The face feature amount is a vector amount and obtained by combining scaler amount components expressing the feature of a face image. As a component of a feature amount, without being limited in particular, various types thereof may be used. For example, as a component of a feature amount, a positional relationship such as a distance or an angle between feature points that are set at the center or the end of an organ of a face, such as an eye, a nose, a mouth, or the like, a curvature of the outline of a face, a color distribution or a shade and light value of the surface of a face, or the like can be used. The number of components of the feature amount may be set as appropriate in accordance with required matching accuracy, a processing speed, or the like without being limited in particular.
Further, the face feature amount extraction unit 312 temporarily stores face image data that is image data of the detected face image together with the face feature amount extracted from the detected face image in the storage unit 304 in association with each other. Furthermore, the face feature amount extraction unit 312 temporarily stores a detection number that is a number identifying the image data and the capturing time when the detected face image is captured in the storage unit 304 in association with each other for respective detected face images, together with face image data and the face feature amount thereof.
A relational database is configured in the storage unit 304. In the relational database of the storage unit 304, the face feature amount extracted by the face feature amount extraction unit 312 as described above is temporarily stored in association with a detection number, a capturing time, and face image data. Such mutually associated data is managed by a Relational Database Management System (RDBMS). As an RDBMS, without being limited in particular, Microsoft (registered trademark) SQL Server is used, for example.
In the storage unit 304, the face feature amount and data related thereto are stored for only a certain period from the capturing time for each detected face image. The face feature amount and data related thereto of a detected face image remaining after a certain time has elapsed from the capturing time are sequentially deleted from the relational database of the storage unit 304. For example, the face feature amount and data related thereto of the detected face image captured by the fixed camera 20 within the immediate past three minutes are temporarily stored in the storage unit 304.
When there is ticket presentation of an annual pass to the reading unit 108 of the gate apparatus 10, the face matching unit 314 performs identity verification by face matching for the visitor who performs ticket presentation of an annual pass at the reading unit 108.
ID information read by the reading unit 108 from the annual pass on the ticket presentation is transmitted to the face matching unit 314. The face matching unit 314 acquires the transmitted ID information and acquires, online, a face feature amount of a registered face image which is registered in association with the ID information via the network 60 from the datacenter server 40 described later. A person of the registered face image acquired by the face matching unit 314 as above is a valid user who can validly use the annual pass on the ticket presentation. A valid user of an annual pass is a purchaser who has purchased the annual pass, for example.
Further, the face matching unit 314 refers to the relational database of the storage unit 304 and acquires, offline, face feature amounts of N detected face images associated with the capturing time included in a predetermined period before ticket presentation that is before acquisition of ID information. That is, the face matching unit 314 acquires face feature amounts of N detected face images captured by the fixed camera 20 before the reading unit 108 reads ID information from an annual pass. Note that N is typically an integer greater than one, and a plurality of detected face images are acquired by the face matching unit 314. However, there may be a case where N is one and a single detected face image is acquired by the face matching unit 314. A predetermined period before ticket presentation is performed for acquiring a detected face image may be a period immediately before the ticket presentation, and the length thereof may be set as appropriate in accordance with required matching accuracy, a processing speed, or the like. For example, a predetermined period before ticket presentation for acquiring a detected face image can be set to several seconds immediately before the ticket presentation.
The face matching unit 314 performs a matching process that sequentially matches respective face feature amounts of N detected face images, which have been captured before ticket presentation of an annual pass, against a face feature amount of a registered face image. The matching process here is referred to as N:1 matching because matching of the maximum N detected face images against one registered face image is performed. As discussed above, the face matching unit 314 matches detected face images, which have been detected by the face detection unit 310 from images captured before acquisition of ID information, against a registered face image corresponding to the acquired ID information.
The face matching unit 314 calculates a matching score in accordance with a similarity between a face feature amount of a detected face image and a face feature amount of a registered face image in N:1 matching. The matching score is a larger value for a higher similarity between the face feature amount of a detected face image and the face feature amount of a registered face image. As a result of matching for a certain detected face image, the face matching unit 314 determines that the matching is unmatched if the matching score is less than a predetermined threshold and performs matching of the face feature amount of the next detected face image with the face feature amount of the registered face image. On the other hand, as a result of matching for a certain detected face image, the face matching unit 314 determines that the matching is matched if the matching score is greater than or equal to the predetermined threshold and completes the matching process.
The order of performing matching of face feature amounts against a registered face image for N detected face images is not limited in particular. For example, for N detected face images, matching of face feature amounts against a registered face image may be performed in ascending order or descending order of capturing time or at random. Further, a priority may be determined for each of the N detected face images, and the order of performing matching of face feature amounts against a registered face image may be determined based on the priority. Note that the case where the order of performing matching of face feature amounts against a registered face image is determined based on the priority will be described in a second embodiment.
If the matching performed by the face matching unit 314 is matched, this means that a valid user of an annual pass on the ticket presentation has been included in visitors in front of the gate apparatus 10 before the ticket presentation. Thus, it can be estimated that a valid user of an annual pass performs the ticket presentation of the annual pass. Therefore, in this case, identity verification by face matching is successful.
On the other hand, if all the matching performed by the face matching unit 314 is unmatched, no valid user of an annual pass on the ticket presentation has been included in visitors in front of the gate apparatus 10 before the ticket presentation. Therefore, in this case, identity verification by face matching is failed.
A matching result or the like by the face matching unit 314 can be displayed on the display unit 306. The staff of the facility can confirm a matching result or the like by viewing the display on the display unit 306.
The face matching unit 314 transmits a matching result signal that is a signal indicating the matching result described above to the gate apparatus 10. Specifically, the face matching unit 314 transmits, to the gate apparatus 10, a matching-matched signal that is a signal indicating that the matching by the face matching unit 314 is matched or a matching-unmatched signal that is a signal indicating that all the matching performed by the face matching unit 314 is unmatched.
The face matching apparatus 30 described above is formed of a computer apparatus, for example. An example of a hardware configuration of the face matching apparatus 30 will be described by using
As illustrated in
The CPU 3002 controls the entire operation of the face matching apparatus 30. Further, the CPU 3002 executes a program that implements the function of each unit of the image data acquisition unit 308, the face detection unit 310, the face feature amount extraction unit 312, and the face matching unit 314 in the face matching control unit 302 described above. The CPU 3002 loads a program stored in the HDD 3008 or the like to the RAM 3006 to implement the function of each unit of the face matching control unit 302.
The ROM 3004 stores a program such as a boot program therein. The RAM 3006 is used as a working area when the CPU 3002 executes a program. Further, the program executed by the CPU 3002 is stored in the HDD 3008.
Further, the HDD 3008 is a storage device that implements the function of the storage unit 304 described above. Note that a storage device that implements the function of the storage unit 304 is not limited to the HDD 3008. Various storage devices can be used for implementing the function of the storage unit 304.
The communication I/F 3010 is connected to the network 60. The communication I/F 3010 controls data communication with the datacenter server 40 connected to the network 60.
The display controller 3012 is connected to the display 3014 that functions as the display unit 306. The display controller 3012 causes a matching result from the face matching unit 314 to be displayed on the display 3014.
The input device 3016 may be a keyboard, a mouse, or the like, for example. Further, the input device 3016 may be a touch panel embedded in the display 3014. The staff of the facility can perform setting of the face matching apparatus 30 or input an instruction of execution of a process via the input device 3016.
Note that the hardware configuration of the face matching apparatus 30 is not limited to the configuration described above, but may be various configurations.
The gate control unit 112 of the gate apparatus 10 controls opening and closing of the gate 106 based on a matching result signal transmitted from the face matching unit 314. That is, the gate control unit 112 opens the gate 106 when a matching-matched signal is transmitted from the face matching unit 314. Thereby, a visitor performing ticket presentation is allowed to walk through the path 114 of the gate apparatus 10 to enter the inside of the facility as a person who has been successful in identity verification. The gate control unit 112 causes the gate 106 to be closed after the visitor walked through the path 114.
On the other hand, the gate control unit 112 maintains a closed state of the gate 106 when a matching-unmatched signal is transmitted from the face matching unit 314. At this time, the gate control unit 112 can sound an alert sound of a not-shown alarm provided to the gate apparatus 10, turn on an alert light, or the like to output a warning indicating that all the matching results are unmatched.
The datacenter server 40 has a control unit 402 and a storage unit 404.
The control unit 402 controls the operation of the datacenter server 40.
The storage unit 404 accumulates registered face images and face feature amounts thereof that are registered in association with ID information of issued annual passes.
The control unit 402 provides, to the face matching unit 314, a face feature amount of a registered face image registered in association with ID information of an annual pass on ticket presentation in response to a request from the face matching unit 314.
A registered face image can be uploaded on the web server 70 from a purchaser's terminal when an annual pass is purchased at a web store provided by the web server 70. The registered face image uploaded to the web server 70 is transmitted from the web server 70 to the datacenter server 40. In the datacenter server to which the registered face image has been transmitted, the control unit 402 accumulates the transmitted registered face image in the storage unit 404.
Further, a purchaser who purchased an annual pass at a ticket booth can capture his/her registration face image by the hand camera 110 of the gate apparatus 10 when first visiting the facility. The registered face image captured by the hand camera 110 is transmitted to the datacenter server 40 via the network 60 by the face matching apparatus 30 and accumulated in the storage unit 404 thereof.
Further, a purchaser who purchased an annual pass at a ticket booth may soon capture his/her registration face image by a hand camera (not shown) of the ticket booth. The registered face image captured by the hand camera of the thicket booth is transmitted to the datacenter server 40 via the network 60 by the ticket booth terminal 80 and accumulated in the storage unit 404 thereof.
For a registered face image accumulated in the storage unit 404 of the datacenter server 40 as described above, the same face feature amount as the face feature amount extracted by the face feature amount extraction unit 312 of the face matching apparatus 30 is extracted. Extraction of a face feature amount is performed by the control unit 402 that functions as a face feature amount extraction unit. The extracted face feature amount is accumulated in the storage unit 404 by the control unit 402 in association with ID information of the annual pass with which the registered face image is associated.
A relational database is configured in the storage unit 404. In the relational database of the storage unit 404, face feature amounts of registered face images are stored in association with ID information of annual passes and face image data of the registered face images, as described above. Such mutually associated data is managed by an RDBMS. As an RDBMS, without being limited in particular, Microsoft (registered trademark) SQL Server is used, for example.
Note that, in the relational database of the storage unit 404, in addition to the above, pieces of information such as names, contact addresses, or the like of purchasers of annual passes who are valid users of the annual passes are stored in association with ID information of the annual passes, for example.
The datacenter server 40 described above is formed of a computer apparatus, for example. An example of a hardware configuration of the datacenter server 40 will be described by using
As illustrated in
The CPU 4002 controls the entire operation of the datacenter server 40. Further, the CPU 4002 executes a program that implements the function of the control unit 402 described above. The CPU 4002 loads a program stored in the HDD 4008 or the like to the RAM 4006 to implement the function of the control unit 402.
The ROM 4004 stores a program such as a boot program therein. The RAM 4006 is used as a working area when the CPU 4002 executes a program. Further, the program executed by the CPU 4002 is stored in the HDD 4008.
Further, the HDD 4008 is a storage device that implements the function of the storage unit 404 described above. Note that a storage device that implements the function of the storage unit 404 is not limited to the HDD 4008. Various storage devices can be used for implementing the function of the storage unit 404.
The communication I/F 4010 is connected to the network 60. The communication I/F 4010 controls data communication with the face matching apparatus 30 connected to the network 60.
Note that the hardware configuration of the datacenter server 40 is not limited to the configuration described above, but may be various configurations.
As described above, the face recognition system 1 according to the present example embodiment matches a detected face image captured by the fixed camera 20 before ticket presentation, in which the reading unit 108 reads ID information from an annual pass is performed, against a registered face image registered in association with ID information of the annual pass on the ticket presentation. That is, in the face recognition system 1 according to the present example embodiment, a detected face image that is an image of a matching subject to be matched against a registered face image is acquired in advance before ticket presentation of an annual pass.
Thus, according to the present example embodiment, after a visitor performs ticket presentation of an annual pass, it is not necessary for the staff of the facility to capture a face image of the visitor as an image of a matching subject to be matched against a registered face image. Further, a visitor neither needs to concern about capturing of the face image thereof nor needs to perform a special move such as positioning of the face thereof for the capturing. Therefore, according to the present example embodiment, face matching can be made smoothly in a short time.
Next, a face recognition method according to the present example embodiment using the face recognition system 1 according to the above present example embodiment will be further described by using
First, the entire flow of the face recognition method according to the present example embodiment will be described by using
The fixed camera 20 captures an area in front of the gate apparatus 10 that is the entrance side to the installation area of the reading unit 108 and continuously acquires a plurality of images at a predetermined cycle (step S102). Further, the fixed camera 20 transmits image data of the plurality of images acquired at a predetermined cycle to the face matching apparatus 30 in synchronization with the cycle (step S104).
In the face matching apparatus 30 to which the image data has been transmitted, the image data acquisition unit 308 sequentially acquires image data transmitted from the fixed camera 20. The face detection unit 310 performs face detection for each image and detects a face image as a detected face image (step S106). The face feature amount extraction unit 312 extracts a face feature amount for each detected face image for temporary storage (step S108).
In the fixed camera 20 and the face matching apparatus 30, steps S102 to S108 described above are repeatedly performed.
On the other hand, when there is ticket presentation of an annual pass at the gate apparatus 10, the reading unit 108 reads ID information of the annual pass on the ticket presentation (step S110). Subsequently, the gate control unit 112 transmits ID information read by the reading unit 108 to the face matching apparatus 30 (step S112).
In the face matching apparatus 30 to which the ID information has been transmitted, the face matching unit 314 transmits the ID information to the datacenter server 40 via the network 60 (step S114) and requests a face feature amount of a registered face image registered in association with the ID information. Thereby, the face matching unit 314 acquires a face feature amount of a registered face image registered in association with the ID information online from the datacenter server 40 via the network 60 (step S116).
Further, the face matching unit 314 refers to a relational database of the storage unit 304 thereof and acquires, offline, face feature amounts of N detected face images associated with the capturing time included in a predetermined period before the ticket presentation (step S118). That is, the face matching unit 314 acquires face feature amounts of N detected face images captured before the reading unit 108 reads ID information from the annual pass.
Note that any one of the above step S116 and step S118 may be performed earlier or both of the above step S116 and step S118 may be performed at the same time.
Next, the face matching unit 314 performs N:1 matching based on the face feature amounts of the acquired detected face images and the face feature amount of the registered face image (step S120).
The face matching unit 314 that has performed N:1 matching transmits a matching result signal indicating a matching result to the gate apparatus 10 (step S122).
In the gate apparatus 10 to which the matching result signal has been transmitted, the gate control unit 112 controls opening and closing of the gate 106 based on the matching result signal (step S124).
Every time ticket presentation of an annual pass is performed at the reading unit 108 of the gate apparatus 10, steps S110 to S124 are repeatedly performed.
Next, details of the process in the face recognition method according to the present example embodiment will be described by using
First, a process from capturing by the fixed camera 20 to temporary storage of a face feature amount of a detected face image will be described in detail by using
The image data of images captured by the fixed camera 20 is periodically transmitted to the face matching apparatus 30. The image data acquisition unit 308 determines whether or not it is the timing of transmission of image data from the fixed camera 20 (step S202). If it is not the timing of transmission of image data (step S202, NO), the image data acquisition unit 308 stands by for an arrival of the timing.
If it is the timing of transmission of image data from the fixed camera 20 (step S202, YES), the image data acquisition unit 308 acquires image data transmitted from the fixed camera 20 (step S204). Before the next step S206, the image data acquisition unit 308 can perform image processing such as a correction process on the acquired image data.
Next, the face detection unit 310 performs face detection on an image of image data acquired by the image data acquisition unit 308 and, from the image, detects a face image of a visitor in the area in front of the gate apparatus 10 (step S206). If no face image is detected (step S208, NO), the process returns to step S202 and stands by for transmission of next image data from the fixed camera 20.
If a face image is detected by the face detection unit 310 (step S208, YES), the face feature amount extraction unit 312 extracts a face feature amount for a detected face image that is the detected face image (step S210).
Further, the face feature amount extraction unit 312 temporarily stores the face image data of the detected face image together with the face feature amount extracted from the detected face image in the storage unit 304 in association with each other (step S212). At this time, the face feature amount extraction unit 312 temporarily stores a detection number that is a number identifying the image data and the capturing time when the detected face image is captured in the storage unit 304 in association with each other for the detected face image, together with face image data and the face feature amount thereof.
The process illustrated in
Next, a process from ticket presentation of an annual pass to gate control will be described by using
In the gate apparatus 10, the reading unit 108 stands by until ticket presentation of an annual pass is performed (step S302, NO). During standby for ticket presentation, the gate 106 is closed.
In response to ticket presentation of an annual pass by a visitor at the reading unit 108 (step S302, YES), the reading unit 108 reads ID information recorded in the annual pass from the annual pass (step S306).
Subsequently, the gate control unit 112 transmits the ID information read by the reading unit 108 to the face matching apparatus 30 (step S306).
In the face matching apparatus 30, the face matching unit 314 transmits, to the datacenter server 40 via the network 60, the ID information transmitted from the gate control unit 112 and requests for a face feature amount of a registered face image. Thereby, the face matching unit 314 acquires, online, from the datacenter server 40 via the network 60, a face feature amount of a registered face image registered in association with the transmitted ID information (step S308).
Further, the face matching unit 314 refers to a relational database of the storage unit 304 to acquire, offline, face feature amounts of N detected face images associated with the capturing time included in a predetermined period before ticket presentation of an annual pass (step S310).
The face matching unit 314 performs N:1 matching that sequentially matches respective face feature amounts of the acquired N detected face images against the face feature amount of the acquired registered face image (step S312). If the matching score is less than a predetermined threshold as a result of matching for a certain detected face image, the face matching unit 314 determines that the matching is unmatched and performs matching of the face feature amount of the next detected face image with the face feature amount of the registered face image. On the other hand, if the matching score is greater than or equal to the predetermined threshold as a result of matching for the certain detected face image, the face matching unit 314 determines that the matching is matched and completes the matching process.
If the matching performed by the face matching unit 314 is matched (step S314, YES), the face matching unit 314 transmits a matching-matched signal indicating that the matching performed by the face matching unit 314 is matched to the gate apparatus 10 (step S316).
In the gate apparatus 10, in response to transmission of a matching-matched signal from the face matching unit 314, the gate control unit 112 opens the gate 106 (step S318). Thereby, the visitor performing ticket presentation is allowed to walk through 114 of the gate apparatus 10 and enter the inside of the facility as a person who is successful in identity verification.
The gate control unit 112 closes the gate 106 after the visitor has passed through the path 114 (step S320).
On the other hand, if all the matching performed by the face matching unit 314 is unmatched (step S314, NO), the face matching unit 314 transmits a matching-unmatched signal indicating that all the matching performed by the face matching unit 314 is unmatched to the gate apparatus 10 (step S322).
In the gate apparatus 10, in response to transmission of a matching-unmatched signal from the face matching unit 314, the gate control unit 112 maintains the closed state of the gate 106 (step S324). At this time, the gate control unit 112 outputs a warning by sounding an alert sound of a not-shown alert provided to the gate apparatus 10, turning on an alert light, or the like (step S326). As a result, the visitor performing ticket presentation is unable to enter the inside of the facility at this stage as a person who is failed in identity verification. In this case, an action such as identity verification again performed by the staff of the facility is made, for example.
The process illustrated in
As discussed above, according to the present example embodiment, a face feature amount of a detected face image detected from an image captured by the fixed camera 20 before ticket presentation of an annual pass is performed is matched against a face feature amount of a registered face image registered in association with ID information of the annual pass on the ticket presentation. Therefore, according to the present example embodiment, face matching can be performed smoothly in a short time.
Note that, while the case where a single visitor is captured in one frame of image captured by the fixed camera 20 as illustrated in
A face recognition system and a face recognition method according to a second example embodiment of the present invention will be described by using
The basic configuration of the face recognition system according to the present example embodiment is substantially the same as the configuration of the face recognition system according to the first example embodiment. The face recognition system according to the present example embodiment is different from the face recognition system according to the first example embodiment in that the face matching apparatus 30 further has a priority calculation unit that calculates the priority in performing N:1 matching for a detected face image.
As illustrated in
The priority calculation unit 318 calculates a priority used for determining the order of performing matching of a face feature amount against a registered face image for respective detected face images detected by the face detection unit 310. The priority calculation unit 318 can calculate a priority based on various factors.
As a factor by which the priority calculation unit 318 calculates a priority, a positional relationship of detected face images in an image captured by the fixed camera 20 is exemplified. A person of a detected face image located closer to the reading unit 108 of the gate apparatus 10 in an image captured by the fixed camera 20 is more likely to perform ticket presentation of an annual pass at the reading unit 108. Therefore, a higher priority can be set for a detected face image located closer to the reading unit 108 in an image captured by the fixed camera 20.
Further, as a factor by which the priority calculation unit 318 calculates a priority, a facial likeness score that is an evaluation value evaluating a facial likeness of a detected face image can be exemplified. For example, a facial likeness score having a higher value indicates that a detected face image is more likely to be a face. Therefore, a higher priority can be set for a detected face image having a higher facial likeness score.
Further, as a factor by which the priority calculation unit 318 calculates the priority, a quality of a detected face image is exemplified. A higher quality of a detected face image enables a more accurate matching. Therefore, a higher priority can be set for a detected face image having a higher quality.
The priority calculation unit 318 can calculates the priority based on any of the factors described above or a combination of these factors. For example, the priority value may be a smaller value for a higher priority.
The priority calculation unit 318 stores the calculated priorities for respective detected face images in the relational database configured in the storage unit 304 in association with a detection number, a capturing time, and face image data.
Note that, also in the present example embodiment, the face matching apparatus 30 can have the same hardware configuration as the hardware configuration illustrated in
As discussed above, in the present example embodiment, since the order of performing matching of a face feature amount against a registered face image for N detected face images is descending order of priority, the N:1 matching can be efficiently performed. Therefore, according to the present example embodiment, face matching can be performed smoothly in a short time.
Third Example EmbodimentA face recognition system and a face recognition method according to a third example embodiment of the present invention will be described by using
The basic configuration of the face recognition system according to the present example embodiment is substantially the same as the face recognition system according to the configuration of the first example embodiment. The face recognition system according to the present example embodiment is different from the face recognition system according to the first example embodiment in that the face matching apparatus 30 further has an identical-person processing unit that performs a process when visitors of an identical person are captured in different frames of images captured by the fixed camera 20.
When an area in front of the gate apparatus 10 is captured by the fixed camera 20, visitors of an identical person may be captured in different frames of images.
In the present example embodiment, when visitors of an identical person are captured in a different frame of images in such a way, the detected face images are classified on an identical person basis. The priority based on the quality or the like is then determined for the detected face image of the identical person, and matching of a face feature amount against a registered face image is performed in descending order of priority. This allows more efficient N:1 matching to be performed.
As illustrated in
The identical-person processing unit 320 classifies detected face images detected by the face detection unit 310 into respective identical persons. Furthermore, the identical-person processing unit 320 calculates a priority for determining the order of performing matching of a face feature amount against the registered face image for a plurality of detected face images classified into identical persons. The identical-person processing unit 320 can calculate the priority based on a positional relationship of detected face images, a facial likeness score, a quality, or the like in a similar manner to the priority calculation unit 318 according to the second example embodiment.
Note that, also in the present example embodiment, the face matching apparatus 30 can have the same hardware configuration as the hardware configuration illustrated in
The face matching unit 314 performs matching of face feature amounts against a registered face images in descending order of priority for a plurality of detected face images classified into an identical person in N:1 matching.
As discussed above, according to the present example embodiment, in N:1 matching, since matching of face feature amounts against a registered face image is performed in descending order of priority for detected face images classified into an identical person, the N:1 matching can be performed efficiently. Therefore, according to the present example embodiment, face matching can be performed more smoothly in a shorter time.
Note that, while the identical-person processing unit 320 is further provided in addition to the configuration of the face recognition system according to the first example embodiment above has been described above, the example embodiment is not limited thereto. The identical-person processing unit 320 may be further provided in addition to the configuration of the face recognition system according to the second example embodiment.
Fourth Example EmbodimentA face recognition system and a face recognition method according to a fourth example embodiment of the present invention will be described by using
The basic configuration of the face recognition system according to the present example embodiment is substantially the same as the configuration of the face recognition system according to the first example embodiment. When the matching performed by the face matching unit 314 of the face matching apparatus 30 is matched, the face recognition system according to the present example embodiment updates a registered face image stored in the datacenter server 40 based on the matching result.
An update process of a registered face image in a face recognition system according to the present example embodiment will be described below by using
In the face matching apparatus 30, if matching is matched as a result of N:1 matching, the face matching unit 314 stores the matching result (step S402). The face matching unit 314 can store the matching result in the storage unit 304 or can store the matching result in a storage unit provided separately from the storage unit 304.
Next, the face matching unit 314 transmits the stored matching result to the datacenter server 40 regularly or irregularly (step S404). For example, the face matching unit 314 can transmit a matching result to the datacenter server 40 on a daily basis.
In the datacenter server 40 to which a matching result is transmitted, the control unit 402 stores the transmitted matching result (step S406). The control unit 402 can store the matching result in the storage unit 404 or can store a storage unit provided separately from the storage unit 404.
Next, the control unit 402 regularly or irregularly processes the stored matching result. Specifically, the control unit 402 stores the detected face image associated with ID information in the storage unit 404 by newly adding it as a registered face image associated with the ID information (step S408). Further, the face feature amount of the newly added registered face image is stored in the storage unit 404 in association with the ID information. In such a way, for the same ID information, a plurality of registered face images and the face feature amounts thereof are stored and registered in association with each other. As discussed above, the control unit 402 functions as an update unit that additionally registers the detected face image in which matching is matched to the registered face image as a new registered face image and additionally updates the registered face image.
Next, the control unit 402 updates the priority of a plurality of registered face images stored in association with the same ID information (step S410). The control unit 402 can calculate a priority based on a facial likeness score or the like in a similar manner to the priority calculation unit 318 according to the second example embodiment and update the priority of the plurality of registered face images.
The plurality of registered face images whose priorities have been updated in such a way are subjected to the N:1 matching performed by the face matching unit 314 of the face matching apparatus 30 in descending order of priority.
The process illustrated in
As described above, according to the present example embodiment, since the registered face images stored in the datacenter server 40 are updated based on a matching result from the face matching unit 314 of the face matching apparatus 30, the accuracy of face matching can be maintained to be high.
Note that, while the case where the registered face images are updated in the face recognition system according to the first example embodiment has been described above, the example embodiment is not limited thereto. The registered face images can be updated in the same manner as above in the face recognition system according to the second and third example embodiment.
Fifth EmbodimentA face recognition system and a face recognition method according to a fifth example embodiment of the present invention will be described by using
In the present example embodiment, when identity verification by face matching based on an image captured by the fixed camera 20 is failed, a face image of a visitor is captured by the hand camera 110 to perform face matching in the first example embodiment. The face recognition method according to the present example embodiment will be described below by using FIG.
Also in the present example embodiment, identity verification by face matching is performed based on an image captured by the fixed camera 20 in the same manner as in the first example embodiment. As illustrated in
When a warning is output in response to a failed identity verification by face matching based on an image captured by the fixed camera 20 (step S326), the staff of the facility uses the hand camera 110 of the gate apparatus 10 to capture the face of the visitor who is failed in identity verification thereof (step S502). Note that, in the following, a face image of a face of a visitor captured by the hand camera 110 may also referred to as an in-hand face image.
Once an in-hand face image is captured by the hand camera 110, the gate control unit 112 transmits image data of the in-hand face image to the face matching apparatus 30.
In the face matching apparatus 30, the face feature amount extraction unit 312 extracts a face feature amount of the transmitted in-hand face image.
Next, the face matching unit 314 matches the face feature amount of the in-hand face image extracted by the face feature amount extraction unit 312 against a face feature amount of a registered face image registered in association with ID information of the annual pass on ticket presentation (step S504).
If the matching by the face matching unit 314 based on the in-hand face image is matched (step S506, YES), the process transfers to step S316 to open the gate 106 (step S318). The visitor who made ticket presentation is able to walk through the path 114 of the gate apparatus 10 to enter the inside of the facility as a person who is successful in identity verification by face matching based on the in-hand face image. Then, the gate 106 is closed (step S320).
In contrast, if the matching by the face matching unit 314 based on the in-hand face image is unmatched (step S506, NO), the visitor who made ticket presentation is attended by the staff of the facility as a person who is failed in identity verification by face matching based on the in-hand face image (step S508). For example, as an action by the staff of the facility, identity verification is again done by the staff of the facility.
As in the present example embodiment, when identity verification by face matching based on a detected face image by using the fixed camera 20 is failed, identity verification by face matching based on an in-hand face image by using the hand camera 110 can be performed.
Note that, while the case where face matching based on an in-hand face image by using the hand camera 110 is performed in the face recognition system according to the first example embodiment has been described above, the example embodiment is not limited thereto. The face matching based on an in-hand face image by using the hand camera 110 can be performed in the face recognition system in the same manner as above in the face recognition system according to the second to fourth example embodiment.
Sixth EmbodimentA face recognition system and a face recognition method according to a sixth example embodiment of the present invention will be described by using
In the present example embodiment, when identity verification by face matching based on an image captured by the fixed camera 20 is failed in the face recognition system in the first example embodiment, a registered face image and a detected face image are manually matched through visual observation by the staff of the facility. The face recognition method according to the present example embodiment will be described below by using
Also in the present example embodiment, identity verification by face matching is performed based on an image captured by the fixed camera 20 in the same manner as in the first example embodiment. As illustrated in
When a warning is output in response to a failed identity verification by face matching based on an image captured by the fixed camera 20 (step S326), the face matching unit 314 displays a matching window on the display unit 306 to the staff of the facility in the face matching apparatus 30 (step S602). Specifically, the face matching unit 314 displays a matching window on the display unit 306 in which a registered face image and N detected face images on which the N:1 matching has been performed are displayed.
The staff of the facility views the matching window displayed on the display unit 306 to perform matching manually (step S604). Specifically, out of N detected face images displayed in the matching window, the staff of the facility selects a detected face image which can be determined as the same person as a person of the registered face image displayed together. The display unit 306 is formed of a touch panel and functions as an input unit, for example. In this case, the staff of the facility may touch the detected face image displayed in the matching window and input the selected result to the face matching apparatus 30. Note that a selected result may be input to the face matching apparatus 30 by using other input units such as a mouse, a keyboard, or the like.
If the matching is matched as a result of manual matching (step S606, YES), that is, a detected face image which is determined as the same person as a person of the registered face image is selected, the process transfers to step S316 to open the gate 106 (step S318). The visitor who the made ticket presentation is able to walk through the path 114 of the gate apparatus 10 to enter the inside of the facility as a person who is successful in identity verification by face matching based on the manual matching by the staff of the facility. Then, the gate 106 is closed (step S320).
On the other hand, if the matching is unmatched as a result of manual matching (step S606, NO), that is, there is no detected face image which is determined as the same person as a person of the registered face image, an action in which identity verification is again performed by the staff of the facility or the like is taken (step S608).
The detected face image selected by the staff of the facility in the manual matching as described above can be registered as a new registered face image. In this case, the face matching unit 314 of the face matching apparatus 30 transmits image data of the selected detected face image and the face feature amount thereof to the datacenter server 40.
In the datacenter server 40, the control unit 402 stores the transmitted detected face image as a new registered face image in the storage unit 404 in association with the ID information of the annual pass on the ticket presentation. At this time, the control unit 402 also stores the face feature amount of the new registered face image in the storage unit 404 in association with the ID information of the annual pass on the ticket presentation.
As in the present example embodiment, when identity verification by face matching based on a detected face image by the fixed camera 20 is failed, identity verification can be performed based on manual matching through visual observation by the staff of the facility.
Note that, while the case where manual matching through visual observation by the staff of the facility is performed in the face recognition system according to the first example embodiment has been described above, the example embodiment is not limited thereto. The manual matching through visual observation by the staff of the facility can be performed in the face recognition system in the same manner as above in the face recognition system according to the second to fifth example embodiment.
Seventh EmbodimentA face recognition system and a face recognition method according to a seventh example embodiment of the present invention will be described by using
The basic configuration of the face recognition system according to the present example embodiment is substantially the same as the configuration of the face recognition system according to the first example embodiment. In the face recognition system according to the present example embodiment, a part of the information accumulated in the storage unit 404 of the datacenter server 40 is stored in the face matching apparatus 30. In this regard, the face recognition system according to the present example embodiment is different from the face recognition system according to the first example embodiment.
As illustrated in
The storage unit 324 stores registered face images registered in association with ID information of issued annual passes and the face feature amounts thereof. The registered face images and the face feature amounts thereof stored in the storage unit 324 are a part of the registered face images and the face feature amounts thereof accumulated in the storage unit 404 of the datacenter server 40.
In the storage unit 324, a relational database is configured. In the relational database of the storage unit 324, face feature amounts of registered face images are stored in association with ID information of annual passes and face image data of the registered face images, as described above. Such mutually associated data is managed by an RDBMS. As an RDBMS, without being limited in particular, Microsoft (registered trademark) SQL Server is used, for example.
Also in the present example embodiment, in the same manner as in the first example embodiment, first, the face matching unit 314 attempts to acquire, online, a face feature amount of a registered face image registered in association with ID information of an annual pass on ticket presentation from the datacenter server 40 via the network 60.
At this time, there may be a case where the face matching apparatus 30 cannot be connected to the datacenter server 40 via the network 60 for some reason such as the face matching apparatus 30 being offline. When the face matching apparatus 30 cannot be connected to the datacenter server 40, the face matching unit 314 cannot acquire, online, a face feature amount of a registered face image from the datacenter server 40 via the network 60.
In this case, the face matching unit 314 refers to the relational database in the storage unit 324 of the face matching apparatus 30 and acquires, offline, the face feature amount of the registered face image registered in association with the ID information of the annual pass on the ticket presentation.
The face matching unit 314 can use a registered face image acquired from the storage unit 324 of the face matching apparatus 30 to perform N:1 matching in the same manner as the first example embodiment
As discussed above, according to the present example embodiment, the face matching apparatus 30 has the storage unit 324 offline in which registered face images registered in association with ID information of the annual passes and the face feature amounts thereof are stored. Therefore, according to the present example embodiment, even when face feature amounts of registered face images cannot be acquired from the datacenter server 40 online, face matching can be performed.
Eighth Example EmbodimentA face recognition system and a face recognition method according to an eighth example embodiment of the present invention will be described by using
First, the face recognition system according to the present example embodiment will be described by using
In the above first to eighth embodiments, the case of performing identity verification by face matching on a visitor as an authentication subject who is going to enter the inside of the facility by using an admission ticket at an entrance gate of a facility has been described. However, a situation of performing identity verification by face matching is not limited thereto. In the present example embodiment, a case of performing identity verification by face matching on a shopping customer as an authentication subject who performs electronic payment by using a credit card, a debit card, electronic money, or the like at a cashier area such as a register counter where a register terminal is installed in a shop will be described.
As illustrated in
The face matching apparatus 30 and the datacenter server 40 are connected to a network via the network 60, respectively, and can communicate with each other via the network 60 in the same manner as in the first example embodiment.
Further, the register terminal 810 and the electronic payment server 820 are connected to a network via the network 60, respectively, and can communicate with each other via the network 60.
Further, the register terminal 810 and the fixed camera 20 are directly, locally connected to the face matching apparatus 30 in a communicable manner through cable connection or the like, respectively. The connection among the register terminal 810, the fixed camera 20, and the face matching apparatus 30 may be of a wired scheme or a wireless scheme.
Next, each unit of the face recognition system 5 in the present example embodiment will be described in detail.
The register terminal 810 reads price information of a purchase item purchased by a shopping customer as well as other information and performs an accounting process for the purchase item. As described later, the register terminal 810 performs an accounting process based on electronic payment information notified when identity verification is successful by face matching.
A reading unit 830 is connected to the register terminal 810. The reading unit 830 is installed near the register terminal 810 adjacent to the register terminal 810. The reading unit 830 reads information recorded in a membership card of a shop carried by a shopping customer. Specifically, in a membership card, ID information that uniquely identifies the membership card is recorded. The reading unit 830 reads ID information from a membership card. The ID information read by the reading unit 830 is a membership number of the membership card, for example. A shopping customer using a shop carries a membership card when using the shop and causes the reading unit 830 to read the membership card in accounting. The membership card is a medium that is carried by a shopping customer, which is an authentication subject, and required when the shopping customer makes a payment and in which ID information uniquely identifying this is recorded. As described later, in the datacenter server 40, information on registered members to which membership cards are issued in association with ID information on the membership cards is accumulated.
The reading unit 830 has a reading scheme in accordance with a recording scheme of ID information of a membership card. For example, when a membership card has ID information recorded in a one-dimensional code such as a barcode or a two-dimensional code such as a QR code (registered trademark), the reading unit 108 is a code reader such as a barcode reader, a QR code reader, or the like. Further, for example, when a membership card has ID information recorded in a non-contact IC card or a non-contact IC tug with Radio Frequency Identification (RFID), the reading unit 830 is an RFID reader.
When there is ticket presentation of a membership card on the reading unit 830, the reading unit 830 reads ID information recorded in the annual pass from the annual pass. Note that ticket presentation here means that a shopping customer who is an authentication subject causes the reading unit 830 to read information including the ID information recorded in a membership card.
The reading unit 830 transmits ID information read from a membership card to the register terminal 810. The register terminal 810 transmits, to the face matching apparatus 30, the ID information of the membership card transmitted from the reading unit 830.
Further, the register terminal 810 performs an accounting process based on the electronic payment information transmitted from the face matching apparatus 30 described later.
The fixed camera 20 is fixed to the upper end of a support pillar 202 installed on the exit side of the cashier area 850 with respect to the register terminal 810. The fixed camera 20 captures an image of an area in front of the register terminal 810, and the orientation facing the entrance side of the cashier area 850 is fixed. The fixed camera 20 is fixed at a height located above a head of a human of a height of around 200 cm, for example, from the ground face at the cashier area 850 and is directed obliquely downward to face an area in front of the register terminal 810. Note that a fixing scheme of the fixed camera 20 is not limited to a scheme using the support pillar 202. For example, the fixed camera 20 may be hanged from and fixed to the ceiling.
The fixed camera 20 fixed as described above captures an image of an area in front of the register terminal 810 that is the entrance side to the installation area of the register terminal 810 and the reading unit 830. Thereby, the fixed camera 20 can capture a shopping customer C in the area in front of the register terminal 10 that is the entrance side to the installation area of the reading unit 830.
Since other features of the fixed camera 20 is the same as those in the above first example embodiment, the description thereof will be omitted.
The face matching apparatus 30 has the face matching control unit 302, the storage unit 304, and the display unit 306 in the same manner as the first example embodiment. The face matching control unit 302 includes the image data acquisition unit 308, the face detection unit 310, the face feature amount extraction unit 312, and the face matching unit 314.
The image data acquisition unit 308 sequentially acquires image data of images transmitted from the fixed camera 20 at a predetermined cycle in the same manner as the first example embodiment.
The face detection unit 310 performs face detection on respective images of image data sequentially acquired from the image data acquisition unit 308 in the same manner as the first example embodiment. Note that, in the present example embodiment, the face detection unit 310 detects a face image of a shopping customer in an area in front of the register terminal 810.
The face feature amount extraction unit 312 extracts a face feature amount for respective detected face images detected by the face detection unit 310 in the same manner as the first example embodiment. Further, the face feature amount extraction unit 312 temporarily stores face image data, a face feature amount, a detection number, and a capturing time in the storage unit 304 in association with each other for respective detected face images in the same manner as the first example embodiment.
A relational database is configured in the storage unit 304, and face feature amounts and data related thereto are temporarily stored for a certain time period from the capturing time for respective detected face images in the same manner as the first example embodiment.
In response to ticket presentation of a membership card at the reading unit 830 connected to the register terminal 810, the face matching unit 314 performs identity verification by face matching for a shopping customer who made ticket presentation of the membership card at the reading unit 830.
ID information read by the reading unit 830 from the membership card on ticket presentation is transmitted to the face matching unit 314. The face matching unit 314 acquires the transmitted ID information and, via the network 60 from the datacenter server 40 described later, acquires a face feature amount of a registered face image registered in association with the ID information. A person of a registered face image acquired by the face matching unit 314 in such a way is a valid user who is allowed to validly use the membership card on ticket presentation. A valid user of a membership card is a registered member to which the membership card was issued, for example.
Further, in the same manner as the first example embodiment, the face matching unit 314 refers to the relational database of the storage unit 304 and acquires face feature amounts of N detected face images associated with the capturing time included in a predetermined period before ticket presentation. That is, the face matching unit 314 acquires face feature amounts of N detected face images captured by the fixed camera 20 before the reading unit 830 reads ID information from a membership card.
The face matching unit 314 performs N:1 matching that sequentially matches respective face feature amounts of N detected face images, which have been captured before ticket presentation of a membership card, against a face feature amount of a registered face image in the same manner as the first example embodiment.
If matching by the face matching unit 314 is matched, this means that a valid user of a membership card on ticket presentation is in shopping customers in front of the register terminal 810 before the ticket presentation. Thus, it can be estimated that a valid user of a membership card made the ticket presentation of the membership card. Therefore, in this case, identity verification by face matching is successful.
On the other hand, if all the matching by the face matching unit 314 is unmatched, this means that there is no valid user of a membership card on ticket presentation in shopping customers in front of the register terminal 810 before the ticket presentation. Therefore, in this case, identity verification by face matching is failed.
A matching result or the like by the face matching unit 314 can be displayed on the display unit 306. The staff of the shop can confirm a matching result or the like by viewing the display on the display unit 306.
The face matching unit 314 transmits, to the register terminal 810, a matching result signal that is a signal indicating the matching result described above. Specifically, the face matching unit 314 transmits, to the register terminal 810, a matching-matched signal that is a signal indicating that the matching performed by the face matching unit 314 is matched or a matching-unmatched signal that is a signal indicating that all the matching performed by the face matching unit 314 is unmatched.
Furthermore, if the matching is matched, the face matching unit 314 acquires payment option information via the network 60 from the datacenter server 40 described later. The payment information is information used for performing an electronic payment stored in association with ID information of a membership card in which matching is matched and may be a credit card number or electronic money ID information, for example. The face matching unit 314 transmits the acquired payment option information to the register terminal 810 together with the matching-matched signal.
When a matching-matched signal is transmitted from the face matching unit 314, the register terminal 810 requests an electronic payment to the electronic payment server 820 described later. In this case, the register terminal 810 transmits, to the electronic payment server 820, electronic payment information that is information including the payment option information transmitted together with the matching-matched signal and an accounting price of a purchased item.
On the other hand, when a matching-unmatched signal is transmitted from the face matching unit 314, the register terminal 810 uses a warning display, a warning sound, or the like to notify the shop staff operating the register terminal 810 that the electronic payment cannot be made.
The datacenter server 40 has the control unit 402 and the storage unit 404 in the same manner as the first example embodiment.
The storage unit 404 accumulates registered face images registered in association with ID information of issued membership cards and face feature amounts thereof. Furthermore, in the storage unit 404, payment option information associated with ID information of membership cards is stored.
In response to a request from the face matching unit 314, the control unit 402 provides, to the face matching unit 314, a face feature amount of a registered face image registered in association with the ID information of the membership card on ticket presentation. Furthermore, in response to a request from the face matching unit 314, the control unit 402 provides, to the face matching unit 314, the payment option information associated with the ID information of the membership card on thicket presentation.
A registered face image can be uploaded to the datacenter server 40 from a terminal in a shop, a terminal of a shopping customer, or the like when the shopping customer acquires a membership card of the shop, for example. In the datacenter server 40, the control unit 402 accumulates the uploaded registered face image in the storage unit 404.
For a registered face image accumulated in the storage unit 404 of the datacenter server 40 as described above, a face feature amount is extracted and accumulated in the storage unit 404 in the same manner as the first example embodiment.
A relational database is configured in the storage unit 404, and face feature amounts of registered face images are stored in association with ID information of membership cards and face image data of the registered face images in the same manner as the first example embodiment. Furthermore, in the storage unit 404, payment option information is stored in association with ID information of membership cards.
Note that, in the relational database of the storage unit 404, for example, information on names, contact addresses, or the like of registered members who are valid users of membership cards is stored in addition to the above in association with ID information of membership cards.
The electronic payment server 820 performs an electronic payment by a credit card, an electronic money, or the like. The electronic payment server 820 performs an electronic payment for an item purchased by a shopping customer based on the electronic payment information transmitted from the register terminal 810.
As described above, the face recognition system 5 according to the present example embodiment matches, against a registered face image registered in association with the ID information of the membership card on the ticket presentation, detected face images captured by the fixed camera 20 before ticket presentation in which the reading unit 830 reads ID information from a membership card is performed. That is, in the face recognition system 5 according to the present example embodiment, the detected face image that is an image of a matching subject to be matched against a registered face image is acquired in advance before ticket presentation of a membership card.
Thus, according to the present example embodiment, after a shopping customer makes ticket presentation of a membership card, it is not necessary to capture a face image of the shopping customer as an image of a matching subject to be matched against a registered face image by the staff of a shop. Further, the shopping customer neither needs to concern about capturing of the face image thereof nor needs to perform a special move such as positioning of the face thereof for the capturing. Therefore, according to the present example embodiment, face matching can be made smoothly in a short time.
Next, a face recognition method according to the present example embodiment using the face recognition system 5 according to the present example embodiment will be further described by using
The fixed camera 20 captures an area in front of the register terminal 810 that is the entrance side to the installation area of the reading unit 830 and acquires a plurality of images continuously at a predetermined cycle. Since the process before temporality storing a face feature amount of a detected face image from the capturing by the fixed camera 20 is the same as the process illustrated in
A process from the ticket presentation of a membership card to execution of an electronic payment will be described below by using
At the register terminal 810, the reading unit 830 stands by until ticket presentation of a membership card is performed (step S702, NO).
In response to ticket presentation of the membership card at the reading unit 830 by a shopping customer (step S702, YES), the reading unit 830 reads ID information recorded in the membership card from the membership card (step S704).
Subsequently, the register terminal 810 transmits the ID information read by the reading unit 830 to the face matching apparatus 30 (step S706).
In the face matching apparatus 30, the face matching unit 314 transmits, to the datacenter server 40 via the network 60, the ID information transmitted from the register terminal 810 and requests a face feature amount of a registered face image. Thereby, from the datacenter server 40 via the network 60, the face matching unit 314 acquires, online, a face feature amount of a registered face image registered in association with the transmitted ID information (step S708).
Further, the face matching unit 314 refers to the relational database of the storage unit 304 to acquire, offline, a face feature amount of N detected face images associated with the capturing time included in a predetermined period before the ticket presentation of the membership card (step S710). That is, the face matching unit 314 acquires face feature amounts of N detected face images captured by the fixed camera 20 before the reading unit 830 reads ID information from the membership card.
The face matching unit 314 performs N:1 matching that sequentially matches respective face feature amounts of the acquired N detected face images against the face feature amount of the acquired registered face image in the same manner as the first example embodiment (step S712).
If the matching performed by the face matching unit 314 is matched (step S714, YES), the face matching unit 314 acquires payment option information from the datacenter server 40 via the network 60 (step S716). Subsequently, the face matching unit 314 transmits, to the register terminal 810 together with the acquired payment option information, a matching-matched signal indicating that the matching performed by the face matching unit 314 is matched (step S718).
In response to the transmission of the matching-matched signal from the face matching unit 314, the register terminal 810 transmits, to the electronic payment server 820, electronic payment information including the transmitted payment option information and an accounting price of a purchased item (step S720).
The electronic payment server 820 executes the electronic payment for the item purchased by the shopping customer based on the electronic payment information transmitted from the register terminal 810 (step S722).
On the other hand, if all the matching performed by the face matching unit 314 is unmatched (step S714, NO), the face matching unit 314 transmits, to the register terminal 810, a matching-unmatched signal indicating that all the matching performed by the face matching unit 314 is unmatched (step S724).
In response to the transmission of the matching-unmatched signal from the face matching unit 314, the register terminal 810 uses a warning display, a warning sound, or the like to notify the shop staff operating the register terminal 810 that the electronic payment cannot be made (step S726).
The process illustrated in
As discussed above, according to the present example embodiment, a face feature amount of a detected face image detected from an image captured by the fixed camera 20 before ticket presentation of a membership card is matched against a face feature amount of a registered face image registered in association with the ID information of the membership card on the ticket presentation. Therefore, according to the present example embodiment, face matching can be performed smoothly in a short time.
Note that, also when identity verification is performed by face matching at a cashier area where a register terminal is installed in the shop described above, the face recognition system may be configured in the same manner as the second to seventh example embodiments described above to perform the same process as that in the second to seventh example embodiments.
Ninth Example EmbodimentA computer apparatus according to the ninth example embodiment of the present invention will be described by using
The computer apparatus 1000 has a processor 1002, memory 1004, and a storage device 1006. Further, the computer apparatus 1000 has a high-speed controller 1008 including a high-speed interface and a low-speed controller 1010 including a low-speed interface. The memory 1004 and a high-speed expansion port 1012 are connected to the high-speed controller 1008. Further, an external input/output device such as a display 1016 or the like is connected to the high-speed controller 1008. On the other hand, a low-speed expansion port 1014 and the storage device 1006 are connected to the low-speed controller 1010.
The processor 1002, the memory 1004, the storage device 1006, the high-speed controller 1008, the low-speed controller 1010, and the high-speed expansion port 1012 are connected to each other through various buses. Further, the processor 1002, the memory 1004, the storage device 1006, the high-speed controller 1008, the low-speed controller 1010, and the high-speed expansion port 1012 may be implemented on a common motherboard or may be implemented in other forms as appropriate.
The processor 1002 is a central processing unit (CPU), for example, and is able to process instructions executed within the computer apparatus 1000. Such instructions include an instruction that is used for displaying graphics information of a graphical user interface (GUI) on an external input/output device such as the display 1016 and stored in the memory 1004 or the storage device 1006.
Further, a plurality of processors, a plurality of busses, or a plurality of processors and a plurality of busses can be used as appropriate together with a plurality of memory devices and multiple types of memory devices. Further, a plurality of computer apparatus 1000 can be connected to each device that performs a part of the necessary process. For example, a plurality of computer apparatus 1000 can be connected to each other as a server bank, a group of blade servers, or a multiprocessor system.
The memory 1004 stores therein information within the computer apparatus 1000. For example, the memory 1004 may be a volatile memory unit or a non-volatile memory unit. The memory 1004 may be another computer readable medium, such as a magnetic disk, an optical disk, or the like, for example.
The storage device 1006 can configure mass storage used for the computer apparatus 1000. The storage device 1006 may be, for example, a computer readable medium such as a floppy (registered trademark) disk device, a hard disk device, an optical disk device, a tape device, a solid state memory device such as a flash memory, a disk array, or the like. Further, the storage device 1006 may include such a computer readable storage medium. The storage device 1006 may include a storage area network or a device with another configuration. A computer program product may be tangibly embodied in an information carrier. The computer program product can also store an instruction that executes one or a plurality of processes as described above when executed. The information carrier may be a memory device such as the memory 1004, the storage unit 1006, or the memory on the processor 1002 or may be a computer readable medium or a machine readable medium such as a carrier signal.
The high-speed controller 1008 manages processes in which the bandwidth for the computer apparatus 1000 is intensively used. On the other hand, the low-speed controller 1010 manages processes in which the bandwidth is less intensively used. However, such allocation of the functions is a mere example, and allocation is not limited thereto. Further, a part or a whole of the high-speed controller 1008 may be incorporated in the processor 1002.
The high-speed controller 1008 is connected to the high-speed expansion port 1012 that can accept the memory 1004 and various expansion cards. Further, the high-speed controller 1008 is connected to the display 1016 via a graphics processor or an accelerator, for example.
Further, the low-speed controller 1010 is connected to the storage device 1006 and the low-speed expansion port 1014. The low-speed expansion port 1014 can include, for example, a communication port of various standards such as Universal Serial Bus (USB), Bluetooth (registered trademark), wired or wireless Ethernet (registered trademark), or the like. One or plurality of input/output devices such as a keyboard, a pointing device, a scanner, or the like can be connected to the low-speed expansion port 1014. Further, one or plurality of network devices such as a switch, a router, or the like can be connected to the low-speed expansion port 1014 via a network adapter, for example.
The computer apparatus 1000 can be implemented in many different forms. For example, the computer apparatus 1000 can be implemented in a form of a typical server or a plurality of servers in a form of a group of such servers. Further, the computer apparatus 1000 can be implemented as a part of the rack server system. Furthermore, the computer apparatus 1000 can be implemented in a form of a personal computer such as a laptop computer, a desktop computer, or the like.
The computer apparatus 1000 described above can function as a part of the gate apparatus 10 in the example embodiments described above. In this case, the processor 1002 of the computer apparatus 1000 can function as the gate control unit 112 by executing a program that implements the function of the gate control unit 112 of the gate apparatus 10.
Further, the computer apparatus 1000 can function as the face matching apparatus 30 in the example embodiments described above. In this case, the processor 1002 of the computer apparatus 1000 can function as the face matching control unit 302 by executing a program that implements the function of the face matching control unit 302 of the face matching apparatus 30. That is, the processor 1002 executes programs that implement the functions of respective units of the image data acquisition unit 308, the face detection unit 310, the face feature amount extraction unit 312, the face matching unit 314, the priority calculation unit 318, and the identical-person processing unit 320. Thereby, the processor 1002 can function as each unit of the image data acquisition unit 308, the face detection unit 310, the face feature amount extraction unit 312, the face matching unit 314, the priority calculation unit 318, and the identical-person processing unit 320. Further, the storage device 1006 of the computer apparatus 1000 can function as the storage units 304 and 324 of the face matching apparatus 30.
The computer apparatus 1000 can function as the datacenter server 40 in the example embodiments described above. In this case, the processor 1002 of the computer apparatus 1000 can function as the control unit 402 by executing a program that implements the function of the control unit 402 of the datacenter server 40. Further, the storage device 1006 of the computer apparatus 1000 can function as the storage unit 404 of the datacenter server 40.
Note that a part or a whole of the program executed by the processor 1002 of the computer apparatus 1000 can be provided by a computer readable storage medium storing the above, such as a digital versatile disc-read only memory (DVD-ROM), a compact disc-read only memory (CD-ROM), a flash memory such as a USB memory or the like.
Other Example EmbodimentsThe face recognition system illustrated in each of the example embodiments described above can be configured as illustrated in
As illustrated in
Further, the face matching apparatus described in each of the above example embodiments can be configured as illustrated in
As illustrated in
Further, the face recognition system described in each of the above example embodiments can be configured as illustrated in
As illustrated in
The present invention is not limited to the example embodiments described above, and various modifications are possible.
For example, while the situations where identity verification by face matching is performed for visitors who intend to enter the inside of the facility as authentication subjects and for shopping customers who make electronic payment as an authentication subject, respectively, have been described as examples in the above example embodiments, the invention is not limited thereto. An authentication subject refers to a person to be verified as to whether or not to have some authority, such as a visitor entering a facility, an entrant at immigration examination, or the like. A situation where identity verification by face matching may be a situation where identity verification by face matching is performed in immigration control, entrance and exit control for a room, or the like, for example.
Further, while the cases where the fixed camera is used as an image capturing unit have been described as examples in the above example embodiments, a camera that functions as the image capturing unit is not limited thereto. For example, instead of the fixed camera 20, a movable camera in which the orientation thereof can be changed by a pan function, a tilt function, or the like may be used as the image capturing unit. In the case of a movable camera, the orientation thereof can be changed by automatic control or remote control.
Further, while the cases where identity verification by face matching is performed after ticket presentation is made by an authentication subject whose annual pass or membership card information is read by the reading unit 108 or 830 have been described as examples in the above example embodiments, the invention is not limited thereto. For example, the reading units 108 and 830 can be configured to read, from a medium carried by an authentication subject, identification information such as ID information uniquely identifying the medium by using wireless communication or the like without requiring action, namely, ticket presentation by an authentication subject.
The whole or part of the example embodiments disclosed above can be described as, but not limited to, the following supplementary notes.
Supplementary Note 1A face recognition system comprising:
a face detection unit that detects a face image from an image including an authentication subject as a detected face image;
a storage unit that stores identification information identifying the authentication subject and a registered face image of the authentication subject in association with each other; and
a face matching unit that, in response to acquisition of the identification information identifying the authentication subject, matches, against the registered face image corresponding to the acquired identification information, the detected face image detected by the face detection unit from an image captured before the acquisition.
Supplementary Note 2The face recognition system according to supplementary note 1,
wherein the face detection unit detects a plurality of detected face images from a plurality of frames of images, and
wherein the face matching unit matches the plurality of detected face images against the registered face image.
Supplementary Note 3The face recognition system according to supplementary note 2 further comprising a priority calculation unit that calculates a priority used for determining order of performing matching against the registered face image for the plurality of detected face images,
wherein the face matching unit matches the plurality of detected face images against the registered face image in descending order of the priority calculated by the priority calculation unit.
Supplementary Note 4The face recognition system according to supplementary note 2 or 3 further comprising an identical-person processing unit that classifies the plurality of detected face images on an identical person basis and calculates a priority for determining order of performing matching against the registered face image for the detected face image classified into the identical person,
wherein the face matching unit matches the detected face image classified into the identical person against the registered face image in descending order of the priority calculated by the identical-person processing unit.
Supplementary Note 5The face recognition system according to any one of supplementary notes 1 to 4 further comprising an update unit that registers, as a new registered face image, the detected face image in which matching against the registered face image is matched.
Supplementary Note 6The face recognition system according to any one of supplementary notes 1 to 5 further comprising:
an image capturing unit that captures an image including the authentication subject; and
another image capturing unit that captures a face image of the authentication subject and acquires the face image of the authentication subject when matching of the detected face image against the registered face image is not matched,
wherein the face matching unit matches, against the registered face image, the face image of the authentication subject acquired by the another image capturing unit.
Supplementary Note 7The face recognition system according to any one of supplementary notes 1 to 5 further comprising a display unit that displays the detected face image and the registered face image when matching of the detected face image and the registered face image is not matched.
Supplementary Note 8The face recognition system according to any one of supplementary notes 1 to 7, wherein the face matching unit acquires, offline, the registered face image to be matched against the detected face image.
Supplementary Note 9The face recognition system according to any one of supplementary notes 1 to 8 further comprising an image capturing unit that captures an image including the authentication subject,
wherein the image capturing unit is installed in a vertical orientation so as to capture the image that is vertically long.
Supplementary Note 10A face recognition method comprising: detecting a face image from an image including an authentication subject as a detected face image; and
in response to acquisition of the identification information identifying the authentication subject, matching, against a registered face image registered in association with the acquired identification information, the detected face image detected from an image captured before the acquisition.
Supplementary Note 11The face recognition method according to supplementary note 10 further comprising:
detecting a plurality of detected face images from a plurality of frames of images; and
matching the plurality of detected face images against the registered face image.
Supplementary Note 12A storage medium in which a program is stored, wherein the program causes a computer to execute:
detecting a face image from an image including an authentication subject as a detected face image; and
in response to acquisition of the identification information identifying the authentication subject, matching, against a registered face image registered in association with the acquired identification information, the detected face image detected from an image captured before the acquisition.
While the present invention has been described with reference to the example embodiments, the present invention is not limited to the example embodiment described above. Various modification that can be understood by those skilled in the art can be made to the configuration or the details of the present invention within the scope of the present invention.
This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2016-036406, filed on Feb. 26, 2016, the disclosure of which is incorporated herein in its entirety by reference.
REFERENCE SIGNS LIST
- 1, 2, 3, 4, 5 face recognition system
- 10 gate apparatus
- 20 fixed camera
- 30 face matching apparatus
- 40 datacenter server
- 108 reading unit
- 110 hand camera
- 112 gate control unit
- 302 face matching control unit
- 304 storage unit
- 308 image data acquisition unit
- 310 face detection unit
- 312 face feature amount extraction unit
- 314 face matching unit
- 318 priority calculation unit
- 320 identical-person processing unit
- 324 storage unit
- 402 control unit
- 404 storage unit
- 810 register terminal
- 830 reading unit
Claims
1. A face recognition system comprising:
- a memory comprising instructions; and
- a processor configured to execute the instructions to:
- detect a face image from an image including an authentication subject as a detected face image;
- in response to acquisition of identification information identifying the authentication subject, match, against the registered face image corresponding to the acquired identification information, the detected face image detected from an image captured before the acquisition, the identification information and the registered face image of the authentication subject being stored in association with each other.
2. The face recognition system according to claim 1, wherein the processor further configured to execute the instructions to:
- detect a plurality of detected face images from a plurality of frames of images, and
- match the plurality of detected face images against the registered face image.
3. The face recognition system according to claim 2, wherein the processor further configured to execute the instructions to:
- calculate a priority used for determining order of performing matching against the registered face image for the plurality of detected face images; and
- match the plurality of detected face images against the registered face image in descending order of the priority calculated by the priority calculation unit.
4. The face recognition system according to claim 2, wherein the processor further configured to execute the instructions to:
- classify the plurality of detected face images on an identical person basis and calculate a priority for determining order of performing matching against the registered face image for the detected face image classified into the identical person; and
- match the detected face image classified into the identical person against the registered face image in descending order of the priority.
5. The face recognition system according to any one of claim 1, wherein the processor further configured to execute the instructions to:
- register, as a new registered face image, the detected face image in which matching against the registered face image is matched.
6. The face recognition system according to claim 1 further comprising:
- an image capturing unit that captures an image including the authentication subject; and
- another image capturing unit that captures a face image of the authentication subject and acquires the face image of the authentication subject when matching of the detected face image against the registered face image is not matched,
- wherein the processor further configured to execute the instructions to:
- match, against the registered face image, the face image of the authentication subject acquired by the another image capturing unit.
7. The face recognition system according to claim 1 further comprising a display unit that displays the detected face image and the registered face image when matching of the detected face image and the registered face image is not matched.
8. The face recognition system according to claim 1, wherein the processor further configured to execute the instructions to:
- acquire, offline, the registered face image to be matched against the detected face image.
9. The face recognition system according to claim 1 further comprising an image capturing unit that captures an image including the authentication subject,
- wherein the image capturing unit is installed in a vertical orientation so as to capture the image that is vertically long.
10. A face recognition method comprising:
- detecting a face image from an image including an authentication subject as a detected face image; and
- in response to acquisition of the identification information identifying the authentication subject, matching, against a registered face image registered in association with the acquired identification information, the detected face image detected from an image captured before the acquisition.
11. The face recognition method according to claim 10 further comprising:
- detecting a plurality of detected face images from a plurality of frames of images; and
- matching the plurality of detected face images against the registered face image.
12. A non-transitory storage medium in which a program is stored, wherein the program causes a computer to execute:
- detecting a face image from an image including an authentication subject as a detected face image; and
- in response to acquisition of the identification information identifying the authentication subject, matching, against a registered face image registered in association with the acquired identification information, the detected face image detected from an image captured before the acquisition.
Type: Application
Filed: Feb 23, 2017
Publication Date: Feb 14, 2019
Patent Grant number: 11055513
Applicant: NEC CORPORATION (Tokyo)
Inventors: Noriaki HAYASE (Tokyo), Hiroshi TEZUKA (Tokyo)
Application Number: 16/079,814