REMAINING THERMAL TRACE EXTRACTION METHOD, REMAINING THERMAL TRACE EXTRACTION APPARATUS AND PROGRAM
A heat trace area extraction method executed by a computer, includes: generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured; generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
The present invention relates to a heat trace area extraction method, a heat trace area extraction apparatus, and a program.
BACKGROUND ARTWith the epidemic of the novel coronavirus infection (COVID-19), various measures have been taken to prevent infection, among which are disinfection and sterilization. Since virus infection occurs when the mouth, nose, or mucous membrane of the eyes are touched with fingers to which the virus adheres, disinfection of objects around the body that are likely to be touched by fingers and by ab infected person is recommended. For this reason, in restaurants and sports gyms used by many people, measures of periodically wiping desks, doors, training equipment, and the like are periodically wiped with an alcohol disinfectant or the like are taken.
It is difficult to visually confirm which part of an object has actually been touched, and in such stores, a countermeasure of disinfecting all places that people may touch are disinfected at every predetermined time is often performed. However, the work of disinfecting all places that people may touch periodically involves a large amount of labor. For this reason, a method for improving efficiency of disinfection using a drone has been proposed (for example, NPL 1).
On the other hand, a method of detecting a place where a person is present using a monitoring camera or the like and disinfecting only that place is conceivable. There are many examples of research in human detection using a video, and since recent years, a place where a person is present can be identified from a video with considerably high accuracy. If only a place where a person is present can be disinfected using such a technology, it is considered that the labor of disinfection is reduced. For example, it is not necessary to disinfect the periphery of the place in a time zone in which no person uses it at all. On the other hand, if it is known that a plurality of persons use the place, it is considered that the place can be disinfected earlier. In disinfection at regular time intervals, infection mediated by an object, that is, infection caused by an infected person touching an object and another person touching the object cannot be prevented within the interval, but it is considered that such spread of infection can be further reduced if it can be disinfected flexibly according to human use. In addition, such a method can prevent unnecessary disinfection, and thus the effect of reducing a disinfectant can be expected. That is, if disinfection can be performed in accordance with the use of an object by a person detected by a monitoring camera, it is possible to expect to reduce labor, prevent the spread of infection, and save a disinfectant as compared to disinfecting everything that may have been used at regular time intervals.
It is considered that labor and disinfectant are further reduced if it is possible to disinfect only a place that a person actually touches more precisely instead of disinfecting everything around a place where a person is present. However, it is difficult to identify whether or not a person who is present on the spot actually touches an object in a method using a visible video captured by a monitoring camera or the like. For example, it is assumed that a camera is installed downward from the ceiling and captures an image of a table. It is considered that it is difficult to determine whether a hand is touching the desk or not from a video captured by the installed camera when the hand is put out on the table. Therefore, a method of detecting contact with an object using a shadow has been proposed (NPL 2).
CITATION LIST Non Patent Literature[NPL 1] “Efficiently disinfect stadium using drone! Devised by US start-up has devised for novel coronavirus,” [online], Internet<URL:https://techable.jp/archives/124749>
[NPL 2] Ryusei Yoshida, Feng Yaokai, Seiichi Uchida, Touch sensing with image recognition, 2011 Electrical Association Kyushu Branch Joint Conference, 2011.
SUMMARY OF INVENTION Technical ProblemHowever, the method of NPL 2 requires a strong light source such as a projector. Further, it is assumed that recognition accuracy is considerably affected by a positional relationship between the camera and the light source. A strong light source cannot be freely installed in many environments, and is not considered that the strong light source is not suitable for the purpose of detecting and presenting a place touched by a person in various places and supporting disinfection.
The present invention was made in view of the aforementioned circumstances and an object of the present invention is to improve the accuracy of detection of places touched by people.
Solution to ProblemAccordingly, in order to solve the above problem, a computer executes a differential visible image generation procedure of generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured, a differential thermal image generation procedure of generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured, and an extraction procedure of extracting a heat trace area on the basis of the first differential image and the second differential image.
Advantageous Effects of InventionIt is possible to improve the accuracy of detection of places touched by people.
In the present embodiment, a device, a method, and a program for detecting a place touched by a person using a thermal image in order to help sterilize or disinfect viruses are disclosed. Since humans are homeothermic animals and heat emanates from their hands and feet, heat remains in a contact place for a certain period of time after a person touches something. For example, a method of abusing this heat trace to decode a pass code of a smartphone has been reported (“Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017”).
Heat traces remain not only on the screen of a smartphone but also on various places such as desks and walls. That is, if a heat trace is identified on the basis of a video (thermal image) of a thermal camera, a place touched by a person indoors or the like can be detected precisely.
A heat trace area can be extracted according to background subtraction with a thermal image from before a person touches as a background. However, since the human body itself is heated, the human body area is also extracted along with a heat trace in this method. Therefore, in the present embodiment, a visible image is acquired simultaneously with a thermal image, and a heat trace area is extracted by comparing the thermal image with the visible image.
Specifically, background subtraction is performed for each of a visible image and a thermal image, and a heat trace area is extracted from a difference in results of background subtraction. Since a heat trace cannot be observed in a visible image (that is, with the naked eye), the heat trace cannot be extracted even if background subtraction is performed for the visible image with a visible image from before a person touches as a background. On the other hand, when a person is present on the spot, if background subtraction is performed with a visible image captured in a state where the person is not present as a background, the area of the person is extracted. That is, when an area extracted according to background subtraction in a thermal image is similarly extracted in a visible image, it can be ascertained that the area is not a heat trace. On the other hand, an area extracted in the thermal image according to background subtraction and not extracted in the visible image is highly likely to be a heat trace. In the present embodiment, a heat trace area extracted by such a method is visualized, and a place touched by a person is transmitted to a user. Simultaneous acquisition of a thermal image and a visible image may be performed using a device such as a sensor node (“Yoshinari Shirai, Yasue Kishino, Takayuki Suyama, Shin Mizutani: PASNIC: a thermal based privacy-aware sensor node for image capturing, UbiComp/ISWC '19 Adjunct, pp. 202-205, 2019”) including a visible light camera and a thermal camera.
Embodiments of Present InventionThe object of embodiment of the present invention will be described with reference to
Considering disinfecting the area actually touched by the hand, a differential area extracted by background subtraction for the thermal image at the time t3 may be disinfected. On the other hand, a differential area extracted by background subtraction for the thermal image at the time t2 includes a part that does not touch the door. The differential area extracted in the thermal image at the time t2 is actually an area where a human body is present and is not an area of a heat trace remaining after the door is actually touched. From the viewpoint of disinfection, the differential area extracted at the time t3 may be specified, and the differential area extracted by background subtraction at the time t2 is not necessary.
Therefore, in the present embodiment, when a similar differential area is extracted by background subtraction even in the visible image, it is determined that the differential area is not a heat trace area. Since the shape of the arm is extracted even in the visible image at the time t2, it is determined that the differential area extracted in the thermal image is not a heat trace area (that is, a part touched by the person). On the other hand, since the area extracted by background subtraction of the thermal image is not extracted in the visible image at the time t3, it is determined that the differential area extracted in the thermal image is a heat trace area (that is, the part touched by the person). When the system presents information indicating the heat trace area extracted on the basis of such determination, a user who has viewed the information can efficiently disinfect the part touched by the person.
Hereinafter, the heat trace area extraction apparatus 10 for realizing the above-described embodiment will be described in detail.
A program that realizes processing of the heat trace area extraction apparatus 10 is provided by a recording medium 101 such as a CD-ROM. When the recording medium 101 storing the program is set in the drive device 100, the program is installed from the recording medium 101 to the auxiliary storage device 102 via the drive device 100. The program may not necessarily be installed from the recording medium 101 and may be downloaded from another computer via a network. The auxiliary storage device 102 stores the installed program and stores necessary files, data, and the like.
The memory device 103 reads the program from the auxiliary storage device 102 and stores the program when an instruction for starting the program is issued. The CPU 104 executes functions relevant to the heat trace area extraction apparatus 10 according to the program stored in the memory device 103. The interface device 105 is used as an interface for connection to a network.
As shown in
In step S101, the visible image acquisition unit 11 acquires a visible image captured by the visible light camera 21, which is input from the visible light camera 21, and the thermal image acquisition unit 14 acquires a thermal image captured by the thermal camera 22, which is input from the thermal camera 22. In step S101, acquisition of the visible image by the visible image acquisition unit 11 and acquisition of the thermal image by the thermal image acquisition unit 14 may be performed simultaneously or may not be performed simultaneously. If simultaneous acquisition is not performed, some frames of a camera having a higher frame rate may be ignored in accordance with a camera having a lower frame rate. In addition, there is no problem if fps is high to some extent even in a method of alternately acquiring still images from the visible light camera 21 and the thermal camera 22 and regarding the acquired images as being simultaneously acquired. The visible image acquisition unit 11 transmits the acquired visible image to the background visible image generation unit 12, and the thermal image acquisition unit 14 transmits the acquired thermal image to the background thermal image generation unit 15.
Subsequently, the background visible image generation unit 12 stores the visible image transmitted from the visible image acquisition unit 11 in an auxiliary storage device 102, and the background thermal image generation unit 15 stores the thermal image transmitted from the thermal image acquisition unit 14 in the auxiliary storage device 102 (S102).
Steps S101 and S102 are repeated until a predetermined time T1 elapses. The predetermined time T1 may be a period in which one or more visible images and one or more thermal images are accumulated in the auxiliary storage device 102.
When the predetermined time T1 elapses from execution of the first step S101 (Yes in S103), processing proceeds to step S104. In step S104, the background visible image generation unit 12 generates a background image of a photographing range (referred to as a “background visible image” hereinafter) on the basis of a visible image group accumulated in the auxiliary storage device 102 in the predetermined period T1.
In addition, in step S104, the background thermal image generation unit 15 generates a background image of a photographing range (referred to as a “background thermal image” hereinafter) on the basis of a thermal image group accumulated in the auxiliary storage device 102 in the predetermined period T1.
For example, when the visible image group or the thermal image group (the visible image group and the thermal image group are simply referred to as a “captured image group” if they are not distinguished from each other) includes a plurality of captured images, a background image (background visible image and background thermal image) may be generated for each image group using an average value or a median value of pixel values (RGB) of each captured image group as a pixel value of each pixel. Accordingly, a background image in which a person or the like that has passed temporarily is removed and only a person or the like that continues to stay on the spot for a long time is included can be generated. Many studies have been made on how to dynamically create a background from images captured for a predetermined time, and a background image generation method in the present embodiment is not limited to a predetermined method.
The predetermined time T1 corresponds to the time t1 in
When the background visible image and the background thermal image are generated, step S105 and subsequent steps are executed. Steps S101 to S104 and step S105 may not be executed synchronously. For example, step S105 and subsequent steps may be started according to an instruction different from the execution instruction for steps S101 to S104.
In step S105, the visible image acquisition unit 11 and the thermal image acquisition unit 14 wait for the elapse of a predetermined time T2. The predetermined time T2 is, for example, an elapsed time from the time t2 to the time t3 in
When the predetermined time T2 elapses (Yes in S105), the visible image acquisition unit 11 acquires a visible image (referred to as a “target visible image” hereinafter) input from the visible light camera 21, and the thermal image acquisition unit 14 acquires a thermal image (referred to as a “target thermal image” hereinafter) input from the thermal camera 22 (S106). It is desirable that the target visible image and the target thermal image be images captured simultaneously (or almost simultaneously).
In subsequent step S107, the differential visible image generation unit 13 compares the background visible image generated by the background visible image generation unit 12 with the target visible image according to a background subtraction method and extracts a differential area with respect to the background visible image (area different from the background visible image) from the target visible image to generate a differential image representing a difference (referred to as a “differential visible image” hereinafter). In addition, the differential thermal image generation unit 16 compares the background thermal image generated by the background thermal image generation unit 15 with the target thermal image according to the background subtraction method and extracts a differential area with respect to the background thermal image (area different from the background thermal image) from the target visible image, thereby generating a differential image representing a difference (referred to as a “differential thermal image” hereinafter).
Meanwhile, if a difference in pixel values is a certain threshold value or more in comparison with the background image, it is assumed that the corresponding pixel is different from the background and, for example, when a binary image in which the pixel value of the pixel is 1 and the pixel value of the same pixel as the background is 0 is generated as each differential image (differential visible image and differential thermal image). In addition, each differential image is sent to the heat trace area extraction unit 17.
Subsequently, the heat trace area extraction unit 17 compares the differential visible image with the differential thermal image and extracts a heat trace area in the imaging range (S108).
When an area which is not similar to the differential visible area is extracted with respect to the differential thermal image, similarity determination of the differential area of each differential image may be used. For example, the heat trace area extraction unit 17 first performs labeling (extraction of connection area) on each binary image which is the differential visible image or the differential thermal image. Next, the heat trace area extraction unit 17 compares degrees of overlap between one or more differential areas obtained by labeling of the differential thermal image (referred to as “differential thermal areas” hereinafter) and one or more differential areas obtained by labeling of the differential visible image (referred to as “differential visible areas” hereinafter). Specifically, the heat trace area extraction unit 17 counts whether or not the differential areas to be compared match in pixel units, and determines that two compared differential areas are not similar when a matching rate is a certain threshold value or more. The heat trace area extraction unit 17 extracts a heat difference area which is not similar to any differential visible region as a heat trace area.
The heat trace area extraction unit 17 transmits information indicating the heat trace area and the background visible image to the heat trace area output unit 18. In this case, the heat trace area extraction unit 17 may generate a binary image in which the heat trace area part is white and the other part is black and transmit the binary image to the heat trace area output unit 18 as the information indicating the heat trace area. It should be noted that determination of a similarity between areas is actively performed in research of pattern matching or the like, and the present embodiment is not limited to a predetermined method
Subsequently, the heat trace area output unit 18 outputs information indicating the extraction result of the heat trace area such that a user can check the information (S109). For example, the heat trace area output unit 18 may output an image obtained by combining white pixels of the binary image indicating the heat trace area on the background visible image. In addition, the output form is not limited to a predetermined form. For example, setting in a display device, storage in the auxiliary storage device 102, and transmission to a user terminal via a network may be performed.
Subsequently to step S109, step S105 and subsequent steps are executed. Alternatively, step S109 may be executed after steps S105 to S108 are repeated executed a plurality of times. In this case, heat trace areas extracted the plurality of times can be output collectively.
It is also possible to project the binary image representing the heat trace area to a photographing range in the environment using a projector or the like. In this case, it is possible to directly transmit the part touched by the person to each person in the environment by projecting the heat trace area trace image to the heat trace part in the environment. Since a person who comes to the place can be aware of a part touched by another person if this method is used, it is possible to encourage an action of avoiding touching the part touched by the person without disinfecting the part.
In addition, steps S101 to S103 may be executed in parallel with step S105 and subsequent steps. In this case, the background visible image and the background thermal image are periodically updated. Therefore, it is possible to expect improvement of resistance to change in the background according to the lapse of time.
Although it is assumed that a camera is fixed to an indoor place and a heat trace left on a wall, a desk, or the like which is relatively fixed is extracted in the above case, the present embodiment is applicable to a moving thing if the position of an object that is a target touched by a person can be specified in an image. For example, if a QR code (registered trademark) for identifying a position is attached to four corners of the seat surface of a chair and the position of the seat surface can be estimated using the QR code (registered trademark) as a clue, a heat trace remaining on the seat surface can be estimated with the seat surface as a background even when the chair moves, and the heat trace can be displayed on the seat surface on an image. Various techniques for estimating the position of an object in an image have been proposed and are not limited to the use of the QR code (registered trademark).
As described above, according to the present embodiment, it is possible to improve the accuracy of detection of a place touched by a person. As a result, it is possible to efficiently sterilize and disinfect a place where a virus such as a novel coronavirus may adhere, for example.
It should be noted that in the present embodiment, the differential visible image is an example of a first differential image. The differential thermal image is an example of a second differential image. The heat trace area extraction unit 17 is an example of an extraction unit.
Although an embodiment of the present invention has been described in detail above, the present invention is not limited to the specific embodiment described above, and various modifications and changes can be made within the concept of the present invention described in the claims.
REFERENCE SIGNS LIST
-
- 10 Heat trace area extraction apparatus
- 11 Visible image acquisition unit
- 12 Background visible image generation unit
- 13 Differential visible image generation unit
- 14 Thermal image acquisition unit
- 15 Background thermal image generation unit
- 16 Differential thermal image generation unit
- 17 Heat trace area extraction unit
- 18 Heat trace area output unit
- 21 Visible light camera
- 22 Thermal camera
- 100 Drive device
- 101 Recording medium
- 102 Auxiliary storage device
- 103 Memory device
- 104 CPU
- 105 Interface device
- B Bus
Claims
1. A heat trace area extraction method executed by a computer, the heat trace area extraction method comprising:
- generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
- generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
- extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
2. A heat trace area extraction method executed by a computer, the heat trace area extraction method comprising:
- generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
- generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
- extracting, as the heat trace area, an area that is not similar to any of one or more differential visible areas among one or more differential thermal areas, with areas different from the visible image of the background in the first differential image as the differential visible areas and with areas different from the thermal image of the background in the second differential image as the differential thermal areas.
3. The heat trace area extraction method according to claim 1, further comprising:
- outputting an image obtained by combining the visible image of the background of the certain range and a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black.
4. A heat trace area extraction apparatus comprising:
- a memory; and
- a processor configured to execute
- generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
- generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
- extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
5. (canceled)
6. The heat trace area extraction apparatus according to claim 4, wherein the processor is further configure to execute
- outputting an image obtained by combining the visible image of the background of the certain range and a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black.
7. (canceled)
8. The heat trace area extraction method according to claim 1, further comprising:
- projecting a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black to the certain range.
9. The heat trace area extraction apparatus according to claim 4, wherein the processor is further configure to execute projecting a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black to the certain range.
10. A non-transitory computer-readable recording medium having computer-readable instructions stored thereon, which when executed, cause a computer to execute the heat trace area extraction method according to claim 1.
Type: Application
Filed: Oct 16, 2020
Publication Date: Jan 18, 2024
Inventors: Yoshinari SHIRAI (Tokyo), Yasue KISHINO (Tokyo), Takayuki SUYAMA (Tokyo), Shin MIZUTANI (Tokyo), Kazuya OHARA (Tokyo)
Application Number: 18/248,295