OBJECT TRACKING APPARATUS, OBJECT TRACKING METHOD, AND PROGRAM
An object tracking apparatus (10) includes a target determination information acquisition unit (110), an image processing unit (120), and a control unit (130). The target determination information acquisition unit (110) acquires target determination information determining an object to be tracked. The image processing unit (120) detects the object to be tracked and tracks the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus (20). The control unit (130) controls an imaging range of the imaging apparatus (20) in such a way that the object to be tracked is included in the to-be-processed region.
Latest NEC Corporation Patents:
- METHOD, DEVICE AND COMPUTER STORAGE MEDIUM FOR COMMUNICATION
- RADIO TERMINAL AND METHOD THEREFOR
- OPTICAL SPLITTING/COUPLING DEVICE, OPTICAL SUBMARINE CABLE SYSTEM, AND OPTICAL SPLITTING/COUPLING METHOD
- INFORMATION PROVIDING DEVICE, INFORMATION PROVIDING METHOD, AND RECORDING MEDIUM
- METHOD, DEVICE AND COMPUTER STORAGE MEDIUM OF COMMUNICATION
The present invention relates to a technique for detecting and tracking a particular object by using an image.
BACKGROUND ARTOne example of a technique for tracking a particular object in an image is disclosed in, for example, PTLs 1 to 3 below. PTL 1 below discloses a technique for, when a part of a region including a subject to be tracked is cut out to make a cut-out video, manually (user operation) or automatically tracking the subject. Further, PTL 2 below discloses a technique for processing an entire region of an image taken by an imaging apparatus attached to an excavator to determine a partial region having high possibility of presence of a person, and further processing the partial region to determine whether a person is present in the region. Further, PTL 3 below discloses a technique for detecting a moving body from an image generated by an imaging unit that can change an imaging direction through a pan operation and a tilt operation, and performing pan operation and tilt control in such a way that the moving body is positioned near a center of the image.
CITATION LIST Patent Literature[PTL 1] International Publication No. WO 2016/002228
[PTL 2] Japanese Patent Application Publication No. 2017-151815
[PTL 3] Japanese Patent Application Publication No. 2016-092606
SUMMARY OF INVENTION Technical ProblemA high-performance imaging apparatus can generate an image with a very high resolution such as 4K or 8K. A high-resolution image requires a large capacity of data. Thus, in a case of processing the high-resolution image, a large volume of resources is required for a machine performing image processing.
The present invention has been made in view of the above-described problem. One object of the present invention is to provide a technique for reducing a resource needed when object tracking is performed by using a high-resolution image.
Solution to ProblemAn object tracking apparatus according to the present invention includes:
a target determination information acquisition unit that acquires target determination information determining an object to be tracked;
an image processing unit that detects the object to be tracked and tracks the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
a control unit that controls an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
An object tracking method execute by a computer according to the present invention includes:
acquiring target determination information determining an object to be tracked;
detecting the object to be tracked and tracking the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
controlling an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
A program according to the present invention causes a computer to execute the above-described object tracking method.
Advantageous Effects of InventionAccording to the present invention, a resource needed when object tracking is performed by using a high-resolution image can be reduced.
The above-described object and other objects, features, and advantageous effects become more apparent from the preferred example embodiments described below and the following accompanying drawings.
Hereinafter, example embodiments of the present invention will be described by using the drawings. Note that, a similar component is assigned with a similar reference sign throughout all the drawings, and description therefor will not be repeated as appropriate. Further, in each block diagram, each block represents not a configuration on a hardware basis but a configuration on a function basis, except as particularly described.
SUMMARY OF INVENTIONA summary of the present invention will be described by using
The object tracking apparatus 10 performs, in response to acquisition of information (hereinafter, also written as “target determination information”) determining an object to be tracked, image processing of tracking, on an image acquired from the imaging apparatus 20, the object determined by the target determination information. The object tracking apparatus 10 tracks, by using a known object tracking algorithm, a particular object among a plurality of images (two or more consecutive frame images) acquired from the imaging apparatus 20.
Herein, as illustrated in
Note that, when the object to be tracked moves to outside the to-be-processed region AP, the object tracking apparatus 10 is no longer able to track the object. In view of this, the object tracking apparatus 10 controls an imaging range of the imaging apparatus 20 in such a way that the object determined by the target determination information is included in the to-be-processed region AP, by using a position (a position on an image coordinate system) of the object to be tracked. Note that, the object tracking apparatus 10 can change the imaging range of the imaging apparatus 20 by controlling, by using a control signal, an operation of a mechanism (not illustrated) controlling a zoom factor of the imaging apparatus 20, a direction of the imaging apparatus 20, or a position of the imaging apparatus 20.
Advantageous EffectAs described above, according to the present invention, upon acquisition of information (target determination information) determining an object to be tracked, image processing of tracking the object determined by the target determination information is executed. The image processing is executed on the to-be-processed region AP being a part of an image acquired from the imaging apparatus 20. Limiting a region subjected to the image processing to a part of a region of the image can reduce a computational cost involved in the image processing. Then, reducing the computational cost enables processing on a high-resolution image with less delay (or with no delay), even when a computer for the image processing has low performance.
However, when a region subjected to the image processing is limited, there arises a problem that an object trackable range in an image is decreased. In view of this, according to the present invention, a zoom factor, a direction, a position, or the like of the imaging apparatus is controlled in such a way that the object to be tracked is included in the to-be-processed region AP. Accordingly, the object trackable range is substantially expanded. In other words, the present invention can have an advantageous effect of reducing the computational cost involved in the image processing while reducing influence on the object trackable range.
First Example Embodiment <Function Configuration Example>The target determination information acquisition unit 110 acquires target determination information. The “target determination information” is information for determining an object to be tracked included in an image (information for uniquely identifying an object to be tracked included in an image) acquired from an imaging apparatus 20. The target determination information is information extractable from an image, including a feature value indicating a feature specific to the object to be tracked (for example, a color, a shape, or the like specific to the object). Further, the target determination information may be information indicating an image region from which the feature value is extracted (for example, information specifying a position or a region on an image).
The image processing unit 120 executes, on a to-be-processed region AP, image processing of detecting an object to be tracked determined by the target determination information and tracking the detected object. In other words, the image processing unit 120 does not execute the image processing for detecting and tracking the object to be tracked on a region other than the to-be-processed region AP. Thus, even when some object is included in the image acquired from the imaging apparatus 20, the object is not detected and tracked by the image processing unit 120 when the object is positioned in a region outside the to-be-processed region AP.
<<Regarding to-be-Processed Region AP>>
Herein, the to-be-processed region AP is a partial region of an image, as illustrated in
As one example, the shape of the to-be-processed region AP can be set to any shape. For example, the object tracking apparatus 10 may accept, via a user terminal 30, a specification input for specifying a region subjected to processing or a region not subjected to processing, and may define the to-be-processed region AP, based on the input. In this case, position coordinates of the to-be-processed region AP computed based on the specification input are stored (set) in a memory or the like.
As another example, a size of the to-be-processed region AP may be determined in advance. Specifically, the to-be-processed region AP may be determined to a size (for example, a size equivalent to a video graphics array (VGA) (about 300 thousand pixels), or the like) capable of ensuring a sufficient processing speed. In this case, position coordinates of the to-be-processed region AP are stored (set) as a fixed value in the memory or the like.
According to the present invention, the to-be-processed region AP is set for one purpose of reducing a computational cost in the image processing. Thus, when a sufficient resource is left in an apparatus performing the image processing, there is little influence on a processing speed experienced by a user even when the to-be-processed region AP becomes large to some extent. In view of this, the size of the to-be-processed region AP may be determined based on a size of a resource allocatable to the image processing on the to-be-processed region AP. Specifically, the object tracking apparatus 10 first acquires information on an own surplus resource, and determines, based on the information, a size of a resource allocatable to the image processing on the to-be-processed region AP. Then, the object tracking apparatus 10 determines, based on the size of the allocatable resource, a size (position coordinates) of the to-be-processed region AP, and stores (sets) the position coordinates in the memory or the like.
The control unit 130 controls an imaging range of the imaging apparatus 20 in such a way that the object to be tracked determined by the target determination information is included in the to-be-processed region AP. The control unit 130 can control the imaging range of the imaging apparatus 20 in such a way that the object to be tracked is included in the to-be-processed region AP, by acquiring a motion (a displacement of a detected position) of the object to be tracked by using a plurality of images, and operating a mechanism (not illustrated) controlling the imaging range of the imaging apparatus 20 in response to the motion.
<<Specific Example of Mechanism Controlling Imaging Range of Imaging Apparatus 20>>Specific examples of the “mechanism controlling the imaging range of the imaging apparatus 20” include a mechanism controlling a zoom of the imaging apparatus 20, an electric pan/tilt head controlling a direction of the imaging apparatus 20, an electric slider controlling a capturing position of the imaging apparatus 20, and the like.
<<Specific Example of Control Performed by Control Unit 130>>The control unit 130 controls an imaging range of the imaging apparatus 20 by operating the mechanism controlling the imaging range of the imaging apparatus 20 as described above. As one example, the control unit 130 can change the imaging range of the imaging apparatus 20 by operating a mechanism controlling a zoom of the imaging apparatus 20 in response to a motion of an object to be tracked on an image. As another example, when the imaging apparatus 20 is mounted on an unillustrated electric pan/tilt head (a mechanism controlling a direction of the imaging apparatus 20), the control unit 130 can change the imaging range of the imaging apparatus 20 by operating the electric pan/tilt head in response to a motion of the object to be tracked on the image. As further another example, when the imaging apparatus 20 is mounted on an unillustrated electric slider (a mechanism controlling a position of the imaging apparatus 20), the control unit 130 can change the imaging range of the imaging apparatus 20 by operating the electric slider in response to a motion of the object to be tracked on the image. Further, the control unit 130 may control the imaging range of the imaging apparatus 20 by combining operations on a plurality of mechanisms described above.
Further, in terms of user's visibility, the control unit 130 is preferably configured to control the imaging range of the imaging apparatus 20 in such a way that the object to be tracked is positioned at a point (for example, a vicinity of a central portion of the image) easily viewed by a user. As one example, the control unit 130 controls the imaging range of the imaging apparatus 20 in such a way that the object to be tracked is included in a predetermined region (e.g.,
The object tracking apparatus 10 may be achieved by hardware (example: a hard-wired electronic circuit, or the like) achieving each of the function configuration units, or may be achieved by a combination of hardware and software (example: a combination of an electronic circuit and a program controlling the electronic circuit, or the like). Hereinafter, a case will be further described in which the object tracking apparatus 10 is achieved by a combination of hardware and software.
The object tracking apparatus 10 includes a bus 1010, a processor 1020, a memory 1030, a storage device 1040, an input/output interface 1050, and a network interface 1060.
The bus 1010 is a data transmission line through which the processor 1020, the memory 1030, the storage device 1040, the input/output interface 1050, and the network interface 1060 transmit and receive data to and from one another. However, a method of connecting the processor 1020 and the like to one another is not limited to bus connection.
The processor 1020 is a processor achieved by a central processing unit (CPU), a graphics processing unit (GPU), or the like.
The memory 1030 is a main storage achieved by a random access memory (RAM) or the like.
The storage device 1040 is an auxiliary storage achieved by a hard disk drive (HDD), a solid state drive (SSD), a memory card, a read only memory (ROM), or the like. The storage device 1040 stores a program module for achieving each of the functions (the target determination information acquisition unit 110, the image processing unit 120, the control unit 130, and the like) of the object tracking apparatus 10. The processor 1020 achieves a function associated with each program module, by reading each of the program modules into the memory 1030 and executing the program module.
The input/output interface 1050 is an interface for connecting the object tracking apparatus 10 to peripheral equipment 15. The peripheral equipment 15 includes, for example, input equipment such as a keyboard and a mouse, and output equipment such as a display (touch panel display) and a speaker. The object tracking apparatus 10 is connected to the peripheral equipment 15 via the input/output interface 1050.
The network interface 1060 is an interface for connecting the object tracking apparatus to a network. The network is, for example, a local area network (LAN) or a wide area network (WAN). A method by which the network interface 1060 connects to the network may be wireless connection, or may be wired connection. The object tracking apparatus 10 is communicably connected to external apparatuses including the imaging apparatus 20 and the user terminal 30 via the network interface 1060. The imaging apparatus 20 is, for example, a camera on which a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor is mounted. The user terminal 30 is a terminal used by a person (user) who performs confirmation work on an image generated by the imaging apparatus 20. The user terminal 30 is a stationary personal computer (PC) or a portable terminal (such as a smartphone or a tablet terminal). The user terminal 30 is not particularly limited.
Note that, the hardware configuration illustrated in
Hereinafter, processing executed by the object tracking apparatus 10 according to the present example embodiment will be described by using the drawings.
First, the object tracking apparatus 10 communicates with the imaging apparatus 20 via the network interface 1060, and acquires an image generated by the imaging apparatus 20 (S102).
After acquiring the image from the imaging apparatus 20, the image processing unit 120 performs object detection processing on the to-be-processed region AP of the image generated by the imaging apparatus 20 (S104). Note that, the “object detection processing” represents image processing of detecting an object (example: a person, a vehicle, a motorcycle, or the like) defined in advance as a detection target. The image processing unit 120 can use an object detector stored in the memory 1030, the storage device 1040, or the like to detect a predetermined object from the image acquired from the imaging apparatus 20. The object detector is constructed in such a way as to be capable of detecting a particular object (example: a person, a vehicle, a motorcycle, or the like) by, for example, machine learning. Further, the object detector may be constructed in such a way as to be capable of detecting a particular object by a rule base. Further, the object detector may be constructed in such a way as to be capable of distinctively detecting an object having a particular attribute from among objects of a same type. Specific examples of the “attribute” include, but not particularly limited to, items as listed below.
-
- Attributes of Person: age, sex, a feature of clothes, a feature of belongings, and the like
- Attributes of Vehicle/Motorcycle: a vehicle body color, a vehicle type, a vehicle body shape and size, a number written on a number plate, the number of seated persons, and the like
Further, the object detector may be constructed in such a way as to be capable of discriminating an attribute (for example, staggering, behaving suspiciously, or the like) relating to behavior of an object and detecting a particular object having the attribute.
After executing the object detection processing, the image processing unit 120 outputs, to a display of the user terminal 30, a result of object detection acquired through the processing, in such a way as to be superimposed over the image acquired from the imaging apparatus 20 as illustrated in, for example,
Then, the object tracking apparatus 10 determines whether target determination information has been acquired by the target determination information acquisition unit 110 (S108). When the target determination information has not yet been acquired (S108: NO), target determination information acquisition processing is executed. A flow of the target determination information acquisition processing is illustrated in
The target determination information acquisition unit 110 determines whether an input (position specification input) for specifying a position of the object to be tracked has been accepted (S110). For example, when a user selects a position of the object to be tracked on the image displayed on the display of the user terminal 30, information indicating the selected position (the position on image coordinates) is input to the target determination information acquisition unit 110 (example:
When a user performs the position specification input, the image processing unit 120 may display information indicating the to-be-processed region AP on the display of the user terminal 30 in such a way as to be superimposed over the image acquired from the imaging apparatus 20. For example, the image processing unit 120 may display a dashed line (the information indicating the to-be-processed region AP) as illustrated in
Herein, in order to track an object positioned outside the to-be-processed region AP (in a region on which the image processing is not executed), the control unit 130 needs to control the imaging range of the imaging apparatus 20 in such a way that the object is included in the to-be-processed region AP. Thus, when the target determination information acquisition unit 110 has accepted the position specification input (S110: YES), the control unit 130 further determines whether the position specified by the position specification input is outside the to-be-processed region AP (S112).
When the position specified by the position specification input is outside the to-be-processed region AP (S112: YES), the control unit 130 computes, for at least one of mechanisms controlling the imaging range of the imaging apparatus 20, a control amount necessary for including the specified position in the to-be-processed region AP (S114). Then, the control unit 130 transmits, to a mechanism to be operated, a control signal based on the computed control amount (S116). By the mechanism operating in response to the control signal, the position (the object existing at the position and selected as a target be tracked) specified by the position specification input is included in the to-be-processed region AP. Then, the object tracking apparatus 10 acquires a new image (an image in a state in which the object selected as a target to be tracked is included in the to-be-processed region AP) generated by the imaging apparatus 20 after the mechanism operating in response to the control signal (S118). Then, the target determination information acquisition unit 110 acquires, by using information on the position specified by the position specification input, target determination information for determining the object selected as a target to be tracked (S120). The target determination information acquisition unit 110 first determines, on a new image, a position associated with the position specified on the image acquired in the processing of S102. For example, the target determination information acquisition unit 110 can determine, on the new image, a position associated with the specified position, based on the specified position on the image acquired in the processing of S102 and the control amount (amount of moving the imaging range) in S114. Then, the target determination information acquisition unit 110 acquires, as target determination information for determining the object to be tracked, a feature value of an object detected at the position determined on the new image. Then, by using the target determination information acquired herein, the object tracking processing as illustrated in
On the other hand, when the position specified by the position specification input is inside the to-be-processed region AP (S112: NO), the processing from S114 to S118 as described above is not executed. In this case, the target determination information acquisition unit 110 acquires, by using information on the position specified by the position specification input, a feature value of an object detected at the position, as target determination information for determining the object to be tracked (S120). Then, by using the target determination information acquired herein, the object tracking processing as illustrated in
When the target determination information acquisition unit 110 has not accepted the position specification input in determination in S110 (S110: NO), the target determination information acquisition unit 110 further determines whether an object matching a predetermined condition has been detected in the object detection processing in S104 (S122). Herein, the predetermined condition is a condition for determining as the object to be tracked. Specific examples of the “predetermined condition” include, but not particularly limited to, “a person wearing a backpack”, “a red sedan-type car”, “a bicycle with two riders”, and the like. Further, the “predetermined condition” may indicate a feature relating to a suspect's appearance (a hairstyle, clothes, belongings, or the like). Information relating to such a predetermined condition may be stored in advance in the memory 1030 or the storage device 1040, or may be input by a user to the object tracking apparatus 10 via the user terminal 30.
When an object matching a predetermined condition has been detected in the object detection processing in S104 (S122: YES), the target determination information acquisition unit 110 acquires, as target determination information, a feature value of the detected object (S124). Then, by using the target determination information acquired herein, the object tracking processing as illustrated in
The image processing unit 120 determines, as the object to be tracked, an object associated with the target determination information acquired in the preceding processing, from among objects detected in the object detection processing in S104, and acquires a position (a position on an image coordinate system) of the object to be tracked (S126). Then, the image processing unit 120 outputs, to the display of the user terminal 30, information indicating the position of the object to be tracked, as illustrated in, for example,
Then, the control unit 130 determines whether the imaging range of the imaging apparatus 20 needs to be controlled, based on the position of the object to be tracked acquired in the processing of S126 (S130). Specifically, the control unit 130 determines whether the imaging range of the imaging apparatus 20 is needed, by comparing the position of the object to be tracked with a position of the predetermined region a (a region including a central portion of the image) as illustrated in
When it is determined that the imaging range of the imaging apparatus 20 needs to be controlled, such as, for example, when the position of the object to be tracked is outside the predetermined region a (S130: YES), the control unit 130 computes, for at least one of mechanisms controlling the imaging range of the imaging apparatus 20, a control amount necessary for including the position of the object to be tracked in the to-be-processed region AP (S132). Then, the control unit 130 transmits, to a mechanism to be operated, a control signal based on the computed control amount (S134). By the mechanism operating in response to the control signal, the object to be tracked is included in the predetermined region a. On the other hand, when it is determined that the imaging range of the imaging apparatus 20 does not need to be controlled (S130: NO), the control unit 130 does not execute the above-described processing.
Then, the control unit 130 performs end determination for the object tracking processing (S136). Specific examples of an end condition for the object tracking processing include, but not particularly limited to, “a processing end instruction is input in the user terminal 30”, “the mechanism controlling the imaging range of the imaging apparatus 20 has reached a movable limit”, “the object tracking processing has been continued for a predetermined period of time or a predetermined distance or more”, and the like. When the end condition for the object tracking processing is not satisfied (S136: NO), the processing returns to S126, and the object tracking processing is continued by using an image newly acquired from the imaging apparatus 20.
<<Processing when Object to be Tracked is Lost>>
Herein, when the object to be tracked moves drastically, when the object to be tracked and another object similar to the object come close to each other (pass by each other), or the like, the object tracking apparatus 10 may lose sight of the object to be tracked. “Lose sight” herein means that the object tracking apparatus 10 is no longer able to identify which object is an object associated with the target determination information.
In this case, the object tracking apparatus 10 executes processing as illustrated in
When the image processing unit 120 has failed in detecting the object to be tracked (has lost sight of the object to be tracked) in the to-be-processed region AP (S202: YES), the target determination information acquisition unit 110 acquires information (hereinafter, also written as “target position information”) indicating a position of the object in the image acquired from the imaging apparatus 20 (S204). Upon acquisition of the target position information by the target determination information acquisition unit 110, the control unit 130 further determines whether the position indicated by the target position information is outside the to-be-processed region AP (S206).
When the position indicated by the target position information is outside the to-be-processed region AP (S206: YES), the control unit 130 determines that the imaging range of the imaging apparatus 20 needs to be controlled. In this case, the control unit 130 computes, for at least one of mechanisms controlling the imaging range of the imaging apparatus 20, a control amount necessary for including the position indicated by the target position information in the to-be-processed region AP (S208). Then, the control unit 130 transmits, to a mechanism to be operated, a control signal based on the computed control amount (S210). By the mechanism operating in response to the control signal, the position (the lost object existing at the position) indicated by the target position information is included in the to-be-processed region AP. Then, the object tracking apparatus 10 acquires a new image (an image in a state in which the lost object is included in the to-be-processed region AP) generated by the imaging apparatus 20 after the mechanism operating in response to the control signal (S212). Then, the target determination information acquisition unit 110 acquires, by using the position indicated by the target position information, target determination information for redetermining the lost object (S214). Then, by using the target determination information acquired herein, the object tracking processing as illustrated in
On the other hand, when the position indicated by the target position information is inside the to-be-processed region AP (S206: NO), the processing from S206 to S212 as described above is not executed. In this case, the target determination information acquisition unit 110 acquires, by using the position indicated by the target position information, target determination information for redetermining the lost object (S214). Then, by using the target determination information acquired herein, the object tracking processing as illustrated in
The flow of the processing from S206 to S214 in
The above-described processing will be described specifically by using
The target position information acquired in an example in
An object tracking apparatus 10 according to the present example embodiment is different from the above-described first example embodiment in a point that a recording unit as described below is further included.
<Function Configuration Example>Hereinafter, processing executed by the object tracking apparatus 10 according to the present example embodiment will be described by using the drawing.
First, the recording unit 122 determines whether object tracking processing has been started (S302). Specifically, the recording unit 122 determines whether target determination information has been acquired by a target determination information acquisition unit 110 through the processing as illustrated in
When object tracking processing has been started, that is, when target determination information has been acquired by the target determination information acquisition unit 110 (S302: YES), the recording unit 122 buffers an image (video data) acquired from the imaging apparatus 20 in, for example, a memory 1030 or the like (S304). Subsequently, the recording unit 122 continues buffering an image (video data) acquired from the imaging apparatus 20 until the object tracking processing is ended (S306: NO). Then, when the object tracking processing has been ended (S306: YES), the recording unit 122 stores, in the storage device 1040 or the like, the video data buffered in the memory 1030 or the like, as recorded data indicating a result of the object tracking processing (S308).
According to the present example embodiment described above, along with start of the object tracking processing described in the first example embodiment, recorded data indicating a result of the object tracking processing are generated and stored, based on the video data acquired from the imaging apparatus 20. The recorded data indicating a result of the past object tracking processing is useful in confirmation work performed by a user. For example, a user can perform careful confirmation of a motion of a concerned object to be tracked, by reviewing the recorded data on the object to be tracked.
Herein, in a situation in which object tracking processing has not been performed (that is, target determination information has not been acquired), for example, the recording unit 122 may generate and store a time lapse video by using an image (video data) acquired from the imaging apparatus 20. By performing confirmation of the time lapse video, a user can roughly recognize situations before and after start of tracking processing of a certain object. In this case, the recording unit 122 may store the above-described recorded data and the time lapse video as separate files in the storage device 1040 or the like, or may store the above-described recorded data and the time lapse video as one file in the storage device 1040 or the like.
Third Example EmbodimentAn object tracking apparatus 10 according to the present example embodiment is different from the first example embodiment in a point that a report generation unit as described below is further included.
<Function Configuration Example>The report generation unit 140 generates a report file indicating a result of object tracking processing. The report file generated by the report generation unit 140 includes, for example, an image indicating an object to be tracked, information relating to a period (time and date) for which tracking processing has been performed, information relating to a place where the tracking processing has been executed, a movement trajectory of the object to be tracked, and the like. Further, when the object tracking apparatus 10 includes a recording unit 122 described in the second example embodiment, the report file may further include a link address to recorded data generated according to the object tracking processing. The image indicating the object to be tracked is, for example, a representative image (for example, when recorded data are generated according to the above-described second example embodiment, a thumbnail image of the recorded data) out of images acquired from an imaging apparatus 20, or the like. Besides the image, the report generation unit 140 may further include, in the report file, text information relating to the object to be tracked. The text information relating to the object to be tracked is, for example, information indicating a category (a person, a vehicle, or the like) of the object or an attribute (example: “wearing a backpack” for a person, and “a red sedan-type” for a vehicle) of the object, or the like. The information relating to the place where the tracking processing has been executed is, for example, information indicating a point (an address or the like) at which the imaging apparatus 20 is installed, information determining the individual imaging apparatus 20, such as a number uniquely assigned to each imaging apparatus 20, or the like.
<Flow of Processing>Hereinafter, processing executed by the object tracking apparatus 10 according to the present example embodiment will be described by using the drawing.
First, the report generation unit 140 determines whether object tracking processing has been started (S402). Specifically, the recording unit 122 determines whether target determination information has been acquired by a target determination information acquisition unit 110 through the processing as illustrated in
When object tracking processing has been started, that is, when target determination information has been acquired by the target determination information acquisition unit 110 (S402: YES), the report generation unit 140 collects information to be included in a report file during a period from start to end of the object tracking processing (S404).
For example, the report generation unit 140 determines one representative image from among images acquired from the imaging apparatus 20 during execution of the object tracking processing. For example, the report generation unit 140 can convert, based on a direction, a degree of less blurring, brightness, and the like of an object to be tracked being a subject of an image, visibility of the object in each image into a numerical value, and can include, in the report file, an optimum image based on the numerical value, as a representative image. Further, the report generation unit 140 can include, in the report file, a result (for example, “a vehicle”, “a red sedan-type vehicle”, or the like) of inputting an image generated by the imaging apparatus 20 to an object detector, as text information. Further, the report generation unit 140 can acquire a start time and an end time of the object tracking processing, and can include, in the report file, the start time and the end time as information indicating a period for which the object tracking processing has been performed. Further, the report generation unit 140 can acquire an installation place of the imaging apparatus 20 having generated the image used in the object tracking processing, a specific number unique to the imaging apparatus 20, and the like, and can include, in the report file, the installation place, the specific number, and the like as information indicating a point at which the object tracking processing has been executed. Further, the report generation unit 140 can include, in the report file, information indicating a movement locus of the object to be tracked that can be acquired by converting a position of the object on each image into a position on a map. In this case, the report generation unit 140 may use, for example, a function that converts the position on the image into the position on the map, based on a pose (an angle of pan/tilt) of the imaging apparatus 20 and a position of the imaging apparatus 20. Such a function is stored in advance in, for example, a memory 1030 or a storage device 1040. Further, when the object tracking apparatus 10 further includes the recording unit 122 described in the second example embodiment, the report generation unit 140 can include, in the report file, information indicating a link address to recorded data output at an end time of the object tracking processing.
Then, the report generation unit 140 outputs, to the storage device 1040, a user terminal 30, or the like, the report file generated based on the information collected in S404 (S406).
According to the present example embodiment described above, when object tracking processing is executed, a report file on the object tracking processing is generated and output. The configuration according to the present example embodiment enables a user to easily manage and recognize performance of the object tracking processing.
While the example embodiments of the present invention have been described with reference to the drawings, the above-described example embodiments are illustrative of the present invention, and various configurations other than the above may be employed.
Further, while a plurality of steps (processing) are described in order in a plurality of sequence diagrams or flowcharts used in the above description, execution order of the steps executed in each of the example embodiments is not limited to the described order. In each of the example embodiments, the order of the illustrated steps can be changed, as long as the change does not interfere with the contents. Further, the above-described example embodiments can be combined, as long as the contents do not contradict each other.
The whole or part of the above-described example embodiments can be described as, but not limited to, the following supplementary notes.
1.
An object tracking apparatus including:
a target determination information acquisition unit that acquires target determination information determining an object to be tracked;
an image processing unit that detects the object to be tracked and tracks the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
a control unit that controls an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
2.
The object tracking apparatus according to supplementary note 1, wherein
the target determination information acquisition unit acquires the target determination information in response to an input of a position of the object to be tracked on the image.
3.
The object tracking apparatus according to supplementary note 2, wherein
the target determination information acquisition unit acquires the target determination information in response to an input for selecting a position of the object to be tracked on the image displayed on a display.
4.
The object tracking apparatus according to any one of supplementary notes 1 to 3, wherein
the target determination information acquisition unit executes processing of acquiring information indicating a position of the object to be tracked on the image, when detection of the object to be tracked fails in the image processing.
5.
The object tracking apparatus according to any one of supplementary notes 1 to 4, wherein
the to-be-processed region is a region including a central portion of the image.
6.
The object tracking apparatus according to supplementary note 5, wherein
the control unit controls the imaging range of the imaging apparatus in such a way that the object to be tracked is included in a predetermined region including a central portion of the image and being a part of the to-be-processed region.
7.
The object tracking apparatus according to any one of supplementary notes 1 to 6, wherein
a size of the to-be-processed region is determined in advance.
8.
The object tracking apparatus according to supplementary note 7, wherein
a size of the to-be-processed region is determined based on a size of a resource allocatable to image processing on the to-be-processed region.
9.
The object tracking apparatus according to any one of supplementary notes 1 to 8, further including
a recording unit that stores, as recorded data, an image acquired from the imaging apparatus during a period from when the object to be tracked is detected to when a predetermined end condition is satisfied.
10.
The object tracking apparatus according to any one of supplementary notes 1 to 9, wherein
the image processing unit displays, on a display, information indicating the to-be-processed region in such a way as to be superimposed over the image.
11.
The object tracking apparatus according to any one of supplementary notes 1 to 10, wherein
the control unit controls the imaging range of the imaging apparatus by operating at least any one of a mechanism controlling a zoom of the imaging apparatus, a mechanism controlling a direction of the imaging apparatus, and a mechanism controlling a position of the imaging apparatus.
12.
An object tracking method executed by a computer, the object tracking method including:
acquiring target determination information determining an object to be tracked;
detecting the object to be tracked and tracking the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
controlling an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
13.
The object tracking method according to supplementary note 12, further including,
acquiring the target determination information in response to an input of a position of the object to be tracked on the image.
14.
The object tracking method according to supplementary note 13, further including,
acquiring the target determination information in response to an input for selecting a position of the object to be tracked on the image displayed on a display.
15.
The object tracking method according to any one of supplementary notes 12 to 14, further including,
executing processing of acquiring information indicating a position of the object to be tracked on the image, when detection of the object to be tracked fails in the image processing.
16.
The object tracking method according to any one of supplementary notes 12 to 15, further including
the to-be-processed region being a region including a central portion of the image.
17.
The object tracking method according to supplementary note 16, further including, controlling the imaging range of the imaging apparatus in such a way that the object to be tracked is included in a predetermined region including a central portion of the image and being a part of the to-be-processed region.
18.
The object tracking method according to any one of supplementary notes 12 to 17, further including
a size of the to-be-processed region being determined in advance.
19.
The object tracking method according to supplementary note 18, further including a size of the to-be-processed region being determined based on a size of a resource allocatable to an image processing on the to-be-processed region.
20.
The object tracking method according to any one of supplementary notes 12 to 19, further including,
storing, as recorded data, an image acquired from the imaging apparatus during a period from when the object to be tracked is detected to when a predetermined end condition is satisfied.
21.
The object tracking method according to any one of supplementary notes 12 to 20, further including,
displaying, on a display, information indicating the to-be-processed region in such a way as to be superimposed over the image.
22.
The object tracking method according to any one of supplementary notes 12 to 21, further including,
controlling the imaging range of the imaging apparatus by operating at least any one of a mechanism controlling a zoom of the imaging apparatus, a mechanism controlling a direction of the imaging apparatus, and a mechanism controlling a position of the imaging apparatus.
23.
A program causing a computer to execute the object tracking method according to any one of supplementary notes 12 to 22.
Claims
1. An object tracking apparatus comprising:
- a target determination information acquisition unit that acquires target determination information determining an object to be tracked;
- an image processing unit that detects the object to be tracked and tracks the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
- a control unit that controls an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
2. The object tracking apparatus according to claim 1, wherein
- the target determination information acquisition unit acquires the target determination information in response to an input of a position of the object to be tracked on the image.
3. The object tracking apparatus according to claim 2, wherein
- the target determination information acquisition unit acquires the target determination information in response to an input for selecting a position of the object to be tracked on the image displayed on a display.
4. The object tracking apparatus according to claim 1, wherein
- the target determination information acquisition unit executes processing of acquiring information indicating a position of the object to be tracked on the image when detection of the object to be tracked fails in the image processing.
5. The object tracking apparatus according to claim 1, wherein
- the to-be-processed region is a region including a central portion of the image.
6. The object tracking apparatus according to claim 5, wherein
- the control unit controls the imaging range of the imaging apparatus in such a way that the object to be tracked is included in a predetermined region including a central portion
- of the image and being a part of the to-be-processed region.
7. The object tracking apparatus according to claim 1, wherein
- a size of the to-be-processed region is determined in advance.
8. The object tracking apparatus according to claim 7, wherein
- a size of the to-be-processed region is determined based on a size of a resource allocatable to an image processing on the to-be-processed region.
9. The object tracking apparatus according to claim 1, further comprising
- a recording unit that stores, as recorded data, an image acquired from the imaging apparatus during a period from when the object to be tracked is detected to when a predetermined end condition is satisfied.
10. The object tracking apparatus according to claim 1, wherein
- the image processing unit displays, on a display, information indicating the to-be-processed region in such a way as to be superimposed over the image.
11. The object tracking apparatus according to claim 1, wherein
- the control unit controls the imaging range of the imaging apparatus by operating at least any one of a mechanism controlling a zoom of the imaging apparatus, a mechanism controlling a direction of the imaging apparatus, and a mechanism controlling a position of the imaging apparatus.
12. An object tracking method executed by a computer, the object tracking method comprising:
- acquiring target determination information determining an object to be tracked;
- detecting the object to be tracked and tracking the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
- controlling an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
13. A non-transitory computer readable medium storing a program causing a computer to execute an object tracking method, the method comprising:
- acquiring target determination information determining an object to be tracked;
- detecting the object to be tracked and tracking the detected object by performing image processing on a to-be-processed region being a partial region of an image acquired from an imaging apparatus; and
- controlling an imaging range of the imaging apparatus in such a way that the object to be tracked is included in the to-be-processed region.
Type: Application
Filed: Oct 18, 2018
Publication Date: Aug 19, 2021
Applicant: NEC Corporation (Minato-ku, Tokyo)
Inventor: Shoji YACHIDA (Tokyo)
Application Number: 17/284,590