Patents by Inventor Hiroo Ikeda

Hiroo Ikeda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230386243
    Abstract: An information processing apparatus (2000) includes a recognizer (2020). An image (10) is input to the recognizer (2020). The recognizer (2020) outputs, for a crowd included in the input image (10), a label (30) describing a type of the crowd and structure information (40) describing a structure of the crowd. The structure information (40) indicates a location and a direction of an object included in the crowd. The information processing apparatus (2000) acquires training data (50) which includes a training image (52), a training label (54), and training structure information (56). The information processing apparatus (2000) performs training of the recognizer (2020) using the label (30) and the structure information (40), which are acquired by inputting the training image (52) with respect to the recognizer (2020, and the training label (54) and the training structure information (56).
    Type: Application
    Filed: August 9, 2023
    Publication date: November 30, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo Ikeda
  • Publication number: 20230377188
    Abstract: A group specification apparatus includes: a first group candidate setting unit that selects a person from among persons within a first shot image and sets a first group candidate, based on a spatial condition and a state condition with reference to the selected person; a second group candidate setting unit that selects a person from persons within a second shot image having a different shooting time from the first shot image, using an attribute of the person selected and sets a second group candidate, based on the spatial condition and the state condition; a similarity calculating unit that compares first attribute configuration information about the first group candidate with second attribute configuration information about the second group candidate, and calculates a similarity between the group candidates; and a group specifying unit that specifies the persons constituting the first group candidate as one group according to the similarity.
    Type: Application
    Filed: October 14, 2020
    Publication date: November 23, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Patent number: 11823398
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Grant
    Filed: May 12, 2022
    Date of Patent: November 21, 2023
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Publication number: 20230351259
    Abstract: At least one processor generates a crowd state image as an image in which a person image corresponding to a person state is synthesized with previously-prepared image at a predetermined size. The previously-prepared image is an background image that include no person. The at least one processor specifies a training label for the crowd state image. The at least one processor outputs a pair of crowd state image and training label.
    Type: Application
    Filed: July 6, 2023
    Publication date: November 2, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Publication number: 20230353711
    Abstract: An image processing system, an image processing method, and a program capable of implementing an association of a person appearing in a video image through a simple operation are provided. The image processing system includes an input device which accepts input of video images captured by a plurality of video cameras, a display screen generating unit which causes a display device to display at least one video image among the video images inputted from the input device, and a tracked person registering unit which is capable of registering one or more persons appearing in the video image displayed by the display device. When a person appears in the video image displayed by the display device, the display screen generating unit selectably displays person images of one or more persons, which are associable with the person appearing in the video image and which are registered by the tracked person registering unit, in a vicinity of the video image.
    Type: Application
    Filed: July 7, 2023
    Publication date: November 2, 2023
    Applicant: NEC Corporation
    Inventors: Yusuke Takahashi, Hiroo IKEDA
  • Publication number: 20230351258
    Abstract: At least one processor generates a crowd state image as an image in which a person image corresponding to a person state is synthesized with previously-prepared image at a predetermined size. The at least one processor specifies a training label for the crowd state image. The at least one processor outputs a pair of crowd state image and training label.
    Type: Application
    Filed: July 6, 2023
    Publication date: November 2, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Publication number: 20230351617
    Abstract: A crowd type classification system of an aspect of the present invention includes: a staying crowd detection unit that detects a local region indicating a crowd in staying from a plurality of local regions determined in an image acquired by an image acquisition device; a crowd direction estimation unit that estimates a direction of the crowd for an image of a part corresponding to the detected local region, and appends the direction of the crowd to the local region; and a crowd type classification unit that classifies a type of the crowd including a plurality of staying persons for the local region to which the direction is appended by using a relative vector indicating a relative positional relationship between two local regions and directions of crowds in the two local regions, and outputs the type and positions of the crowds.
    Type: Application
    Filed: June 29, 2023
    Publication date: November 2, 2023
    Applicant: NEC CORPORATION
    Inventor: Hiroo Ikeda
  • Patent number: 11727578
    Abstract: A crowd type classification system of an aspect of the present invention includes: a staying crowd detection unit that detects a local region indicating a crowd in staying from a plurality of local regions determined in an image acquired by an image acquisition device; a crowd direction estimation unit that estimates a direction of the crowd for an image of a part corresponding to the detected local region, and appends the direction of the crowd to the local region; and a crowd type classification unit that classifies a type of the crowd including a plurality of staying persons for the local region to which the direction is appended by using a relative vector indicating a relative positional relationship between two local regions and directions of crowds in the two local regions, and outputs the type and positions of the crowds.
    Type: Grant
    Filed: December 17, 2020
    Date of Patent: August 15, 2023
    Assignee: NEC CORPORATION
    Inventor: Hiroo Ikeda
  • Publication number: 20230230277
    Abstract: An object position estimation device (1) is provided with: a feature extraction unit (10) including a first feature extraction unit (21) which generates a first feature map by subjecting a target image to a convolution computation process, and a second feature extraction unit (22) which generates a second feature map by also subjecting the first feature map to the convolution computation process; and a likelihood map estimation unit (20) including a first position likelihood estimation unit (23) which, by using the first feature map, estimates a first likelihood map indicating the probability that first objects having a first size are present in the target image, and a second position likelihood estimation unit (24) which, by using the second feature map, estimates a second likelihood map indicating the probability that second objects having a second size, which is greater than the first size, are present in the target image.
    Type: Application
    Filed: June 23, 2020
    Publication date: July 20, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Publication number: 20230186680
    Abstract: In order to acquire recognition environment information impacting the recognition accuracy of a recognition engine, an information processing device 100 comprises a detection unit 101 and an environment acquisition unit 102. The detection unit 101 detects a marker, which has been disposed within a recognition target zone for the purpose of acquiring information, from an image captured by means of an imaging device which captures images of objects located within the recognition target zone. The environment acquisition unit 102 acquires the recognition environment information based on image information of the detected marker. The recognition environment information is information representing the way in which a recognition target object is reproduced in an image captured by the imaging device when said imaging device captures an image of the recognition target object located within the recognition target zone.
    Type: Application
    Filed: January 19, 2023
    Publication date: June 15, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Publication number: 20230093919
    Abstract: A flow-rate information output apparatus (2000) computes the number of objects (20) passing through a surveillance location (30) included in a target image (10) with respect to each of a plurality of surveillance directions (34). The flow-rate information output apparatus (2000) generates, with respect to one or more surveillance directions (34), a flow rate mark (40) relevant to the number of objects (20) passing through the surveillance location (30) toward the surveillance direction (34). The flow-rate information output apparatus (2000) generates a result image (60) by superimposing the flow rate mark (40) on an image including the surveillance location (30).
    Type: Application
    Filed: February 3, 2020
    Publication date: March 30, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Publication number: 20230048567
    Abstract: A queue analysis apparatus (2000) estimates a position and an orientation of each object (20) included in a target image (10). The target image (10) is generated by a camera (50) that captures the object (20). The queue analysis apparatus (2000) generates a queue line (40) that expresses, by a linear shape, a queue included in a queue region (30) being a region representing a queue in the target image (10), based on a position and an orientation being estimated for each object (20) included in the queue region (30).
    Type: Application
    Filed: February 7, 2020
    Publication date: February 16, 2023
    Applicant: NEC Corporation
    Inventor: Hiroo Ikeda
  • Patent number: 11580720
    Abstract: In order to acquire recognition environment information impacting the recognition accuracy of a recognition engine, an information processing device 100 comprises a detection unit 101 and an environment acquisition unit 102. The detection unit 101 detects a marker, which has been disposed within a recognition target zone for the purpose of acquiring information, from an image captured by means of an imaging device which captures images of objects located within the recognition target zone. The environment acquisition unit 102 acquires the recognition environment information based on image information of the detected marker. The recognition environment information is information representing the way in which a recognition target object is reproduced in an image captured by the imaging device when said imaging device captures an image of the recognition target object located within the recognition target zone.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: February 14, 2023
    Assignee: NEC CORPORATION
    Inventor: Hiroo Ikeda
  • Publication number: 20220351522
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: July 13, 2022
    Publication date: November 3, 2022
    Applicant: NEC Corporation
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20220343516
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: May 12, 2022
    Publication date: October 27, 2022
    Applicant: NEC Corporation
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20220292706
    Abstract: An object count estimation apparatus (2000) includes a first feature extraction network (2042), a first counting network (2044), a second feature extraction network (2062), and a second counting network (2064). The first feature extraction network (2042) generates a first feature map (20) by performing convolution processing on a target image (10). The first counting network (2044) estimates the number of target objects having a size included in a first predetermined range by performing processing on the first feature map (20). The second feature extraction network (2062) generates a second feature map (30) by performing convolution processing on the first feature map (20). The second existence estimation network (2064) estimates the number of target objects having a size included in a second predetermined range by performing processing on the second feature map (30). A size included in the first predetermined range is smaller than a size included in the second predetermined range.
    Type: Application
    Filed: August 30, 2019
    Publication date: September 15, 2022
    Applicant: NEC Corporation
    Inventor: Hiroo IKEDA
  • Patent number: 11423658
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Grant
    Filed: March 8, 2019
    Date of Patent: August 23, 2022
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Hiroyoshi Miyano, Yusuke Takahashi, Hiroo Ikeda, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa, Kazuya Koyama, Hiroshi Yamada
  • Patent number: 11403771
    Abstract: Provided is an image processing apparatus (2000) including an index value calculation unit (2020) and a presentation unit (2040). The index value calculation unit (2020) acquires a plurality of images captured by a camera (3000) (captured images), and calculates an index value indicating the degree of change in the state of a monitoring target in the captured image, using the acquired captured image. The presentation unit (2040) presents an indication based on the index value calculated by the index value calculation unit (2020) on the captured image captured by the camera (3000).
    Type: Grant
    Filed: December 14, 2020
    Date of Patent: August 2, 2022
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Hiroyoshi Miyano, Yusuke Takahashi, Hiroo Ikeda, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa, Kazuya Koyama, Hiroshi Yamada
  • Patent number: 11373335
    Abstract: A projection image generation unit 91 applies a plurality of projection schemes that use the radius of a visual field region of a fisheye-lens camera to an image that is imaged by the fisheye-lens camera to generate a plurality of projection images. A display unit 92 displays the plurality of projection images. A selection acceptance unit 93 accepts a projection image selected by a user from among the plurality of displayed projection images. A projection scheme determination unit 94 determines a projection scheme on the basis of the selected projection image. An output unit 95 outputs an internal parameter of the fisheye-lens camera that corresponds to the determined projection scheme.
    Type: Grant
    Filed: March 27, 2017
    Date of Patent: June 28, 2022
    Assignee: NEC CORPORATION
    Inventor: Hiroo Ikeda
  • Patent number: 11373408
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Grant
    Filed: May 20, 2020
    Date of Patent: June 28, 2022
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Hiroyoshi Miyano, Yusuke Takahashi, Hiroo Ikeda, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa, Kazuya Koyama, Hiroshi Yamada