Patents by Inventor Hiroyoshi Miyano

Hiroyoshi Miyano has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10614317
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Grant
    Filed: March 5, 2019
    Date of Patent: April 7, 2020
    Assignee: NEC Corporation
    Inventors: Ryoma Oami, Hiroyoshi Miyano, Yusuke Takahashi, Hiroo Ikeda, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa, Kazuya Koyama, Hiroshi Yamada
  • Publication number: 20200034630
    Abstract: An object tracking apparatus, method and computer-readable medium for detecting an object from output information of sensors, tracking the object on a basis of a plurality of detection results, generating tracking information of the object represented in a common coordinate system, outputting the tracking information, and detecting the object on a basis of the tracking information.
    Type: Application
    Filed: August 6, 2019
    Publication date: January 30, 2020
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO
  • Publication number: 20190325232
    Abstract: The present invention is a two-wheel vehicle riding person number determination system including an imaging means configured to image a two-wheel vehicle that is installed in a predetermined position and travels on a road, and a two-wheel vehicle riding person number determining means configured to process an image of the imaging means, extract a contour shape of an upper position of the two-wheel vehicle that travels on the road, detect humped shapes corresponding to heads of persons who ride on the two-wheel vehicle from the contour shape of the upper position of the two-wheel vehicle, and determine, on the basis of the humped shapes, whether or not the number of the persons who ride on the two-wheel vehicle is at least two persons or more.
    Type: Application
    Filed: April 30, 2019
    Publication date: October 24, 2019
    Inventors: Hiroyoshi MIYANO, Tetsuo INOSHITA
  • Publication number: 20190272430
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: March 8, 2019
    Publication date: September 5, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190272431
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: March 8, 2019
    Publication date: September 5, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266413
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266412
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266411
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020) The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190236418
    Abstract: Provided is an image processing system, an image processing method, and a program for preferably detecting a mobile object. The image processing system includes: an image input unit for receiving an input for some image frames having different times in a plurality of image frames constituting a picture, which is of a pixel on which the mobile object appears or a pixel on which the mobile object does not appear, for selected arbitrary one or more pixels in an image frame at the time of processing; and a mobile object detection model constructing unit for learning a parameter for detecting the mobile object based on the input.
    Type: Application
    Filed: March 1, 2019
    Publication date: August 1, 2019
    Applicant: NEC Corporation
    Inventor: Hiroyoshi MIYANO
  • Publication number: 20190220672
    Abstract: [Problem] To provide a motion condition estimation device, a motion condition estimation method and a motion condition estimation program capable of accurately estimating the motion condition of monitored subjects even in a crowded environment. [Solution] A motion condition estimation device according to the present invention is provided with a quantity estimating means 81 and a motion condition estimating means 82. The quantity estimating means 81 uses a plurality of chronologically consecutive images to estimate a quantity of monitored subjects for each local region in each image. The motion condition estimating means 82 estimates the motion condition of the monitored subjects from chronological changes in the quantities estimated in each local region.
    Type: Application
    Filed: March 8, 2019
    Publication date: July 18, 2019
    Applicant: NEC CORPORATION
    Inventor: Hiroyoshi Miyano
  • Publication number: 20190213424
    Abstract: To provide an image processing system, an image processing method, and a program, capable of detecting a group with high irregularity. An image processing system is provided with: a group detector that detects a group based on an input image captured with an image capturing at a first time; a repeating group analyzer that determines that a detected group has been previously detected; and an alert module that reports when the detected group has been determined by the repeating group analyzer to have been previously detected.
    Type: Application
    Filed: March 13, 2019
    Publication date: July 11, 2019
    Applicant: NEC Corporation
    Inventors: Ryoma Oami, Yusuke Takahashi, Hiroyoshi Miyano
  • Patent number: 10347100
    Abstract: A similarity computation unit (130) derives a first probability P indicating that a first moving body appearing in the first video is the same as a second moving body appearing in the second video on the basis of similarity of feature value of the moving bodies. A non-appearance probability computation unit (140) derives a second probability Q indicating that the first moving body is not the same as the second moving body on the basis of an elapsed time after the first moving body exits from the first video. A person determination unit (150) determines whether the first moving body is the same as the second moving body by comparing the probability P and Q.
    Type: Grant
    Filed: April 23, 2014
    Date of Patent: July 9, 2019
    Assignee: NEC Corporation
    Inventor: Hiroyoshi Miyano
  • Publication number: 20190205660
    Abstract: [Problem] To provide a motion condition estimation device, a motion condition estimation method and a motion condition estimation program capable of accurately estimating the motion condition of monitored subjects even in a crowded environment. [Solution] A motion condition estimation device according to the present invention is provided with a quantity estimating means 81 and a motion condition estimating means 82. The quantity estimating means 81 uses a plurality of chronologically consecutive images to estimate a quantity of monitored subjects for each local region in each image. The motion condition estimating means 82 estimates the motion condition of the monitored subjects from chronological changes in the quantities estimated in each local region.
    Type: Application
    Filed: March 8, 2019
    Publication date: July 4, 2019
    Applicant: NEC CORPORATION
    Inventor: Hiroyoshi Miyano
  • Publication number: 20190206067
    Abstract: Provided is an image processing apparatus (2000) including an index value calculation unit (2020) and a presentation unit (2040). The index value calculation unit (2020) acquires a plurality of images captured by a camera (3000) (captured images), and calculates an index value indicating the degree of change in the state of a monitoring target in the captured image, using the acquired captured image. The presentation unit (2040) presents an indication based on the index value calculated by the index value calculation unit (2020) on the captured image captured by the camera (3000).
    Type: Application
    Filed: March 5, 2019
    Publication date: July 4, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190205661
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: March 8, 2019
    Publication date: July 4, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190197708
    Abstract: Provided is an image processing apparatus (2000) including an index value calculation unit (2020) and a presentation unit (2040). The index value calculation unit (2020) acquires a plurality of images captured by a camera (3000) (captured images), and calculates an index value indicating the degree of change in the state of a monitoring target in the captured image, using the acquired captured image. The presentation unit (2040) presents an indication based on the index value calculated by the index value calculation unit (2020) on the captured image captured by the camera (3000).
    Type: Application
    Filed: March 5, 2019
    Publication date: June 27, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Patent number: 10325160
    Abstract: [Problem] To provide a motion condition estimation device, a motion condition estimation method and a motion condition estimation program capable of accurately estimating the motion condition of monitored subjects even in a crowded environment. [Solution] A motion condition estimation device according to the present invention is provided with a quantity estimating means 81 and a motion condition estimating means 82. The quantity estimating means 81 uses a plurality of chronologically consecutive images to estimate a quantity of monitored subjects for each local region in each image. The motion condition estimating means 82 estimates the motion condition of the monitored subjects from chronological changes in the quantities estimated in each local region.
    Type: Grant
    Filed: January 13, 2016
    Date of Patent: June 18, 2019
    Assignee: NEC CORPORATION
    Inventor: Hiroyoshi Miyano
  • Publication number: 20190180584
    Abstract: A similarity computation unit (130) derives a first probability P indicating that a first moving body appearing in the first video is the same as a second moving body appearing in the second video on the basis of similarity of feature value of the moving bodies. A non-appearance probability computation unit (140) derives a second probability Q indicating that the first moving body is not the same as the second moving body on the basis of an elapsed time after the first moving body exits from the first video. A person determination unit (150) determines whether the first moving body is the same as the second moving body by comparing the probability P and Q.
    Type: Application
    Filed: February 13, 2019
    Publication date: June 13, 2019
    Inventor: Hiroyoshi MIYANO
  • Publication number: 20190180583
    Abstract: A similarity computation unit (130) derives a first probability P indicating that a first moving body appearing in the first video is the same as a second moving body appearing in the second video on the basis of similarity of feature value of the moving bodies. A non-appearance probability computation unit (140) derives a second probability Q indicating that the first moving body is not the same as the second moving body on the basis of an elapsed time after the first moving body exits from the first video. A person determination unit (150) determines whether the first moving body is the same as the second moving body by comparing the probability P and Q.
    Type: Application
    Filed: February 13, 2019
    Publication date: June 13, 2019
    Inventor: Hiroyoshi MIYANO
  • Publication number: 20190147282
    Abstract: Provided is a technique for enhancing operability of a mobile apparatus. An information processing apparatus (2000) includes a first processing unit (2020), a second processing unit (2040), and a control unit (2060). The first processing unit (2020) generates information indicating an event detection position in accordance with a position on a surveillance image set in a first operation. The first operation is an operation with respect to the surveillance image displayed on a display screen. The second processing unit (2040) performs a display change process with respect to the surveillance image or a window including the surveillance image. The control unit (2060) causes any one of the first processing unit (2020) and the second processing unit (2040) to process the first operation on the basis of a second operation.
    Type: Application
    Filed: December 18, 2018
    Publication date: May 16, 2019
    Applicant: NEC CORPORATION
    Inventors: Kenichiro IDA, Hiroshi KITAJIMA, Hiroyoshi MIYANO