Patents by Inventor Ryo Kawai

Ryo Kawai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10614317
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Grant
    Filed: March 5, 2019
    Date of Patent: April 7, 2020
    Assignee: NEC Corporation
    Inventors: Ryoma Oami, Hiroyoshi Miyano, Yusuke Takahashi, Hiroo Ikeda, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa, Kazuya Koyama, Hiroshi Yamada
  • Publication number: 20200074184
    Abstract: An information processing apparatus (2000) includes a summarizing unit (2040) and a display control unit (2060). The summarizing unit (2040) obtains a video (30) generated by each of a plurality of cameras (10). Furthermore, the summarizing unit (2040) performs a summarizing process on the video (30) and generates summary information of the video (30). The display control unit (2060) causes a display system (20) to display the video (30). Here, the display control unit (2060) causes the display system (20) to display the summary information of the video (30) in response to that a change in a display state of the video (30) in the display system (20) satisfies a predetermined condition.
    Type: Application
    Filed: November 5, 2019
    Publication date: March 5, 2020
    Applicant: NEC CORPORATION
    Inventor: Ryo KAWAI
  • Publication number: 20200065982
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: February 27, 2020
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20200042778
    Abstract: Provided is a technique for extracting information with which it is possible to track an object to be tracked, even if it happens that the object to be tracked is hidden or the like. This image processing device is provided with: a moving region identification unit which identifies, in a video, the image region associated with a moving object shown in the video; a stationary region identification unit which identifies, in the video, the image region associated with a stationary object shown in the video; and an extraction unit which extracts a feature of a partial image that is included in the image region associated with the stationary object, and that does not overlap the image region associated with the moving object.
    Type: Application
    Filed: October 11, 2019
    Publication date: February 6, 2020
    Applicant: NEC Corporation
    Inventor: Ryo KAWAI
  • Publication number: 20200043181
    Abstract: Provided is a technique for extracting information with which it is possible to track an object to be tracked, even if it happens that the object to be tracked is hidden or the like. This image processing device is provided with: a moving region identification unit which identifies, in a video, the image region associated with a moving object shown in the video; a stationary region identification unit which identifies, in the video, the image region associated with a stationary object shown in the video; and an extraction unit which extracts a feature of a partial image that is included in the image region associated with the stationary object, and that does not overlap the image region associated with the moving object.
    Type: Application
    Filed: October 11, 2019
    Publication date: February 6, 2020
    Applicant: NEC Corporation
    Inventor: Ryo KAWAI
  • Publication number: 20200043177
    Abstract: Provided is a technique for extracting information with which it is possible to track an object to be tracked, even if it happens that the object to be tracked is hidden or the like. This image processing device is provided with: a moving region identification unit which identifies, in a video, the image region associated with a moving object shown in the video; a stationary region identification unit which identifies, in the video, the image region associated with a stationary object shown in the video; and an extraction unit which extracts a feature of a partial image that is included in the image region associated with the stationary object, and that does not overlap the image region associated with the moving object.
    Type: Application
    Filed: October 19, 2016
    Publication date: February 6, 2020
    Applicant: NEC Corporation
    Inventor: Ryo KAWAI
  • Publication number: 20190385315
    Abstract: In order to produce a discriminator that has higher discrimination ability, this image-processing device is provided with a synthesis unit for synthesizing a background image and an object image the hue and/or brightness of which at least partially resembles at least a portion of the background image, a generation unit for generating a difference image between the synthesized image and the background image, and a machine learning unit for performing machine learning using the generated difference image as learning data.
    Type: Application
    Filed: August 21, 2019
    Publication date: December 19, 2019
    Applicant: NEC CORPORATION
    Inventor: Ryo KAWAI
  • Publication number: 20190378281
    Abstract: In order to produce a discriminator that has higher discrimination ability, this image-processing device is provided with a synthesis unit for synthesizing a background image and an object image the hue and/or brightness of which at least partially resembles at least a portion of the background image, a generation unit for generating a difference image between the synthesized image and the background image, and a machine learning unit for performing machine learning using the generated difference image as learning data.
    Type: Application
    Filed: August 21, 2019
    Publication date: December 12, 2019
    Applicant: NEC CORPORATION
    Inventor: Ryo KAWAI
  • Publication number: 20190333234
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: October 31, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20190325589
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: October 24, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Patent number: 10432877
    Abstract: An image processing system includes: a receiving unit configured to receive an input of a plurality of image frames constituting a video from an imaging apparatus; a detection unit configured to detect a feature point included in an image frame to be processed in the plurality of image frames; and an output unit configured to output an output image obtained by superimposing an image frame to be processed of an area detected as a feature point on a background image generated from at least some of a plurality of image frames.
    Type: Grant
    Filed: June 17, 2015
    Date of Patent: October 1, 2019
    Assignee: NEC CORPORATION
    Inventor: Ryo Kawai
  • Patent number: 10422249
    Abstract: An exhaust frame includes: an inner casing; an inner diffuser which defines, between the inner diffuser and the inner casing, an annular inner cooling passage connected to a final-stage wheel space; an outer diffuser which defines an exhaust passage between the outer diffuser and the inner diffuser; an outer casing which defines an annular outer cooling passage between the outer casing and the outer diffuser; a strut which connects the inner casing and the outer casing to each other while crossing the exhaust passage; a strut cover which connects the inner diffuser and the outer diffuser to each other, and defines, between the strut cover and the strut, an annular connection passage connecting the inner cooling passage and the outer cooling passage to each other; and a communication hole provided in a wall of the outer cooling passage at a position on the downstream side of a center line of the strut in the flow direction of a combustion gas.
    Type: Grant
    Filed: January 18, 2017
    Date of Patent: September 24, 2019
    Assignee: Mitsubishi Hitachi Power Systems, Ltd.
    Inventors: Takuya Takeda, Tetsuya Nakamura, Ryo Kawai, Kenji Nanataki
  • Publication number: 20190286913
    Abstract: An information processing apparatus (2000) includes a summarizing unit (2040) and a display control unit (2060). The summarizing unit (2040) obtains a video (30) generated by each of a plurality of cameras (10). Furthermore, the summarizing unit (2040) performs a summarizing process on the video (30) and generates summary information of the video (30). The display control unit (2060) causes a display system (20) to display the video (30). Here, the display control unit (2060) causes the display system (20) to display the summary information of the video (30) in response to that a change in a display state of the video (30) the display system (20) satisfies a predetermined condition.
    Type: Application
    Filed: November 7, 2016
    Publication date: September 19, 2019
    Applicant: NEC CORPORATION
    Inventor: Ryo KAWAI
  • Publication number: 20190272431
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: March 8, 2019
    Publication date: September 5, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190272430
    Abstract: A guidance processing apparatus (100) includes an information acquisition unit (101) that acquires a plurality of different pieces of guidance information on the basis of states of a plurality of people within one or more images, and a control unit (102) that performs control of a plurality of target devices present in different spaces or time division control of a target device so as to set a plurality of different states corresponding to the plurality of pieces of guidance information.
    Type: Application
    Filed: March 8, 2019
    Publication date: September 5, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266411
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020) The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266412
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190266413
    Abstract: A state acquisition unit (2020) acquires a state of a monitoring target in a captured image captured by a camera (3040). A monitoring point acquisition unit (2040) acquires, from a monitoring point information storage unit (3020), a monitoring point corresponding to the state of the monitoring target acquired by the state acquisition unit (2020). The monitoring point indicates a position to be monitored in the captured image. A presentation unit (2060) presents the monitoring point on the captured image.
    Type: Application
    Filed: March 5, 2019
    Publication date: August 29, 2019
    Inventors: Ryoma OAMI, Hiroyoshi MIYANO, Yusuke TAKAHASHI, Hiroo IKEDA, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA, Kazuya KOYAMA, Hiroshi YAMADA
  • Publication number: 20190253636
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Publication number: 20190251346
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa