Patents by Inventor Katsuhiko Takahashi

Katsuhiko Takahashi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10931923
    Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.
    Type: Grant
    Filed: March 13, 2019
    Date of Patent: February 23, 2021
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
  • Publication number: 20210027110
    Abstract: The first parameter generation unit 811 generates a first parameter, which is a parameter of a first recognizer, using first learning data including a combination of data to be recognized, a correct label of the data, and domain information indicating a collection environment of the data. The second parameter generation unit 812 generates a second parameter, which is a parameter of a second recognizer, using second learning data including a combination of data to be recognized that is collected in a predetermined collection environment, a correct label of the data, and target domain information indicating the predetermined collection environment, based on the first parameter. The third parameter generation unit 813 integrates the first parameter and the second parameter to generate a third parameter to be used for pattern recognition of input data by learning using the first learning data.
    Type: Application
    Filed: May 10, 2018
    Publication date: January 28, 2021
    Applicant: NEC Corporation
    Inventors: Katsuhiko TAKAHASHI, Hiroyoshi MIYANO, Tetsuo lNOSHITA
  • Patent number: 10887561
    Abstract: A surveillance system (1) includes an area information acquisition unit (101), a position information acquisition unit (102), a candidate determination unit (103), and a notification unit (104). The area information acquisition unit (101) acquires information of a surveillance-desired area. The position information acquisition unit (102) acquires pieces of position information of a plurality of portable terminals (20), each portable terminal performing surveillance using an image capturing unit. The candidate determination unit (103) determines a candidate portable terminal (20) to be moved to the surveillance-desired area from among the plurality of portable terminals (20) based on the acquired pieces of position information of the plurality of portable terminals (20). The notification unit (104) outputs a notification to the candidate portable terminal requesting to move to the surveillance-desired area.
    Type: Grant
    Filed: March 13, 2019
    Date of Patent: January 5, 2021
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
  • Publication number: 20200334801
    Abstract: In the present invention, a first image acquisition means 81 acquires a first image of an inspection target including an abnormal part. A second image acquisition means 82 acquires a second image of the inspection target captured earlier than the time when the first image is captured. A learning data generation means 83 generates learning data indicating that the second image includes an abnormal part. A learning means 84 learns a discrimination dictionary by using the learning data generated by the learning data generation means 83.
    Type: Application
    Filed: December 6, 2017
    Publication date: October 22, 2020
    Applicant: NEC Corporation
    Inventors: Katsuhiko TAKAHASHI, Takashi SHIBATA
  • Publication number: 20200311894
    Abstract: An anomaly detection apparatus 100 includes an image transformation unit 103 that calculates an image transformation parameter, based on an inspection image in which an inspection object appears, a reference image indicating a normal state of the inspection object and a parameter for image transformation parameter calculation, and performs image transformation on the inspection image using the image transformation parameter, an image change detection unit 104 that collates the reference image and the image-transformed inspection image using a change detection parameter, and calculates an anomaly certainty factor indicating whether there is a change in a specific region of the inspection image, a change detection parameter learning unit 106 that learns the change detection parameter, based on a difference between a training image indicating a correct answer value of the change and the anomaly certainty factor, and an image transformation parameter learning unit 108 that learns the parameter for image transform
    Type: Application
    Filed: September 29, 2017
    Publication date: October 1, 2020
    Applicant: NEC CORPORATION
    Inventors: Katsuhiko TAKAHASHI, Yuichi NAKATANI
  • Publication number: 20200286259
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: November 21, 2016
    Publication date: September 10, 2020
    Applicant: NEC Corporation
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Patent number: 10764503
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Grant
    Filed: April 26, 2019
    Date of Patent: September 1, 2020
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Publication number: 20200242391
    Abstract: An object detection apparatus 100 is provided with: a fish-eye image acquisition unit 10 configured to acquire a time series fish-eye image; a horizontal panorama image generation unit 20 configured to, for each frame, perform conversion to a horizontal panorama image in which a vertical direction in a real space is expressed in a perpendicular direction of the frame, and an azimuth is expressed equiangularly in a horizontal direction of the frame; an edge pair extraction unit 30 configured to extract a pair of edges in the perpendicular direction from the horizontal panorama image; a change rate extraction unit 40 configured to extract a change rate of an inter-edge distance between the pair of edges; a lower end region extraction unit 50 configured to extract a region of a lower end of an object providing the pair of edges; a distance change rate extraction unit 60 configured to calculate a distance from the object to the fish-eye camera based on the position of the region of the lower end of the object in
    Type: Application
    Filed: October 6, 2017
    Publication date: July 30, 2020
    Applicant: NEC CORPORATION
    Inventor: Katsuhiko TAKAHASHI
  • Publication number: 20200228756
    Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.
    Type: Application
    Filed: August 16, 2016
    Publication date: July 16, 2020
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA
  • Patent number: 10699422
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Grant
    Filed: January 26, 2017
    Date of Patent: June 30, 2020
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Patent number: 10679078
    Abstract: The present invention is directed to a helmet wearing determination system including a imaging means that is installed in a predetermined position and images a two-wheel vehicle that travels on a road; and a helmet wearing determination means that processes an image imaged by the imaging means, estimates a rider head region corresponding to a head of a person who rides on the two-wheel vehicle that travels on the road, compares image characteristics of the rider head region with image characteristics according to the head at a time when a helmet is worn or/and at a time when a helmet is not worn, and determines whether or not the rider wears the helmet.
    Type: Grant
    Filed: April 5, 2019
    Date of Patent: June 9, 2020
    Assignee: NEC Corporation
    Inventor: Katsuhiko Takahashi
  • Publication number: 20200065982
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: February 27, 2020
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20190333234
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: October 31, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20190325589
    Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).
    Type: Application
    Filed: June 18, 2019
    Publication date: October 24, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20190253636
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Publication number: 20190253618
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
  • Publication number: 20190251346
    Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
  • Publication number: 20190236384
    Abstract: The present invention is directed to a helmet wearing determination system including a imaging means that is installed in a predetermined position and images a two-wheel vehicle that travels on a road; and a helmet wearing determination means that processes an image imaged by the imaging means, estimates a rider head region corresponding to a head of a person who rides on the two-wheel vehicle that travels on the road, compares image characteristics of the rider head region with image characteristics according to the head at a time when a helmet is worn or/and at a time when a helmet is not worn, and determines whether or not the rider wears the helmet.
    Type: Application
    Filed: April 5, 2019
    Publication date: August 1, 2019
    Inventor: Katsuhiko TAKAHASHI
  • Publication number: 20190215492
    Abstract: A surveillance system (1) includes an area information acquisition unit (101), a position information acquisition unit (102), a candidate determination unit (103), and a notification unit (104). The area information acquisition unit (101) acquires information of a surveillance-desired area. The position information acquisition unit (102) acquires pieces of position information of a plurality of portable terminals (20), each portable terminal performing surveillance using an image capturing unit. The candidate determination unit (103) determines a candidate portable terminal (20) to be moved to the surveillance-desired area from among the plurality of portable terminals (20) based on the acquired pieces of position information of the plurality of portable terminals (20). The notification unit (104) outputs a notification to the candidate portable terminal requesting to move to the surveillance-desired area.
    Type: Application
    Filed: March 13, 2019
    Publication date: July 11, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko Takahashi, Yuuske Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
  • Publication number: 20190215494
    Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.
    Type: Application
    Filed: March 13, 2019
    Publication date: July 11, 2019
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA