Patents by Inventor Katsuhiko Takahashi
Katsuhiko Takahashi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10931923Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.Type: GrantFiled: March 13, 2019Date of Patent: February 23, 2021Assignee: NEC CORPORATIONInventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
-
Publication number: 20210027110Abstract: The first parameter generation unit 811 generates a first parameter, which is a parameter of a first recognizer, using first learning data including a combination of data to be recognized, a correct label of the data, and domain information indicating a collection environment of the data. The second parameter generation unit 812 generates a second parameter, which is a parameter of a second recognizer, using second learning data including a combination of data to be recognized that is collected in a predetermined collection environment, a correct label of the data, and target domain information indicating the predetermined collection environment, based on the first parameter. The third parameter generation unit 813 integrates the first parameter and the second parameter to generate a third parameter to be used for pattern recognition of input data by learning using the first learning data.Type: ApplicationFiled: May 10, 2018Publication date: January 28, 2021Applicant: NEC CorporationInventors: Katsuhiko TAKAHASHI, Hiroyoshi MIYANO, Tetsuo lNOSHITA
-
Patent number: 10887561Abstract: A surveillance system (1) includes an area information acquisition unit (101), a position information acquisition unit (102), a candidate determination unit (103), and a notification unit (104). The area information acquisition unit (101) acquires information of a surveillance-desired area. The position information acquisition unit (102) acquires pieces of position information of a plurality of portable terminals (20), each portable terminal performing surveillance using an image capturing unit. The candidate determination unit (103) determines a candidate portable terminal (20) to be moved to the surveillance-desired area from among the plurality of portable terminals (20) based on the acquired pieces of position information of the plurality of portable terminals (20). The notification unit (104) outputs a notification to the candidate portable terminal requesting to move to the surveillance-desired area.Type: GrantFiled: March 13, 2019Date of Patent: January 5, 2021Assignee: NEC CORPORATIONInventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
-
Publication number: 20200334801Abstract: In the present invention, a first image acquisition means 81 acquires a first image of an inspection target including an abnormal part. A second image acquisition means 82 acquires a second image of the inspection target captured earlier than the time when the first image is captured. A learning data generation means 83 generates learning data indicating that the second image includes an abnormal part. A learning means 84 learns a discrimination dictionary by using the learning data generated by the learning data generation means 83.Type: ApplicationFiled: December 6, 2017Publication date: October 22, 2020Applicant: NEC CorporationInventors: Katsuhiko TAKAHASHI, Takashi SHIBATA
-
Publication number: 20200311894Abstract: An anomaly detection apparatus 100 includes an image transformation unit 103 that calculates an image transformation parameter, based on an inspection image in which an inspection object appears, a reference image indicating a normal state of the inspection object and a parameter for image transformation parameter calculation, and performs image transformation on the inspection image using the image transformation parameter, an image change detection unit 104 that collates the reference image and the image-transformed inspection image using a change detection parameter, and calculates an anomaly certainty factor indicating whether there is a change in a specific region of the inspection image, a change detection parameter learning unit 106 that learns the change detection parameter, based on a difference between a training image indicating a correct answer value of the change and the anomaly certainty factor, and an image transformation parameter learning unit 108 that learns the parameter for image transformType: ApplicationFiled: September 29, 2017Publication date: October 1, 2020Applicant: NEC CORPORATIONInventors: Katsuhiko TAKAHASHI, Yuichi NAKATANI
-
Publication number: 20200286259Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.Type: ApplicationFiled: November 21, 2016Publication date: September 10, 2020Applicant: NEC CorporationInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
-
Patent number: 10764503Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.Type: GrantFiled: April 26, 2019Date of Patent: September 1, 2020Assignee: NEC CORPORATIONInventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
-
Publication number: 20200242391Abstract: An object detection apparatus 100 is provided with: a fish-eye image acquisition unit 10 configured to acquire a time series fish-eye image; a horizontal panorama image generation unit 20 configured to, for each frame, perform conversion to a horizontal panorama image in which a vertical direction in a real space is expressed in a perpendicular direction of the frame, and an azimuth is expressed equiangularly in a horizontal direction of the frame; an edge pair extraction unit 30 configured to extract a pair of edges in the perpendicular direction from the horizontal panorama image; a change rate extraction unit 40 configured to extract a change rate of an inter-edge distance between the pair of edges; a lower end region extraction unit 50 configured to extract a region of a lower end of an object providing the pair of edges; a distance change rate extraction unit 60 configured to calculate a distance from the object to the fish-eye camera based on the position of the region of the lower end of the object inType: ApplicationFiled: October 6, 2017Publication date: July 30, 2020Applicant: NEC CORPORATIONInventor: Katsuhiko TAKAHASHI
-
Publication number: 20200228756Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.Type: ApplicationFiled: August 16, 2016Publication date: July 16, 2020Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA
-
Patent number: 10699422Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).Type: GrantFiled: January 26, 2017Date of Patent: June 30, 2020Assignee: NEC CORPORATIONInventors: Ryoma Oami, Katsuhiko Takahashi, Yusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
-
Patent number: 10679078Abstract: The present invention is directed to a helmet wearing determination system including a imaging means that is installed in a predetermined position and images a two-wheel vehicle that travels on a road; and a helmet wearing determination means that processes an image imaged by the imaging means, estimates a rider head region corresponding to a head of a person who rides on the two-wheel vehicle that travels on the road, compares image characteristics of the rider head region with image characteristics according to the head at a time when a helmet is worn or/and at a time when a helmet is not worn, and determines whether or not the rider wears the helmet.Type: GrantFiled: April 5, 2019Date of Patent: June 9, 2020Assignee: NEC CorporationInventor: Katsuhiko Takahashi
-
Publication number: 20200065982Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).Type: ApplicationFiled: June 18, 2019Publication date: February 27, 2020Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
-
Publication number: 20190333234Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).Type: ApplicationFiled: June 18, 2019Publication date: October 31, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
-
Publication number: 20190325589Abstract: An information processing apparatus (2000) includes a first analyzing unit (2020), a second analyzing unit (2040), and an estimating unit (2060). The first analyzing unit (2020) calculates a flow of a crowd in a capturing range of a fixed camera (10) using a first surveillance image (12). The second analyzing unit (2040) calculates a distribution of an attribute of objects in a capturing range of a moving camera (20) using a second surveillance image (22). The estimating unit (2060) estimates an attribute distribution for a range that is not included in the capturing range of the moving camera (20).Type: ApplicationFiled: June 18, 2019Publication date: October 24, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
-
Publication number: 20190253636Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.Type: ApplicationFiled: April 26, 2019Publication date: August 15, 2019Applicant: NEC CORPORATIONInventors: Ryoma Oami, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
-
Publication number: 20190253618Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.Type: ApplicationFiled: April 26, 2019Publication date: August 15, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroshi YAMADA, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Ryo KAWAI, Takuya OGAWA
-
Publication number: 20190251346Abstract: An information processing apparatus (2000) includes a determination unit (2020) and a notification unit (2040). The determination unit (2020) determines whether a field of view of a second camera is correct, on the basis of a first captured image (40), a second captured image (50), and relationship information (information indicating the relationship to be satisfied between a field of view of a first camera and the field of view of the second camera). In a case in which the field of view of the second camera is not correct, the notification unit (2040) notifies that the field of view of the second camera is not correct. The first camera is provided in a head-mounted display worn by a person. The second camera is provided in a part other than the head-mounted display.Type: ApplicationFiled: April 26, 2019Publication date: August 15, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko Takahashi, Yuusuke Konishi, Hiroshi Yamada, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Ryo Kawai, Takuya Ogawa
-
Publication number: 20190236384Abstract: The present invention is directed to a helmet wearing determination system including a imaging means that is installed in a predetermined position and images a two-wheel vehicle that travels on a road; and a helmet wearing determination means that processes an image imaged by the imaging means, estimates a rider head region corresponding to a head of a person who rides on the two-wheel vehicle that travels on the road, compares image characteristics of the rider head region with image characteristics according to the head at a time when a helmet is worn or/and at a time when a helmet is not worn, and determines whether or not the rider wears the helmet.Type: ApplicationFiled: April 5, 2019Publication date: August 1, 2019Inventor: Katsuhiko TAKAHASHI
-
Publication number: 20190215492Abstract: A surveillance system (1) includes an area information acquisition unit (101), a position information acquisition unit (102), a candidate determination unit (103), and a notification unit (104). The area information acquisition unit (101) acquires information of a surveillance-desired area. The position information acquisition unit (102) acquires pieces of position information of a plurality of portable terminals (20), each portable terminal performing surveillance using an image capturing unit. The candidate determination unit (103) determines a candidate portable terminal (20) to be moved to the surveillance-desired area from among the plurality of portable terminals (20) based on the acquired pieces of position information of the plurality of portable terminals (20). The notification unit (104) outputs a notification to the candidate portable terminal requesting to move to the surveillance-desired area.Type: ApplicationFiled: March 13, 2019Publication date: July 11, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko Takahashi, Yuuske Konishi, Hiroo Ikeda, Junko Nakagawa, Kosuke Yoshimi, Yukie Ebiyama, Ryo Kawai, Takuya Ogawa
-
Publication number: 20190215494Abstract: A surveillance system (1) includes an area information acquisition unit (101), a request information provision unit (102), and a participation consent reception unit (103). The area information acquisition unit (101) acquires information of a surveillance-desired area. The request information provision unit (102) provides participation request information for surveillance that is conducted by using an image capturing unit of a portable terminal in the surveillance-desired area. The participation consent reception unit (103) receives participation consent as a response to the participation request information, from the portable terminal.Type: ApplicationFiled: March 13, 2019Publication date: July 11, 2019Applicant: NEC CORPORATIONInventors: Ryoma OAMI, Katsuhiko TAKAHASHI, Yuusuke KONISHI, Hiroo IKEDA, Junko NAKAGAWA, Kosuke YOSHIMI, Yukie EBIYAMA, Ryo KAWAI, Takuya OGAWA