Patents by Inventor Yasufumi Hirakawa

Yasufumi Hirakawa has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190197847
    Abstract: A setting assistance device includes an acquisition unit, a calculation unit and a determination unit. The acquisition unit acquires coordinates designated by a user for an image capturing a three-dimensional space. The calculation unit calculates coordinates of a position located at a predetermined distance from a position of a part of the three-dimensional space relating to the acquired coordinates. The determination unit determines a region set for the acquired coordinates based on the calculated coordinates.
    Type: Application
    Filed: March 5, 2019
    Publication date: June 27, 2019
    Applicant: NEC CORPORATION
    Inventor: Yasufumi Hirakawa
  • Publication number: 20190156508
    Abstract: Provided is an analysis apparatus (10) including a person extraction unit (11) that analyzes video data to extract a person, a time calculation unit (12) that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person, and an inference unit (13) that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
    Type: Application
    Filed: January 28, 2019
    Publication date: May 23, 2019
    Applicant: NEC Corporation
    Inventors: Yasufumi HIRAKAWA, Jianquan LIU, Shoji NISHIMURA, Takuya ARAKI
  • Publication number: 20190156509
    Abstract: Provided is an analysis apparatus (10) including a person extraction unit (11) that analyzes video data to extract a person, a time calculation unit (12) that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person, and an inference unit (13) that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
    Type: Application
    Filed: January 28, 2019
    Publication date: May 23, 2019
    Applicant: NEC Corporation
    Inventors: Yasufumi HIRAKAWA, Jianquan LIU, Shoji NISHIMURA, Takuya ARAKI
  • Publication number: 20190035106
    Abstract: Provided is an analysis apparatus (10) including a person extraction unit (11) that analyzes video data to extract a person, a time calculation unit (12) that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person, and an inference unit (13) that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
    Type: Application
    Filed: February 13, 2017
    Publication date: January 31, 2019
    Applicant: NEC Corporation
    Inventors: Yasufumi HIRAKAWA, Jianquan LIU, Shoji NISHIMURA, Takuya ARAKI
  • Publication number: 20180350212
    Abstract: The conventional image surveillance technologies relevant to the present invention had the potential of failing in detection due to the intrusion duration time being set too long and thereby allowing an intruder to pass a warning line before the preset intrusion duration had elapsed. The intrusion detection device according to the present invention is provided with: a detection means that detects, in an image, an intrusion position in a specific region by an object that has intruded into the specific region; and a control means that associates the intrusion position with a prescribed time period, wherein the detection means outputs an alert when the object, in the image, has stayed in the specific region for the prescribed time period or longer.
    Type: Application
    Filed: December 16, 2016
    Publication date: December 6, 2018
    Applicant: NEC CORPORATION
    Inventor: Yasufumi HIRAKAWA
  • Publication number: 20180330151
    Abstract: A data processing apparatus (1) of the present invention includes a unit that retrieves a predetermined subject from moving image data. The data processing apparatus includes a person extraction unit (10) that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed, and an output unit (20) that outputs information regarding the extracted person.
    Type: Application
    Filed: October 25, 2016
    Publication date: November 15, 2018
    Applicant: NEC Corporation
    Inventors: Jianquan LIU, Shoji NISHIMURA, Takuya ARAKI, Yasufumi HIRAKAWA
  • Publication number: 20170013230
    Abstract: A video processing system includes: an object movement information acquiring means for detecting a moving object moving in a plurality of segment regions from video data obtained by shooting a monitoring target area, and acquiring movement segment region information as object movement information, the movement segment region information representing segment regions where the detected moving object has moved; an object movement information and video data storing means for storing the object movement information in association with the video data corresponding to the object movement information; a retrieval condition inputting means for inputting a sequence of the segment regions as a retrieval condition; and a video data retrieving means for retrieving the object movement information in accordance with the retrieval condition and outputting video data stored in association with the retrieved object movement information, the object movement information being stored by the object movement information and video d
    Type: Application
    Filed: February 5, 2015
    Publication date: January 12, 2017
    Applicant: NEC Corporation
    Inventor: Yasufumi HIRAKAWA
  • Patent number: 9495754
    Abstract: A person's region is detected from input video of a surveillance camera; a person's direction in the person's region is determined; the separability of person's clothes is determined to generate clothing segment separation information; furthermore, clothing features representing visual features of person's clothes in the person's region are extracted in consideration of the person's direction and the clothing segment separation information. The person's direction is determined based on a person's face direction, person's motion, and clothing symmetry. The clothing segment separation information is generated based on analysis information regarding a geometrical shape of the person's region and visual segment information representing person's clothing segments which are visible based on the person's region and background prior information.
    Type: Grant
    Filed: October 7, 2014
    Date of Patent: November 15, 2016
    Assignee: NEC CORPORATION
    Inventors: Ryoma Oami, Yasufumi Hirakawa, Yusuke Takahashi
  • Patent number: 9462160
    Abstract: It is made possible to easily correct the color tone changing from region to region in an image without the need of preparing a color chart for each region where the color tone changes due to factors such as light source change.
    Type: Grant
    Filed: November 19, 2013
    Date of Patent: October 4, 2016
    Assignee: NEC CORPORATION
    Inventor: Yasufumi Hirakawa
  • Publication number: 20150334267
    Abstract: It is made possible to easily correct the color tone changing from region to region in an image without the need of preparing a color chart for each region where the color tone changes due to factors such as light source change.
    Type: Application
    Filed: November 19, 2013
    Publication date: November 19, 2015
    Inventor: Yasufumi Hirakawa
  • Patent number: 8977005
    Abstract: Provided is a carried item region extraction device for accurately extracting a carried item region from an image. This carried item region extraction device has: a string region processing unit for extracting a string region including a string of a carried item from image information; and a carried item region processing unit for extracting a carried item region including a carried item from the image information on the basis of the string region.
    Type: Grant
    Filed: September 15, 2011
    Date of Patent: March 10, 2015
    Assignee: NEC Corporation
    Inventor: Yasufumi Hirakawa
  • Publication number: 20150023596
    Abstract: A person's region is detected from input video of a surveillance camera; a person's direction in the person's region is determined; the separability of person's clothes is determined to generate clothing segment separation information; furthermore, clothing features representing visual features of person's clothes in the person's region are extracted in consideration of the person's direction and the clothing segment separation information. The person's direction is determined based on a person's face direction, person's motion, and clothing symmetry. The clothing segment separation information is generated based on analysis information regarding a geometrical shape of the person's region and visual segment information representing person's clothing segments which are visible based on the person's region and background prior information.
    Type: Application
    Filed: October 7, 2014
    Publication date: January 22, 2015
    Applicant: NEC CORPORATION
    Inventors: Ryoma OAMI, Yasufumi HIRAKAWA, Yusuke TAKAHASHI
  • Patent number: 8913873
    Abstract: Provided is a content reproduction control system equipped with: a signal characterizing quantity sequence extraction means that, for each content item stored in a content group storage unit that stores at least more than one content item, extracts a signal characterizing quantity sequence which is the sequence of the signal characterizing quantities associated with the content positions on the time axis; a common section group detection means that detects, as a common section group in the signal characterizing quantity sequence for each content item, a common section group for which the signal characterizing quantities for different content items are similar to each other; a content reproduction quality information collection means that, for each common section group, collects content reproduction quality information which indicates the reproduction quality of the content to which each common section of a common section group belongs; and a content reproduction control means that, when the content contained
    Type: Grant
    Filed: June 22, 2009
    Date of Patent: December 16, 2014
    Assignee: NEC Corporation
    Inventors: Kota Iwamoto, Ryoma Oami, Yuzo Senda, Takahiro Kimoto, Takami Sato, Yasufumi Hirakawa
  • Patent number: 8891880
    Abstract: A person's region is detected from input video of a surveillance camera; a person's direction in the person's region is determined; the separability of person's clothes is determined to generate clothing segment separation information; furthermore, clothing features representing visual features of person's clothes in the person's region are extracted in consideration of the person's direction and the clothing segment separation information. The person's direction is determined based on a person's face direction, person's motion, and clothing symmetry. The clothing segment separation information is generated based on analysis information regarding a geometrical shape of the person's region and visual segment information representing person's clothing segments which are visible based on the person's region and background prior information.
    Type: Grant
    Filed: October 13, 2010
    Date of Patent: November 18, 2014
    Assignee: NEC Corporation
    Inventors: Ryoma Oami, Yasufumi Hirakawa, Yusuke Takahashi
  • Patent number: 8879004
    Abstract: A high-quality content generating system provided with a feature amount extracting means for extracting the feature amounts of a plurality of pieces of content therefrom, a content grouping means for performing matching between the feature amounts of the plurality of pieces of content extracted by the feature amount extracting means, grouping the same content included in the plurality of pieces of content and the derived content produced by using the same content, and calculating same/derived content grouping information, and a high-quality content generating means for selecting pieces of content to be grouped by the same/derived content grouping information from the plurality of pieces of content and generating content with higher quality by using the selected content.
    Type: Grant
    Filed: June 16, 2009
    Date of Patent: November 4, 2014
    Assignee: NEC Corporation
    Inventors: Ryoma Oami, Kota Iwamoto, Takami Sato, Yasufumi Hirakawa, Yuzo Senda, Takahiro Kimoto
  • Patent number: 8819558
    Abstract: An edited section information acquisition unit generates edited section information and edited section association information based on edit history information. Pre-edit content/post-edit content correspondence relationship acquisition unit groups consecutive edited groups based on the edited section information of the edited section information acquisition unit and generates group information. An edited section information display screen generation unit generates screen information based on the edited section information and the edited section association information of the edited section information acquisition unit, the group information, inter-group association information, section/group association information of the pre-edit content/post-edit content correspondence relationship acquisition unit, and input information of an input unit. A provision unit displays the screen information obtained by the edited section information display screen generation unit.
    Type: Grant
    Filed: December 25, 2009
    Date of Patent: August 26, 2014
    Assignee: Nec Corporation
    Inventor: Yasufumi Hirakawa
  • Publication number: 20140193034
    Abstract: The present invention accurately detects an object from a video image where large distortion of an image may be generated for covering a wide field of view.
    Type: Application
    Filed: May 21, 2012
    Publication date: July 10, 2014
    Applicant: NEC CORPORATION
    Inventors: Ryoma Oami, Yasufumi Hirakawa
  • Patent number: 8655147
    Abstract: A system comprising: a extraction unit that extracts from each content a signal feature series, being a series of signal features caused to correspond to positions on a time axis of the content; a generator for detects mutual similarity sections, being sections in which the signal features for different contents are similar to each other, from said signal feature series for each content, and generates the content to which each mutual similarity section belongs, information for specifying the position of the above mutual similarity section on the time axis of the content, and a link indicative of a correspondence relation of the mutual similarity section as mutual similarity section link information; and a determination unit that determines a reproduction order of the content based upon a relationship of the position of the mutual similarity section on the time axis of the content that said mutual similarity section link information indicates.
    Type: Grant
    Filed: June 22, 2009
    Date of Patent: February 18, 2014
    Assignee: NEC Corporation
    Inventors: Kota Iwamoto, Ryoma Oami, Yuzo Senda, Takahiro Kimoto, Takami Sato, Yasufumi Hirakawa
  • Publication number: 20130182962
    Abstract: Provided is a carried item region extraction device for accurately extracting a carried item region from an image. This carried item region extraction device has: a string region processing unit for extracting a string region including a string of a carried item from image information; and a carried item region processing unit for extracting a carried item region including a carried item from the image information on the basis of the string region.
    Type: Application
    Filed: September 15, 2011
    Publication date: July 18, 2013
    Applicant: NEC CORPORATION
    Inventor: Yasufumi Hirakawa
  • Patent number: 8306992
    Abstract: A system for determining content topicality comprises a feature value extracting means which extracts feature values of a plurality of contents from the contents; a content grouping means which compares the feature values of the plurality of contents extracted by the feature value extracting means, obtains identical contents and derived contents created using the contents, both of which are included in the plurality of contents, and groups the identical/derived contents, thereby computing identical/derived content grouping information; and a topicality determining means which totals viewing frequencies of the contents determined to be the identical/derived contents from viewing history information and the identical/derived content grouping information relating to the plurality of contents, computes total viewing frequencies of each of the identical/derived contents, and determines the topicality of the identical/derived contents according to the total viewing frequency.
    Type: Grant
    Filed: June 16, 2009
    Date of Patent: November 6, 2012
    Assignee: NEC Corporation
    Inventors: Ryoma Oami, Kota Iwamoto, Takami Sato, Yasufumi Hirakawa, Yuzo Senda, Takahiro Kimoto