Patents by Inventor Kyoji Hirata

Kyoji Hirata has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9588957
    Abstract: In object recognition using image information, countermeasures are taken for reducing erroneous judgment caused when the similarity between a visual feature of an object image and a visual feature of dictionary data of another object becomes high in association with a temporal change. In a dictionary update method for a computer for updating dictionary data in which a visual feature of an object which a user desires to recognize is registered: when there are a plurality of data pieces having a visual feature similar to the inputted inquiry image, visual features of a plurality of objects are concluded as being similar and hence the pair of objects are accumulated into a similar object accumulation section; and when the objects accumulated in the similar object accumulation section have reached a condition set forth in advance, data having the visual features of the objects having reached the condition is concluded as requiring update and hence update is recommended to a user having registered the object.
    Type: Grant
    Filed: July 19, 2013
    Date of Patent: March 7, 2017
    Assignee: Biglobe, Inc.
    Inventors: Michitaro Miyata, Kyoji Hirata, Takeshi Kawasaki, Kazuya Furukawa
  • Patent number: 9459706
    Abstract: In an AR device, when display control of additional information is performed for an object, a display device acquires a picture including an object in real space. An pointing device outputs, when pointing the object in real space by an operation of a user, information showing a feature for identifying the pointed object. A control device analyzes the picture by the display device and identifies objects in the picture. The control device outputs position information in the picture for the identified object. The control device outputs position information in the picture for the pointed object based on the outputted information from the pointing device. The control device extracts additional information of a target object from an additional information group of objects in real space. The control device performs display control of the additional information based on the calculated position information. The display device displays the additional information.
    Type: Grant
    Filed: September 5, 2012
    Date of Patent: October 4, 2016
    Assignee: BIGLOBE, INC.
    Inventors: Tomonari Kamba, Kyoji Hirata
  • Patent number: 9386437
    Abstract: A memory unit stores a plurality of transmission destinations. A control unit determines an order of priority of transmission destinations based on states of the plurality of transmission destinations that are stored by the memory unit, and then, upon receiving predetermined input, originates a call to the transmission destination having the highest order of priority of transmission destinations that was determined.
    Type: Grant
    Filed: April 11, 2014
    Date of Patent: July 5, 2016
    Assignee: BIGLOBE Inc.
    Inventors: Michitaro Miyata, Tomonari Kamba, Kyoji Hirata, Takahiro Murakami, Kenji Shioume, Tadashi Haneishi
  • Publication number: 20150234881
    Abstract: For the purpose of appropriately updating dictionary information in accordance with a change caused by a medium-to-long term temporal change caused by aging or a characteristics change of the target person, a method of updating a person authentication dictionary for storing biological information of a person of recognition target, includes: a storage step of storing and accumulating inquiry biological information received within a predetermined term, into a history accumulation unit; a generation step of measuring a similarity of the accumulated inquiry biological information pieces and then generating the similarity of each pair of biological information pieces; and an update step of, on the basis of a pair of biological information pieces judged as having a high similarity, updating the person authentication dictionary.
    Type: Application
    Filed: July 19, 2013
    Publication date: August 20, 2015
    Applicant: BIGLOBE Inc.
    Inventors: Kyoji Hirata, Takeshi Kawasaki, Michitaro Miyata, Kazuya Furukawa
  • Publication number: 20150234808
    Abstract: In object recognition using image information, countermeasures are taken for reducing erroneous judgment caused when the similarity between a visual feature of an object image and a visual feature of dictionary data of another object becomes high in association with a temporal change. In a dictionary update method for a computer for updating dictionary data in which a visual feature of an object which a user desires to recognize is registered: when there are a plurality of data pieces having a visual feature similar to the inputted inquiry image, visual features of a plurality of objects are concluded as being similar and hence the pair of objects are accumulated into a similar object accumulation section; and when the objects accumulated in the similar object accumulation section have reached a condition set forth in advance, data having the visual features of the objects having reached the condition is concluded as requiring update and hence update is recommended to a user having registered the object.
    Type: Application
    Filed: July 19, 2013
    Publication date: August 20, 2015
    Applicant: BIGLOBE Inc.
    Inventors: Michitaro Miyata, Kyoji Hirata, Takeshi Kawasaki, Kazuya Furukawa
  • Publication number: 20150229765
    Abstract: To store voice information for identifying the voice of the opposite party in a telephone directory.
    Type: Application
    Filed: January 20, 2015
    Publication date: August 13, 2015
    Applicant: BIGLOBE Inc.
    Inventor: Kyoji Hirata
  • Publication number: 20150215449
    Abstract: An object of the present invention is to forestall phone crimes such as hoax call and phone fraud. When a call detection section 110 detects a call, a message transmission section 120 transmits a first message to a calling terminal corresponding to the call originator, and if a first response is transmitted from the calling terminal in response to the first message, a connection control section 130 connects the calling terminal to a receiving terminal corresponding to a receiver of the call, and if a second response is transmitted from the calling terminal in response to the first message, the connection control section 130 rejects a connection of the calling terminal to the receiving terminal corresponding to the call receiver. If the second response is transmitted from the calling terminal in response to the first message, a registration section 140 registers a telephone number of the calling terminal.
    Type: Application
    Filed: January 20, 2015
    Publication date: July 30, 2015
    Applicant: BIGLOBE INC.
    Inventors: Kyoji Hirata, Kazuya Furukawa
  • Publication number: 20140323078
    Abstract: A memory unit stores a plurality of transmission destinations. A control unit determines an order of priority of transmission destinations based on states of the plurality of transmission destinations that are stored by the memory unit, and then, upon receiving predetermined input, originates a call to the transmission destination having the highest order of priority of transmission destinations that was determined.
    Type: Application
    Filed: April 11, 2014
    Publication date: October 30, 2014
    Applicant: NEC BIGLOBE, LTD.
    Inventors: Michitaro MIYATA, Tomonari KAMBA, Kyoji HIRATA, Takahiro MURAKAMI, Kenji SHIOUME, Tadashi HANEISHI
  • Publication number: 20140292653
    Abstract: In an AR device, when display control of additional information is performed for an object, a display device acquires a picture including an object in real space. An pointing device outputs, when pointing the object in real space by an operation of a user, information showing a feature for identifying the pointed object. A control device analyzes the picture by the display device and identifies objects in the picture. The control device outputs position information in the picture for the identified object. The control device outputs position information in the picture for the pointed object based on the outputted information from the pointing device. The control device extracts additional information of a target object from an additional information group of objects in real space. The control device performs display control of the additional information based on the calculated position information. The display device displays the additional information.
    Type: Application
    Filed: September 5, 2012
    Publication date: October 2, 2014
    Applicant: NEC BIGLOBE, LTD.
    Inventors: Tomonari Kamba, Kyoji Hirata
  • Patent number: 8364003
    Abstract: During time-shifted playback of a broadcast signal, an immediacy information detection means (121) detects whether the received broadcast signal is a broadcast signal having immediacy information that must be provided immediately to the viewer, and when a broadcast signal having immediacy information is detected, a playback control means (123) interrupts the time-shifted playback of the broadcast signal that does not have immediacy information to immediately play back the broadcast signal having immediacy information.
    Type: Grant
    Filed: September 25, 2006
    Date of Patent: January 29, 2013
    Assignee: NEC Corporation
    Inventor: Kyoji Hirata
  • Patent number: 8290341
    Abstract: A video playing device includes a classification unit classifying a video or a video section to be played according to a degree of user's necessary concentration on a video. A user's concentration derivation unit calculates a degree of user's concentration on the video. A video selection unit selects the video or the video section to be played based on a classification result and the degree of user's concentration on the video. A video playing unit plays the video or the video section selected by the video selection unit, and operates so as to preferentially play videos or video sections which the degree of user's concentration necessary to watch each video or video section is lower than the current degree of user's concentration.
    Type: Grant
    Filed: May 15, 2007
    Date of Patent: October 16, 2012
    Assignee: NEC Corporation
    Inventors: Kyoji Hirata, Eiji Kasutani, Masumi Okumura
  • Publication number: 20100191665
    Abstract: A service value calculation system includes: crisis problem inspection means which detects a crisis or a problem of a service object; and virtual damage amount calculation means which calculates a virtual damage amount generated if a detected crisis or problem is not solved when a service is not applied and calculates the service value.
    Type: Application
    Filed: July 10, 2008
    Publication date: July 29, 2010
    Applicants: MONASH UNIVERSITY, POLYCHIP PHARMACEUTICALS PTY LTD,
    Inventors: Kyoji Hirata, Ken Hanazawa, Masahiro Iwadare
  • Patent number: 7734996
    Abstract: To enable a person to effectively create a document based on image data or audio data of a recorded meeting or a recorded lecture and also a person who creates the minutes of a meeting or a participant to browse a summarized document with images or voices so that a plurality of persons can effectively perform documentation. The audio/image inputting means 10 generates image data by recording a meeting and audio data by recording the contents of the meeting. The document inputting means 20 generates document data including a drafted minutes of a meeting or the like inputted by a person who creates the minutes of a meeting. The relationship deriving means 50 generates correspondence table data by deriving relationship between voices or images and a document based on audio data or image data and document data. The relationship presenting means 60 displays voices or images and a document in association with each other based on the correspondence table data.
    Type: Grant
    Filed: August 31, 2004
    Date of Patent: June 8, 2010
    Assignee: NEC Corporation
    Inventor: Kyoji Hirata
  • Publication number: 20100076938
    Abstract: A protocol mismatch detection system (1) includes: a protocol database (10); input means (11) for inputting data to be accumulated in the protocol database (10); protocol data accumulation means (12) for accumulating the inputted data in the protocol database (10); announced content reception means (13) for receiving an announced content from outside; a protocol mismatch detection means (14) for referencing the protocol database (10) to detect a portion of the announced content received which does not coincide with the protocol of a user in the protocol mismatch detection system (1); and output means (15) for outputting the detected protocol mismatch and the protocol mismatch portion.
    Type: Application
    Filed: December 26, 2007
    Publication date: March 25, 2010
    Inventors: Ken HANAZAWA, Kyoji HIRATA, MASAHIRO IWADARE
  • Publication number: 20100063965
    Abstract: To provide a content processing technique which enables to prevent a reading person from easily guessing the fact of hiding and hidden information, and to obtain a content having natural information close to information of its original content before hiding. A content processor includes a search means which searches contents having information similar to a part excluding a part to be hidden in the original content, an arithmetic means which calculates non-similarity which shows the degree of non-similarity of each content obtained by the search means to the part to be hidden of the contend, and a selection means which selects the content which is the least similar to the part to be hidden out of the contents searched by the search means.
    Type: Application
    Filed: April 25, 2008
    Publication date: March 11, 2010
    Inventors: Ken Hanazawa, Masahiro Iwadare, Kyoji Hirata
  • Publication number: 20090279850
    Abstract: During time-shifted playback of a broadcast signal, an immediacy information detection means (121) detects whether the received broadcast signal is a broadcast signal having immediacy information that must be provided immediately to the viewer, and when a broadcast signal having immediacy information is detected, a playback control means (123) interrupts the time-shifted playback of the broadcast signal that does not have immediacy information to immediately play back the broadcast signal having immediacy information.
    Type: Application
    Filed: September 25, 2006
    Publication date: November 12, 2009
    Inventor: Kyoji Hirata
  • Publication number: 20090097822
    Abstract: A video playing device includes a classification unit 121 classifying a video or a video section to be played according to a degree of user's necessary concentration on a video, a user's concentration derivation unit 140 calculating a degree of user's concentration on the video, a video selection unit 122 selecting the video or the video section to be played based on a classification result and the degree of user's concentration on the video, and a video playing unit 130 playing the video or the video section selected by the video selection unit, and operates so as to preferentially playing videos or video sections which the degree of user's concentration necessary to watch the each video or the each video section is lower than the current degree of user's concentration on a video.
    Type: Application
    Filed: May 15, 2007
    Publication date: April 16, 2009
    Inventors: Kyoji Hirata, Eiji Kasutani, Masumi Okumura
  • Publication number: 20060294453
    Abstract: To enable a person to effectively create a document based on image data or audio data of a recorded meeting or a recorded lecture and also a person who creates the minutes of a meeting or a participant to browse a summarized document with images or voices so that a plurality of persons can effectively perform documentation. The audio/image inputting means 10 generates image data by recording a meeting and audio data by recording the contents of the meeting. The document inputting means 20 generates document data including a drafted minutes of a meeting or the like inputted by a person who creates the minutes of a meeting. The relationship deriving means 50 generates correspondence table data by deriving relationship between voices or images and a document based on audio data or image data and document data.
    Type: Application
    Filed: August 31, 2004
    Publication date: December 28, 2006
    Inventor: Kyoji Hirata
  • Publication number: 20060195858
    Abstract: Visual feature information which is information representing a numerical value of a visual feature of an object and additional information which is information added to the object are stored in association with each other. Partial image data which is image data of a partial area of a video image is extracted. Visual feature information of the extracted partial image data is generated. The visual feature information of the extracted partial image data and visual feature information of an object which is stored are compared with each other to calculate a similarity therebetween. Based on the calculated similarity, an object contained in the video image data is identified. An annotation made up of additional information of the identified object is displayed in superposing relation to the video image on a display device.
    Type: Application
    Filed: April 15, 2004
    Publication date: August 31, 2006
    Inventors: Yusuke Takahashi, Kyoji Hirata
  • Patent number: 6922485
    Abstract: This paper provides a new image segmentation algorithm for object-based image retrieval. The system partitions multi-dimensional images into disjoint regions of coherent color and texture. In order to distinguish the object contour lines from texture edge, the description length of the line is used as discriminating criteria. Visual attribute values are assigned to each region.
    Type: Grant
    Filed: May 30, 2002
    Date of Patent: July 26, 2005
    Assignee: NEC Corporation
    Inventor: Kyoji Hirata