Patents by Inventor Yuji Kaneda

Yuji Kaneda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20060222354
    Abstract: At least one exemplary embodiment is directed to an imaging processing method where a plurality of images are picked up while an imaging direction of a camera is being changed. The images are combined into a composite image. A target region is specified within the composite image. An imaging direction and an imaging magnification for imaging an object included in the target region in a predetermined size are calculated based on a position and a size of the target region in the composite image and an imaging parameter set for the camera when the plurality of images are picked up. Accordingly, the imaging direction and the imaging magnification for imaging the object to be imaged can be set easily.
    Type: Application
    Filed: March 15, 2006
    Publication date: October 5, 2006
    Applicant: Canon Kabushiki Kaisha
    Inventors: Katsuhiko Mori, Masakazu Matsugu, Masami Kato, Yusuke Mitarai, Yuji Kaneda, Hiroshi Sato
  • Publication number: 20060172947
    Abstract: A peptide having any one of the amino acid sequences of SEQ ID NO: 1 or 13, preferably a peptide having any one of the amino acid sequences of SEQ ID NOS: 2 to 9 or a peptide having any one of the amino acid sequences of SEQ ID NOS: 10 and 15 to 17, is used as an active ingredient of an agent for promoting growth or differentiation of cells such as osteoblasts, chondroblasts, cementoblasts, bone marrow-derived mesenchymal stem cells and periodontal ligament-derived cells.
    Type: Application
    Filed: February 20, 2004
    Publication date: August 3, 2006
    Inventors: Takashi Takata, Shoji Kitagawa, Yuji Kaneda
  • Publication number: 20060115157
    Abstract: An image including a face is input (S201), a plurality of local features are detected from the input image, a region of a face in the image is specified using the plurality of detected local features (S202), and an expression of the face is determined on the basis of differences between the detection results of the local features in the region of the face and detection results which are calculated in advance as references for respective local features in the region of the face (S204).
    Type: Application
    Filed: January 12, 2006
    Publication date: June 1, 2006
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Katsuhiko Mori, Yuji Kaneda, Masakazu Matsugu, Yusuke Mitarai, Takashi Suzuki
  • Publication number: 20060008173
    Abstract: A person area is detected from an inputted image, a category to which a person shown in the person area belongs is recognized, a correction area is extracted from the person area, and the correction area is corrected based on the recognized category. Thus, the inputted image is easily corrected in an appropriate manner according to the category of the person, that is, an object.
    Type: Application
    Filed: June 24, 2005
    Publication date: January 12, 2006
    Applicant: Canon Kabushiki Kaisha
    Inventors: Masakazu Matsugu, Katsuhiko Mori, Yusuke Mitarai, Yuji Kaneda
  • Publication number: 20050201594
    Abstract: A movement evaluation apparatus extracts feature points from a first reference object image and an ideal object image which are obtained by sensing an image including an object by an image sensing unit, and generates ideal action data on the basis of change amounts of the feature points between the first reference object image and the ideal object image. The apparatus extracts feature points from a second reference object image and an evaluation object image sensed by the image sensing unit, and generates measurement action data on the basis of change amounts of the feature points between the second reference object image and the evaluation object image. The movement evaluation apparatus evaluates the movement of the object in the evaluation object image on the basis of the ideal action data and the measurement action data.
    Type: Application
    Filed: February 24, 2005
    Publication date: September 15, 2005
    Inventors: Katsuhiko Mori, Masakazu Matsugu, Yuji Kaneda
  • Publication number: 20050187437
    Abstract: An information processing apparatus detects the facial expression and body action of a person image included in image information, and determines the physical/mental condition of the user on the basis of the detection results. Presentation of information by a presentation unit which visually and/or audibly presenting information is controlled by the determined physical/mental condition of the user.
    Type: Application
    Filed: February 24, 2005
    Publication date: August 25, 2005
    Inventors: Masakazu Matsugu, Katsuhiko Mori, Yuji Kaneda