Patents by Inventor Yoshiyuki Habashima

Yoshiyuki Habashima has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11861151
    Abstract: The relationship between the smell or taste of an object and the expression of the smell or taste can be grasped, and the user can grasp what smell or taste they prefer. In the single-sample display mode, a system displays a group of expressions relating to the olfactory sense stimulated by a sample, and if any expression is selected by the user, the system displays a relationship image indicating the relationship between the selected expression and other samples corresponding to the olfactory sense associated with the expression. In addition, in the multiple-sample display mode, the system displays, for each of plural samples, a group of expressions relating to the sense of smell stimulated by each sample, and displays an expression common to plural samples among the group of expressions.
    Type: Grant
    Filed: December 1, 2020
    Date of Patent: January 2, 2024
    Assignee: JAPAN TOBACCO INC.
    Inventors: Toshiharu Kurisu, Yusuke Okamura, Yoshiyuki Habashima
  • Publication number: 20230004279
    Abstract: The relationship between the smell or taste of an object and the expression of the smell or taste can be grasped, and the user can grasp what smell or taste they prefer. In the single-sample display mode, display control means displays a group of expressions relating to the olfactory sense stimulated by sample identified by sample identifying means, and if any expression is selected by the user, displays a relationship image indicating the relationship between the selected expression and other samples corresponding to the olfactory sense associated with the expression. In addition, in the multiple-sample display mode, display control means displays, for each of plural samples specific by sample identifying means, a group of expressions relating to the sense of smell stimulated by each sample, and displays the expression common to plural samples among the group of expressions to be distinguishable from the expression common to plural samples.
    Type: Application
    Filed: December 1, 2020
    Publication date: January 5, 2023
    Applicant: JAPAN TOBACCO INC.
    Inventors: Toshiharu KURISU, Yusuke OKAMURA, Yoshiyuki HABASHIMA
  • Publication number: 20220222025
    Abstract: To enable a user to know a smell or taste (of an object) that matches a specified description, when a card is placed on a sensing surface of a sensor, the description specifying unit specifies which card (or which description) is located at which position on the sensing surface. A first display control unit that controls a projector to display a related description group, which is a description group related to the description shown on the card. A priority specifying unit specifies a priority for each of the displayed related descriptions. A second display control unit displays a relationship between at least one related description included in the displayed related description group and an object that stimulates a sense of smell or taste described by the related description. On displaying the relation image, the relationship being displayed by an appearance in accordance with a specified priority.
    Type: Application
    Filed: March 24, 2020
    Publication date: July 14, 2022
    Applicant: JAPAN TOBACCO INC.
    Inventors: Toshiharu Kurisu, Sayako SAKUMA, Yusuke OKAMURA, Yoshiyuki HABASHIMA
  • Publication number: 20220218263
    Abstract: A via visual representation a relationship between a smell or a taste of an object and a verbal description of the smell or taste, a display control unit controls a projector to display, in a display area including to a position of a sample, specified by the position specifying unit, one or more descriptions relating to the sense of smell or taste stimulated by the sample. Here, display control unit displays the descriptions by an appearance corresponding to the sample specified by the sample specifying unit.
    Type: Application
    Filed: March 24, 2020
    Publication date: July 14, 2022
    Inventors: Toshiharu KURISU, Sayako SAKUMA, Yusuke OKAMURA, Yoshiyuki HABASHIMA
  • Publication number: 20220218254
    Abstract: The discovery of a new expression likely to be supported as a smell or taste expression by many users. An acquisition unit acquires appearance history of expressions used by a plurality of users from SNS server device via network. An extraction unit extracts, for each user attribute, appearing expression, appearance frequency of which exceeds threshold value on basis of appearance history acquired by the acquisition unit. The extracted appearing expression is referred to as a frequently appearing expression. A presentation unit presents the frequently appearing expression extracted by the extraction unit as an expression candidate meaning a smell to a user who uses an input device. A registration unit registers the smell and the expression in association with each other when the frequency of selection, by the user who uses the input device, of the expression presented to the user as an expression meaning the smell satisfies a determined condition.
    Type: Application
    Filed: March 24, 2020
    Publication date: July 14, 2022
    Applicant: JAPAN TOBACCO INC.
    Inventors: Toshiharu KURISU, Sayako SAKUMA, Yusuke OKAMURA, Yoshiyuki HABASHIMA
  • Patent number: 9550419
    Abstract: A method and system for providing an augmented reality vehicle interface. The method and system include providing an augmented reality user interface. The method and system additionally include receiving an image of a vehicle with an image capturing device. The method and system additionally include identifying a user classification category of the user that is capturing an image of the vehicle. Additionally, the method and system include presenting an augmented reality image of the vehicle by overlaying one or more virtual user interface objects on the points of interest. The method and system also includes controlling vehicle features via the one or more virtual user interface objects.
    Type: Grant
    Filed: January 21, 2014
    Date of Patent: January 24, 2017
    Assignee: Honda Motor Co., Ltd.
    Inventors: Yoshiyuki Habashima, Fuminobu Kurosawa, Arthur Alaniz, Michael Gleeson-May
  • Publication number: 20160048249
    Abstract: A system and method of remotely controlling a component of a vehicle using a wearable computing device comprising: viewing an identifying characteristic on the vehicle by the wearable computing device; comparing the identifying characteristic viewed to an identifying characteristic image stored in a memory of the wearable computing device; and sending a command signal from the wearable computing device to the vehicle to control the component when the identifying characteristic viewed corresponds to the identifying characteristic image stored in the memory.
    Type: Application
    Filed: August 14, 2014
    Publication date: February 18, 2016
    Inventors: SIYUAN CHEN, GOKULA KRISHNAN, FUMINOBU KUROSAWA, YOSHIYUKI HABASHIMA, MASAYUKI SATO, ARTHUR ALANIZ, MICHAEL EAMONN GLEESON-MAY
  • Patent number: 9117120
    Abstract: A system includes at least one sensor, and a computing device coupled to the at least one sensor. The computing device includes a processor, and a computer-readable storage media having computer-executable instructions embodied thereon. When executed by at least one processor, the computer-executable instructions cause the processor to identifying a dominant eye of the occupant, determine a first position associated with the dominant eye of the occupant, determine a second position associated with the occupant, and determine a first line-of-sight by extending a first line-of-sight between the first position and the second position.
    Type: Grant
    Filed: May 24, 2013
    Date of Patent: August 25, 2015
    Assignee: Honda Motor Co., Ltd.
    Inventors: Arthur Alaniz, Fuminobu Kurosawa, Yoshiyuki Habashima
  • Publication number: 20150202962
    Abstract: A method and system for providing an augmented reality vehicle interface. The method and system include providing an augmented reality user interface. The method and system additionally include receiving an image of a vehicle with an image capturing device. The method and system additionally include identifying a user classification category of the user that is capturing an image of the vehicle. Additionally, the method and system include presenting an augmented reality image of the vehicle by overlaying one or more virtual user interface objects on the points of interest. The method and system also includes controlling vehicle features via the one or more virtual user interface objects.
    Type: Application
    Filed: January 21, 2014
    Publication date: July 23, 2015
    Applicant: Honda Motor Co., Ltd.
    Inventors: Yoshiyuki Habashima, Fuminobu Kurosawa, Arthur Alaniz, Michael Gleeson-May
  • Publication number: 20150116200
    Abstract: A method and system for gestural control of a vehicle system including tracking a motion path of a grasp hand posture upon detecting an initiation dynamic hand gesture in a spatial location associated with the motorized vehicle system, wherein the initiation dynamic hand gesture is a sequence from a first open hand posture to the grasp hand posture, controlling a feature of the vehicle system based on the motion path and terminating control of the feature upon detecting a termination dynamic hand gesture, wherein the termination dynamic hand gesture is a sequence from the grasp hand posture to a second open hand posture.
    Type: Application
    Filed: January 20, 2014
    Publication date: April 30, 2015
    Applicant: Honda Motor Co., Ltd.
    Inventors: Fuminobu Kurosawa, Yoshiyuki Habashima, Michael Eamonn Gleeson-May, Arthur Alaniz
  • Publication number: 20140348377
    Abstract: A system includes at least one sensor, and a computing device coupled to the at least one sensor. The computing device includes a processor, and a computer-readable storage media having computer-executable instructions embodied thereon. When executed by at least one processor, the computer-executable instructions cause the processor to identifying a dominant eye of the occupant, determine a first position associated with the dominant eye of the occupant, determine a second position associated with the occupant, and determine a first line-of-sight by extending a first line-of-sight between the first position and the second position.
    Type: Application
    Filed: May 24, 2013
    Publication date: November 27, 2014
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Arthur Alaniz, Fuminobu Kurosawa, Yoshiyuki Habashima
  • Patent number: 8731824
    Abstract: A vehicle display system includes a display having a display screen which displays an image and provides a touch screen user interface. A controller of the display includes a touch detector which detects a number of fingers touched to a predetermined portion of the display screen, as well as a direction and distance of movement of the number of fingers away from the predetermined portion of the display screen. A navigation mode determiner of the controller determines a navigation mode based on the detected number of fingers. A navigation feature determiner of the controller determines a navigation direction and navigation rate based on the detected direction and distance the number of fingers is moved. An image navigation controller of the controller controls the display to navigate the image displayed on the display screen according to the determined navigation mode, navigation direction, and navigation rate.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: May 20, 2014
    Assignee: Honda Motor Co., Ltd.
    Inventors: Arthur Alaniz, Fuminobu Kurosawa, Yoshiyuki Habashima