Patents by Inventor James E. Holmes

James E. Holmes has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11275447
    Abstract: A user, such as the driver of a vehicle, to retrieve information related to a point of interest (POI) near the vehicle by pointing at the POI or performing some other gesture to identify the POI. Gesture recognition is performed on the gesture to generate a target region that includes the POI that the user identified. After generating the target region, information about the POI can be retrieved by querying a server-based POI service with the target region or by searching in a micromap that is stored locally. The retrieved POI information can then be provided to the user via a display and/or speaker in the vehicle. This process beneficially allows a user to rapidly identify and retrieve information about a POI near the vehicle without having to navigate a user interface by manipulating a touchscreen or physical buttons.
    Type: Grant
    Filed: August 8, 2018
    Date of Patent: March 15, 2022
    Assignees: HONDA MOTOR CO., LTD., EDGE 3 TECHNOLOGIES LLC
    Inventors: Pedram Vaghefinazari, Stuart Masakazu Yamamoto, Tarek A. El Dokor, Jordan Cluster, James E. Holmes
  • Patent number: 11169618
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: November 9, 2021
    Assignees: HONDA MOTOR CO., LTD., EDGE 3 TECHNOLOGIES, INC.
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Publication number: 20210191521
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Application
    Filed: December 7, 2020
    Publication date: June 24, 2021
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Patent number: 10860116
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Grant
    Filed: January 13, 2020
    Date of Patent: December 8, 2020
    Assignees: HONDA MOTOR CO., LTD., EDGE 3 TECHNOLOGIES, INC.
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Publication number: 20200150775
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Application
    Filed: January 13, 2020
    Publication date: May 14, 2020
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Patent number: 10551936
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Grant
    Filed: October 17, 2018
    Date of Patent: February 4, 2020
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Patent number: 10466657
    Abstract: A method and system for globally updating a plurality of learning implicit gesture control systems. Embodiments can comprise receiving, by a global server and from a plurality of learning implicit gesture control systems, a user data. The global server configured to analyze the user data and determine an applicable integration level, and the global server further able to communicate with the plurality of learning implicit gesture control systems. Modifying a global parameter when the global server determines the applicable integration level is a global integration level; and transmitting the modified global parameter to the plurality of learning implicit gesture control systems.
    Type: Grant
    Filed: September 18, 2014
    Date of Patent: November 5, 2019
    Assignees: Honda Motor Co., Ltd., EDGE 3 TECHNOLOGIES, INC.
    Inventors: Tarek A. El Dokor, Joshua King, Jordan Cluster, James E. Holmes
  • Publication number: 20190079592
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Application
    Filed: October 17, 2018
    Publication date: March 14, 2019
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Patent number: 10156906
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Grant
    Filed: October 2, 2015
    Date of Patent: December 18, 2018
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Stuart Yamamoto, Tarek El Dokor, Graeme Asher, Jordan Cluster, Matt Conway, Josh King, James E. Holmes, Matt McElvogue, Churu Yun
  • Publication number: 20180348886
    Abstract: A user, such as the driver of a vehicle, to retrieve information related to a point of interest (POI) near the vehicle by pointing at the POI or performing some other gesture to identify the POI. Gesture recognition is performed on the gesture to generate a target region that includes the POI that the user identified. After generating the target region, information about the POI can be retrieved by querying a server-based POI service with the target region or by searching in a micromap that is stored locally. The retrieved POI information can then be provided to the user via a display and/or speaker in the vehicle. This process beneficially allows a user to rapidly identify and retrieve information about a POI near the vehicle without having to navigate a user interface by manipulating a touchscreen or physical buttons.
    Type: Application
    Filed: August 8, 2018
    Publication date: December 6, 2018
    Inventors: Pedram Vaghefinazari, Stuart Masakazu Yamamoto, Tarek A. El Dokor, Jordan Cluster, James E. Holmes
  • Patent number: 10073535
    Abstract: A user, such as the driver of a vehicle, to retrieve information related to a point of interest (POI) near the vehicle by pointing at the POI or performing some other gesture to identify the POI. Gesture recognition is performed on the gesture to generate a target region that includes the POI that the user identified. After generating the target region, information about the POI can be retrieved by querying a server-based POI service with the target region or by searching in a micromap that is stored locally. The retrieved POI information can then be provided to the user via a display and/or speaker in the vehicle. This process beneficially allows a user to rapidly identify and retrieve information about a POI near the vehicle without having to navigate a user interface by manipulating a touchscreen or physical buttons.
    Type: Grant
    Filed: December 2, 2016
    Date of Patent: September 11, 2018
    Assignees: Honda Motor Co., Ltd., EDGE 3 TECHNOLOGIES, INC.
    Inventors: Pedram Vaghefinazari, Stuart Masakazu Yamamoto, Tarek A. El Dokor, Jordan Cluster, James E. Holmes
  • Patent number: 9694283
    Abstract: Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display.
    Type: Grant
    Filed: February 10, 2013
    Date of Patent: July 4, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tarek El Dokor, Joshua E King, James E. Holmes, Justin R. Gigliotto, William E. Glomski
  • Patent number: 9684427
    Abstract: A three-dimensional virtual-touch human-machine interface system (20) and a method (100) of operating the system (20) are presented. The system (20) incorporates a three-dimensional time-of-flight sensor (22), a three-dimensional autostereoscopic display (24), and a computer (26) coupled to the sensor (22) and the display (24). The sensor (22) detects a user object (40) within a three-dimensional sensor space (28). The display (24) displays an image (42) within a three-dimensional display space (32). The computer (26) maps a position of the user object (40) within an interactive volumetric field (36) mutually within the sensor space (28) and the display space (32), and determines when the positions of the user object (40) and the image (42) are substantially coincident. Upon detection of coincidence, the computer (26) executes a function programmed for the image (42).
    Type: Grant
    Filed: July 3, 2014
    Date of Patent: June 20, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tarek El Dokor, Joshua T. King, James E. Holmes, William E. Glomski, Maria N. Ngomba
  • Publication number: 20170083106
    Abstract: A user, such as the driver of a vehicle, to retrieve information related to a point of interest (POI) near the vehicle by pointing at the POI or performing some other gesture to identify the POI. Gesture recognition is performed on the gesture to generate a target region that includes the POI that the user identified. After generating the target region, information about the POI can be retrieved by querying a server-based POI service with the target region or by searching in a micromap that is stored locally. The retrieved POI information can then be provided to the user via a display and/or speaker in the vehicle. This process beneficially allows a user to rapidly identify and retrieve information about a POI near the vehicle without having to navigate a user interface by manipulating a touchscreen or physical buttons.
    Type: Application
    Filed: December 2, 2016
    Publication date: March 23, 2017
    Inventors: Pedram Vaghefinazari, Stuart Masakazu Yamamoto, Tarek A. El Dokor, Jordan Cluster, James E. Holmes
  • Patent number: 9541418
    Abstract: A user, such as the driver of a vehicle, to retrieve information related to a point of interest (POI) near the vehicle by pointing at the POI or performing some other gesture to identify the POI. Gesture recognition is performed on the gesture to generate a target region that includes the POI that the user identified. After generating the target region, information about the POI can be retrieved by querying a server-based POI service with the target region or by searching in a micromap that is stored locally. The retrieved POI information can then be provided to the user via a display and/or speaker in the vehicle. This process beneficially allows a user to rapidly identify and retrieve information about a POI near the vehicle without having to navigate a user interface by manipulating a touchscreen or physical buttons.
    Type: Grant
    Filed: July 16, 2014
    Date of Patent: January 10, 2017
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Tarek A. El Dokor, Jordan Cluster, James E. Holmes, Pedram Vaghefinazari, Stuart M. Yamamoto
  • Patent number: 9346471
    Abstract: An in-vehicle computing system allows a user to control components of the vehicle by performing gestures. The user provides a selecting input to indicate that he wishes to control one of the components. After the component is identified, the user performs a gesture to control the component. The gesture and the component that was previously selected are analyzed to generate a command for the component. Since the command is based on both the gesture and the identified component, the user can perform the same gesture in the same position within the vehicle to control different components.
    Type: Grant
    Filed: October 14, 2014
    Date of Patent: May 24, 2016
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Tarek A. El Dokor, Jordan Cluster, James E. Holmes, Pedram Vaghefinazari, Stuart M. Yamamoto
  • Patent number: 9342797
    Abstract: Some embodiments provide systems and methods for enabling a learning implicit gesture control system for use by an occupant of a vehicle. The method includes identifying features received from a plurality of sensors and comparing the features to antecedent knowledge stored in memory. A system output action that corresponds to the features can then be provided in the form of a first vehicle output. The method further includes detecting a second vehicle output from the plurality of sensors and updating the antecedent knowledge to associate the system output action with the second vehicle output.
    Type: Grant
    Filed: April 3, 2014
    Date of Patent: May 17, 2016
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Tarek A. El Dokor, Joshua King, Jordan Cluster, James E. Holmes
  • Publication number: 20160103499
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Application
    Filed: October 2, 2015
    Publication date: April 14, 2016
    Applicants: Edge3 Technologies, LLC, Honda Motor Co., Ltd.
    Inventors: Stuart Yamamoto, Tarek A. El Dokor, Graeme Asher, Jordan Cluster, Matthew Conway, Joshua T. King, James E. Holmes, Matt McElvogue, Churu Yun
  • Patent number: 9272202
    Abstract: Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display.
    Type: Grant
    Filed: February 10, 2013
    Date of Patent: March 1, 2016
    Assignee: Edge 3 Technologies, Inc.
    Inventors: Tarek El Dokor, Joshua E. King, James E. Holmes, Justin R. Gigliotto, William E. Glomski
  • Publication number: 20150286191
    Abstract: A method and system for globally updating a plurality of learning implicit gesture control systems. Embodiments can comprise receiving, by a global server and from a plurality of learning implicit gesture control systems, a user data. The global server configured to analyze the user data and determine an applicable integration level, and the global server further able to communicate with the plurality of learning implicit gesture control systems. Modifying a global parameter when the global server determines the applicable integration level is a global integration level; and transmitting the modified global parameter to the plurality of learning implicit gesture control systems.
    Type: Application
    Filed: September 18, 2014
    Publication date: October 8, 2015
    Inventors: Tarek A. El Dokor, Joshua King, Jordan Cluster, James E. Holmes