Patents by Inventor Victor Ng

Victor Ng has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9064420
    Abstract: An augmented reality driver system, device, and method safely guide a vehicle driver to yield to pedestrians. A vehicle navigator determines a turn lane based upon proximity to a vehicle. A target sensor detects a pedestrian entering the turn lane and to determine a crosswalk path across the turn lane. An augmented reality controller three dimensionally maps a forward view including the pedestrian, and spatially overlays an augmented reality display on the volumetric heads up display for a driver of the vehicle by projecting a yielding indication adjacent to the crosswalk path.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: June 23, 2015
    Assignee: Honda Motor Co., Ltd.
    Inventors: Lee Beckwith, Victor Ng-Thow-Hing
  • Patent number: 9047703
    Abstract: A method, augmented reality driving system and device safely guide a vehicle driver to perform a left turn. A vehicle navigator detects a left turn based upon proximity and speed for a vehicle. A target sensor determines a current position and a relative vector for an oncoming vehicle in a lane for opposing traffic that is approaching the left turn. An augmented reality controller three dimensionally maps a forward view including the oncoming vehicle and spatially overlays an augmented reality display on a volumetric heads up display for a driver of the vehicle by projecting a target path of the oncoming vehicle based upon the vector and relative vector and by projecting a left turn path.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: June 2, 2015
    Assignee: Honda Motor Co., Ltd.
    Inventors: Lee Beckwith, Victor Ng-Thow-Hing
  • Publication number: 20150062168
    Abstract: A method and system for providing augmented reality based directions. The method and system include receiving a voice input based on verbal cues provided by one or more vehicle occupants in a vehicle. The method and system also include receiving a gesture input and a gaze input based on gestural cues and gaze cues provided by the one or more vehicle occupants in the vehicle. The method and system additionally include determining directives based on the voice input, the gesture input and the gaze input and associating the directives with the surrounding environment of the vehicle. Additionally, the method and system include generating augmented reality graphical elements based on the directives and the association of the directives with the surrounding environment of the vehicle. The method and system further include displaying the augmented reality graphical elements on a heads-up display system of the vehicle.
    Type: Application
    Filed: October 16, 2014
    Publication date: March 5, 2015
    Inventors: Victor Ng-Thow-Hing, Cuong Tran, Karlin Bark
  • Publication number: 20150022426
    Abstract: A system for indicating braking intensity to a main vehicle has an observational device monitoring positional and speed data of at least one vehicle proximate the main vehicle. A control unit is coupled to the observational device. The control unit processes the positional and speed data monitored by the observational device and generates graphical representations of the at least one vehicle proximate the main vehicle and graphical representations of a braking intensity level of the at least one vehicle proximate the main vehicle.
    Type: Application
    Filed: October 3, 2014
    Publication date: January 22, 2015
    Applicant: HONDA MOTOR CO., LTD
    Inventors: VICTOR NG-THOW-HING, KARLIN BARK, CUONG TRAN, TIMOTHY MICHAEL STUTTS
  • Publication number: 20140375543
    Abstract: A system includes at least one sensor, at least one display, and a computing device coupled to the at least one sensor and the at least one display. The computing device includes a processor, and a computer-readable storage media having computer-executable instructions embodied thereon. When executed by at least one processor, the computer-executable instructions cause the processor to receive information from at least a first occupant, identify an object based at least partially on the received information, and present, on the at least one display, a first image associated with the object to a second occupant. The first image is aligned substantially between an eye position of the second occupant and the object such that the at least one display appears to one of overlay the first image over the object and position the first image adjacent to the object with respect to the eye position of the second occupant.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Victor Ng-Thow-Hing, Karlin Young Ju Bark, Cuong Tran
  • Publication number: 20140362195
    Abstract: One or more embodiments of techniques or systems for 3-dimensional (3-D) navigation are provided herein. A heads-up display (HUD) component can project, render, display, or present graphic elements on focal planes around an environment surrounding a vehicle. The HUD component can cause these graphic elements to appear volumetric or 3-D by moving or adjusting a distance between a focal plane and the vehicle. Objects within the environment may be tracked, identified, and corresponding graphic elements may be projected on, near, or around respective objects. For example, the HUD component may project graphic elements or pointers on pedestrians such that it may alert the driver or operator of the vehicle of their presence. These pointers may stay ‘stuck’ on the pedestrian as he or she is walking within the environment. Metadata associated with objects may be presented, such as address information, ratings, telephone numbers, logos, etc.
    Type: Application
    Filed: July 1, 2014
    Publication date: December 11, 2014
    Applicant: HONDA MOTOR, CO., LTD.
    Inventors: Victor Ng-Thow-Hing, Karlin Bark, Cuong Tran
  • Publication number: 20140365228
    Abstract: Various exemplary embodiments relate to a command interpreter for use in a vehicle control system in a vehicle for interpreting user commands, a vehicle interaction system including such a command interpreter, a vehicle including such a vehicle interaction system, and related method and non-transitory machine-readable storage medium, including: a memory and a processor, the processor being configured to: receive, from at least one human via a first input device, a first input having a first type; receive a second input having a second type via a second input device, wherein the second type comprises at least one of sensed information describing a surrounding environment of the vehicle and input received from at least one human; interpret both the first input and the second input to generate a system instruction; and transmit the system instruction to a different system of the vehicle.
    Type: Application
    Filed: August 21, 2014
    Publication date: December 11, 2014
    Inventors: Victor Ng-Thow-Hing, Karlin Bark, Cuong Tran
  • Publication number: 20140354684
    Abstract: An augmented reality driver system, device, and method for providing real-time safety information to a driver by detecting the presence and attributes of pedestrians and other road users in the vicinity of a vehicle. An augmented reality controller spatially overlays an augmented reality display on a volumetric heads up by projecting indicators, associated with the social and behavioral states of road users, in a visual field of the driver.
    Type: Application
    Filed: May 28, 2013
    Publication date: December 4, 2014
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Lee Beckwith, Victor Ng-Throw-Hing
  • Publication number: 20140354692
    Abstract: A heads-up display device for displaying graphic elements in view of a user while the user views an environment through a display screen. The heads-up display device includes at least one projector that projects a graphic element on a dynamic, frontal focal plane in view of the user while the user views the environment through the display screen, and at least one projector that projects a graphic element on a static, ground-parallel focal plane in view of the user while the user views the environment through the display screen. A controller determines a target graphic element position and a graphic element size based on the target graphic element position for the graphic element projected on the frontal focal plane, so as to provide the user with an immersive three-dimensional heads-up display.
    Type: Application
    Filed: August 19, 2014
    Publication date: December 4, 2014
    Inventors: Victor Ng-Thow-Hing, Tom Zamojdo, Chris Grabowski
  • Publication number: 20140354691
    Abstract: A heads-up display device for displaying graphic elements in view of a user while the user views an environment through a display screen. The heads-up display device includes at least one projector that projects a graphic element on a frontal focal plane in view of the user while the user views the environment through the display screen, and at least one projector that projects a graphic element on a ground-parallel focal plane in view of the user while the user views the environment through the display screen. The projector that projects the graphic element on the frontal focal plane is mounted on an actuator that linearly moves the projector so as to cause the frontal focal plane to move in a direction of a line-of-sight of the user. The projector that projects the ground-parallel focal plane is fixedly arranged such that the ground-parallel focal plane is static.
    Type: Application
    Filed: August 15, 2014
    Publication date: December 4, 2014
    Inventors: Victor Ng-Thow Hing, Tom Zamojdo, Chris Grabowski
  • Publication number: 20140267402
    Abstract: A heads-up display device for displaying graphic elements in view of a user while the user views an environment through a display screen. The heads-up display device includes at least one projector that projects a graphic element on a frontal focal plane in view of the user while the user views the environment through the display screen, and at least one projector that projects a graphic element on a ground-parallel focal plane in view of the user while the user views the environment through the display screen. The projector that projects the graphic element on the frontal focal plane is mounted on an actuator that linearly moves the projector so as to cause the frontal focal plane to move in a direction of a line-of-sight of the user. The projector that projects the ground-parallel focal plane is fixedly arranged such that the ground-parallel focal plane is static.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: Honda Motor Co., Ltd.
    Inventors: Victor Ng-Thow Hing, Tom Zamojdo, Chris Grabowski
  • Publication number: 20140282259
    Abstract: Navigating through objects or items in a display device using a first input device to detect pointing of a user's finger to an object or item and using a second input device to receive the user's indication on the selection of the object or the item. An image of the hand is captured by the first input device and is processed to determine a location on the display device corresponding to the location of the fingertip of the pointing finger. The object or the item corresponding to the location of the fingertip is selected after the second input device receives predetermined user input from the second input device.
    Type: Application
    Filed: February 27, 2014
    Publication date: September 18, 2014
    Applicant: Honda Motor Co., Ltd.
    Inventors: Kikuo Fujimura, Victor Ng-Thow-Hing, Behzad Dariush
  • Publication number: 20140267398
    Abstract: An augmented reality driver system, device, and method safely guide a vehicle driver to yield to pedestrians. A vehicle navigator determines a turn lane based upon proximity to a vehicle. A target sensor detects a pedestrian entering the turn lane and to determine a crosswalk path across the turn lane. An augmented reality controller three dimensionally maps a forward view including the pedestrian, and spatially overlays an augmented reality display on the volumetric heads up display for a driver of the vehicle by projecting a yielding indication adjacent to the crosswalk path.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: Honda Motor Co., Ltd
    Inventors: Lee Beckwith, Victor Ng-Thow-Hing
  • Publication number: 20140272812
    Abstract: A driver training system includes a training controller, a heads-up display device, and a driving cue adherence controller. The training controller is configured to receive inputs related to an operational state of a vehicle and an environment surrounding the vehicle, and to determine a driving cue based on the received inputs. The heads-up display device is configured to present the driving cue as an augmented reality graphic element in view of a driver by projecting graphic elements on a windshield of the vehicle. The driving cue adherence controller is configured to continuously determine a current level of adherence to the driving cue, and an aggregate level of adherence to the driving cue based on the continuously determined current level of adherence to the driving cue over a predetermined time period. The heads-up display device is configured to present the continuously determined aggregate level of adherence in view of the driver.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: Honda Motor Co., Ltd.
    Inventors: Victor Ng-Thow Hing, Lee Beckwith
  • Publication number: 20140268353
    Abstract: One or more embodiments of techniques or systems for 3-dimensional (3-D) navigation are provided herein. A heads-up display (HUD) component can project graphic elements on focal planes around an environment surrounding a vehicle. The HUD component can cause these graphic elements to appear volumetric or 3-D by moving or adjusting a distance between a focal plane and the vehicle. Additionally, a target position for graphic elements can be adjusted. This enables the HUD component to project graphic elements as moving avatars. In other words, adjusting the focal plane distance and the target position enables graphic elements to be projected in three dimensions along an x, y, and z axis. Further, a moving avatar can be ‘animated’ by sequentially projecting the avatar on different focal planes, thereby providing an occupant with the perception that the avatar is moving towards or away from the vehicle.
    Type: Application
    Filed: September 30, 2013
    Publication date: September 18, 2014
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Kikuo Fujimura, Victor Ng-Thow-Hing
  • Publication number: 20140267263
    Abstract: A method, augmented reality driving system and device safely guide a vehicle driver to perform a left turn. A vehicle navigator detects a left turn based upon proximity and speed for a vehicle. A target sensor determines a current position and a relative vector for an oncoming vehicle in a lane for opposing traffic that is approaching the left turn. An augmented reality controller three dimensionally maps a forward view including the oncoming vehicle and spatially overlays an augmented reality display on a volumetric heads up display for a driver of the vehicle by projecting a target path of the oncoming vehicle based upon the vector and relative vector and by projecting a left turn path.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Applicant: Honda Motor Co., Ltd.
    Inventors: Lee Beckwith, Victor Ng-Thow-Hing
  • Publication number: 20140266656
    Abstract: A system for warning of potential hazards when a vehicle is turning has a sensor coupled to the vehicle and configured to capture data of objects located around the vehicle. A control unit is coupled to the sensor for processing the data captured by the sensor to generate graphical representations of objects captured by the sensor, graphical representations of projected paths of moving objects captured; and a graphical representation of a projected turning path of the vehicle.
    Type: Application
    Filed: February 5, 2014
    Publication date: September 18, 2014
    Inventors: VICTOR NG-THOW-HING, CUONG TRAN
  • Publication number: 20130293582
    Abstract: Generating a virtual model of environment in front of a vehicle based on images captured using an image capturing. The Images captured on an image capturing device of a vehicle are processed to extract features of interest. Based on the extracted features, a virtual model of the environment is constructed. The virtual model includes one or more surfaces. Each of the surfaces may be used as a reference surface to attach and move graphical elements generated to implement augmented reality (AR). As the vehicle moves, the graphical elements move as if the graphical elements are affixed to the one of the surfaces. By presenting the graphical elements to move together with real objects in front of the vehicle, a driver perceives the graphical elements as being part of the actual environment and reduces distraction or confusion associated with the graphical elements.
    Type: Application
    Filed: May 3, 2013
    Publication date: November 7, 2013
    Inventors: Victor Ng-Thow-Hing, Srinath Sridhar
  • Patent number: 8406925
    Abstract: A robot using less storage and computational resources to embody panoramic attention. The robot includes a panoramic attention module with multiple levels that are hierarchically structured to process different levels of information. The top-level of the panoramic attention module receives information about entities detected from the environment of the robot and maps the entities to a panoramic map maintained by the robot. By mapping and storing high-level entity information instead of low-level sensory information in the panoramic map, the amount of storage and computation resources for panoramic attention can be reduced significantly. Further, the mapping and storing of high-level entity information in the panoramic map also facilitates consistent and logical processing of different conceptual levels of information.
    Type: Grant
    Filed: June 18, 2010
    Date of Patent: March 26, 2013
    Assignee: Honda Motor Co., Ltd.
    Inventors: Ravi Kiran Sarvadevabhatla, Victor Ng-Thow-Hing
  • Publication number: 20120314020
    Abstract: A user interface screen for displaying data associated with operation of a robot where the user interface screen includes one or more windows that can be rotated and then minimized into an icon to render space for other windows. As user input for moving a window is received, the window moves to an edge. As further user input is received, the window is rotated about an axis and then minimized into an icon. In this way, the windows presented on the screen can be intuitively operated by a user.
    Type: Application
    Filed: May 1, 2012
    Publication date: December 13, 2012
    Applicant: HONDA MOTOR CO,, LTD.
    Inventor: Victor Ng-Thow-Hing