Patents by Inventor Youding Zhu

Youding Zhu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230357076
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: May 2, 2023
    Publication date: November 9, 2023
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Viabhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Patent number: 11698272
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: July 11, 2023
    Assignee: NVIDIA Corporation
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Vaibhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Patent number: 10948726
    Abstract: Optimizations are provided for generating passthrough visualizations for Head Mounted Displays. The interpupil distance of a user wearing a head-mounted device is determined and a stereo camera pair with a left and right camera is used to capture raw images. The center-line perspectives of the images captured by the left camera have non-parallel alignments with respect to center-line perspectives of any images captured by the right camera. After the raw images are captured, various camera distortion corrections are applied to the images to create corrected images. Epipolar transforms are then applied to the corrected images to create transformed images having parallel center-line perspectives. Thereafter, a depth map is generated of the transformed images. Finally, left and right passthrough visualizations are generated and rendered by reprojecting the transformed left and right images.
    Type: Grant
    Filed: October 4, 2019
    Date of Patent: March 16, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Youding Zhu, Michael Bleyer, Denis Claude Pierre Demandolx, Raymond Kirk Price
  • Publication number: 20210063200
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: August 31, 2020
    Publication date: March 4, 2021
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Vaibhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Publication number: 20200041799
    Abstract: Optimizations are provided for generating passthrough visualizations for Head Mounted Displays. The interpupil distance of a user wearing a head-mounted device is determined and a stereo camera pair with a left and right camera is used to capture raw images. The center-line perspectives of the images captured by the left camera have non-parallel alignments with respect to center-line perspectives of any images captured by the right camera. After the raw images are captured, various camera distortion corrections are applied to the images to create corrected images. Epipolar transforms are then applied to the corrected images to create transformed images having parallel center-line perspectives. Thereafter, a depth map is generated of the transformed images. Finally, left and right passthrough visualizations are generated and rendered by reprojecting the transformed left and right images.
    Type: Application
    Filed: October 4, 2019
    Publication date: February 6, 2020
    Inventors: Youding ZHU, Michael BLEYER, Denis Claude Pierre DEMANDOLX, Raymond Kirk PRICE
  • Patent number: 10546426
    Abstract: A virtual reality scene is displayed via a display device. A real-world positioning of a peripheral control device is identified relative to the display device. Video of a real-world scene of a physical environment located behind a display region of the display device is captured via a camera. A real-world portal is selectively displayed via the display device that includes a portion of the real-world scene and simulates a view through the virtual reality scene at a position within the display region that tracks the real-world positioning of the peripheral control device.
    Type: Grant
    Filed: February 1, 2018
    Date of Patent: January 28, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandru Octavian Balan, Youding Zhu, Min shik Park
  • Patent number: 10437065
    Abstract: Optimizations are provided for generating passthrough visualizations for Head Mounted Displays. The interpupil distance of a user wearing a head-mounted device is determined and a stereo camera pair with a left and right camera is used to capture raw images. The center-line perspectives of the images captured by the left camera have non-parallel alignments with respect to center-line perspectives of any images captured by the right camera. After the raw images are captured, various camera distortion corrections are applied to the images to create corrected images. Epipolar transforms are then applied to the corrected images to create transformed images having parallel center-line perspectives. Thereafter, a depth map is generated of the transformed images. Finally, left and right passthrough visualizations are generated and rendered by reprojecting the transformed left and right images.
    Type: Grant
    Filed: October 3, 2017
    Date of Patent: October 8, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Youding Zhu, Michael Bleyer, Denis Claude Pierre Demandolx, Raymond Kirk Price
  • Publication number: 20190213793
    Abstract: A virtual reality scene is displayed via a display device. A real-world positioning of a peripheral control device is identified relative to the display device. Video of a real-world scene of a physical environment located behind a display region of the display device is captured via a camera. A real-world portal is selectively displayed via the display device that includes a portion of the real-world scene and simulates a view through the virtual reality scene at a position within the display region that tracks the real-world positioning of the peripheral control device.
    Type: Application
    Filed: February 1, 2018
    Publication date: July 11, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian BALAN, Youding ZHU, Min shik PARK
  • Publication number: 20190101758
    Abstract: Optimizations are provided for generating passthrough visualizations for Head Mounted Displays. The interpupil distance of a user wearing a head-mounted device is determined and a stereo camera pair with a left and right camera is used to capture raw images. The center-line perspectives of the images captured by the left camera have non-parallel alignments with respect to center-line perspectives of any images captured by the right camera. After the raw images are captured, various camera distortion corrections are applied to the images to create corrected images. Epipolar transforms are then applied to the corrected images to create transformed images having parallel center-line perspectives. Thereafter, a depth map is generated of the transformed images. Finally, left and right passthrough visualizations are generated and rendered by reprojecting the transformed left and right images.
    Type: Application
    Filed: October 3, 2017
    Publication date: April 4, 2019
    Inventors: Youding Zhu, Michael Bleyer, Denis Claude Pierre Demandolx, Raymond Kirk Price
  • Patent number: 9746675
    Abstract: A head-mounted display device is disclosed, which includes an at least partially see-through display, a processor configured to detect a physical feature, generate an alignment hologram based on the physical feature, determine a view of the alignment hologram based on a default view matrix for a first eye of a user of the head-mounted display device, display the view of the alignment hologram to the first eye of the user on the at least partially see-through display, output an instruction to the user to enter an adjustment input to visually align the alignment hologram with the physical feature, determine a calibrated view matrix based on the default view matrix and the adjustment input, and adjust a view matrix setting of the head-mounted display device based on the calibrated view matrix.
    Type: Grant
    Filed: May 28, 2015
    Date of Patent: August 29, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Quentin Simon Charles Miller, Drew Steedly, Denis Demandolx, Youding Zhu, Qi Kuan Zhou, Todd Michael Lyon
  • Patent number: 9658686
    Abstract: Various embodiments relating to using motion based view matrix tuning to calibrate a head-mounted display device are disclosed. In one embodiment, the holograms are rendered with different view matrices, each view matrix corresponding to a different inter-pupillary distance. Upon selection by the user of the most stable hologram, the head-mounted display device can be calibrated to the inter-pupillary distance corresponding to the selected most stable hologram.
    Type: Grant
    Filed: May 28, 2015
    Date of Patent: May 23, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Quentin Simon Charles Miller, Drew Steedly, Denis Demandolx, Youding Zhu, Qi Kuan Zhou, Todd Michael Lyon
  • Publication number: 20160349837
    Abstract: Various embodiments relating to using motion based view matrix tuning to calibrate a head-mounted display device are disclosed. In one embodiment, the holograms are rendered with different view matrices, each view matrix corresponding to a different inter-pupillary distance. Upon selection by the user of the most stable hologram, the head-mounted display device can be calibrated to the inter-pupillary distance corresponding to the selected most stable hologram.
    Type: Application
    Filed: May 28, 2015
    Publication date: December 1, 2016
    Inventors: Quentin Simon Charles Miller, Drew Steedly, Denis Demandolx, Youding Zhu, Qi Kuan Zhou, Todd Michael Lyon
  • Publication number: 20160349510
    Abstract: A head-mounted display device is disclosed, which includes an at least partially see-through display, a processor configured to detect a physical feature, generate an alignment hologram based on the physical feature, determine a view of the alignment hologram based on a default view matrix for a first eye of a user of the head-mounted display device, display the view of the alignment hologram to the first eye of the user on the at least partially see-through display, output an instruction to the user to enter an adjustment input to visually align the alignment hologram with the physical feature, determine a calibrated view matrix based on the default view matrix and the adjustment input, and adjust a view matrix setting of the head-mounted display device based on the calibrated view matrix.
    Type: Application
    Filed: May 28, 2015
    Publication date: December 1, 2016
    Inventors: Quentin Simon Charles Miller, Drew Steedly, Denis Demandolx, Youding Zhu, Qi Kuan Zhou, Todd Michael Lyon
  • Patent number: 9352230
    Abstract: Techniques for interpreting motions of a motion-sensitive device are described to allow natural and intuitive interfaces for controlling an application (e.g. video game). A motion-sensitive device includes inertial and generates sensor signals sufficient to derive positions and orientations of the device in six degrees of freedom. The motion of the device in six degrees of freedom is tracked by analyzing sensor data from the inertial sensors in conjunction with data from a secondary source that may be a camera or a non-inertial sensor. Different techniques are provided to correct or minimize errors in deriving the positions and orientations of the device. These techniques include “stop” detection, back tracking, extrapolation of sensor data beyond the sensor ranges, and using constraints from multi trackable objects in an application being interfaced with the motion-sensitive device.
    Type: Grant
    Filed: May 11, 2011
    Date of Patent: May 31, 2016
    Assignee: AiLive Inc.
    Inventors: Dana Wilkinson, Charles Musick, Jr., William Robert Powers, III, Youding Zhu
  • Patent number: 9292734
    Abstract: Techniques for performing accurate and automatic head pose estimation are disclosed. According to one aspect of the techniques, head pose estimation is integrated with a scale-invariant head tracking method along with facial features detected from a located head in images. Thus the head pose estimation works efficiently even when there are large translational movements resulting from the head motion. Various computation techniques are used to optimize the process of estimation so that the head pose estimation can be applied to control one or more objects in a virtual environment and virtual character gaze control.
    Type: Grant
    Filed: July 11, 2014
    Date of Patent: March 22, 2016
    Assignee: AiLive, Inc.
    Inventors: Youding Zhu, Charles Musick, Jr., Robert Kay, William Robert Powers, III, Dana Wilkinson, Stuart Reynolds
  • Patent number: 9165199
    Abstract: A system, method, and computer program product for estimating human body pose are described. According to one aspect, anatomical features are detected in a depth image of a human actor. The method detects a head, neck, and trunk (H-N-T) template in the depth image, and detects limbs in the depth image based on the H-N-T template. The anatomical features are detected based on the H-N-T template and the limbs. An estimated pose of a human model is estimated based on the detected features and kinematic constraints of the human model.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: October 20, 2015
    Assignee: Honda Motor Co., Ltd.
    Inventors: Youding Zhu, Behzad Dariush, Kikuo Fujimura
  • Patent number: 9098766
    Abstract: A system, method, and computer program product for estimating upper body human pose are described. According to one aspect, a plurality of anatomical features are detected in a depth image of the human actor. The method detects a head, neck, and torso (H-N-T) template in the depth image, and detects the features in the depth image based on the H-N-T template. An estimated pose of a human model is estimated based on the detected features and kinematic constraints of the human model.
    Type: Grant
    Filed: December 19, 2008
    Date of Patent: August 4, 2015
    Assignee: Honda Motor Co., Ltd.
    Inventors: Behzad Dariush, Youding Zhu, Kikuo Fujimura
  • Patent number: 9007299
    Abstract: Techniques for using a motion sensitive device as a controller are disclosed. A motion controller as an input/control device is used to control an existing electronic device (a.k.a., controlled device) previously configured for taking inputs from a pre-defined controlling device. The signals from the input device are in a different form from the pre-defined controlling device. According to one aspect of the present invention, the controlled device was designed to respond to signals from a pre-defined controlling device (e.g., a touch-screen device). The inputs from the motion controller are converted into touch-screen like signals that are then sent to the controlled device or programs being executed in the controlled device to cause the behavior of the controlled device to change or respond thereto, without reconfiguration of the applications running on the controlled device.
    Type: Grant
    Filed: September 30, 2011
    Date of Patent: April 14, 2015
    Assignee: AiLive Inc.
    Inventors: Charles Musick, Jr., Robert Kay, Stuart Reynolds, Dana Wilkinson, Anupam Chakravorty, William Robert Powers, III, Wei Yen, Youding Zhu
  • Publication number: 20140342830
    Abstract: Techniques for providing compatibility between two different game controllers are disclosed. When a new or more advanced controller is introduced, it is important that such a new controller works with a system originally configured for an existing or old controller. The new controller may provide more functionalities than the old one does. In some cases, the new controller provides more sensing signals than the old one does. The new controller is configured to work with the system to transform the sensing signals therefrom to masquerade as though they were coming from the old controller. The transforming of the sensing signals comprises: replicating operational characteristics of the old controller, and relocating virtually the sensing signals to appear as though the sensing signals were generated from inertial sensors located in a certain location in the new controller responsive to a certain location of the inertial sensors in the old controller.
    Type: Application
    Filed: August 1, 2014
    Publication date: November 20, 2014
    Inventors: Charles Musick, JR., Robert Kay, William Robert Powers, III, Dana Wilkinson, Youding Zhu
  • Publication number: 20140320691
    Abstract: Techniques for performing accurate and automatic head pose estimation are disclosed. According to one aspect of the techniques, head pose estimation is integrated with a scale-invariant head tracking method along with facial features detected from a located head in images. Thus the head pose estimation works efficiently even when there are large translational movements resulting from the head motion. Various computation techniques are used to optimize the process of estimation so that the head pose estimation can be applied to control one or more objects in a virtual environment and virtual character gaze control.
    Type: Application
    Filed: July 11, 2014
    Publication date: October 30, 2014
    Inventors: Youding Zhu, Charles Musick, JR., Robert Kay, William Robert Powers, III, Dana Wilkinson, Stuart Reynolds