Patents by Inventor Hiu Hong Yu

Hiu Hong Yu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11892560
    Abstract: A calibration system for multi-sensor extrinsic calibration in a vehicle includes one or more calibration targets provided around an external environment within a threshold distance of the vehicle. Each of the one or more calibration targets includes a combination of sensor targets configured to be measured by and used for calibrating a pair of sensors selected from the group consisting of a first sensor, a second sensor or a third sensor. The system also includes a vehicle placement section configured to accommodate the vehicle on the vehicle placement section for detection of the one or more calibration targets.
    Type: Grant
    Filed: February 3, 2020
    Date of Patent: February 6, 2024
    Assignee: NIO Technology (Anhui) Co., Ltd
    Inventors: Hiu Hong Yu, Zhenxiang Jian, Tong Lin, Xu Chen, Zhongkui Wang, Antonio Antonellis Rufo, Waylon Chen
  • Patent number: 11821990
    Abstract: Embodiments of the present disclosure are directed to providing scene perception display requiring reduced processing capabilities. Sensor data indicative of one or more targets from an imaging and ranging subsystem, location data from a positioning subsystem defining a geographical location of the imaging and ranging subsystem and orientation data from an orientation subsystem defining an orientation of the imaging and ranging subsystem are received. Doppler point cloud data is generated based on Doppler information and point cloud data from the sensor data, the location data and the orientation data. The targets are classified as either a static or dynamic target. Afterwards, the Doppler point cloud data is filtered by removing either the dynamic or the static targets from the Doppler point cloud data. The Doppler point cloud data of either the dynamic or the static targets are further processed and the further processed Doppler point cloud data is rendered.
    Type: Grant
    Filed: November 7, 2019
    Date of Patent: November 21, 2023
    Assignee: NIO Technology (Anhui) Co., Ltd.
    Inventors: Xu Chen, Tong Lin, Zhenxiang Jian, Hiu Hong Yu
  • Patent number: 11726189
    Abstract: Embodiments of the present disclosure are directed to calibrating an imaging and ranging subsystem. Sensor data indicative of one or more targets from the imaging and ranging subsystem, location data defining a geographical location of the imaging and ranging subsystem, orientation data defining an orientation of the imaging and ranging subsystem and stored translation and rotation values of the imaging and ranging subsystem are received. Estimated Doppler values for the target are provided by the sensor data and theoretical Doppler values for the targets are also calculated. The estimated Doppler values are compared to the theoretical Doppler values to determine if calibration of the imaging and ranging subsystem is required. If calibration is necessary, correction translation and correction rotation values are calculated in order to calculate updated translation and rotation values used to calibrate the imaging and ranging subsystem.
    Type: Grant
    Filed: December 9, 2019
    Date of Patent: August 15, 2023
    Assignee: NIO Technology (Anhui) Co., Ltd.
    Inventors: Zhenxiang Jian, Tong Lin, Hiu Hong Yu, Xu Chen
  • Patent number: 11520024
    Abstract: Extrinsic calibration of a Light Detection and Ranging (LiDAR) sensor and a camera can comprise constructing a first plurality of reconstructed calibration targets in a three-dimensional space based on physical calibration targets detected from input from the LiDAR and a second plurality of reconstructed calibration targets in the three-dimensional space based on physical calibration targets detected from input from the camera. Reconstructed calibration targets in the first and second plurality of reconstructed calibration targets can be matched and a six-degree of freedom rigid body transformation of the LiDAR and camera can be computed based on the matched reconstructed calibration targets. A projection of the LiDAR to the camera can be computed based on the computed six-degree of freedom rigid body transformation.
    Type: Grant
    Filed: December 24, 2019
    Date of Patent: December 6, 2022
    Assignee: NIO Technology (Anhui) Co., Ltd.
    Inventors: Hiu Hong Yu, Tong Lin, Xu Chen, Zhenxiang Jian
  • Patent number: 11340354
    Abstract: Methods and systems include localization of a vehicle localize precisely and in near real-time. As described, localization of a vehicle using a Global Navigation Satellite System (GNSS) can comprise receiving a signal from each of a plurality of satellites of a GNSS constellation and receiving input from one or more sensors of the vehicle. The input from the sensors can indicate current physical surroundings of the vehicle. A model of the current physical surrounding of the vehicle can be generated based on the input from the one or more sensors of the vehicle. One or more multipath signals received from the plurality of satellites can be mitigated based on the model and the vehicle can be localized using the received signals from the plurality of satellites of the GNSS constellation and based on the mitigation of the one or more multipath signals.
    Type: Grant
    Filed: June 26, 2019
    Date of Patent: May 24, 2022
    Assignee: NIO USA, Inc.
    Inventors: Tong Lin, Hiu Hong Yu, Veera Ganesh Yalla, Farzad Cyrus Foroughi Abari, Andre Michelin, Xu Chen
  • Publication number: 20210372796
    Abstract: Methods and systems include localization of a vehicle localize precisely and in near real-time. As described, localization of a vehicle using a Global Navigation Satellite System (GNSS) can comprise receiving a signal from each of a plurality of satellites of a GNSS constellation and receiving input from one or more sensors of the vehicle. The input from the sensors can indicate current physical surroundings of the vehicle. A model of the current physical surrounding of the vehicle can be generated based on the input from the one or more sensors of the vehicle. One or more multipath signals received from the plurality of satellites can be mitigated based on the model and the vehicle can be localized using the received signals from the plurality of satellites of the GNSS constellation and based on the mitigation of the one or more multipath signals.
    Type: Application
    Filed: June 26, 2019
    Publication date: December 2, 2021
    Inventors: Tong Lin, Hiu Hong Yu, Veera Ganesh Yalla, Farzad Cyrus Foroughi Abari, Andre Michelin, Xu Chen
  • Publication number: 20210239793
    Abstract: A calibration system for multi-sensor extrinsic calibration in a vehicle includes one or more calibration targets provided around an external environment within a threshold distance of the vehicle. Each of the one or more calibration targets includes a combination of sensor targets configured to be measured by and used for calibrating a pair of sensors selected from the group consisting of a first sensor, a second sensor or a third sensor. The system also includes a vehicle placement section configured to accommodate the vehicle on the vehicle placement section for detection of the one or more calibration targets.
    Type: Application
    Filed: February 3, 2020
    Publication date: August 5, 2021
    Inventors: Hiu Hong Yu, Zhenxiang Jian, Tong Lin, Xu Chen, Zhongkui Wang, Antonio Antonellis Rufo, Waylon Chen
  • Publication number: 20210190922
    Abstract: Extrinsic calibration of a Light Detection and Ranging (LiDAR) sensor and a camera can comprise constructing a first plurality of reconstructed calibration targets in a three-dimensional space based on physical calibration targets detected from input from the LiDAR and a second plurality of reconstructed calibration targets in the three-dimensional space based on physical calibration targets detected from input from the camera. Reconstructed calibration targets in the first and second plurality of reconstructed calibration targets can be matched and a six-degree of freedom rigid body transformation of the LiDAR and camera can be computed based on the matched reconstructed calibration targets. A projection of the LiDAR to the camera can be computed based on the computed six-degree of freedom rigid body transformation.
    Type: Application
    Filed: December 24, 2019
    Publication date: June 24, 2021
    Inventors: Hiu Hong Yu, Tong Lin, Xu Chen, Zhenxiang Jian
  • Publication number: 20210173055
    Abstract: Embodiments of the present disclosure are directed to calibrating an imaging and ranging subsystem. Sensor data indicative of one or more targets from the imaging and ranging subsystem, location data defining a geographical location of the imaging and ranging subsystem, orientation data defining an orientation of the imaging and ranging subsystem and stored translation and rotation values of the imaging and ranging subsystem are received. Estimated Doppler values for the target are provided by the sensor data and theoretical Doppler values for the targets are also calculated. The estimated Doppler values are compared to the theoretical Doppler values to determine if calibration of the imaging and ranging subsystem is required. If calibration is necessary, correction translation and correction rotation values are calculated in order to calculate updated translation and rotation values used to calibrate the imaging and ranging subsystem.
    Type: Application
    Filed: December 9, 2019
    Publication date: June 10, 2021
    Inventors: Zhenxiang Jian, Tong Lin, Hiu Hong Yu, Xu Chen
  • Publication number: 20210141092
    Abstract: Embodiments of the present disclosure are directed to providing scene perception display requiring reduced processing capabilities. Sensor data indicative of one or more targets from an imaging and ranging subsystem, location data from a positioning subsystem defining a geographical location of the imaging and ranging subsystem and orientation data from an orientation subsystem defining an orientation of the imaging and ranging subsystem are received. Doppler point cloud data is generated based on Doppler information and point cloud data from the sensor data, the location data and the orientation data. The targets are classified as either a static or dynamic target. Afterwards, the Doppler point cloud data is filtered by removing either the dynamic or the static targets from the Doppler point cloud data. The Doppler point cloud data of either the dynamic or the static targets are further processed and the further processed Doppler point cloud data is rendered.
    Type: Application
    Filed: November 7, 2019
    Publication date: May 13, 2021
    Inventors: Xu Chen, Tong Lin, Zhenxiang Jian, Hiu Hong Yu