Patents by Inventor Yin Yee Chan

Yin Yee Chan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11169832
    Abstract: A method of generating an AR user manual in an electronic 3D viewing environment, comprising: recording a moving trajectory of the 3D viewing environment's optical sensor; receiving a landmark location information; executing an iterative target object pose estimation comprising: estimating an estimated target object pose from each of the optical sensor poses in the recorded moving trajectory and the landmark location; calculating an estimation error from a 3D model being arranged in the estimated target object pose and projected onto the target object in the real-world scene; calculating a mean of the estimation errors; and reiterating the iterative target object pose estimation to optimize the estimated target object pose for a minimum mean estimation error; if the minimum mean estimation error is within a predefined estimation error threshold, then rendering the AR user manual onto the target object according to the optimized estimated target object pose.
    Type: Grant
    Filed: May 17, 2021
    Date of Patent: November 9, 2021
    Assignee: Hong Kong Applied Science and Technology Research Institute Company Limited
    Inventors: Yat Cheung Ngai, Yin Yee Chan
  • Patent number: 11043038
    Abstract: A method of tracking a point of interest (POI) in an electronic three-dimensional (3D) viewing environment, comprising: capturing via an optical sensor and recording motions of an onsite user, wherein each motion comprises an image surrounding and a pose of the onsite user; displaying a snapshot to a remote user, wherein the snapshot is one of the recorded motions; receiving a POI indicator in the snapshot from the second user; estimating a 3D position of the POI in the electronic 3D viewing environment using the POI indicator data, a moving trajectory from each of the recorded motions to the snapshot, and an estimation of distance between the optical sensor to the POI center; and rendering and superimposing the POI indicator in the electronic 3D viewing environment to be displayed to the onsite user using the estimated 3D position of the POI, the moving trajectory, and the recorded motions.
    Type: Grant
    Filed: March 16, 2020
    Date of Patent: June 22, 2021
    Assignee: Hong Kong Applied Science and Technology Research Institute Company Limited
    Inventors: Yat Cheung Ngai, Yin Yee Chan
  • Patent number: 10616483
    Abstract: A method of generating an electronic 3D walkthrough environment of a targeted environment, comprising: capturing a panorama at a first location point in the targeted environment; measuring an initial orientation of the camera whenever a panorama is captured; moving through the targeted environment and at each location point, capturing a live image of surrounding, and measuring an orientation of the camera; determining whether a new panorama is required to be captured at the current location point at where the camera is with the current orientation of the camera, comprising: identifying a sub-image in the panorama last captured that matches the live image captured; and if the sub-image cannot be identified, then a new panorama is required; and generating the electronic 3D walkthrough environment by connecting the panoramas captured.
    Type: Grant
    Filed: February 27, 2019
    Date of Patent: April 7, 2020
    Assignee: HONG KONG APPLIED SCIENCE AND TECHNOLOGY RESEARCH INSTITUTE COMPANY LIMITED
    Inventors: Yat Cheung Ngai, Yin Yee Chan