Patents by Inventor Anli HE

Anli HE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10657367
    Abstract: A system for hand tracking may comprise a head mounted display wearable by a user and a hand tracking camera module attached to the head mounted display and comprising at least one of a pair of stereo cameras or a depth camera. The hand tracking camera module may be configured to capture images of at least one physical hand of the user. The head mounted display may be configured to render a virtual hand resembling the physical hand in a virtual environment for viewing by the user based at least on the images.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: May 19, 2020
    Assignee: USens, Inc.
    Inventors: Yue Fei, Anli He, Wentao Mao, Gengyu Ma, Wenjie Mou, Yu Gao, Eunseok Park
  • Patent number: 10620712
    Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.
    Type: Grant
    Filed: February 12, 2019
    Date of Patent: April 14, 2020
    Assignee: uSens, Inc.
    Inventors: Anli He, Yue Fei
  • Publication number: 20190179419
    Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.
    Type: Application
    Filed: February 12, 2019
    Publication date: June 13, 2019
    Inventors: Anli HE, Yue FEI
  • Patent number: 10320437
    Abstract: A foldable apparatus is disclosed. The apparatus may comprise at least one camera configured to acquire an image of a physical environment, an orientation and position determination module configured to determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the acquired image, a housing configured to hold the at least one camera and the orientation and position determination module, and a first strap attached to the housing and configured to attach the housing to a head of a user of the apparatus.
    Type: Grant
    Filed: March 3, 2016
    Date of Patent: June 11, 2019
    Assignee: USens, Inc.
    Inventors: Yue Fei, Anli He
  • Patent number: 10256859
    Abstract: A method is disclosed. The method may be implemented by an immersive and interactive multimedia generation system. The method may comprise projecting, by a first component of the system, a plurality of patterns to a physical environment in which a second component of the system is located, acquiring, by the second component of the system, a first image and a second image of at least a part of the physical environment, the first and second images including, respectively, first and second pixel data corresponding to at least some of the plurality of projected patterns, and determining, by the second component of the system, a change in at least one of an orientation or a position of the second component within the physical environment based on a relationship between the first and the second pixel data.
    Type: Grant
    Filed: March 3, 2016
    Date of Patent: April 9, 2019
    Assignee: uSens, Inc.
    Inventors: Yue Fei, Anli He
  • Patent number: 10223834
    Abstract: An apparatus comprising a processor and a non-transitory computer-readable storage medium storing instructions, when executed by the processor, cause the apparatus to perform a method comprising determining a first and second set of coordinates, associated with a first and second feature, respectively, in one or more first and second images, respectively, the first and second images are captured from a first and second position and/or orientation relative to the physical environment, respectively, re-projecting the first set of coordinates to one or more 2D spaces associated with the second images, comparing the re-projected first set of coordinates with the second set of coordinates in at least one of position closeness, feature closeness, or stereo constraints to determine a correspondence between the first and second features and determining a change between the first and second orientations and/or positions with respect to the physical environment based on the determined correspondence.
    Type: Grant
    Filed: December 14, 2017
    Date of Patent: March 5, 2019
    Assignee: USens, Inc.
    Inventors: Yue Fei, Anli He
  • Patent number: 10203765
    Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.
    Type: Grant
    Filed: March 21, 2016
    Date of Patent: February 12, 2019
    Assignee: uSens, Inc.
    Inventors: Anli He, Yue Fei
  • Publication number: 20180285636
    Abstract: A system for hand tracking may comprise a head mounted display wearable by a user and a hand tracking camera module attached to the head mounted display and comprising at least one of a pair of stereo cameras or a depth camera. The hand tracking camera module may be configured to capture images of at least one physical hand of the user. The head mounted display may be configured to render a virtual hand resembling the physical hand in a virtual environment for viewing by the user based at least on the images.
    Type: Application
    Filed: March 30, 2018
    Publication date: October 4, 2018
    Inventors: YUE FEI, ANLI HE, WENTAO MAO, GENGYU MA, WENJIE MOU, YU GAO, EUNSEOK PARK
  • Publication number: 20180108180
    Abstract: An apparatus comprising a processor and a non-transitory computer-readable storage medium storing instructions, when executed by the processor, cause the apparatus to perform a method comprising determining a first and second set of coordinates, associated with a first and second feature, respectively, in one or more first and second images, respectively, the first and second images are captured from a first and second position and/or orientation relative to the physical environment, respectively, re-projecting the first set of coordinates to one or more 2D spaces associated with the second images, comparing the re-projected first set of coordinates with the second set of coordinates in at least one of position closeness, feature closeness, or stereo constraints to determine a correspondence between the first and second features and determining a change between the first and second orientations and/or positions with respect to the physical environment based on the determined correspondence.
    Type: Application
    Filed: December 14, 2017
    Publication date: April 19, 2018
    Inventors: Yue FEI, Anli HE
  • Patent number: 9858722
    Abstract: An apparatus is disclosed. The apparatus comprises an optical sensing system that comprises at least one camera, the at least one camera being configured to acquire an image of a physical environment. The apparatus further comprises a processing system. The processing system comprises an orientation and position determination module configured to detect salient features from the image, and determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the detected salient features. The processing system also comprises a rendering module configured to determine a rendering of the physical environment based on the image and on the determined change in orientation and/or position of the apparatus, and provide data related to the rendering of the physical environment to a display system.
    Type: Grant
    Filed: October 26, 2015
    Date of Patent: January 2, 2018
    Assignee: USENS, INC.
    Inventors: Yue Fei, Anli He
  • Publication number: 20160261300
    Abstract: A foldable apparatus is disclosed. The apparatus may comprise at least one camera configured to acquire an image of a physical environment, an orientation and position determination module configured to determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the acquired image, a housing configured to hold the at least one camera and the orientation and position determination module, and a first strap attached to the housing and configured to attach the housing to a head of a user of the apparatus.
    Type: Application
    Filed: March 3, 2016
    Publication date: September 8, 2016
    Inventors: Yue FEI, Anli HE
  • Publication number: 20160260260
    Abstract: A method is disclosed. The method may be implemented by an immersive and interactive multimedia generation system. The method may comprise projecting, by a first component of the system, a plurality of patterns to a physical environment in which a second component of the system is located, acquiring, by the second component of the system, a first image and a second image of at least a part of the physical environment, the first and second images including, respectively, first and second pixel data corresponding to at least some of the plurality of projected patterns, and determining, by the second component of the system, a change in at least one of an orientation or a position of the second component within the physical environment based on a relationship between the first and the second pixel data.
    Type: Application
    Filed: March 3, 2016
    Publication date: September 8, 2016
    Inventors: Yue FEI, Anli HE
  • Publication number: 20160202844
    Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.
    Type: Application
    Filed: March 21, 2016
    Publication date: July 14, 2016
    Inventors: Anli HE, Yue FEI
  • Publication number: 20160117860
    Abstract: An apparatus is disclosed. The apparatus comprises an optical sensing system that comprises at least one camera, the at least one camera being configured to acquire an image of a physical environment. The apparatus further comprises a processing system. The processing system comprises an orientation and position determination module configured to detect salient features from the image, and determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the detected salient features. The processing system also comprises a rendering module configured to determine a rendering of the physical environment based on the image and on the determined change in orientation and/or position of the apparatus, and provide data related to the rendering of the physical environment to a display system.
    Type: Application
    Filed: October 26, 2015
    Publication date: April 28, 2016
    Inventors: Yue FEI, Anli HE
  • Patent number: 9323338
    Abstract: A method for three-dimensional (3D) sensing. The method includes obtaining a first two-dimensional (2D) skeleton of an object, obtaining a second 2D skeleton of the object different from the first 2D skeleton, and calculating a 3D skeleton of the object based on the first and second 2D skeletons.
    Type: Grant
    Filed: September 23, 2013
    Date of Patent: April 26, 2016
    Assignee: USENS, INC.
    Inventors: Anli He, Yue Fei
  • Publication number: 20150277700
    Abstract: A system for providing a graphical user interface is provided. The system includes a display, at least one imaging sensor configured to capture at least one image associated with a user, one or more processors, and a memory for storing instructions executable by the one or more processors. The one or more processors may be configured to detect a gesture of a target part of the user based on the at least one image, and determine, based on the gesture of the target part of the user, three-dimensional (3D) coordinates of at least one 3D object in a 3D coordinate system. The one or more processors may be further configured to perform a projection of the at least one 3D object onto the display based on the 3D coordinates and render the at least one 3D object on the display according to the projection.
    Type: Application
    Filed: June 16, 2015
    Publication date: October 1, 2015
    Inventor: Anli HE
  • Publication number: 20140354602
    Abstract: A method for generating and displaying a graphic representation of an object on a display screen. The method includes capturing at least one image of the object using at least one image sensor, determining, according to the at least one image, three-dimensional (3D) coordinates of a 3D point on the object in a 3D coordinate system defined in a space containing the object, defining a touch interactive surface in the space, performing a projection of the 3D point onto a projection point on the touch interactive surface, determining 3D coordinates of the projection point in the 3D coordinate system according to the projection, determining a displaying position of the graphic representation on the display screen according to the 3D coordinates of the projection point, and displaying the graphic representation at the displaying position on the display screen.
    Type: Application
    Filed: August 18, 2014
    Publication date: December 4, 2014
    Inventors: Anli HE, Yue FEI
  • Patent number: D753111
    Type: Grant
    Filed: March 2, 2015
    Date of Patent: April 5, 2016
    Assignee: USENS, INC.
    Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian
  • Patent number: D753112
    Type: Grant
    Filed: March 2, 2015
    Date of Patent: April 5, 2016
    Assignee: USENS, INC.
    Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian
  • Patent number: D753113
    Type: Grant
    Filed: March 2, 2015
    Date of Patent: April 5, 2016
    Assignee: USENS, INC.
    Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian