Patents by Inventor Anli HE
Anli HE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10657367Abstract: A system for hand tracking may comprise a head mounted display wearable by a user and a hand tracking camera module attached to the head mounted display and comprising at least one of a pair of stereo cameras or a depth camera. The hand tracking camera module may be configured to capture images of at least one physical hand of the user. The head mounted display may be configured to render a virtual hand resembling the physical hand in a virtual environment for viewing by the user based at least on the images.Type: GrantFiled: March 30, 2018Date of Patent: May 19, 2020Assignee: USens, Inc.Inventors: Yue Fei, Anli He, Wentao Mao, Gengyu Ma, Wenjie Mou, Yu Gao, Eunseok Park
-
Patent number: 10620712Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.Type: GrantFiled: February 12, 2019Date of Patent: April 14, 2020Assignee: uSens, Inc.Inventors: Anli He, Yue Fei
-
Publication number: 20190179419Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.Type: ApplicationFiled: February 12, 2019Publication date: June 13, 2019Inventors: Anli HE, Yue FEI
-
Patent number: 10320437Abstract: A foldable apparatus is disclosed. The apparatus may comprise at least one camera configured to acquire an image of a physical environment, an orientation and position determination module configured to determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the acquired image, a housing configured to hold the at least one camera and the orientation and position determination module, and a first strap attached to the housing and configured to attach the housing to a head of a user of the apparatus.Type: GrantFiled: March 3, 2016Date of Patent: June 11, 2019Assignee: USens, Inc.Inventors: Yue Fei, Anli He
-
Patent number: 10256859Abstract: A method is disclosed. The method may be implemented by an immersive and interactive multimedia generation system. The method may comprise projecting, by a first component of the system, a plurality of patterns to a physical environment in which a second component of the system is located, acquiring, by the second component of the system, a first image and a second image of at least a part of the physical environment, the first and second images including, respectively, first and second pixel data corresponding to at least some of the plurality of projected patterns, and determining, by the second component of the system, a change in at least one of an orientation or a position of the second component within the physical environment based on a relationship between the first and the second pixel data.Type: GrantFiled: March 3, 2016Date of Patent: April 9, 2019Assignee: uSens, Inc.Inventors: Yue Fei, Anli He
-
Patent number: 10223834Abstract: An apparatus comprising a processor and a non-transitory computer-readable storage medium storing instructions, when executed by the processor, cause the apparatus to perform a method comprising determining a first and second set of coordinates, associated with a first and second feature, respectively, in one or more first and second images, respectively, the first and second images are captured from a first and second position and/or orientation relative to the physical environment, respectively, re-projecting the first set of coordinates to one or more 2D spaces associated with the second images, comparing the re-projected first set of coordinates with the second set of coordinates in at least one of position closeness, feature closeness, or stereo constraints to determine a correspondence between the first and second features and determining a change between the first and second orientations and/or positions with respect to the physical environment based on the determined correspondence.Type: GrantFiled: December 14, 2017Date of Patent: March 5, 2019Assignee: USens, Inc.Inventors: Yue Fei, Anli He
-
Patent number: 10203765Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.Type: GrantFiled: March 21, 2016Date of Patent: February 12, 2019Assignee: uSens, Inc.Inventors: Anli He, Yue Fei
-
Publication number: 20180285636Abstract: A system for hand tracking may comprise a head mounted display wearable by a user and a hand tracking camera module attached to the head mounted display and comprising at least one of a pair of stereo cameras or a depth camera. The hand tracking camera module may be configured to capture images of at least one physical hand of the user. The head mounted display may be configured to render a virtual hand resembling the physical hand in a virtual environment for viewing by the user based at least on the images.Type: ApplicationFiled: March 30, 2018Publication date: October 4, 2018Inventors: YUE FEI, ANLI HE, WENTAO MAO, GENGYU MA, WENJIE MOU, YU GAO, EUNSEOK PARK
-
Publication number: 20180108180Abstract: An apparatus comprising a processor and a non-transitory computer-readable storage medium storing instructions, when executed by the processor, cause the apparatus to perform a method comprising determining a first and second set of coordinates, associated with a first and second feature, respectively, in one or more first and second images, respectively, the first and second images are captured from a first and second position and/or orientation relative to the physical environment, respectively, re-projecting the first set of coordinates to one or more 2D spaces associated with the second images, comparing the re-projected first set of coordinates with the second set of coordinates in at least one of position closeness, feature closeness, or stereo constraints to determine a correspondence between the first and second features and determining a change between the first and second orientations and/or positions with respect to the physical environment based on the determined correspondence.Type: ApplicationFiled: December 14, 2017Publication date: April 19, 2018Inventors: Yue FEI, Anli HE
-
Patent number: 9858722Abstract: An apparatus is disclosed. The apparatus comprises an optical sensing system that comprises at least one camera, the at least one camera being configured to acquire an image of a physical environment. The apparatus further comprises a processing system. The processing system comprises an orientation and position determination module configured to detect salient features from the image, and determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the detected salient features. The processing system also comprises a rendering module configured to determine a rendering of the physical environment based on the image and on the determined change in orientation and/or position of the apparatus, and provide data related to the rendering of the physical environment to a display system.Type: GrantFiled: October 26, 2015Date of Patent: January 2, 2018Assignee: USENS, INC.Inventors: Yue Fei, Anli He
-
Publication number: 20160261300Abstract: A foldable apparatus is disclosed. The apparatus may comprise at least one camera configured to acquire an image of a physical environment, an orientation and position determination module configured to determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the acquired image, a housing configured to hold the at least one camera and the orientation and position determination module, and a first strap attached to the housing and configured to attach the housing to a head of a user of the apparatus.Type: ApplicationFiled: March 3, 2016Publication date: September 8, 2016Inventors: Yue FEI, Anli HE
-
Publication number: 20160260260Abstract: A method is disclosed. The method may be implemented by an immersive and interactive multimedia generation system. The method may comprise projecting, by a first component of the system, a plurality of patterns to a physical environment in which a second component of the system is located, acquiring, by the second component of the system, a first image and a second image of at least a part of the physical environment, the first and second images including, respectively, first and second pixel data corresponding to at least some of the plurality of projected patterns, and determining, by the second component of the system, a change in at least one of an orientation or a position of the second component within the physical environment based on a relationship between the first and the second pixel data.Type: ApplicationFiled: March 3, 2016Publication date: September 8, 2016Inventors: Yue FEI, Anli HE
-
Publication number: 20160202844Abstract: A method for human-machine interaction includes monitoring a movement of an object by a sensor that detects positions of the object over time, generating a time-dependent velocity of the object based on the movement of the object, detecting a tapping event of the object tapping on a surface by detecting a sudden change of the time-dependent velocity, and determining a position of the object at a time when the tapping event occurs as a tapping position of the object.Type: ApplicationFiled: March 21, 2016Publication date: July 14, 2016Inventors: Anli HE, Yue FEI
-
Publication number: 20160117860Abstract: An apparatus is disclosed. The apparatus comprises an optical sensing system that comprises at least one camera, the at least one camera being configured to acquire an image of a physical environment. The apparatus further comprises a processing system. The processing system comprises an orientation and position determination module configured to detect salient features from the image, and determine a change in orientation and/or position of the apparatus with respect to the physical environment based on the detected salient features. The processing system also comprises a rendering module configured to determine a rendering of the physical environment based on the image and on the determined change in orientation and/or position of the apparatus, and provide data related to the rendering of the physical environment to a display system.Type: ApplicationFiled: October 26, 2015Publication date: April 28, 2016Inventors: Yue FEI, Anli HE
-
Patent number: 9323338Abstract: A method for three-dimensional (3D) sensing. The method includes obtaining a first two-dimensional (2D) skeleton of an object, obtaining a second 2D skeleton of the object different from the first 2D skeleton, and calculating a 3D skeleton of the object based on the first and second 2D skeletons.Type: GrantFiled: September 23, 2013Date of Patent: April 26, 2016Assignee: USENS, INC.Inventors: Anli He, Yue Fei
-
Publication number: 20150277700Abstract: A system for providing a graphical user interface is provided. The system includes a display, at least one imaging sensor configured to capture at least one image associated with a user, one or more processors, and a memory for storing instructions executable by the one or more processors. The one or more processors may be configured to detect a gesture of a target part of the user based on the at least one image, and determine, based on the gesture of the target part of the user, three-dimensional (3D) coordinates of at least one 3D object in a 3D coordinate system. The one or more processors may be further configured to perform a projection of the at least one 3D object onto the display based on the 3D coordinates and render the at least one 3D object on the display according to the projection.Type: ApplicationFiled: June 16, 2015Publication date: October 1, 2015Inventor: Anli HE
-
Publication number: 20140354602Abstract: A method for generating and displaying a graphic representation of an object on a display screen. The method includes capturing at least one image of the object using at least one image sensor, determining, according to the at least one image, three-dimensional (3D) coordinates of a 3D point on the object in a 3D coordinate system defined in a space containing the object, defining a touch interactive surface in the space, performing a projection of the 3D point onto a projection point on the touch interactive surface, determining 3D coordinates of the projection point in the 3D coordinate system according to the projection, determining a displaying position of the graphic representation on the display screen according to the 3D coordinates of the projection point, and displaying the graphic representation at the displaying position on the display screen.Type: ApplicationFiled: August 18, 2014Publication date: December 4, 2014Inventors: Anli HE, Yue FEI
-
Patent number: D753111Type: GrantFiled: March 2, 2015Date of Patent: April 5, 2016Assignee: USENS, INC.Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian
-
Patent number: D753112Type: GrantFiled: March 2, 2015Date of Patent: April 5, 2016Assignee: USENS, INC.Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian
-
Patent number: D753113Type: GrantFiled: March 2, 2015Date of Patent: April 5, 2016Assignee: USENS, INC.Inventors: Yue Fei, Anli He, Lian Zhu, Zhichuan Qian