Patents by Inventor Branislav Micusik

Branislav Micusik has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240135555
    Abstract: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
    Type: Application
    Filed: October 24, 2022
    Publication date: April 25, 2024
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wolf
  • Patent number: 11915453
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Grant
    Filed: February 7, 2023
    Date of Patent: February 27, 2024
    Assignee: SNAP INC.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20230421717
    Abstract: A method for generating a virtual selfie stick image is described. In one aspect, the method includes generating, at a device, an original self-portrait image with an optical sensor of the device, the optical sensor directed at a face of a user of the device, the device being held at an arm length from the face of the user, displaying, on a display of the device, an instruction guiding the user to move the device at the arm length about the face of the user within a limited range at a plurality of poses, accessing, at the device, image data generated by the optical sensor at the plurality of poses, and generating a virtual selfie stick self-portrait image based on the original self-portrait image and the image data.
    Type: Application
    Filed: June 28, 2022
    Publication date: December 28, 2023
    Inventors: Kai Zhou, Branislav Micusik
  • Publication number: 20230401796
    Abstract: A method for aligning coordinate systems from separate augmented reality (AR) devices is described. In one aspect, the method includes generating predicted depths of a first point cloud by applying a pre-trained model to a first single image generated by a first monocular camera of a first augmented reality (AR) device, and first sparse 3D points generated by a first SLAM system at the first AR device, generating predicted depths of a second point cloud by applying the pre-trained model to a second single image generated by a second monocular camera of the second AR device, and second sparse 3D points generated by a second SLAM system at the second AR device, determining a relative pose between the first AR device and the second AR device by registering the first point cloud with the second point cloud.
    Type: Application
    Filed: August 23, 2022
    Publication date: December 14, 2023
    Inventors: Georgios Evangelidis, Branislav Micusik, Jakob Zillner, Nathan Jacob Litke
  • Publication number: 20230359038
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Application
    Filed: July 19, 2023
    Publication date: November 9, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20230267691
    Abstract: A method for detecting changes in a scene includes accessing a first set of images and corresponding pose data in a first coordinate system associated with a first user session of an augmented reality (AR) device and accessing a second set of images and corresponding pose data in a second coordinate system associated with a second user session. The method identifies the first set of images corresponding to a second image from the second set of images based on the pose data of the first set of images being determined spatially closest to the pose data of the second image after aligning the first coordinate system and the second coordinate system. A trained neural network generates a synthesized image from the first set of images. Features of the second image are subtracted from features of the synthesized image. Area of changes are identified based on the subtracted features.
    Type: Application
    Filed: February 22, 2022
    Publication date: August 24, 2023
    Inventor: Branislav Micusik
  • Patent number: 11726327
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Grant
    Filed: September 15, 2020
    Date of Patent: August 15, 2023
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20230186521
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Application
    Filed: February 7, 2023
    Publication date: June 15, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Patent number: 11587255
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: February 21, 2023
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20220375110
    Abstract: A method for AR-guided depth estimation is described. The method includes identifying a virtual object rendered in a first frame that is generated based on a first pose of an augmented reality (AR) device, determining a second pose of the AR device, the second pose following the first pose, identifying an augmentation area in the second frame based on the virtual object rendered in the first frame, and the second pose, determining depth information for the augmentation area in the second frame, and rendering the virtual object in the second frame based on the depth information.
    Type: Application
    Filed: November 18, 2021
    Publication date: November 24, 2022
    Inventors: Georgios Evangelidis, Branislav Micusik, Sagi Katz
  • Patent number: 11450022
    Abstract: A method for detecting a loop closure is described. A device accesses pose information and a three-dimensional map of feature points generated by a visual inertia system of the device. The device splits the pose information into a translational part and a rotational part. The device limits the translational part to two-dimensional coordinates and estimates two-dimensional information of the limited translational part based on an accumulator voting space. The device determines an updated pose of the device based on the estimated two-dimensional information, the rotational part, and the three-dimensional map. The pose information is updated with the updated pose.
    Type: Grant
    Filed: October 27, 2020
    Date of Patent: September 20, 2022
    Assignee: RPX Corporation
    Inventor: Branislav Micusik
  • Publication number: 20220082827
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Application
    Filed: September 15, 2020
    Publication date: March 17, 2022
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20210110572
    Abstract: A method for detecting a loop closure is described. A device accesses pose information and a three-dimensional map of feature points generated by a visual inertia system of the device. The device splits the pose information into a translational part and a rotational part. The device limits the translational part to two-dimensional coordinates and estimates two-dimensional information of the limited translational part based on an accumulator voting space. The device determines an updated pose of the device based on the estimated two-dimensional information, the rotational part, and the three-dimensional map. The pose information is updated with the updated pose.
    Type: Application
    Filed: October 27, 2020
    Publication date: April 15, 2021
    Inventor: Branislav Micusik
  • Patent number: 10861186
    Abstract: A method for detecting a loop closure is described. A device accesses pose information and a three-dimensional map of feature points generated by a visual inertia system of the device. The device splits the pose information into a translational part and a rotational part. The device limits the translational part to two-dimensional coordinates and estimates two-dimensional information of the limited translational part based on an accumulator voting space. The device determines an updated pose of the device based on the estimated two-dimensional information, the rotational part, and the three-dimensional map. The pose information is updated with the updated pose.
    Type: Grant
    Filed: August 28, 2018
    Date of Patent: December 8, 2020
    Assignee: RPX Corporation
    Inventor: Branislav Micusik
  • Patent number: 10586395
    Abstract: A system and method for offloading object detection are described. A server receives first sensor data from a first sensor of an augmented reality (AR) display device. The first sensor data indicates a pose of the AR display device relative to a first reference coordinate system. The server detects a physical object using second sensor data received from a second sensor of the AR display device. The server determines, based on the second sensor data, a pose of the physical object relative to the AR display device. The server then determines the pose of the physical object relative to the first reference coordinate system based on the pose of the physical object relative to the AR display device and the pose of the AR display device relative to the first reference coordinate system.
    Type: Grant
    Filed: May 9, 2018
    Date of Patent: March 10, 2020
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Daniel Wolf, Jakob Zillner, Branislav Micusik, William Hoff
  • Publication number: 20200074670
    Abstract: A method for detecting a loop closure is described. A device accesses pose information and a three-dimensional map of feature points generated by a visual inertia system of the device. The device splits the pose information into a translational part and a rotational part. The device limits the translational part to two-dimensional coordinates and estimates two-dimensional information of the limited translational part based on an accumulator voting space. The device determines an updated pose of the device based on the estimated two-dimensional information, the rotational part, and the three-dimensional map. The pose information is updated with the updated pose.
    Type: Application
    Filed: August 28, 2018
    Publication date: March 5, 2020
    Inventor: Branislav Micusik
  • Publication number: 20180261012
    Abstract: A system and method for offloading object detection are described. A server receives first sensor data from a first sensor of an augmented reality (AR) display device. The first sensor data indicates a pose of the AR display device relative to a first reference coordinate system. The server detects a physical object using second sensor data received from a second sensor of the AR display device. The server determines, based on the second sensor data, a pose of the physical object relative to the AR display device. The server then determines the pose of the physical object relative to the first reference coordinate system based on the pose of the physical object relative to the AR display device and the pose of the AR display device relative to the first reference coordinate system.
    Type: Application
    Filed: May 9, 2018
    Publication date: September 13, 2018
    Inventors: Brian Mullins, Daniel Wolf, Jakob Zillner, Branislav Micusik, William Hoff