Patents by Inventor Georgios Evangelidis

Georgios Evangelidis has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240135555
    Abstract: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
    Type: Application
    Filed: October 24, 2022
    Publication date: April 25, 2024
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wolf
  • Publication number: 20240126084
    Abstract: An energy-efficient adaptive 3D sensing system. The adaptive 3D sensing system includes one or more cameras and one or more projectors. The adaptive 3D sensing system captures images of a real-world scene using the one or more cameras and computes depth estimates and depth estimate confidence values for pixels of the images. The adaptive 3D sensing system computes an attention mask based on the one or more depth estimate confidence values and commands the one or more projectors to send a distributed laser beam into one or more areas of the real-world scene based on the attention mask. The adaptive 3D sensing system captures 3D sensing image data of the one or more areas of the real-world scene and generates 3D sensing data for the real-world scene based on the 3D sensing image data.
    Type: Application
    Filed: April 13, 2023
    Publication date: April 18, 2024
    Inventors: Jian Wang, Sizhuo Ma, Brevin Tilmon, Yicheng Wu, Gurunandan Krishnan Gorumkonda, Ramzi Zahreddine, Georgios Evangelidis
  • Patent number: 11915453
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Grant
    Filed: February 7, 2023
    Date of Patent: February 27, 2024
    Assignee: SNAP INC.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20240020948
    Abstract: A vision transformer network having extremely low latency and usable on mobile devices, such as smart eyewear devices and other augmented reality (AR) and virtual reality (VR) devices. The transformer network processes an input image, and the network includes a convolution stem configured to patch embed the image. A first stack of stages including at least two stages of 4-Dimension (4D) metablocks (MBs) (MB4D) follow the convolution stem. A second stack of stages including at least two stages of 3-Dimension MBs (MB3D) follow the MB4D stages. Each of the MB4D stages and each of the MB3D stages include different layer configurations, and each of the MB4D stages and each of the MB3D stages include a token mixer. The MB3D stages each additionally include a multi-head self attention (MHSA) processing block.
    Type: Application
    Filed: July 14, 2022
    Publication date: January 18, 2024
    Inventors: Jian Ren, Yang Wen, Ju Hu, Georgios Evangelidis, Sergey Tulyakov, Yanyu Li, Geng Yuan
  • Publication number: 20230421895
    Abstract: A hand-tracking input pipeline dimming system for an AR system is provided. The AR system deactivates the hand-tracking input pipeline and places a camera component of the hand-tracking input pipeline in a limited operational mode. The AR system uses the camera component to detect initiation of a gesture by a user of the AR system and in response to detecting the initiation of the gesture, the AR system activates the hand-tracking input pipeline and places the camera component in a fully operational mode.
    Type: Application
    Filed: September 19, 2022
    Publication date: December 28, 2023
    Inventors: Jan Bajana, Daniel Colascione, Georgios Evangelidis, Erick Mendez Mendez, Daniel Wolf
  • Publication number: 20230401796
    Abstract: A method for aligning coordinate systems from separate augmented reality (AR) devices is described. In one aspect, the method includes generating predicted depths of a first point cloud by applying a pre-trained model to a first single image generated by a first monocular camera of a first augmented reality (AR) device, and first sparse 3D points generated by a first SLAM system at the first AR device, generating predicted depths of a second point cloud by applying the pre-trained model to a second single image generated by a second monocular camera of the second AR device, and second sparse 3D points generated by a second SLAM system at the second AR device, determining a relative pose between the first AR device and the second AR device by registering the first point cloud with the second point cloud.
    Type: Application
    Filed: August 23, 2022
    Publication date: December 14, 2023
    Inventors: Georgios Evangelidis, Branislav Micusik, Jakob Zillner, Nathan Jacob Litke
  • Publication number: 20230359038
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Application
    Filed: July 19, 2023
    Publication date: November 9, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20230319476
    Abstract: Electronic eyewear device providing simplified audio source separation, also referred to as voice/sound unmixing, using alignment between respective device trajectories. Multiple users of electronic eyewear devices in an environment may simultaneously generate audio signals (e.g., voices/sounds) that are difficult to distinguish from one another. The electronic eyewear device tracks the location of moving remote electronic eyewear devices of other users, or an object of the other users, such as the remote user's face, to provide audio source separation using location of the sound sources. The simplified voice unmixing uses a microphone array of the electronic eyewear device and the known location of the remote user's electronic eyewear device with respect to the user's electronic eyewear device to facilitate audio source separation.
    Type: Application
    Filed: April 1, 2022
    Publication date: October 5, 2023
    Inventors: Georgios Evangelidis, Ashwani Arya, Jennica Pounds, Andrei Rybin
  • Patent number: 11726327
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Grant
    Filed: September 15, 2020
    Date of Patent: August 15, 2023
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20230186521
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Application
    Filed: February 7, 2023
    Publication date: June 15, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Patent number: 11587255
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: February 21, 2023
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20220375110
    Abstract: A method for AR-guided depth estimation is described. The method includes identifying a virtual object rendered in a first frame that is generated based on a first pose of an augmented reality (AR) device, determining a second pose of the AR device, the second pose following the first pose, identifying an augmentation area in the second frame based on the virtual object rendered in the first frame, and the second pose, determining depth information for the augmentation area in the second frame, and rendering the virtual object in the second frame based on the depth information.
    Type: Application
    Filed: November 18, 2021
    Publication date: November 24, 2022
    Inventors: Georgios Evangelidis, Branislav Micusik, Sagi Katz
  • Patent number: 11403499
    Abstract: Systems and methods for generating composite sets of data based on sensor data from different sensors are disclosed. Exemplary implementations may capture a color image including chromatic information; capture a depth image; generate inertial signals conveying values that are used to determine motion parameters; determine the motion parameters based on the inertial signals; generate a re-projected depth image as if the depth image had been captured at the same time as the color image, based on the interpolation of motion parameters; and generate a composite set of data based on different kinds of sensor data by combining information from the color image, the re-projected depth image, and one or more motion parameters.
    Type: Grant
    Filed: November 11, 2020
    Date of Patent: August 2, 2022
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventor: Georgios Evangelidis
  • Publication number: 20220082827
    Abstract: Eyewear having unsynchronized rolling shutter (RS) cameras such that images produced by each camera are unaligned, and a state that includes velocity and gravity orientation in the eyewear's reference system is calculated. The state is not limited to these two parameters as other parameters such as the acceleration bias, the gyroscope bias, or both may be included. Math solvers are used such that processing time to calculate the velocity and gravity orientation are acceptable. Arranging the RS cameras in an unsynchronized configuration allows estimating the motion of the eyewear from just one stereo image pair and removes the requirement of possessing more images.
    Type: Application
    Filed: September 15, 2020
    Publication date: March 17, 2022
    Inventors: Branislav Micusik, Georgios Evangelidis
  • Publication number: 20210133517
    Abstract: Systems and methods for generating composite sets of data based on sensor data from different sensors are disclosed. Exemplary implementations may capture a color image including chromatic information; capture a depth image; generate inertial signals conveying values that are used to determine motion parameters; determine the motion parameters based on the inertial signals; generate a re-projected depth image as if the depth image had been captured at the same time as the color image, based on the interpolation of motion parameters; and generate a composite set of data based on different kinds of sensor data by combining information from the color image, the re-projected depth image, and one or more motion parameters.
    Type: Application
    Filed: November 11, 2020
    Publication date: May 6, 2021
    Inventor: Georgios Evangelidis
  • Patent number: 10867220
    Abstract: Systems and methods for generating composite sets of data based on sensor data from different sensors are disclosed. Exemplary implementations may capture a color image including chromatic information; capture a depth image; generate inertial signals conveying values that are used to determine motion parameters; determine the motion parameters based on the inertial signals; generate a re-projected depth image as if the depth image had been captured at the same time as the color image, based on the interpolation of motion parameters; and generate a composite set of data based on different kinds of sensor data by combining information from the color image, the re-projected depth image, and one or more motion parameters.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: December 15, 2020
    Assignee: RPX Corporation
    Inventor: Georgios Evangelidis
  • Publication number: 20200364519
    Abstract: Systems and methods for generating composite sets of data based on sensor data from different sensors are disclosed. Exemplary implementations may capture a color image including chromatic information; capture a depth image; generate inertial signals conveying values that are used to determine motion parameters; determine the motion parameters based on the inertial signals; generate a re-projected depth image as if the depth image had been captured at the same time as the color image, based on the interpolation of motion parameters; and generate a composite set of data based on different kinds of sensor data by combining information from the color image, the re-projected depth image, and one or more motion parameters.
    Type: Application
    Filed: May 16, 2019
    Publication date: November 19, 2020
    Inventor: Georgios Evangelidis