Patents by Inventor Erick Mendez Mendez

Erick Mendez Mendez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240085166
    Abstract: An eyewear device including a strain gauge sensor to determine when the eyewear device is manipulated by a user, such as being put on, taken off, and interacted with. A processor identifies a signature event based on sensor signals received from the strain gauge sensor and a data table of strain gauge sensor measurements corresponding to signature events. The processor controls the eyewear device as a function of the identified signature event, such as powering on a display of the eyewear device as the eyewear device is being put on a user's head, and then turning of the display when the eyewear device is removed from the user's head.
    Type: Application
    Filed: September 12, 2022
    Publication date: March 14, 2024
    Inventors: Jason Heger, Matthias Kalkgruber, Erick Mendez Mendez
  • Publication number: 20230421895
    Abstract: A hand-tracking input pipeline dimming system for an AR system is provided. The AR system deactivates the hand-tracking input pipeline and places a camera component of the hand-tracking input pipeline in a limited operational mode. The AR system uses the camera component to detect initiation of a gesture by a user of the AR system and in response to detecting the initiation of the gesture, the AR system activates the hand-tracking input pipeline and places the camera component in a fully operational mode.
    Type: Application
    Filed: September 19, 2022
    Publication date: December 28, 2023
    Inventors: Jan Bajana, Daniel Colascione, Georgios Evangelidis, Erick Mendez Mendez, Daniel Wolf
  • Publication number: 20230297164
    Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.
    Type: Application
    Filed: May 24, 2023
    Publication date: September 21, 2023
    Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
  • Patent number: 11681361
    Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.
    Type: Grant
    Filed: May 12, 2022
    Date of Patent: June 20, 2023
    Assignee: Snap Inc.
    Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
  • Publication number: 20230177708
    Abstract: A depth estimation system to perform operations that include: receiving image data generated by a client device, the image data comprising a depiction of an environment; identifying a set of image features based on the image data; determining a pose of the client device based on the set of features; generating a depth estimation based on the image data and the pose of the client device; and generating a mesh model of the environment based on the depth estimation.
    Type: Application
    Filed: December 5, 2022
    Publication date: June 8, 2023
    Inventors: Erick Mendez Mendez, Isac Andreas Müller Sandvik, Qi Pan, Edward James Rosten, Andrew Tristan Spek, Daniel Wagner, Jakob Zillner
  • Publication number: 20230156357
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Application
    Filed: January 19, 2023
    Publication date: May 18, 2023
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Patent number: 11582409
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Grant
    Filed: January 29, 2021
    Date of Patent: February 14, 2023
    Assignee: Snap Inc.
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Publication number: 20220365592
    Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.
    Type: Application
    Filed: May 12, 2022
    Publication date: November 17, 2022
    Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
  • Publication number: 20210409628
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Application
    Filed: January 29, 2021
    Publication date: December 30, 2021
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Patent number: 10262441
    Abstract: Disclosed is a method and apparatus for using color measurement features at multiple scales for a Color Transfer technique. In one embodiment, the functions implemented include: resizing a ground truth image target frame to a plurality of different scales; selecting one or more color measurement features from the ground truth image target frame at each of the plurality of different scales; making a color measurement for each color measurement feature in the ground truth image target frame; and adjusting colors of a virtual object in an augmented frame based at least in part on the color measurements.
    Type: Grant
    Filed: February 18, 2015
    Date of Patent: April 16, 2019
    Assignee: QUALCOMM Incorporated
    Inventors: Erick Mendez Mendez, Daniel Wagner, Michael Gervautz
  • Patent number: 9684970
    Abstract: Disclosed is a method and apparatus for adaptively executing one or more motion blur estimation methods to estimate a motion blur associated with an image target frame in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: applying a first motion blur estimation method to estimate the motion blur; determining whether computational resources are available for a second motion blur estimation method; and applying the second motion blur estimation method to estimate the motion blur in response to a determination that computational resources are available for the second motion blur estimation method.
    Type: Grant
    Filed: February 27, 2015
    Date of Patent: June 20, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Erick Mendez Mendez, Kiyoung Kim, Youngmin Park
  • Patent number: 9626803
    Abstract: Disclosed are a system, apparatus, and method for depth and color camera image synchronization. Depth and color camera input images are received or otherwise obtained unsynchronized and without associated creation timestamps. An image of one type is compared with an image of a different type to determine a match for synchronization. Matches may be determined according to edge detection or depth coordinate detection. When a match is determined a synchronized pair is formed for processing within an augmented reality output. Optionally the synchronized pair may be transformed to improve the match between the image pair.
    Type: Grant
    Filed: December 12, 2014
    Date of Patent: April 18, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Youngmin Park, Erick Mendez Mendez, Gerhard Reitmayr, Daniel Wagner, Serafin Diaz Spindola
  • Publication number: 20160253819
    Abstract: Disclosed is a method and apparatus for adaptively executing one or more motion blur estimation methods to estimate a motion blur associated with an image target frame in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: applying a first motion blur estimation method to estimate the motion blur; determining whether computational resources are available for a second motion blur estimation method; and applying the second motion blur estimation method to estimate the motion blur in response to a determination that computational resources are available for the second motion blur estimation method.
    Type: Application
    Filed: February 27, 2015
    Publication date: September 1, 2016
    Inventors: Erick MENDEZ MENDEZ, Kiyoung KIM, Youngmin PARK
  • Publication number: 20160239986
    Abstract: Disclosed is a method and apparatus for using color measurement features at multiple scales for a Color Transfer technique. In one embodiment, the functions implemented include: resizing a ground truth image target frame to a plurality of different scales; selecting one or more color measurement features from the ground truth image target frame at each of the plurality of different scales; making a color measurement for each color measurement feature in the ground truth image target frame; and adjusting colors of a virtual object in an augmented frame based at least in part on the color measurements.
    Type: Application
    Filed: February 18, 2015
    Publication date: August 18, 2016
    Inventors: Erick MENDEZ MENDEZ, Daniel WAGNER, Michael GERVAUTZ
  • Publication number: 20160171768
    Abstract: Disclosed are a system, apparatus, and method for depth and color camera image synchronization. Depth and color camera input images are received or otherwise obtained unsynchronized and without associated creation timestamps. An image of one type is compared with an image of a different type to determine a match for synchronization. Matches may be determined according to edge detection or depth coordinate detection. When a match is determined a synchronized pair is formed for processing within an augmented reality output. Optionally the synchronized pair may be transformed to improve the match between the image pair.
    Type: Application
    Filed: December 12, 2014
    Publication date: June 16, 2016
    Inventors: Youngmin Park, Erick Mendez Mendez, Gerhard Reitmayr, Daniel Wagner, Serafin Diaz Spindola
  • Publication number: 20160086377
    Abstract: Disclosed are example methods, apparatuses, and articles of manufacture for determining and providing a suitability of an image target for Color Transfer. In an example embodiment, a method, which may be implemented using a computing device, may comprise: receiving image data representative of the image target; determining a suitability of the image target for Color Transfer based, at least in part, on one or more colors of the image data; and providing an indication indicative of the suitability of the image target for Color Transfer.
    Type: Application
    Filed: September 19, 2014
    Publication date: March 24, 2016
    Inventors: Erick Mendez Mendez, Gerhard Reitmayr
  • Patent number: 9135735
    Abstract: Methods, apparatuses, and systems are provided to transition 3D space information detected in an Augmented Reality (AR) view of a mobile device to screen aligned information on the mobile device. In at least one implementation, a method includes determining augmentation information associated with an object of interest, including a Modelview (M1) matrix and a Projection (P1) matrix, displaying the augmentation information on top of a video image of the object of interest using the M1 and P1 matrices, generating a second Modelview (M2) matrix and a second Projection (P2) matrix, such that the matrices M2 and P2 represent the screen aligned final position of the augmentation information, and displaying the augmentation information using the M2 and P2 matrices.
    Type: Grant
    Filed: March 12, 2013
    Date of Patent: September 15, 2015
    Assignee: QUALCOMM Incorporated
    Inventors: Scott A. Leazenby, Eunjoo Kim, Per O. Nielsen, Gerald V. Wright, Jr., Erick Mendez Mendez, Michael Gervautz
  • Patent number: 8976191
    Abstract: Disclosed is a method and apparatus for creating a realistic color for a virtual object in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: selecting a reference image target frame; selecting a plurality of sample points in the reference image target frame; acquiring a subsequent new image target frame; determining a plurality of corresponding sample points in the new image target frame wherein the plurality of corresponding sample points correspond to the plurality of sample points in the reference image target frame; comparing a color of each of the plurality of sample points in the reference image target frame with a color of each of the corresponding sample points in the new image target frame and computing a Color Transfer function based at least in part on the comparison; and applying the Color Transfer function to the color of the virtual object.
    Type: Grant
    Filed: March 13, 2014
    Date of Patent: March 10, 2015
    Assignee: QUALCOMM Incorporated
    Inventor: Erick Mendez Mendez
  • Publication number: 20130342573
    Abstract: Methods, apparatuses, and systems are provided to transition 3D space information detected in an Augmented Reality (AR) view of a mobile device to screen aligned information on the mobile device. In at least one implementation, a method includes determining augmentation information associated with an object of interest, including a Modelview (M1) matrix and a Projection (P1) matrix, displaying the augmentation information on top of a video image of the object of interest using the M1 and P1 matrices, generating a second Modelview (M2) matrix and a second Projection (P2) matrix, such that the matrices M2 and P2 represent the screen aligned final position of the augmentation information, and displaying the augmentation information using the M2 and P2 matrices.
    Type: Application
    Filed: March 12, 2013
    Publication date: December 26, 2013
    Applicant: QUALCOMM INCORPORATED
    Inventors: Scott A. Leazenby, Eunjoo Kim, Per O. Nielsen, Gerald V. Wright, JR., Erick Mendez Mendez, Michael Gervautz