Patents by Inventor Erick Mendez Mendez
Erick Mendez Mendez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240085166Abstract: An eyewear device including a strain gauge sensor to determine when the eyewear device is manipulated by a user, such as being put on, taken off, and interacted with. A processor identifies a signature event based on sensor signals received from the strain gauge sensor and a data table of strain gauge sensor measurements corresponding to signature events. The processor controls the eyewear device as a function of the identified signature event, such as powering on a display of the eyewear device as the eyewear device is being put on a user's head, and then turning of the display when the eyewear device is removed from the user's head.Type: ApplicationFiled: September 12, 2022Publication date: March 14, 2024Inventors: Jason Heger, Matthias Kalkgruber, Erick Mendez Mendez
-
Publication number: 20230421895Abstract: A hand-tracking input pipeline dimming system for an AR system is provided. The AR system deactivates the hand-tracking input pipeline and places a camera component of the hand-tracking input pipeline in a limited operational mode. The AR system uses the camera component to detect initiation of a gesture by a user of the AR system and in response to detecting the initiation of the gesture, the AR system activates the hand-tracking input pipeline and places the camera component in a fully operational mode.Type: ApplicationFiled: September 19, 2022Publication date: December 28, 2023Inventors: Jan Bajana, Daniel Colascione, Georgios Evangelidis, Erick Mendez Mendez, Daniel Wolf
-
Publication number: 20230297164Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: ApplicationFiled: May 24, 2023Publication date: September 21, 2023Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
-
Patent number: 11681361Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: GrantFiled: May 12, 2022Date of Patent: June 20, 2023Assignee: Snap Inc.Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
-
Publication number: 20230177708Abstract: A depth estimation system to perform operations that include: receiving image data generated by a client device, the image data comprising a depiction of an environment; identifying a set of image features based on the image data; determining a pose of the client device based on the set of features; generating a depth estimation based on the image data and the pose of the client device; and generating a mesh model of the environment based on the depth estimation.Type: ApplicationFiled: December 5, 2022Publication date: June 8, 2023Inventors: Erick Mendez Mendez, Isac Andreas Müller Sandvik, Qi Pan, Edward James Rosten, Andrew Tristan Spek, Daniel Wagner, Jakob Zillner
-
Publication number: 20230156357Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.Type: ApplicationFiled: January 19, 2023Publication date: May 18, 2023Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
-
Patent number: 11582409Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.Type: GrantFiled: January 29, 2021Date of Patent: February 14, 2023Assignee: Snap Inc.Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
-
Publication number: 20220365592Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: ApplicationFiled: May 12, 2022Publication date: November 17, 2022Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
-
Publication number: 20210409628Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.Type: ApplicationFiled: January 29, 2021Publication date: December 30, 2021Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
-
Patent number: 10262441Abstract: Disclosed is a method and apparatus for using color measurement features at multiple scales for a Color Transfer technique. In one embodiment, the functions implemented include: resizing a ground truth image target frame to a plurality of different scales; selecting one or more color measurement features from the ground truth image target frame at each of the plurality of different scales; making a color measurement for each color measurement feature in the ground truth image target frame; and adjusting colors of a virtual object in an augmented frame based at least in part on the color measurements.Type: GrantFiled: February 18, 2015Date of Patent: April 16, 2019Assignee: QUALCOMM IncorporatedInventors: Erick Mendez Mendez, Daniel Wagner, Michael Gervautz
-
Patent number: 9684970Abstract: Disclosed is a method and apparatus for adaptively executing one or more motion blur estimation methods to estimate a motion blur associated with an image target frame in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: applying a first motion blur estimation method to estimate the motion blur; determining whether computational resources are available for a second motion blur estimation method; and applying the second motion blur estimation method to estimate the motion blur in response to a determination that computational resources are available for the second motion blur estimation method.Type: GrantFiled: February 27, 2015Date of Patent: June 20, 2017Assignee: QUALCOMM IncorporatedInventors: Erick Mendez Mendez, Kiyoung Kim, Youngmin Park
-
Patent number: 9626803Abstract: Disclosed are a system, apparatus, and method for depth and color camera image synchronization. Depth and color camera input images are received or otherwise obtained unsynchronized and without associated creation timestamps. An image of one type is compared with an image of a different type to determine a match for synchronization. Matches may be determined according to edge detection or depth coordinate detection. When a match is determined a synchronized pair is formed for processing within an augmented reality output. Optionally the synchronized pair may be transformed to improve the match between the image pair.Type: GrantFiled: December 12, 2014Date of Patent: April 18, 2017Assignee: QUALCOMM IncorporatedInventors: Youngmin Park, Erick Mendez Mendez, Gerhard Reitmayr, Daniel Wagner, Serafin Diaz Spindola
-
Publication number: 20160253819Abstract: Disclosed is a method and apparatus for adaptively executing one or more motion blur estimation methods to estimate a motion blur associated with an image target frame in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: applying a first motion blur estimation method to estimate the motion blur; determining whether computational resources are available for a second motion blur estimation method; and applying the second motion blur estimation method to estimate the motion blur in response to a determination that computational resources are available for the second motion blur estimation method.Type: ApplicationFiled: February 27, 2015Publication date: September 1, 2016Inventors: Erick MENDEZ MENDEZ, Kiyoung KIM, Youngmin PARK
-
Publication number: 20160239986Abstract: Disclosed is a method and apparatus for using color measurement features at multiple scales for a Color Transfer technique. In one embodiment, the functions implemented include: resizing a ground truth image target frame to a plurality of different scales; selecting one or more color measurement features from the ground truth image target frame at each of the plurality of different scales; making a color measurement for each color measurement feature in the ground truth image target frame; and adjusting colors of a virtual object in an augmented frame based at least in part on the color measurements.Type: ApplicationFiled: February 18, 2015Publication date: August 18, 2016Inventors: Erick MENDEZ MENDEZ, Daniel WAGNER, Michael GERVAUTZ
-
Publication number: 20160171768Abstract: Disclosed are a system, apparatus, and method for depth and color camera image synchronization. Depth and color camera input images are received or otherwise obtained unsynchronized and without associated creation timestamps. An image of one type is compared with an image of a different type to determine a match for synchronization. Matches may be determined according to edge detection or depth coordinate detection. When a match is determined a synchronized pair is formed for processing within an augmented reality output. Optionally the synchronized pair may be transformed to improve the match between the image pair.Type: ApplicationFiled: December 12, 2014Publication date: June 16, 2016Inventors: Youngmin Park, Erick Mendez Mendez, Gerhard Reitmayr, Daniel Wagner, Serafin Diaz Spindola
-
Publication number: 20160086377Abstract: Disclosed are example methods, apparatuses, and articles of manufacture for determining and providing a suitability of an image target for Color Transfer. In an example embodiment, a method, which may be implemented using a computing device, may comprise: receiving image data representative of the image target; determining a suitability of the image target for Color Transfer based, at least in part, on one or more colors of the image data; and providing an indication indicative of the suitability of the image target for Color Transfer.Type: ApplicationFiled: September 19, 2014Publication date: March 24, 2016Inventors: Erick Mendez Mendez, Gerhard Reitmayr
-
Patent number: 9135735Abstract: Methods, apparatuses, and systems are provided to transition 3D space information detected in an Augmented Reality (AR) view of a mobile device to screen aligned information on the mobile device. In at least one implementation, a method includes determining augmentation information associated with an object of interest, including a Modelview (M1) matrix and a Projection (P1) matrix, displaying the augmentation information on top of a video image of the object of interest using the M1 and P1 matrices, generating a second Modelview (M2) matrix and a second Projection (P2) matrix, such that the matrices M2 and P2 represent the screen aligned final position of the augmentation information, and displaying the augmentation information using the M2 and P2 matrices.Type: GrantFiled: March 12, 2013Date of Patent: September 15, 2015Assignee: QUALCOMM IncorporatedInventors: Scott A. Leazenby, Eunjoo Kim, Per O. Nielsen, Gerald V. Wright, Jr., Erick Mendez Mendez, Michael Gervautz
-
Patent number: 8976191Abstract: Disclosed is a method and apparatus for creating a realistic color for a virtual object in an Augmented Reality environment produced by an Augmented Reality application. In one embodiment, the functions implemented include: selecting a reference image target frame; selecting a plurality of sample points in the reference image target frame; acquiring a subsequent new image target frame; determining a plurality of corresponding sample points in the new image target frame wherein the plurality of corresponding sample points correspond to the plurality of sample points in the reference image target frame; comparing a color of each of the plurality of sample points in the reference image target frame with a color of each of the corresponding sample points in the new image target frame and computing a Color Transfer function based at least in part on the comparison; and applying the Color Transfer function to the color of the virtual object.Type: GrantFiled: March 13, 2014Date of Patent: March 10, 2015Assignee: QUALCOMM IncorporatedInventor: Erick Mendez Mendez
-
Publication number: 20130342573Abstract: Methods, apparatuses, and systems are provided to transition 3D space information detected in an Augmented Reality (AR) view of a mobile device to screen aligned information on the mobile device. In at least one implementation, a method includes determining augmentation information associated with an object of interest, including a Modelview (M1) matrix and a Projection (P1) matrix, displaying the augmentation information on top of a video image of the object of interest using the M1 and P1 matrices, generating a second Modelview (M2) matrix and a second Projection (P2) matrix, such that the matrices M2 and P2 represent the screen aligned final position of the augmentation information, and displaying the augmentation information using the M2 and P2 matrices.Type: ApplicationFiled: March 12, 2013Publication date: December 26, 2013Applicant: QUALCOMM INCORPORATEDInventors: Scott A. Leazenby, Eunjoo Kim, Per O. Nielsen, Gerald V. Wright, JR., Erick Mendez Mendez, Michael Gervautz