Patents Assigned to Snap-On Inc.
-
Patent number: 12008811Abstract: Aspects of the present disclosure involve a system comprising a medium storing a program and method for machine-learning based selection of a representative video frame. The program and method provide for receiving a set of video frames; determining a first subset of frames by removing frames outside of an image quality threshold; determining a second subset by removing frames outside of an image stillness threshold; computing feature data for each frame in the second subset; providing, for each frame in the second subset, the feature data to a machine learning model (MLM), the MLM being configured to output a score for each frame in the second subset of frames based on the feature data, the MLM having been trained with a first set of images labeled based on aesthetics, and with a second set of images labeled based on image quality; and selecting a frame based on output scores.Type: GrantFiled: December 14, 2021Date of Patent: June 11, 2024Assignee: SNAP INC.Inventors: Kavya Venkata Kota Kopparapu, Benjamin Dodson, Francesc Xavier Drudis Rius, Angus Kong, Richard Leider, Jian Ren, Sergey Tulyakov, Jiayao Yu
-
Patent number: 12007568Abstract: Systems and methods for eyewear devices with integrated heads-up displays are provided. In one embodiment, an eyewear device provides an integrated heads-up display having a partially reflective element carried by an eyeglass lens to reflect towards the user computer-generated imagery projected on to it, while permitting the passage of light through the reflective surface in the direction of view of the user. The display mechanism further includes a cooperating projector assembly housed by a frame of the eyewear device in an overhead configuration relative to the partially reflective element. The projector assembly is housed by a top bar of the eyewear frame, with the reflective surface being housed wholly within the lens.Type: GrantFiled: September 2, 2022Date of Patent: June 11, 2024Assignee: SNAP INC.Inventors: Jonathan M Rodriguez, II, Kimberly A. Phifer
-
Patent number: 12008155Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: GrantFiled: May 24, 2023Date of Patent: June 11, 2024Assignee: Snap Inc.Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
-
Patent number: 12003577Abstract: A machine learning engine identifies training data that includes historical user data and historical content data. A machine learning classifier is trained on the training data to generate a relevancy value for each of a plurality of given content items associated with a given user. The relevancy value for each given content item is indicative of a likelihood that the given user will perform a first user device input action and of a likelihood that the given user will perform a second user device input action, in response to being presented with the given content item. The machine learning classifier receives a plurality of candidate content items associated with a first user. The machine learning classifier generates a relevancy value for each candidate content item. At least one of the candidate content items is identified for inclusion in a first content collection based on the generated relevancy values.Type: GrantFiled: January 19, 2023Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: Jason Brewer, Rodrigo B. Farnham, David B. Lue, Nicholas J. Stucky-Mack
-
Patent number: 12002175Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, and a method for performing operations comprising: receiving a video that depicts a person; identifying a set of skeletal joints corresponding to limbs of the person; tracking 3D movement of the set of skeletal joints corresponding to the limbs of the person in the video; causing display of a 3D virtual object that has a plurality of limbs including one or more extra limbs than the limbs of the person in the video; and moving the one or more extra limbs of the 3D virtual object based on the movement of the set of skeletal joints corresponding to the limbs of the person in the video.Type: GrantFiled: June 30, 2023Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: Avihay Assouline, Itamar Berger, Gal Dudovitch, Matan Zohar
-
Patent number: 12001475Abstract: Systems, methods, devices, media, and computer-readable instructions are described for local image tagging and processing in a resource-constrained environment such as a mobile device. In some embodiments, characteristics associated with images are used to determine whether to store content (e.g., images and video clips) as ephemeral content or non-ephemeral content. Based on the determination, the image is stored in a non-ephemeral camera roll storage of the mobile device, or an ephemeral local application storage. Additional storage operations such as encryption or backup copying may additionally be determined and performed based on the analysis of the content. In some embodiments, such images may be indexed, sorted, and searched based on the image tagging operations used to characterize the content.Type: GrantFiled: December 18, 2020Date of Patent: June 4, 2024Assignee: Snap Inc.Inventor: Jonathan Brody
-
Patent number: 12002168Abstract: A method for reducing motion-to-photon latency for hand tracking is described. In one aspect, a method includes accessing a first frame from a camera of an Augmented Reality (AR) device, tracking a first image of a hand in the first frame, rendering virtual content based on the tracking of the first image of the hand in the first frame, accessing a second frame from the camera before the rendering of the virtual content is completed, the second frame immediately following the first frame, tracking, using the computer vision engine of the AR device, a second image of the hand in the second frame, generating an annotation based on tracking the second image of the hand in the second frame, forming an annotated virtual content based on the annotation and the virtual content, and displaying the annotated virtual content in a display of the AR device.Type: GrantFiled: June 20, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Jan Bajana, Bernhard Jung, Daniel Wagner
-
Patent number: 12002135Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for adding time-based captions to captured video. The program and method provide for displaying, by a messaging application, a preview user interface for previewing and editing captured video in order to generate a media content item, the preview user interface including an interface element which is selectable to switch to a captions user interface for adding a caption to the media content item; switching, in response to first user input selecting the interface element, to the captions user interface which is configured to receive user input for caption content and a duration to display the caption content; receiving, via the captions user interface, second user input for the caption content and the duration; and generating the media content item based on the caption content and the duration to display the caption content.Type: GrantFiled: December 20, 2021Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Kaveh Anvaripour, Christine Barron, Nathan Kenneth Boyd, Christie Marie Heikkinen, Ranidu Lankage, Daniel Moreno, Shannon Ward, Tabari Williams
-
Patent number: 12002392Abstract: Eyewear including a projector having a variable feedback loop controlling a forward current delivered to a colored light source. The colored light source is configured to generate a colored light beam to generate a displayed image. The variable feedback loop in one example has a variable resistance to selectively generate a high brightness image when the eyewear is operated outside, or in a high ambient light setting, and to selectively generate a nominal brightness image when the eyewear is operated inside. A controller selectively controls the drive current delivered to the colored light source to control the brightness mode of the image.Type: GrantFiled: April 20, 2023Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Jason Heger, Gerald Nilles
-
Patent number: 11998833Abstract: Method for generating collectible media content items based on location information starts with processor receiving location information from location sensor coupled to first client computing device. Processor causes a map interface to be displayed that includes avatar of first user at a location based on the location information and a subset of a plurality of collectible items associated with geographical coordinates. When the first client computing device is determined to be within predetermined distance from a selected collectible item, processor causes front facing camera view to be displayed on the first client computing device, causes lens corresponding to selected collectible item to be applied to the front facing camera view, and causes the image of the selected collectible item displayed on the front facing camera view to change. Lens includes image of the selected collectible item that is overlaid on front facing camera view. Other embodiments are described herein.Type: GrantFiled: September 28, 2022Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: Jonathan Brody, Jill H. Cohen, Bryant Detwiller, Alexander Fung, Evan H K Lin, Walton Lin, Kimberly A. Phifer, Alexandre Valdetaro Porto
-
Patent number: 11998798Abstract: Example systems, devices, media, and methods are described for presenting a virtual guided fitness experience using the display of an eyewear device in augmented reality. A guided fitness application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) and video data from one or more cameras. The method includes detecting exercise motions (with or without equipment) as well as detecting and counting repetitions. Relevant data about detected motions or equipment is retrieved and used to curate the guided fitness experience. A current rep count is presented on the display along with an avatar for playing messages, performing animated demonstrations, responding to commands and queries using speech recognition, and presenting guided fitness instructions through text, audio, and video.Type: GrantFiled: May 14, 2021Date of Patent: June 4, 2024Assignee: Snap Inc.Inventor: Megan Hong
-
Patent number: 12001024Abstract: An energy-efficient adaptive 3D sensing system. The adaptive 3D sensing system includes one or more cameras and one or more projectors. The adaptive 3D sensing system captures images of a real-world scene using the one or more cameras and computes depth estimates and depth estimate confidence values for pixels of the images. The adaptive 3D sensing system computes an attention mask based on the one or more depth estimate confidence values and commands the one or more projectors to send a distributed laser beam into one or more areas of the real-world scene based on the attention mask. The adaptive 3D sensing system captures 3D sensing image data of the one or more areas of the real-world scene and generates 3D sensing data for the real-world scene based on the 3D sensing image data.Type: GrantFiled: April 13, 2023Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Jian Wang, Sizhuo Ma, Brevin Tilmon, Yicheng Wu, Gurunandan Krishnan Gorumkonda, Ramzi Zahreddine, Georgios Evangelidis
-
Patent number: 12002232Abstract: Various embodiments provide systems, methods, devices, and instructions for performing simultaneous localization and mapping (SLAM) that involve initializing a SLAM process using images from as few as two different poses of a camera within a physical environment. Some embodiments may achieve this by disregarding errors in matching corresponding features depicted in image frames captured by an image sensor of a mobile computing device, and by updating the SLAM process in a way that causes the minimization process to converge to global minima rather than fall into a local minimum.Type: GrantFiled: May 12, 2023Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: David Ben Ezra, Eyal Zak, Ozi Egri
-
Patent number: 12001658Abstract: Content sharing between a first user, a second user, and a third user is facilitated. A first public content collection includes a first input content item of a first user. Responsive to receiving an indication of a combination user input from a second user, a combination function is invoked to allow the second user to combine the first user input content item with a second user input content item to create a first combined user input content item. The first combined user input content item is stored in association with the first user input content item in a second public content collection. A third user sends a combination collection presentation user input related to the first user input content item. In response, a combination collection presentation function is invoked to enable the third user to navigate the second public content collection.Type: GrantFiled: December 28, 2022Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: Christie Marie Heikkinen, David Phillip Taitz
-
Patent number: 12001750Abstract: A location-based shared augmented reality (AR) experience system is configured to permit users that find themselves in the same geographic area to easily join in a shared AR experience by creating respective instances of the shared AR experience for different previously defined geographic areas. When a user indicates a request to launch a shared AR experience accessible via a messaging client, the location-based shared AR experience system obtains or receives from the user device executing the messaging client location information of the user device, determines a previously-defined AR experience area that encompasses the location of the user device, and communicates to the user device an address of an associated instance of the shared AR experience.Type: GrantFiled: April 20, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventor: Pawel Wawruch
-
Patent number: 12002146Abstract: Methods and systems are disclosed for performing operations for generating a 3D model of a scene. The operations include: receiving a set of two-dimensional (2D) images representing a first view of a real-world environment; applying a machine learning model comprising a neural light field network to the set of 2D images to predict pixel values of a target image representing a second view of the real-world environment, the machine learning model being trained to map a ray origin and direction directly to a given pixel value; and generating a three-dimensional (3D) model of the real-world environment based on the set of 2D images and the predicted target image.Type: GrantFiled: March 28, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Zeng Huang, Jian Ren, Sergey Tulyakov, Menglei Chai, Kyle Olszewski, Huan Wang
-
Patent number: 12001878Abstract: Systems, methods, and computer readable media for auto-recovery of an augmented reality (AR) wearable device are disclosed. A pass-through application is invoked as a background process and an application is invoked as a foreground process. The pass-through application includes an on-resume procedure that is called if the operating system or interpreter determines that the foreground process is unresponsive. The on-resume procedure restarts the application as the foreground process and may first reboot the AR wearable device. The pass-through application remains transparent to the user by not displaying output on the display of the AR wearable device. Additionally, an uncaught exception handler is registered with the operating system to be called in the event that an exception occurs that does not have a handler. The exception handler restarts the application as the foreground process and may first reboot the AR wearable device.Type: GrantFiled: June 3, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventor: Piotr Gurgul
-
Patent number: 12001647Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for presenting available functions for a captured image. The program and method provide for selecting a subset of functions from among a set of functions for applying to an image captured by a device camera; causing display of a first interface for presenting the subset of functions, the first interface including a group of icons, each of which is user-selectable to perform a respective function within the subset, the first interface further including an additional icon selectable to switch to a second interface; and causing, in response to user selection of the additional icon, to switch to display of the second interface for presenting the set of functions, the second interface including a list of entries, each of which is user-selectable to perform a respective function within the set of functions.Type: GrantFiled: June 29, 2022Date of Patent: June 4, 2024Assignee: SNAP INC.Inventors: Kaveh Anvaripour, Laurent Desserrey
-
Patent number: 12003862Abstract: A method of determining an image capture timestamp offset or error includes generating optical flashes at an optical flash rate. A set of images of the optical flashes are captured at an image capture rate. The image capture rate is different from the optical flash rate, and each image includes an associated image timestamp. Signals associated with the generation of the optical flashes are also timestamped. The intensity of each image in the set of images is determined, and the image having the greatest intensity in the set of images is identified. The timestamp offset or error is determined as the difference between the timestamp of the image having the greatest intensity and the timestamp of the corresponding optical flash.Type: GrantFiled: June 21, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Julian Grahsl, Alexander Kane
-
Patent number: D1029756Type: GrantFiled: April 27, 2022Date of Patent: June 4, 2024Assignee: Snap Inc.Inventors: Evan Spiegel, Mathias Hintermann, Kristina Marrero, Klaus Tritschler