Patents by Inventor Jennica Pounds

Jennica Pounds has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12238495
    Abstract: Electronic eyewear device providing simplified audio source separation, also referred to as voice/sound unmixing, using alignment between respective device trajectories. Multiple users of electronic eyewear devices in an environment may simultaneously generate audio signals (e.g., voices/sounds) that are difficult to distinguish from one another. The electronic eyewear device tracks the location of moving remote electronic eyewear devices of other users, or an object of the other users, such as the remote user's face, to provide audio source separation using location of the sound sources. The simplified voice unmixing uses a microphone array of the electronic eyewear device and the known location of the remote user's electronic eyewear device with respect to the user's electronic eyewear device to facilitate audio source separation.
    Type: Grant
    Filed: April 1, 2022
    Date of Patent: February 25, 2025
    Assignee: Snap Inc.
    Inventors: Georgios Evangelidis, Ashwani Arya, Jennica Pounds, Andrei Rybin
  • Publication number: 20250028397
    Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.
    Type: Application
    Filed: October 7, 2024
    Publication date: January 23, 2025
    Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
  • Patent number: 12141362
    Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.
    Type: Grant
    Filed: April 27, 2022
    Date of Patent: November 12, 2024
    Assignee: Snap Inc.
    Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
  • Patent number: 12136433
    Abstract: An eyewear device that performs diarization by segmenting spoken language into different speakers and remembering each speaker over the course of a session. The speech of each speaker is translated to text and the text of each speaker is displayed on an eyewear display. The text of each user has a different attribute such that the eyewear user can distinguish the text of different speakers. Examples of the text attribute can be a text color, font, and font size. The text is displayed on the eyewear display such that it does not substantially obstruct the user's vision.
    Type: Grant
    Filed: May 28, 2020
    Date of Patent: November 5, 2024
    Assignee: Snap Inc.
    Inventors: Jonathan Geddes, Jennica Pounds, Ryan Pruden, Jonathan M. Rodriguez, II, Andrei Rybin
  • Publication number: 20240242541
    Abstract: Eyewear that identifies a hand gesture presenting sign language and then initiates a command indicative of the identified hand gesture. The eyewear uses a convolutional neural network (CNN) to identify the hand gesture by matching a hand motion to a set of hand gestures. The set of hand gestures is a library of hand gestures stored in a memory. The hand gestures can include static hand gestures, moving hand gestures, or both static and moving hand gestures. The eyewear identifies a command corresponding to a hand gesture or a series of hand gestures.
    Type: Application
    Filed: January 13, 2023
    Publication date: July 18, 2024
    Inventors: Zsófia Jáger, Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Jerry Jesada Pua
  • Publication number: 20240193168
    Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.
    Type: Application
    Filed: February 21, 2024
    Publication date: June 13, 2024
    Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
  • Publication number: 20240192780
    Abstract: A multi-System on Chip (SoC) hand-tracking platform is provided. The multi-SoC hand-tracking platform includes a computer vision SoC and one or more application SoCs. The computer vision SoC hosts a hand-tracking input pipeline. The one or more application SoCs host one or more applications that are consumers of input event data generated by the hand-tracking input pipeline. The applications communicate with some components of the hand-tracking input pipeline using a shared-memory buffer and with some of the components of the hand-tracking input pipeline using Inter-Process Communication (IPC) method calls.
    Type: Application
    Filed: December 9, 2022
    Publication date: June 13, 2024
    Inventors: Liviu Marius Coconu, Daniel Colascione, Farid Zare Seisan, Daniel Harris, Jennica Pounds
  • Patent number: 11977553
    Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.
    Type: Grant
    Filed: August 1, 2022
    Date of Patent: May 7, 2024
    Assignee: Snap Inc.
    Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
  • Publication number: 20240127006
    Abstract: A method for recognizing sign language using collaborative augmented reality devices is described. In one aspect, a method includes accessing a first image generated by a first augmented reality device and a second image generated by a second augmented reality device, the first image and the second image depicting a hand gesture of a user of the first augmented reality device, synchronizing the first augmented reality device with the second augmented reality device, in response to the synchronizing, distributing one or more processes of a sign language recognition system between the first and second augmented reality devices, collecting results from the one or more processes from the first and second augmented reality devices, and displaying, in near real-time in a first display of the first augmented reality device, text indicating a sign language translation of the hand gesture based on the results.
    Type: Application
    Filed: October 17, 2022
    Publication date: April 18, 2024
    Inventors: Kai Zhou, Jennica Pounds, Zsolt Robotka, Márton Gergely Kajtár
  • Publication number: 20240126373
    Abstract: A hand-tracking platform generates gesture components for use as user inputs into an application of an Augmented Reality (AR) system. In some examples, the hand-tracking platform generates real-world scene environment frame data based on gestures being made by a user of the AR system using a camera component of the AR system. The hand-tracking platform recognizes a gesture component based on the real-world scene environment frame data and generates gesture component data based on the gesture component. The application utilizes the gesture component data as user input in a user interface of the application.
    Type: Application
    Filed: October 12, 2022
    Publication date: April 18, 2024
    Inventors: Attila Alvarez, Márton Gergely Kajtár, Peter Pocsi, Jennica Pounds, David Retek, Zsolt Robotka
  • Patent number: 11943303
    Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.
    Type: Grant
    Filed: May 20, 2022
    Date of Patent: March 26, 2024
    Assignee: Snap Inc.
    Inventors: Jennica Pounds, Brent Mills, Ulf Oscar Michel Loenngren
  • Publication number: 20240070994
    Abstract: An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and DMVO methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user makes gestures that cause the user's viewpoint of the virtual object to either zoom in or zoom out. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Andrei Rybin, Pierre-Yves Santerre
  • Publication number: 20240070995
    Abstract: An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and DMVO methodologies to provide for the user's selection and modification of virtual object of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user rotates their wrist and the virtual object is rotated as well. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Andrei Rybin, Pierre-Yves Santerre
  • Patent number: 11900729
    Abstract: An eyewear having an electronic processor configured to identify a hand gesture including a sign language, and to generate speech that is indicative of the identified hand gesture. The electronic processor uses a convolutional neural network (CNN) to identify the hand gesture by matching the hand gesture in the image to a set of hand gestures, wherein the set of hand gestures is a library of hand gestures stored in a memory. The hand gesture can include a static hand gesture, and a moving hand gesture. The electronic processor is configured to identify a word from a series of hand gestures.
    Type: Grant
    Filed: November 18, 2021
    Date of Patent: February 13, 2024
    Assignee: SNAP INC.
    Inventors: Ryan Chan, Brent Mills, Eitan Pilipski, Jennica Pounds, Elliot Solomon
  • Publication number: 20230350495
    Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.
    Type: Application
    Filed: April 27, 2022
    Publication date: November 2, 2023
    Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
  • Publication number: 20230341948
    Abstract: An AR system includes multiple input-modalities. A hand-tracking pipeline supports Direct Manipulation of Virtual Object (DMVO) and gesture input methodologies. In addition, a voice processing pipeline provides for speech inputs. Direct memory buffer access to preliminary hand-tracking data, such as skeletal models, allows for low latency communication of the data for use by DMVO-based user interfaces. A system framework component routes higher level hand-tracking data, such as gesture identification and symbols generated based on hand positions, via a Snips protocol to gesture-based user interfaces.
    Type: Application
    Filed: April 26, 2023
    Publication date: October 26, 2023
    Inventors: Daniel Colascione, Daniel Harris, Andrei Rybin, Anoosh Kruba Chandar Mahalingam, Pierre-Yves Santerre, Jennica Pounds
  • Publication number: 20230319476
    Abstract: Electronic eyewear device providing simplified audio source separation, also referred to as voice/sound unmixing, using alignment between respective device trajectories. Multiple users of electronic eyewear devices in an environment may simultaneously generate audio signals (e.g., voices/sounds) that are difficult to distinguish from one another. The electronic eyewear device tracks the location of moving remote electronic eyewear devices of other users, or an object of the other users, such as the remote user's face, to provide audio source separation using location of the sound sources. The simplified voice unmixing uses a microphone array of the electronic eyewear device and the known location of the remote user's electronic eyewear device with respect to the user's electronic eyewear device to facilitate audio source separation.
    Type: Application
    Filed: April 1, 2022
    Publication date: October 5, 2023
    Inventors: Georgios Evangelidis, Ashwani Arya, Jennica Pounds, Andrei Rybin
  • Publication number: 20230168522
    Abstract: Eyewear providing a visual indicator to a hearing-impaired user indicating a direction of arrival of a sound relative to the eyewear to help the user obtain greater awareness of the surrounding environment. An eyewear optical assembly includes an image display displaying the visual indicator discernable by the user and corresponding to the direction of arrival of the sound, even when a sound source is not viewable through the optical assembly. The image display also displays an image indicative of the sound source. A front portion of an eyewear frame includes a light array, where one or more lights of the light array is illuminated to indicate the direction of arrival and intensity of the sound source. An array of vibrating devices may also be used to indicate the direction of arrival and intensity of the sound source.
    Type: Application
    Filed: December 1, 2021
    Publication date: June 1, 2023
    Inventors: Jennica Pounds, Daniel Harris, Ashwani Arya
  • Publication number: 20230104981
    Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.
    Type: Application
    Filed: August 1, 2022
    Publication date: April 6, 2023
    Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
  • Publication number: 20220279043
    Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.
    Type: Application
    Filed: May 20, 2022
    Publication date: September 1, 2022
    Inventors: Jennica Pounds, Brent Mills, Uif Oscar Michel Loenngren