Patents by Inventor Jennica Pounds
Jennica Pounds has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12238495Abstract: Electronic eyewear device providing simplified audio source separation, also referred to as voice/sound unmixing, using alignment between respective device trajectories. Multiple users of electronic eyewear devices in an environment may simultaneously generate audio signals (e.g., voices/sounds) that are difficult to distinguish from one another. The electronic eyewear device tracks the location of moving remote electronic eyewear devices of other users, or an object of the other users, such as the remote user's face, to provide audio source separation using location of the sound sources. The simplified voice unmixing uses a microphone array of the electronic eyewear device and the known location of the remote user's electronic eyewear device with respect to the user's electronic eyewear device to facilitate audio source separation.Type: GrantFiled: April 1, 2022Date of Patent: February 25, 2025Assignee: Snap Inc.Inventors: Georgios Evangelidis, Ashwani Arya, Jennica Pounds, Andrei Rybin
-
Publication number: 20250028397Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.Type: ApplicationFiled: October 7, 2024Publication date: January 23, 2025Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
-
Patent number: 12141362Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.Type: GrantFiled: April 27, 2022Date of Patent: November 12, 2024Assignee: Snap Inc.Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
-
Patent number: 12136433Abstract: An eyewear device that performs diarization by segmenting spoken language into different speakers and remembering each speaker over the course of a session. The speech of each speaker is translated to text and the text of each speaker is displayed on an eyewear display. The text of each user has a different attribute such that the eyewear user can distinguish the text of different speakers. Examples of the text attribute can be a text color, font, and font size. The text is displayed on the eyewear display such that it does not substantially obstruct the user's vision.Type: GrantFiled: May 28, 2020Date of Patent: November 5, 2024Assignee: Snap Inc.Inventors: Jonathan Geddes, Jennica Pounds, Ryan Pruden, Jonathan M. Rodriguez, II, Andrei Rybin
-
Publication number: 20240242541Abstract: Eyewear that identifies a hand gesture presenting sign language and then initiates a command indicative of the identified hand gesture. The eyewear uses a convolutional neural network (CNN) to identify the hand gesture by matching a hand motion to a set of hand gestures. The set of hand gestures is a library of hand gestures stored in a memory. The hand gestures can include static hand gestures, moving hand gestures, or both static and moving hand gestures. The eyewear identifies a command corresponding to a hand gesture or a series of hand gestures.Type: ApplicationFiled: January 13, 2023Publication date: July 18, 2024Inventors: Zsófia Jáger, Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Jerry Jesada Pua
-
Publication number: 20240193168Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.Type: ApplicationFiled: February 21, 2024Publication date: June 13, 2024Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
-
Publication number: 20240192780Abstract: A multi-System on Chip (SoC) hand-tracking platform is provided. The multi-SoC hand-tracking platform includes a computer vision SoC and one or more application SoCs. The computer vision SoC hosts a hand-tracking input pipeline. The one or more application SoCs host one or more applications that are consumers of input event data generated by the hand-tracking input pipeline. The applications communicate with some components of the hand-tracking input pipeline using a shared-memory buffer and with some of the components of the hand-tracking input pipeline using Inter-Process Communication (IPC) method calls.Type: ApplicationFiled: December 9, 2022Publication date: June 13, 2024Inventors: Liviu Marius Coconu, Daniel Colascione, Farid Zare Seisan, Daniel Harris, Jennica Pounds
-
Patent number: 11977553Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.Type: GrantFiled: August 1, 2022Date of Patent: May 7, 2024Assignee: Snap Inc.Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
-
Publication number: 20240127006Abstract: A method for recognizing sign language using collaborative augmented reality devices is described. In one aspect, a method includes accessing a first image generated by a first augmented reality device and a second image generated by a second augmented reality device, the first image and the second image depicting a hand gesture of a user of the first augmented reality device, synchronizing the first augmented reality device with the second augmented reality device, in response to the synchronizing, distributing one or more processes of a sign language recognition system between the first and second augmented reality devices, collecting results from the one or more processes from the first and second augmented reality devices, and displaying, in near real-time in a first display of the first augmented reality device, text indicating a sign language translation of the hand gesture based on the results.Type: ApplicationFiled: October 17, 2022Publication date: April 18, 2024Inventors: Kai Zhou, Jennica Pounds, Zsolt Robotka, Márton Gergely Kajtár
-
Publication number: 20240126373Abstract: A hand-tracking platform generates gesture components for use as user inputs into an application of an Augmented Reality (AR) system. In some examples, the hand-tracking platform generates real-world scene environment frame data based on gestures being made by a user of the AR system using a camera component of the AR system. The hand-tracking platform recognizes a gesture component based on the real-world scene environment frame data and generates gesture component data based on the gesture component. The application utilizes the gesture component data as user input in a user interface of the application.Type: ApplicationFiled: October 12, 2022Publication date: April 18, 2024Inventors: Attila Alvarez, Márton Gergely Kajtár, Peter Pocsi, Jennica Pounds, David Retek, Zsolt Robotka
-
Patent number: 11943303Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.Type: GrantFiled: May 20, 2022Date of Patent: March 26, 2024Assignee: Snap Inc.Inventors: Jennica Pounds, Brent Mills, Ulf Oscar Michel Loenngren
-
Publication number: 20240070994Abstract: An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and DMVO methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user makes gestures that cause the user's viewpoint of the virtual object to either zoom in or zoom out. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.Type: ApplicationFiled: August 31, 2022Publication date: February 29, 2024Inventors: Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Andrei Rybin, Pierre-Yves Santerre
-
Publication number: 20240070995Abstract: An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and DMVO methodologies to provide for the user's selection and modification of virtual object of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user rotates their wrist and the virtual object is rotated as well. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.Type: ApplicationFiled: August 31, 2022Publication date: February 29, 2024Inventors: Anoosh Kruba Chandar Mahalingam, Jennica Pounds, Andrei Rybin, Pierre-Yves Santerre
-
Patent number: 11900729Abstract: An eyewear having an electronic processor configured to identify a hand gesture including a sign language, and to generate speech that is indicative of the identified hand gesture. The electronic processor uses a convolutional neural network (CNN) to identify the hand gesture by matching the hand gesture in the image to a set of hand gestures, wherein the set of hand gestures is a library of hand gestures stored in a memory. The hand gesture can include a static hand gesture, and a moving hand gesture. The electronic processor is configured to identify a word from a series of hand gestures.Type: GrantFiled: November 18, 2021Date of Patent: February 13, 2024Assignee: SNAP INC.Inventors: Ryan Chan, Brent Mills, Eitan Pilipski, Jennica Pounds, Elliot Solomon
-
Publication number: 20230350495Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.Type: ApplicationFiled: April 27, 2022Publication date: November 2, 2023Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
-
Publication number: 20230341948Abstract: An AR system includes multiple input-modalities. A hand-tracking pipeline supports Direct Manipulation of Virtual Object (DMVO) and gesture input methodologies. In addition, a voice processing pipeline provides for speech inputs. Direct memory buffer access to preliminary hand-tracking data, such as skeletal models, allows for low latency communication of the data for use by DMVO-based user interfaces. A system framework component routes higher level hand-tracking data, such as gesture identification and symbols generated based on hand positions, via a Snips protocol to gesture-based user interfaces.Type: ApplicationFiled: April 26, 2023Publication date: October 26, 2023Inventors: Daniel Colascione, Daniel Harris, Andrei Rybin, Anoosh Kruba Chandar Mahalingam, Pierre-Yves Santerre, Jennica Pounds
-
Publication number: 20230319476Abstract: Electronic eyewear device providing simplified audio source separation, also referred to as voice/sound unmixing, using alignment between respective device trajectories. Multiple users of electronic eyewear devices in an environment may simultaneously generate audio signals (e.g., voices/sounds) that are difficult to distinguish from one another. The electronic eyewear device tracks the location of moving remote electronic eyewear devices of other users, or an object of the other users, such as the remote user's face, to provide audio source separation using location of the sound sources. The simplified voice unmixing uses a microphone array of the electronic eyewear device and the known location of the remote user's electronic eyewear device with respect to the user's electronic eyewear device to facilitate audio source separation.Type: ApplicationFiled: April 1, 2022Publication date: October 5, 2023Inventors: Georgios Evangelidis, Ashwani Arya, Jennica Pounds, Andrei Rybin
-
Publication number: 20230168522Abstract: Eyewear providing a visual indicator to a hearing-impaired user indicating a direction of arrival of a sound relative to the eyewear to help the user obtain greater awareness of the surrounding environment. An eyewear optical assembly includes an image display displaying the visual indicator discernable by the user and corresponding to the direction of arrival of the sound, even when a sound source is not viewable through the optical assembly. The image display also displays an image indicative of the sound source. A front portion of an eyewear frame includes a light array, where one or more lights of the light array is illuminated to indicate the direction of arrival and intensity of the sound source. An array of vibrating devices may also be used to indicate the direction of arrival and intensity of the sound source.Type: ApplicationFiled: December 1, 2021Publication date: June 1, 2023Inventors: Jennica Pounds, Daniel Harris, Ashwani Arya
-
Publication number: 20230104981Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.Type: ApplicationFiled: August 1, 2022Publication date: April 6, 2023Inventors: Jennica Pounds, Qi Pan, Brent Michael Barkman, Ozi Egri
-
Publication number: 20220279043Abstract: Various embodiments provide for a registry for augmented reality (AR) objects, which can provide AR objects to a client device to support various software or hardware applications. For instance, some embodiments provide for an AR object registry that facilitates or enables registration of one or more AR objects in association with one or more locations across a planet.Type: ApplicationFiled: May 20, 2022Publication date: September 1, 2022Inventors: Jennica Pounds, Brent Mills, Uif Oscar Michel Loenngren