Patents by Inventor Piotr Gurgul
Piotr Gurgul has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250251739Abstract: Apparatuses, computer readable medium, and methods for image capturing while circumnavigating objects using mobile devices are disclosed. Example methods include capturing an image, processing the image to identify an object within the image, determining a path around the object and a number of images to capture of the object, dividing the path by the number of images to determine a number of waypoints, and navigating the mobile device to the waypoints and capturing an image of the object at each waypoint of the waypoints. Examples include a person pointing at an object and the mobile device identifying the object based on the person pointing at the object. The mobile device determines a bounding box and a geometric center of the bounding box to determine the path to circumnavigate the object. The mobile device determines a height above a ground to assist in navigation.Type: ApplicationFiled: March 25, 2025Publication date: August 7, 2025Inventors: Piotr Gurgul, Sharon Moll
-
Publication number: 20250245040Abstract: Systems, methods, and computer readable media for auto-recovery of an augmented reality (AR) wearable device are disclosed. A pass-through application is invoked as a background process and an application is invoked as a foreground process. The pass-through application includes an on-resume procedure that is called if the operating system or interpreter determines that the foreground process is unresponsive. The on-resume procedure restarts the application as the foreground process and may first reboot the AR wearable device. The pass-through application remains transparent to the user by not displaying output on the display of the AR wearable device. Additionally, an uncaught exception handler is registered with the operating system to be called in the event that an exception occurs that does not have a handler. The exception handler restarts the application as the foreground process and may first reboot the AR wearable device.Type: ApplicationFiled: April 21, 2025Publication date: July 31, 2025Inventor: Piotr Gurgul
-
Publication number: 20250237879Abstract: Systems, methods, and computer readable media for voice input for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. A keyword is used to indicate that the user is about to speak an action or command. The AR wearable device divides the processing of the audio data into a keyword module that is trained to recognize the keyword and a module to process the audio data after the keyword. In some embodiments, the AR wearable device transmits the audio data after the keyword to a host device to process. The AR wearable device maintains an application registry that associates actions with applications. Applications can be downloaded, and the application registry updated where the applications indicate actions to associate with the application.Type: ApplicationFiled: March 14, 2025Publication date: July 24, 2025Inventors: Sharon Moll, Piotr Gurgul, Tomasz Zakrzewski
-
Publication number: 20250208811Abstract: An architecture is provided for packaging visual overlay-based user interfaces (UIs) into mobile device applications to work as user interface extensions that allow certain flows and logic to be displayed on an eyewear device when connected to the mobile device application. The extension of the UIs of the mobile device applications to the display of the eyewear device allows for inexpensive experimentation with augmented reality (AR) UIs for eyewear devices and allows for reusing of business logic across mobile devices and associated eyewear devices. For example, a mobile device application for maps or navigation may be extended to show directions on an associated eyewear device once the destination is chosen in the navigation application on the mobile device. In this example, the business logic would still live in the navigation application on the mobile device but the user would see AR directions overlaid on a display of the eyewear device.Type: ApplicationFiled: March 12, 2025Publication date: June 26, 2025Inventors: Piotr Gurgul, Sharon Moll
-
Publication number: 20250199621Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to detect a gesture made by a user of the computing apparatus and generate gesture data identifying the gesture, select an application or selected action from a set of registered applications and actions based on the gesture data, and invoke the application or selected action.Type: ApplicationFiled: March 4, 2025Publication date: June 19, 2025Inventors: Sharon Moll, Piotr Gurgul, Francis Patrick Sullivan, Andrei Rybin
-
Patent number: 12335876Abstract: Systems, methods, and computer readable media that schedules requests for location data of a mobile device, where the methods include selecting a first positioning system based on a power requirement, a latency requirement, and an accuracy requirement, and determining whether a first condition is satisfied for querying the first positioning system. The method further comprises in response to a determination that the first condition is satisfied, querying the first positioning system for first position data. The method further comprises in response to a determination that the first condition is not satisfied, selecting a second positioning system based on the power requirement, the latency requirement, and the accuracy requirement, determining whether a second condition is satisfied for querying the second positioning system, and in response to a determination that the second condition is satisfied, querying the second positioning system for second position data.Type: GrantFiled: January 11, 2024Date of Patent: June 17, 2025Assignee: SNAP INC.Inventors: Piotr Gurgul, Lucas Rangit Magasweran
-
Publication number: 20250191313Abstract: Systems, methods, and computer readable media for input modalities for an augmented reality (AR) wearable device are disclosed. The AR wearable device captures images using an image capturing device and processes the images to identify objects. The objects may be people, places, things, and so forth. The AR wearable device associates the objects with tags such as the name of the object or a function that can be provided by the selection of the object. The AR wearable device then matches the tags of the objects with tags associated with AR applications. The AR wearable device presents on a display of the AR wearable device indications of the AR applications with matching tags, which provides a user with the opportunity to invoke one of the AR applications. The AR wearable device recognizes a selection of an AR application in a number of different ways including gesture recognition and voice commands.Type: ApplicationFiled: February 20, 2025Publication date: June 12, 2025Inventors: Piotr Gurgul, Sharon Moll
-
Publication number: 20250193801Abstract: Systems, methods, and computer readable media that determine a location of a device using multi-source geolocation data, where the methods include accessing new location data from a location source of a plurality of location sources, where the new location data includes a new position and an accuracy of the new position, and determining a current position and an accuracy of the current position based on the new position, the accuracy of the new position, an previous current position, and an accuracy of the previous current position. The method further includes determining a change in location based on a difference between the current position and the previous current position. Some systems, methods, and computer readable media are directed to scheduling location requests to generate location data where the scheduling and the actual requests are made based on a number of conditions.Type: ApplicationFiled: February 24, 2025Publication date: June 12, 2025Inventors: Piotr Gurgul, Lucas Rangit Magasweran
-
Publication number: 20250180369Abstract: Systems and methods are provided for providing package delivery assistance. The systems and methods detect, by a wearable device, an identifier associated with a package and retrieve package delivery information from a package delivery device using the identifier of the package. The systems and methods display, by the wearable device, a portion of the package delivery information and generate, by the wearable device, visual navigational assistance to guide a courier to a delivery location associated with the package.Type: ApplicationFiled: November 30, 2023Publication date: June 5, 2025Inventors: Piotr Gurgul, Sharon Moll
-
Patent number: 12298768Abstract: Apparatuses, computer readable medium, and methods for image capturing while circumnavigating objects using mobile devices are disclosed. Example methods include capturing an image, processing the image to identify an object within the image, determining a path around the object and a number of images to capture of the object, dividing the path by the number of images to determine a number of waypoints, and navigating the mobile device to the waypoints and capturing an image of the object at each waypoint of the waypoints. Examples include a person pointing at an object and the mobile device identifying the object based on the person pointing at the object. The mobile device determines a bounding box and a geometric center of the bounding box to determine the path to circumnavigate the object. The mobile device determines a height above a ground to assist in navigation.Type: GrantFiled: April 18, 2023Date of Patent: May 13, 2025Assignee: Snap Inc.Inventors: Piotr Gurgul, Sharon Moll
-
Patent number: 12298521Abstract: Systems, methods, and computer readable media for voice input for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. A keyword is used to indicate that the user is about to speak an action or command. The AR wearable device divides the processing of the audio data into a keyword module that is trained to recognize the keyword and a module to process the audio data after the keyword. In some embodiments, the AR wearable device transmits the audio data after the keyword to a host device to process. The AR wearable device maintains an application registry that associates actions with applications. Applications can be downloaded, and the application registry updated where the applications indicate actions to associate with the application.Type: GrantFiled: April 18, 2024Date of Patent: May 13, 2025Assignee: Snap Inc.Inventors: Sharon Moll, Piotr Gurgul, Tomasz Zakrzewski
-
Patent number: 12271647Abstract: An architecture is provided for packaging visual overlay-based user interfaces (UIs) into mobile device applications to work as user interface extensions that allow certain flows and logic to be displayed on an eyewear device when connected to the mobile device application. The extension of the UIs of the mobile device applications to the display of the eyewear device allows for inexpensive experimentation with augmented reality (AR) UIs for eyewear devices and allows for reusing of business logic across mobile devices and associated eyewear devices. For example, a mobile device application for maps or navigation may be extended to show directions on an associated eyewear device once the destination is chosen in the navigation application on the mobile device. In this example, the business logic would still live in the navigation application on the mobile device but the user would see AR directions overlaid on a display of the eyewear device.Type: GrantFiled: August 29, 2022Date of Patent: April 8, 2025Assignee: Snap Inc.Inventors: Piotr Gurgul, Sharon Moll
-
Publication number: 20250111852Abstract: A system and method for controlling an electronic eyewear device using voice commands receives audio data from a microphone, processes the audio data to identify a wake word, and upon identification of a wake word, processes the audio data to identify at least one action keyword in the audio data. The audio data is provided to one of a plurality of controllers associated with different action keywords or sets of action keywords to implement an action. For example, the audio data may be provided to a settings controller to adjust settings of the electronic eyewear device when the action keyword is indicative of a request to adjust a setting of the electronic eyewear device or to a navigation controller to navigate to the system information of the electronic eyewear device when the action keyword is indicative of a request to navigate to system information of the electronic eyewear device.Type: ApplicationFiled: December 13, 2024Publication date: April 3, 2025Inventor: Piotr Gurgul
-
Patent number: 12266057Abstract: Systems, methods, and computer readable media for input modalities for an augmented reality (AR) wearable device are disclosed. The AR wearable device captures images using an image capturing device and processes the images to identify objects. The objects may be people, places, things, and so forth. The AR wearable device associates the objects with tags such as the name of the object or a function that can be provided by the selection of the object. The AR wearable device then matches the tags of the objects with tags associated with AR applications. The AR wearable device presents on a display of the AR wearable device indications of the AR applications with matching tags, which provides a user with the opportunity to invoke one of the AR applications. The AR wearable device recognizes a selection of an AR application in a number of different ways including gesture recognition and voice commands.Type: GrantFiled: June 2, 2022Date of Patent: April 1, 2025Assignee: Snap Inc.Inventors: Piotr Gurgul, Sharon Moll
-
Patent number: 12265663Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to detect a gesture made by a user of the computing apparatus and generate gesture data identifying the gesture, select an application or selected action from a set of registered applications and actions based on the gesture data, and invoke the application or selected action.Type: GrantFiled: April 4, 2022Date of Patent: April 1, 2025Assignee: Snap Inc.Inventors: Sharon Moll, Piotr Gurgul, Francis Patrick Sullivan, Andrei Rybin
-
Patent number: 12262326Abstract: Systems, methods, and computer readable media that determine a location of a device using multi-source geolocation data, where the methods include accessing new location data from a location source of a plurality of location sources, where the new location data includes a new position and an accuracy of the new position, and determining a current position and an accuracy of the current position based on the new position, the accuracy of the new position, an previous current position, and an accuracy of the previous current position. The method further includes determining a change in location based on a difference between the current position and the previous current position. Some systems, methods, and computer readable media are directed to scheduling location requests to generate location data where the scheduling and the actual requests are made based on a number of conditions.Type: GrantFiled: February 8, 2023Date of Patent: March 25, 2025Assignee: Snap Inc.Inventors: Piotr Gurgul, Lucas Rangit Magasweran
-
Publication number: 20250097548Abstract: A mixed-reality media content system may be configured to perform operations that include: causing display of image data at a client device, the image data comprising a depiction of an object that includes a graphical code at a position upon the object; detecting the graphical code at the position upon the depiction of the object based on the image data; accessing media content within a media repository based on the graphical code scanned by the client device; and causing display of a presentation of the media content at the position of the graphical code upon the depiction of the object at the client device.Type: ApplicationFiled: December 5, 2024Publication date: March 20, 2025Inventors: Sharon Moll, Piotr Gurgul, Dawei Zhang
-
Publication number: 20250095304Abstract: Systems, methods, and computer readable media for object counting on augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable display of a count of objects as part of a user view. Upon receipt of a request to count objects, the AR wearable device captures an image of the user view. The AR wearable device transmits the image to a backend for processing to determine the objects in the image. The AR wearable device selects a group of objects of the determined objects to count and overlays boundary boxes over counted objects within the user view. The position of the boundary boxes is adjusted to account for movement of the AR wearable device. A hierarchy of objects is used to group together objects that are related but have different labels or names.Type: ApplicationFiled: November 25, 2024Publication date: March 20, 2025Inventors: Piotr Gurgul, Sharon Moll, Tomasz Zakrzewski
-
Publication number: 20250076990Abstract: AR-enabled wearable electronic devices such as smart glasses are adapted for use as an (Internet of Things) IoT remote control device where the user can control a pointer on a television screen, computer screen, or other IoT enabled device to select items by looking at them and making selections using gestures. Built-in six-degrees-of-freedom (6DoF) tracking capabilities are used to move the pointer on the screen to facilitate navigation. The display screen is tracked in real-world coordinates to determine the point of intersection of the user's view with the screen using raycasting techniques. Hand and head gesture detection are used to allow the user to execute a variety of control actions by performing different gestures. The techniques are particularly useful for smart displays that offer AR-enhanced content that can be viewed in the displays of the AR-enabled wearable electronic devices.Type: ApplicationFiled: November 18, 2024Publication date: March 6, 2025Inventors: Sharon Moll, Piotr Gurgul
-
Patent number: 12212820Abstract: A mixed-reality media content system may be configured to perform operations that include: causing display of image data at a client device, the image data comprising a depiction of an object that includes a graphical code at a position upon the object; detecting the graphical code at the position upon the depiction of the object based on the image data; accessing media content within a media repository based on the graphical code scanned by the client device; and causing display of a presentation of the media content at the position of the graphical code upon the depiction of the object at the client device.Type: GrantFiled: February 13, 2024Date of Patent: January 28, 2025Assignee: Snap Inc.Inventors: Sharon Moll, Piotr Gurgul, Dawei Zhang