Patents by Inventor Sven Kratz

Sven Kratz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972521
    Abstract: Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.
    Type: Grant
    Filed: August 31, 2022
    Date of Patent: April 30, 2024
    Assignee: Snap Inc.
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Patent number: 11954774
    Abstract: Systems and methods enable users to build augmented reality (AR) experiences with Internet of Things (IoT) devices. The system includes an AR object studio that includes a list of IoT devices and control signals for the respective IoT devices and a list of AR objects (e.g., an AR lens). The AR object studio receives selections from users and correlates at least one IoT device to at least one AR object in response to the user selections. During use, a server receives an indication that an AR object has been activated and interacted with on a display of an AR camera device and, in response, sends a control signal to a correlated IoT device. Conversely, the server may receive a signal from an IoT device and, in response, present and control a correlated AR object on the display of the AR camera device.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: April 9, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11941231
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: March 26, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11935198
    Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: March 19, 2024
    Assignee: Snap Inc.
    Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stolzenberg, David Meisenholder
  • Publication number: 20240069637
    Abstract: The present disclosure relates to methods and systems for providing a touch-based augmented reality (AR) experience. During a capture phase, a first user may grip an object. An intensity of a force applied on the object in the grip and/or a duration of the grip may be recorded. A volumetric representation of the first user holding the object may also be captured. During an experience phase, a second user may touch the object, the object may provide haptic feedback (e.g., a vibration) to the second user at an intensity and a duration corresponding to an intensity of the force applied on the object and a duration of the grip of the object. If a volumetric representation of the first user holding the object is captured, touching the object may also cause a presentation of the first user's volumetric body that holds the object.
    Type: Application
    Filed: November 16, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071006
    Abstract: A volumetric content presentation system includes a head-worn display device, which includes one or more processors, and a memory storing instructions that, when executed by the one or more processors, configure the display device to access AR content items that correspond to either real-world objects or virtual objects, mix and match these AR content items, and present volumetric content that includes these mixed and matched AR content items overlaid on a real-world environment to create a new AR scene that a user can experience.
    Type: Application
    Filed: November 22, 2022
    Publication date: February 29, 2024
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith, Rajan Vaish
  • Publication number: 20240071007
    Abstract: The present disclosure relates to methods and systems for providing a presentation of an experience (e.g., a journey) to a user using augmented reality (AR). During a capture phase, persons in the journey may take videos or pictures using their smartphones, GoPros, and/or smart glasses. A drone may also take videos or pictures during the journey. During an experience phase, an AR topographical rendering of the real-world environment of the journey may be rendered on a tabletop, highlighting/animating a path persons took in the journey. The persons may be rendered as miniature avatars/dolls overlaid on the representation of the real-world environment. When the user clicks on a point in the presentation of the journey, a perspective (e.g., the videos or pictures) at that point is presented.
    Type: Application
    Filed: February 15, 2023
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071004
    Abstract: A system monitors a user environment via one or more sensors included in a computing device and detects, via a trigger, that event data is stored in a data store based on the monitoring. The system further detects one or more participants in the event data and invites the one or more participants to share an augmented reality event data and/or to a virtual reality event data. The system also creates, based on the event data, an augmented reality event data and/or a virtual reality event data, and presents the augmented reality event data and/or the virtual reality event data to the one or more participants in a synchronous mode and/or in an asynchronous mode, via the computing device.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071008
    Abstract: A two-dimensional element is identified from one or more two-dimensional images. A volumetric content item is generated based on the two-dimensional element identified from the one or more two-dimensional images. A display device presents the volumetric content item overlaid on a real-world environment that is within a field of view of a user of the display device.
    Type: Application
    Filed: February 16, 2023
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240073404
    Abstract: A display device presents volumetric content comprising a volumetric video. The volumetric video comprises a volumetric representation of one or more elements a three-dimensional space. Input indicative of a control operation associated with the presentation of the volumetric video is received. The presentation of the volumetric video by the display device is controlled by executing the control operation. While the control operation is being executed, the volumetric representation of the one or more elements of the three-dimensional space are displayed from multiple perspectives based on movement of a user.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240069627
    Abstract: A system monitors an environment via one or more sensors included in a computing device and applies a trigger to detect that a memory experience is stored in a data store based on the monitoring. The system creates an augmented reality memory experience, a virtual reality memory experience, or a combination thereof, based on the trigger if the memory experience is detected. The system additionally projects the augmented reality memory experience, the virtual reality memory experience, or the combination thereof, via the computing device.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240069626
    Abstract: A system captures via one or more sensors of a computing device, data of an environment observed by the one or more sensors at a first timeslot, and stores the data in a data store as a first portion of a timelapse memory experience. The system also captures, via the one or more sensors of a computing device, data of the environment observed by the one or more sensors at a second timeslot, and stores the data in a data store as a second portion of the timelapse memory experience. The system additionally associates the timelapse memory experience with a memory experience trigger, wherein the memory experience trigger can initiate a presentation of the timelapse memory experience.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan VAISH, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240073402
    Abstract: The present disclosure relates to methods and systems for providing a multi-perspective augmented reality experience. A volumetric video of a three-dimensional space is captured. The volumetric video of the three-dimensional space includes a volumetric representation of a first user within the three-dimensional space. The volumetric video is displayed by a display device worn by a second user, and the second user sees the volumetric representation of the first user within the three-dimensional space. Input indicative of an interaction (e.g., entering or leaving) of the second user with the volumetric representation of the first user is detected. Based on detecting the input indicative of the interaction, the display device switches to a display of a recorded perspective of the first user. Thus, by interacting with a volumetric representation of the first user in a volumetric video, the second user views the first user's perspective of the three-dimensional space.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240070969
    Abstract: Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andres Monroy-Hernandez, Brian Anthony Smith
  • Publication number: 20240056524
    Abstract: An olfactory sticker used in chats between electronic devices to make messaging more immersive, personalized, and authentic by integrating olfactory information with traditional text-based formats and AR messaging. Olfactory stickers are used to indicate to a recipient that a message includes olfactory information. The olfactory sticker illustrates a graphical representation of a particular scent that can be sent and received via a chat or an AR message. Olfactory stickers provide the recipient control of accessing the transmitted scent at a desired time. This is particularly useful since certain olfactory information, i.e., scents, can be very direct and intrusive. Olfactory stickers are activated (i.e., release their scent) when they are tapped or rubbed by the recipient.
    Type: Application
    Filed: October 26, 2023
    Publication date: February 15, 2024
    Inventors: Andrés Monroy-Hernández, Sven Kratz, Rajan Vaish
  • Patent number: 11887480
    Abstract: A context-aware safety device includes a wireless transceiver, a memory storing an application, and one or more processors. When executing the application, the one or more processors are configured to determine a context of a safety device, configure an alert based on the determined context, and broadcast the configured alert using the wireless transceiver.
    Type: Grant
    Filed: August 13, 2020
    Date of Patent: January 30, 2024
    Assignee: Harman International Industries, Incorporated
    Inventors: Joseph Verbeke, Sven Kratz, Stefan Marti
  • Publication number: 20240027394
    Abstract: An electronic device including an olfactory detector including an array of olfactory sensors for determining scents proximate the electronic device, such as a smartphone, smart eyewear, and a smart watch. Each sensor in the array of olfactory sensors is tuned to detect the presence and concentration of one or more specific chemical compounds or molecules. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor for further processing. Scent type and intensity can be classified by using the information from the scent sensors as input for a machine learning model, generated through supervised training using labeled example measurements from our sensor array.
    Type: Application
    Filed: July 19, 2022
    Publication date: January 25, 2024
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Rajan Vaish
  • Patent number: 11878714
    Abstract: In various embodiments, while a self-driving model operates a vehicle, a user monitoring subsystem acquires sensor data associated with a user of the vehicle and a vehicle observation subsystem acquires sensor data associated with the vehicle. The user monitoring subsystem computes values for a psychological metric based on the sensor data associated with the user. Based on the values for the psychological metric, a feedback application determines a description of the user over a first time period. The feedback application generates a dataset based on the description and the sensor data associated with the vehicle. Subsequently, a training application performs machine learning operation(s) on the self-driving model based on the dataset to generate a modified self-driving model. Advantageously, the dataset enables the training application to automatically modify the self-driving model to account for the impact different driving actions have on the user.
    Type: Grant
    Filed: April 6, 2020
    Date of Patent: January 23, 2024
    Assignee: Harman International Industries, Incorporated
    Inventors: Joseph Verbeke, Sven Kratz, Stefan Marti
  • Patent number: 11875022
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: January 16, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20230409158
    Abstract: An augmented reality (AR) device is interfaced to an Internet of Things (IoT) device by receiving IoT device information in a standardized schema from the IoT device. The IoT device information includes device inputs and available output information for the IoT device. A predetermined AR user interface widget to render for the IoT device is determined from the received IoT device information. The predetermined AR user interface widget converts user inputs to the AR device into the device inputs for the IoT device. Upon selection of the IoT device by the AR device, the predetermined AR user interface widget for the selected IoT device is provided to the AR device as an overlay on a display of the AR device. Device input received in response to user interaction with the AR user interface widget is provided to the IoT device in an input type expected by the IoT device.
    Type: Application
    Filed: June 15, 2022
    Publication date: December 21, 2023
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Yu Jiang THAM