Patents by Inventor Daniel Wagner

Daniel Wagner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230194062
    Abstract: A lighting apparatus for a motor vehicle, wherein a first structural component of the lighting apparatus is connected to a second structural component of the lighting apparatus, and the connection between the first and second structural components is formed by a twist-on connection. The second structural component comprises at least two coupling elements that fit over, in particular at the perimeter of, the first structural component in the fully assembled state, at least in part, and the first structural component ends up in this engaged position, in particular on the perimeter thereof, through a rotation in relation to the second structural component.
    Type: Application
    Filed: December 19, 2022
    Publication date: June 22, 2023
    Applicant: Marelli Automotive Lighting Reutlingen (Germany) GmbH
    Inventors: Ralf Stopper, Christian Lange, Fan Lu, Lothar Pfitzner, Daniel Wagner
  • Publication number: 20230194859
    Abstract: A method for configuring a digital light projector (DLP) of an augmented reality (AR) display device is described. A light source component of the DLP projector is configured to generate a single red-green-blue color sequence repetition per image frame. The AR display device identifies a color sequence of the light source component of the DLP projector and tracks a motion of the AR display device. The AR display device adjusts an operation of the DLP projector based on the single red-green-blue color sequence repetition, the color sequence of the light source component of the DLP projector, and the motion of the AR display device.
    Type: Application
    Filed: February 23, 2023
    Publication date: June 22, 2023
    Inventors: Jeffrey Michael DeWall, Dominik Schnitzer, Amit Singh, Daniel Wagner
  • Publication number: 20230188691
    Abstract: A miniaturized active dual pixel stereo system and method for close range depth extraction includes a projector adapted to project a locally distinct projected pattern onto an image of a scene and a dual pixel sensor including a dual pixel sensor array that generates respective displaced images of the scene. A three-dimensional image is generated from the displaced images of the scene by projecting the locally distinct projected pattern onto the image of the scene, capturing the respective displaced images of the scene using the dual pixel sensor, generating disparity images from the respective displaced images of the scene, determining depth to each pixel of the disparity images, and generating the three-dimensional image from the determined depth to each pixel. A three-dimensional image of a user's hands generated by the active dual pixel stereo system may be processed by gesture recognition software to provide an input to an electronic eyewear device.
    Type: Application
    Filed: December 14, 2021
    Publication date: June 15, 2023
    Inventors: Robert John Hergert, Sagi Katz, Gilad Refael, Daniel Wagner, Weston Welge, Ramzi Zahreddine
  • Publication number: 20230186521
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Application
    Filed: February 7, 2023
    Publication date: June 15, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20230177708
    Abstract: A depth estimation system to perform operations that include: receiving image data generated by a client device, the image data comprising a depiction of an environment; identifying a set of image features based on the image data; determining a pose of the client device based on the set of features; generating a depth estimation based on the image data and the pose of the client device; and generating a mesh model of the environment based on the depth estimation.
    Type: Application
    Filed: December 5, 2022
    Publication date: June 8, 2023
    Inventors: Erick Mendez Mendez, Isac Andreas Müller Sandvik, Qi Pan, Edward James Rosten, Andrew Tristan Spek, Daniel Wagner, Jakob Zillner
  • Patent number: 11662589
    Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: May 30, 2023
    Assignee: Snap Inc.
    Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
  • Publication number: 20230156357
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Application
    Filed: January 19, 2023
    Publication date: May 18, 2023
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Publication number: 20230140291
    Abstract: Polymer compositions comprising stiff and tough star shaped styrene butadiene block-copolymers A1 and A2 can be used for making shrink films. Block copolymer A2 preferably has the structure with hard blocks Se and Si, hard random copolymer blocks (B/S)Ae, soft random copolymer blocks (B/S)B coupled by a coupling agent X.
    Type: Application
    Filed: February 19, 2018
    Publication date: May 4, 2023
    Inventors: Michiel VERSWYVEL, Norbert NIESSNER, Daniel WAGNER, Michael SCHUSTER, Geert VERLINDEN, Bart VAN-DEN-BOSSCHE, Konrad KNOLL
  • Publication number: 20230117690
    Abstract: Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate a first color camera, a second color camera, a first display, and a second display. The first SoC and a second SoC are configured to selectively operate a first and second computer vision (CV) camera algorithms. The first SoC is configured to perform visual odometry (VIO), track hand gestures of the user, and provide depth from stereo images. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.
    Type: Application
    Filed: October 14, 2021
    Publication date: April 20, 2023
    Inventors: Jason Heger, Gerald Nilles, Dmitry Ryuma, Patrick Timothy McSweeney Simons, Daniel Wagner
  • Publication number: 20230109916
    Abstract: Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate the OS, a first color camera, a second color camera, a first display, and a second display. A second SoC is configured to run computer vision (CV) algorithms, visual odometry (VIO), tracking hand gestures of the user, and providing depth from stereo. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.
    Type: Application
    Filed: October 7, 2021
    Publication date: April 13, 2023
    Inventors: Jason Heger, Gerald Nilles, Dmitry Ryuma, Patrick Timothy McSweeney Simons, Daniel Wagner
  • Patent number: 11614618
    Abstract: A method for configuring a digital light projector (DLP) of an augmented reality (AR) display device is described. A light source component of the DLP projector is configured to generate a single red-green-blue color sequence repetition per image frame. The AR display device identifies a color sequence of the light source component of the DLP projector and tracks a motion of the AR display device. The AR display device adjusts an operation of the DLP projector based on the single red-green-blue color sequence repetition, the color sequence of the light source component of the DLP projector, and the motion of the AR display device.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: March 28, 2023
    Assignee: SNAP INC.
    Inventors: Jeffrey Michael DeWall, Dominik Schnitzer, Amit Singh, Daniel Wagner
  • Patent number: 11587255
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: February 21, 2023
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Patent number: 11582409
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Grant
    Filed: January 29, 2021
    Date of Patent: February 14, 2023
    Assignee: Snap Inc.
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Publication number: 20230007064
    Abstract: Techniques are disclosed for allowing remote participation in collaborative video review based on joint state data, for a video collaboration session, maintained by a video collaboration service. A user at a participant client device may provide one or more annotations, such as a drawing annotation, for the video data via a client application. The client application transmits computer-readable instructions for re-creating the drawing annotation to the service, which distributes the drawing instructions to the other participant client devices. Using the drawing instructions, the client applications at the client devices are configured to re-create the drawing annotation on the associated video frame displayed at the client devices. The joint state data communicated to client devices for a given session may include co-presence data that efficiently increases communication among the participants in a session.
    Type: Application
    Filed: June 30, 2021
    Publication date: January 5, 2023
    Inventors: Siya Yang, Alan Rogers, Daniel Wagner, Irene Ma, Jason Stakelon
  • Publication number: 20230006852
    Abstract: Techniques are disclosed for avoiding conflicting user actions while the users synchronously participate in collaborative video review based on joint state data for a video collaboration session. User actions may conflict, e.g., when a user submits a video playback instruction to change the current frame of the session while another user is performing a frame-specific action on the current frame. The video collaboration service freezes the current frame in joint state data based on detecting that a frame-specific action is being performed or is likely imminent. Detecting a freeze condition may be implicit or explicit. In order to unfreeze the current frame of the joint state data, no active freeze conditions may be in effect. Further, the freeze condition may be lifted implicitly or explicitly. A visual video freeze indication may be displayed by one or more client applications participating in the session while a freeze condition is active.
    Type: Application
    Filed: August 10, 2022
    Publication date: January 5, 2023
    Inventors: Siya Yang, Alan Rogers, Daniel Wagner, Irene Ma, Jason Stakelon
  • Patent number: D991945
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991946
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991947
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991948
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991965
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin