Patents by Inventor Daniel Wagner

Daniel Wagner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11768577
    Abstract: Systems and methods for providing and/or presenting, to a user, a user interface for an environment that includes virtual objects are disclosed. Exemplary implementations may: obtain, from electronic storage, information regarding virtual objects in a virtual three-dimensional space that has a virtual three-dimensional volume; determine a subset of voxels from the set of voxels such that the subset of voxels encompasses a three-dimensional volume that includes at least part of a first external surface of the first virtual object; determine proximity information for the first virtual object; determine a manipulation granularity; adjust the manipulation granularity based on the proximity information; receive particular user input from the user having a particular input magnitude; manipulate the first virtual object within the virtual three-dimensional space in accordance with the received particular user input; and effectuate presentation of the user interface to the user through a client computing platform.
    Type: Grant
    Filed: October 7, 2021
    Date of Patent: September 26, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Kai Zhou, Daniel Wagner
  • Publication number: 20230296902
    Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.
    Type: Application
    Filed: April 21, 2023
    Publication date: September 21, 2023
    Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
  • Publication number: 20230283831
    Abstract: The present technology pertains to synchronized video viewing that is supported by the use of a pending changes count to keep the client devices in synchronization while providing a user experience that matches the expectations of the user. A second client device can receive input to change some aspect of the playback of a video that is being viewed synchronously with at least one other device. The second client device can process the event so the video can reflect the received input. The second client device can also send a collaboration message to a synchronized video viewing service to inform other client devices of the command. Since the second client device is aware of its own event, the second client device can ignore processing any other collaboration messages until it receives the collaboration message it initiated echoed back to it. Thereafter it can resume processing received collaboration messages.
    Type: Application
    Filed: March 3, 2022
    Publication date: September 7, 2023
    Inventors: Alan Rogers, Siya Yang, Daniel Wagner, Dylan Nelson, Jason Stakelon
  • Publication number: 20230267552
    Abstract: The method and system is for the management of contractual liability risk transfer via the Internet consisting a the following: a method and system for (a) verification of vendor compliance with contractual insurance purchasing obligations, (b) transmitting notice of claims and/or potential claims to a client that has issued a policy to said vendor and (c) creation of contractual liability risk transfer obligations between an upstream party and a non-party vendor consistent with the liability risk transfer requirements specified in the construction agreement and/or insurance policy issued to the upstream party by the client.
    Type: Application
    Filed: February 24, 2023
    Publication date: August 24, 2023
    Inventor: DANIEL WAGNER LONDON
  • Patent number: 11722630
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vision, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.
    Type: Grant
    Filed: May 17, 2022
    Date of Patent: August 8, 2023
    Assignee: Snap Inc.
    Inventors: Sagi Katz, Daniel Wagner, Weston Welge
  • Publication number: 20230239423
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using, stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vison, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.
    Type: Application
    Filed: March 30, 2023
    Publication date: July 27, 2023
    Inventors: Sagi Katz, Daniel Wagner, Weston Welge
  • Publication number: 20230194062
    Abstract: A lighting apparatus for a motor vehicle, wherein a first structural component of the lighting apparatus is connected to a second structural component of the lighting apparatus, and the connection between the first and second structural components is formed by a twist-on connection. The second structural component comprises at least two coupling elements that fit over, in particular at the perimeter of, the first structural component in the fully assembled state, at least in part, and the first structural component ends up in this engaged position, in particular on the perimeter thereof, through a rotation in relation to the second structural component.
    Type: Application
    Filed: December 19, 2022
    Publication date: June 22, 2023
    Applicant: Marelli Automotive Lighting Reutlingen (Germany) GmbH
    Inventors: Ralf Stopper, Christian Lange, Fan Lu, Lothar Pfitzner, Daniel Wagner
  • Publication number: 20230194859
    Abstract: A method for configuring a digital light projector (DLP) of an augmented reality (AR) display device is described. A light source component of the DLP projector is configured to generate a single red-green-blue color sequence repetition per image frame. The AR display device identifies a color sequence of the light source component of the DLP projector and tracks a motion of the AR display device. The AR display device adjusts an operation of the DLP projector based on the single red-green-blue color sequence repetition, the color sequence of the light source component of the DLP projector, and the motion of the AR display device.
    Type: Application
    Filed: February 23, 2023
    Publication date: June 22, 2023
    Inventors: Jeffrey Michael DeWall, Dominik Schnitzer, Amit Singh, Daniel Wagner
  • Publication number: 20230186521
    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
    Type: Application
    Filed: February 7, 2023
    Publication date: June 15, 2023
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
  • Publication number: 20230188691
    Abstract: A miniaturized active dual pixel stereo system and method for close range depth extraction includes a projector adapted to project a locally distinct projected pattern onto an image of a scene and a dual pixel sensor including a dual pixel sensor array that generates respective displaced images of the scene. A three-dimensional image is generated from the displaced images of the scene by projecting the locally distinct projected pattern onto the image of the scene, capturing the respective displaced images of the scene using the dual pixel sensor, generating disparity images from the respective displaced images of the scene, determining depth to each pixel of the disparity images, and generating the three-dimensional image from the determined depth to each pixel. A three-dimensional image of a user's hands generated by the active dual pixel stereo system may be processed by gesture recognition software to provide an input to an electronic eyewear device.
    Type: Application
    Filed: December 14, 2021
    Publication date: June 15, 2023
    Inventors: Robert John Hergert, Sagi Katz, Gilad Refael, Daniel Wagner, Weston Welge, Ramzi Zahreddine
  • Publication number: 20230177708
    Abstract: A depth estimation system to perform operations that include: receiving image data generated by a client device, the image data comprising a depiction of an environment; identifying a set of image features based on the image data; determining a pose of the client device based on the set of features; generating a depth estimation based on the image data and the pose of the client device; and generating a mesh model of the environment based on the depth estimation.
    Type: Application
    Filed: December 5, 2022
    Publication date: June 8, 2023
    Inventors: Erick Mendez Mendez, Isac Andreas Müller Sandvik, Qi Pan, Edward James Rosten, Andrew Tristan Spek, Daniel Wagner, Jakob Zillner
  • Patent number: 11662589
    Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: May 30, 2023
    Assignee: Snap Inc.
    Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
  • Publication number: 20230156357
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Application
    Filed: January 19, 2023
    Publication date: May 18, 2023
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Publication number: 20230140291
    Abstract: Polymer compositions comprising stiff and tough star shaped styrene butadiene block-copolymers A1 and A2 can be used for making shrink films. Block copolymer A2 preferably has the structure with hard blocks Se and Si, hard random copolymer blocks (B/S)Ae, soft random copolymer blocks (B/S)B coupled by a coupling agent X.
    Type: Application
    Filed: February 19, 2018
    Publication date: May 4, 2023
    Inventors: Michiel VERSWYVEL, Norbert NIESSNER, Daniel WAGNER, Michael SCHUSTER, Geert VERLINDEN, Bart VAN-DEN-BOSSCHE, Konrad KNOLL
  • Publication number: 20230117690
    Abstract: Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate a first color camera, a second color camera, a first display, and a second display. The first SoC and a second SoC are configured to selectively operate a first and second computer vision (CV) camera algorithms. The first SoC is configured to perform visual odometry (VIO), track hand gestures of the user, and provide depth from stereo images. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.
    Type: Application
    Filed: October 14, 2021
    Publication date: April 20, 2023
    Inventors: Jason Heger, Gerald Nilles, Dmitry Ryuma, Patrick Timothy McSweeney Simons, Daniel Wagner
  • Patent number: D991945
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991946
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991947
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991948
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
  • Patent number: D991965
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: July 11, 2023
    Assignee: Endress+Hauser Flowtec AG
    Inventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin