Patents by Inventor Daniell Wagner
Daniell Wagner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250252602Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.Type: ApplicationFiled: April 21, 2025Publication date: August 7, 2025Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
-
Patent number: 12380583Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for continuous surface and depth estimation. A continuous surface and depth estimation system determines the depth and surface normal of physical objects by using stereo vision limited within a predetermined window.Type: GrantFiled: February 7, 2024Date of Patent: August 5, 2025Assignee: SNAP INC.Inventors: Olha Borys, Ilteris Kaan Canberk, Daniel Wagner, Jakob Zillner
-
Publication number: 20250233959Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vison, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.Type: ApplicationFiled: April 1, 2025Publication date: July 17, 2025Inventors: Sagi Katz, Daniel Wagner, Weston Weige
-
Patent number: 12356095Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.Type: GrantFiled: May 29, 2024Date of Patent: July 8, 2025Assignee: Snap Inc.Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
-
Patent number: 12332452Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.Type: GrantFiled: June 13, 2024Date of Patent: June 17, 2025Assignee: Snap Inc.Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
-
Patent number: 12299933Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.Type: GrantFiled: February 15, 2024Date of Patent: May 13, 2025Assignee: Snap Inc.Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wagner
-
Patent number: 12294806Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vison, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.Type: GrantFiled: March 15, 2024Date of Patent: May 6, 2025Assignee: Snap Inc.Inventors: Sagi Katz, Daniel Wagner, Weston Welge
-
Patent number: 12277103Abstract: The present disclosure is directed toward systems and methods that provide users with efficient and effective user experiences when browsing, selecting, or inspecting content items. More specifically, systems and methods described herein provide users the ability to easily and effectively select multiple content items via a single touch gesture (e.g., swipe gesture). Additionally, systems and methods disclosed herein can allow users to select various content items in a selection interface, preview a content item, and return to the selection interface with the previous selections remaining intact. Still further, systems and methods described herein allow users to transition between a content graphical user interface and a selection interface based on touch gestures with content items. Embodiments of the present invention further provide improved navigability by collapsing views of content items.Type: GrantFiled: February 22, 2024Date of Patent: April 15, 2025Assignee: Dropbox, Inc.Inventors: Stephan Jätzold, Daniel Wagner
-
Patent number: 12279006Abstract: The present technology pertains to synchronized video viewing that is supported by the use of a pending changes count to keep the client devices in synchronization while providing a user experience that matches the expectations of the user. A second client device can receive input to change some aspect of the playback of a video that is being viewed synchronously with at least one other device. The second client device can process the event so the video can reflect the received input. The second client device can also send a collaboration message to a synchronized video viewing service to inform other client devices of the command. Since the second client device is aware of its own event, the second client device can ignore processing any other collaboration messages until it receives the collaboration message it initiated echoed back to it. Thereafter it can resume processing received collaboration messages.Type: GrantFiled: September 8, 2023Date of Patent: April 15, 2025Assignee: Dropbox, Inc.Inventors: Alan Rogers, Siya Yang, Daniel Wagner, Dylan Nelson, Jason Stakelon
-
Publication number: 20240369829Abstract: A method for configuring a digital light projector (DLP) of an augmented reality (AR) display device is described. A light source component of the DLP projector is configured to generate a single red-green-blue color sequence repetition per image frame. The AR display device identifies a color sequence of the light source component of the DLP projector and tracks a motion of the AR display device. The AR display device adjusts an operation of the DLP projector based on the single red-green-blue color sequence repetition, the color sequence of the light source component of the DLP projector, and the motion of the AR display device.Type: ApplicationFiled: July 17, 2024Publication date: November 7, 2024Inventors: Jeffrey Michael DeWall, Dominik Schnitzer, Amit Singh, Daniel Wagner
-
Publication number: 20240362743Abstract: A method for minimizing latency of moving objects in an augmented reality (AR) display device is described. In one aspect, the method includes determining an initial pose of a visual tracking device, identifying an initial location of an object in an image that is generated by an optical sensor of the visual tracking device, the image corresponding to the initial pose of the visual tracking device. rendering virtual content based on the initial pose and the initial location of the object, retrieving an updated pose of the visual tracking device, tracking an updated location of the object in an updated image that corresponds to the updated pose, and applying a time warp transformation to the rendered virtual content based on the updated pose and the updated location of the object to generate transformed virtual content.Type: ApplicationFiled: July 10, 2024Publication date: October 31, 2024Inventors: Bernhard Jung, Daniel Wagner
-
Patent number: 12115128Abstract: The present disclosure relates to a closure cap for sealing an outlet end of a barrel of a medicament container, the outlet end having a radially widened rim and the outlet end being sealable by an elastomeric seal, wherein the elastomeric seal comprises a flange portion configured to abut in a longitudinal direction with the outlet end, the closure cap comprising: a cap body comprising a retainer portion and a fastening portion, wherein the retainer portion is configured to engage with the elastomeric seal, wherein the fastening portion comprises a resiliently and radially deformable fastener comprising a snap feature configured to releasably engage with the radially widened rim of the outlet and, and wherein a longitudinal distance between the retainer portion and the snap feature is sized to receive the radially widened rim and the flange portion of the elastomeric seal between the retainer portion and the snap feature.Type: GrantFiled: August 17, 2020Date of Patent: October 15, 2024Assignee: SanofiInventor: Daniel Wagner
-
Patent number: 12119684Abstract: A battery pack supplies an electrically driven treatment apparatus with an electric driving power and includes a plurality of accumulator cells having cell contacts; at least one circuit board electrically connected to the cell contacts; and a battery pack housing having a first battery pack housing part and a second battery pack housing part. The first battery pack housing part and the second battery pack housing part are closed by each other. The accumulator cells are disposed within the battery pack housing. The cell contacts and the circuit board are disposed within the second battery pack housing part. The second battery pack housing part is configured as a mold for a casting compound, and the cell contacts and the at least one circuit board are enclosed by the casting compound.Type: GrantFiled: May 8, 2020Date of Patent: October 15, 2024Assignees: Andreas Stihl AG & Co. KG, Murata Manufacturing Co., LtdInventors: Daniel Sauerteig, Julien Scholl, Kenichi Ozawa, Takayuki Sato, Daiki Misawa, Daniel Wagner, Marcel Wilka, Martin Schurr, Uwe Bossmann, Dirk Liepold
-
Publication number: 20240329409Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.Type: ApplicationFiled: June 13, 2024Publication date: October 3, 2024Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
-
Publication number: 20240314461Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.Type: ApplicationFiled: May 29, 2024Publication date: September 19, 2024Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
-
Publication number: 20240309902Abstract: A spacing fastener comprises a head and a shaft adjoining the head and ending with a tip. The shaft comprises an anchoring screw thread, an under-head screw thread closest to and ending near the head, and a non-threaded shaft portion therebetween. A support element at a distance circumferentially from the screw-thread end of under-head screw thread forms contact point (Au) therewith. In the region overlapping with under-head screw thread in top view, the support element is spaced apart therefrom and forms a contact point (As) for an outer shell lying between the head and under-head screw thread. The distance of contact point (As) from the head is greater than or equal to difference between the distance of contact point (Au) from the head and a tolerance distance of at most 15% of the major diameter of the under-head screw thread. The support element extends lengthwise over less than 150°.Type: ApplicationFiled: July 8, 2022Publication date: September 19, 2024Inventors: Uwe SIEGEMUND, Daniel WAGNER
-
Patent number: 12072486Abstract: A method for configuring a digital light projector (DLP) of an augmented reality (AR) display device is described. A light source component of the DLP projector is configured to generate a single red-green-blue color sequence repetition per image frame. The AR display device identifies a color sequence of the light source component of the DLP projector and tracks a motion of the AR display device. The AR display device adjusts an operation of the DLP projector based on the single red-green-blue color sequence repetition, the color sequence of the light source component of the DLP projector, and the motion of the AR display device.Type: GrantFiled: February 23, 2023Date of Patent: August 27, 2024Assignee: SNAP INC.Inventors: Jeffrey Michael DeWall, Dominik Schnitzer, Amit Singh, Daniel Wagner
-
Patent number: 12067693Abstract: A method for minimizing latency of moving objects in an augmented reality (AR) display device is described. In one aspect, the method includes determining an initial pose of a visual tracking device, identifying an initial location of an object in an image that is generated by an optical sensor of the visual tracking device, the image corresponding to the initial pose of the visual tracking device, rendering virtual content based on the initial pose and the initial location of the object, retrieving an updated pose of the visual tracking device, tracking an updated location of the object in an updated image that corresponds to the updated pose, and applying a time warp transformation to the rendered virtual content based on the updated pose and the updated location of the object to generate transformed virtual content.Type: GrantFiled: November 4, 2021Date of Patent: August 20, 2024Assignee: SNAP INC.Inventors: Bernhard Jung, Daniel Wagner
-
Patent number: D1050150Type: GrantFiled: September 2, 2021Date of Patent: November 5, 2024Assignee: Endress+Hauser Flowtec AGInventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin
-
Patent number: D1050176Type: GrantFiled: September 2, 2021Date of Patent: November 5, 2024Assignee: Endress+Hauser Flowtec AGInventors: Johannes Sprenger, Markus Hoh, Daniel Wagner, Andreas Wiedmer, Kevin Martin