Patents by Inventor David Meisenholder
David Meisenholder has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220155594Abstract: System, method, and non-transitory computer readable medium for presenting audio-based visual overlays on see-through optical assemblies. Overlays are presented by capturing, via a camera of an eyewear device, initial images of a scene, receiving an audio signal, modifying the initial images responsive to the audio signal to create overlay images, and displaying, via a see-through optical assembly of the eyewear device, the overlay images to a wearer of the eyewear device over the scene in a viewing area of the eyewear device.Type: ApplicationFiled: January 31, 2022Publication date: May 19, 2022Inventor: David Meisenholder
-
Publication number: 20220147139Abstract: An eyewear device presents, via an image display, an initial displayed image. The initial displayed image has an initial field of view corresponding to an initial head direction or an initial eye direction. Eyewear device detects movement of a user of the eyewear device by: (i) tracking, via a head movement tracker, a head movement of a head of the user, or (ii) tracking, via an eye movement tracker, an eye movement of an eye of the user of the eyewear device. Eyewear device determines a field of view adjustment to the initial field of view of the initial displayed image based on the detected movement of the user. Field of view adjustment includes a successive field of view corresponding to a successive head direction or a successive eye direction. Eyewear device generates a successive displayed image based on the field of view adjustment and presents the successive displayed image.Type: ApplicationFiled: December 30, 2021Publication date: May 12, 2022Inventors: Ilteris Canberk, David Meisenholder, Jonathan M. Rodriguez, II
-
Patent number: 11327310Abstract: The present application discloses examples of various apparatuses and systems that can be utilized for augmented reality. According to one example, a wearable device that can optionally comprise: a frame configured for wearing by a user; one or more optical elements mounted on the frame; an array having a plurality of light emitting diodes coupled to the one or more optical elements, wherein the one or more optical elements and the array are mounted within a field of view of the user when the frame is worn by the user; and additional onboard electronic components carried by the frame including at least a battery that is configured to provide for electrically powered operation of the array.Type: GrantFiled: December 27, 2019Date of Patent: May 10, 2022Assignee: Snap Inc.Inventors: Robert Matthew Bates, Ilteris Canberk, Brandon Carrillo, David G. Fliszar, Adam Douglas Greengard, Kenneth Kubala, David Meisenholder, Jonathan M Rodriguez, II, Amit Singh, Samuel Thompson
-
Patent number: 11307412Abstract: System, method, and non-transitory computer readable medium for presenting audio-based visual overlays on see-through optical assemblies. Overlays are presented by capturing, via a camera of an eyewear device, initial images of a scene, receiving an audio signal, modifying the initial images responsive to the audio signal to create overlay images, and displaying, via a see-through optical assembly of the eyewear device, the overlay images to a wearer of the eyewear device over the scene in a viewing area of the eyewear device.Type: GrantFiled: December 30, 2019Date of Patent: April 19, 2022Assignee: Snap Inc.Inventor: David Meisenholder
-
Patent number: 11300814Abstract: Apparatuses, systems and methods for wearable devices such as eyewear are described. According to one embodiment, the wearable device can include a frame, a temple, electronics, and a linkage assembly. The frame can have two opposing connection portions and the temple can be adapted to selectively interface with one of the two connection portions. The electronics can be mounted to at least one of the temple and the frame. The linkage assembly can be pivotally coupled at a first end portion to the temple and pivotally coupled at a second end portion to the one of the two connection portions. The linkage assembly can be configured for movement of the temple between a wearable position and a folded position.Type: GrantFiled: November 26, 2019Date of Patent: April 12, 2022Assignee: Snap Inc.Inventor: David Meisenholder
-
Publication number: 20220084259Abstract: Systems, methods, and non-transitory computer readable media for augmenting scenes viewed thorough displays of an eyewear devices with audio-related image information. Scenes may be augmented by capturing, via a camera of the eyewear device, initial images of a scene, identifying features within the initial images; receiving audio-related image information (e.g., lyrics and/or images), registering the audio-related image information to the identified features, creating audio-based visual overlays including the audio-related image information registered to the identified features, and displaying the audio-based visual overlays over the scene.Type: ApplicationFiled: November 29, 2021Publication date: March 17, 2022Inventor: David Meisenholder
-
Patent number: 11269402Abstract: An eyewear device presents, via an image display, an initial displayed image. The initial displayed image has an initial field of view corresponding to an initial head direction or an initial eye direction. Eyewear device detects movement of a user of the eyewear device by: (i) tracking, via a head movement tracker, a head movement of a head of the user, or (ii) tracking, via an eye movement tracker, an eye movement of an eye of the user of the eyewear device. Eyewear device determines a field of view adjustment to the initial field of view of the initial displayed image based on the detected movement of the user. Field of view adjustment includes a successive field of view corresponding to a successive head direction or a successive eye direction. Eyewear device generates a successive displayed image based on the field of view adjustment and presents the successive displayed image.Type: GrantFiled: July 30, 2019Date of Patent: March 8, 2022Assignee: Snap Inc.Inventors: Ilteris Canberk, David Meisenholder, Jonathan M. Rodriguez, II
-
Publication number: 20220068033Abstract: System, method, and non-transitory computer readable medium for presenting images on a mobile device. Images are presented by monitoring the location and the orientation of the mobile device, requesting previously captured images corresponding to where the previously captured images were captured in relation to the location from a server, receiving the requested previously captured images from the server, selecting images from the requested previously captured images responsive to the location and the orientation of the mobile device, generating overlay images from the selected images including image icons associated with the selected images, presenting the overlay images on an optical assembly, receiving an image selection identifying one of the image icons in the presented overlay images, and display the selected image associated with the identified image icon on the viewing area of the optical assembly.Type: ApplicationFiled: October 13, 2021Publication date: March 3, 2022Inventors: David Meisenholder, Celia Mourkogiannis, Donald Giovannini
-
Patent number: 11232601Abstract: Systems, methods, and non-transitory computer readable media for augmenting scenes viewed thorough displays of an eyewear devices with audio-related image information. Scenes may be augmented by capturing, via a camera of the eyewear device, initial images of a scene, identifying features within the initial images; receiving audio-related image information (e.g., lyrics and/or images), registering the audio-related image information to the identified features, creating audio-based visual overlays including the audio-related image information registered to the identified features, and displaying the audio-based visual overlays over the scene.Type: GrantFiled: December 30, 2019Date of Patent: January 25, 2022Assignee: Snap Inc.Inventor: David Meisenholder
-
Publication number: 20210382503Abstract: Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.Type: ApplicationFiled: August 24, 2021Publication date: December 9, 2021Inventors: David Meisenholder, Steven Horowitz
-
Patent number: 11175516Abstract: A wearable or a mobile device includes a camera to capture an image of a scene with an unknown object. Execution of programming by a processor configures the device to perform functions, including a function to capture, via the camera, the image of the scene with the unknown object. To create lightweight human-machine user interactions, execution of programming by the processor further configures the device to determine a recognized object-based adjustment; and produce visible output to the user via the graphical user interface presented on the image display of the device based on the recognized object-based adjustment. Examples of recognized object-based adjustments include launch, hide, or display of an application for the user to interact with or utilize; display of a menu of applications related to the recognized object for execution; or enable or disable of a system level feature.Type: GrantFiled: February 15, 2019Date of Patent: November 16, 2021Assignee: Snap Inc.Inventors: Ilteris Canberk, David Meisenholder
-
Patent number: 11176751Abstract: System, method, and non-transitory computer readable medium for presenting images on a mobile device. Images are presented by monitoring the location and the orientation of the mobile device, requesting previously captured images corresponding to where the previously captured images were captured in relation to the location from a server, receiving the requested previously captured images from the server, selecting images from the requested previously captured images responsive to the location and the orientation of the mobile device, generating overlay images from the selected images including image icons associated with the selected images, presenting the overlay images on an optical assembly, receiving an image selection identifying one of the image icons in the presented overlay images, and display the selected image associated with the identified image icon on the viewing area of the optical assembly.Type: GrantFiled: March 17, 2020Date of Patent: November 16, 2021Assignee: Snap Inc.Inventors: David Meisenholder, Celia Mourkogiannis, Donald Giovannini
-
Publication number: 20210314489Abstract: In a camera-enabled electronic device, photo capture is triggered by a press-and-hold input only if the holding duration of the press-and-hold input is greater than a predefined threshold duration. A press-and-hold input shorter in duration than the threshold triggers video capture. Thus, a short press triggers video capture, while a long press triggers photo capture.Type: ApplicationFiled: April 20, 2021Publication date: October 7, 2021Inventors: Matthew Hanover, Justin Huang, David Meisenholder
-
Publication number: 20210297811Abstract: System, method, and non-transitory computer readable medium for presenting images on a mobile device. Images are presented by monitoring the location and the orientation of the mobile device, requesting previously captured images corresponding to where the previously captured images were captured in relation to the location from a server, receiving the requested previously captured images from the server, selecting images from the requested previously captured images responsive to the location and the orientation of the mobile device, generating overlay images from the selected images including image icons associated with the selected images, presenting the overlay images on an optical assembly, receiving an image selection identifying one of the image icons in the presented overlay images, and display the selected image associated with the identified image icon on the viewing area of the optical assembly.Type: ApplicationFiled: March 17, 2020Publication date: September 23, 2021Inventors: David Meisenholder, Celia Mourkogiannis, Donald Giovannini
-
Publication number: 20210294102Abstract: System, method, and non-transitory computer readable medium for presenting images on a mobile device. Images are presented by monitoring one or more physical characteristics surrounding the mobile device using at least one sensor, determining a contextual state of the mobile device based on the monitored one or more physical characteristics, selecting an image from a plurality of related images associated with the determined contextual state, generating at least one overlay image from the selected image, and presenting the at least one overlay image with an optical assembly of the mobile device.Type: ApplicationFiled: March 19, 2020Publication date: September 23, 2021Inventor: David Meisenholder
-
Patent number: 11126206Abstract: Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.Type: GrantFiled: April 13, 2020Date of Patent: September 21, 2021Assignee: Snap Inc.Inventors: David Meisenholder, Steven Horowitz
-
Patent number: 11006043Abstract: In a camera-enabled electronic device, photo capture is triggered by a press-and-hold input only if the holding duration of the press-and-hold input is greater than a predefined threshold duration. A press-and-hold input shorter in duration than the threshold triggers video capture. Thus, a short press triggers video capture, while a long press triggers photo capture.Type: GrantFiled: April 3, 2019Date of Patent: May 11, 2021Assignee: Snap Inc.Inventors: Matthew Hanover, Justin Huang, David Meisenholder
-
Patent number: 10768639Abstract: Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.Type: GrantFiled: June 30, 2017Date of Patent: September 8, 2020Assignee: Snap Inc.Inventors: David Meisenholder, Steven Horowitz
-
Publication number: 20200241575Abstract: Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.Type: ApplicationFiled: April 13, 2020Publication date: July 30, 2020Inventors: David Meisenholder, Steven Horowitz
-
Publication number: 20200241329Abstract: Apparatuses, systems and methods for wearable devices such as eyewear are described. According to one embodiment, the wearable device includes a body, electronics, and a connector. The body is configured to hold one or more optical elements, the body being disposable between a collapsed condition and a wearable condition in which the device is wearable by a user to hold the one or more optical elements within user view. The electronics are carried by the body. The connector is configured to enable establishment of an electrical and/or electronic connection with the electronics via the connector, the connector being housed by the body such that the connector is substantially obscured from view when the body is in the wearable condition, and such that the connector is exposed for connective access when the body is in the collapsed condition.Type: ApplicationFiled: April 20, 2020Publication date: July 30, 2020Inventors: Matthew Hanover, Qiaokun Huang, David Meisenholder, Lauryn Morris