Patents by Inventor Matthew Kammerait

Matthew Kammerait has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160248995
    Abstract: A head mounted device includes different types of sensors for obtaining sensor data of objects in a physical environment near the head mounted device. The sensors include millimeter wave sensors disposed with the head mounted device that are automatically or manually engageable. The millimeter wave sensors may be automatically engaged based on the location of the head mounted device or when the head mounted device receives sensor data indicating an abnormality. The millimeter wave sensors may further be manually engaged based on an instruction received from a user of the head mounted device via an input device, such as a wearable device, or audio command, such as a command received from a microphone coupled with the head mounted device. The millimeter wave sensors provide millimeter wave sensor data that the head mounted device uses to construct millimeter wave sensor images.
    Type: Application
    Filed: February 19, 2016
    Publication date: August 25, 2016
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160231573
    Abstract: A head mounted device includes a helmet, an ambient light sensor, a pupil dimension sensor, a lighting element, and a dynamic lighting system. The ambient light sensor is disposed in an outside surface of the helmet and measures ambient light outside the helmet. The pupil dimension sensor is disposed in a housing of the helmet and measures a size of a pupil of a wearer of the helmet. The lighting element is disposed in the outside surface of the helmet. The dynamic lighting system controls the lighting element and adjusts an intensity of the lighting element based on the ambient light and the pupil size of the wearer of the helmet.
    Type: Application
    Filed: February 9, 2016
    Publication date: August 11, 2016
    Applicant: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160227868
    Abstract: A head mounted device includes a helmet and a substantially arc-shaped visor. The helmet has an augmented reality device disposed in a housing of the helmet. A first set of magnets is embedded and disposed along a periphery of a front portion of the helmet. The substantially arc-shaped visor has a top part and a bottom part. The top part is removably attached to the front portion of the helmet. A second set of magnets is embedded and disposed along a periphery of the top part of the visor to match the first set of magnets.
    Type: Application
    Filed: February 8, 2016
    Publication date: August 11, 2016
    Applicant: DAQRI, LLC
    Inventors: BRIAN MULLINS, MATTHEW KAMMERAIT, TIMOTHEOS LEAHY
  • Patent number: 9412205
    Abstract: A system and method for extracting data for augmented reality content are described. A device identifies a sensing device using an image captured with at least one camera of the device. Visual data are extracted from the sensing device. The device generates an AR content based on the extracted visual data and maps and displays the AR content in the display to form a layer on the sensing device.
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: August 9, 2016
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Patent number: 9406171
    Abstract: A system and method for visual inertial navigation for augmented reality are described. In some embodiments, at least one camera of a wearable device generates a plurality of video frames. At least one inertial measurement unit (IMU) sensors of the wearable device generates IMU data. Features in the plurality of video frames for each camera are tracked. The plurality of video frames for each camera are synchronized and aligned based on the IMU data. A dynamic state of the wearable device is computed based on the synchronized plurality of video frames with the IMU data for each camera. Augmented reality content is generated and positioned in a display of the wearable device based on the dynamic state of the wearable device.
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: August 2, 2016
    Assignee: DAQRI, LLC
    Inventors: Christopher Broaddus, Brian Mullins, Matthew Kammerait, Austin Eliazar
  • Publication number: 20160217590
    Abstract: A system and method for real-time texture mapping for an augmented reality system are described. A viewing device includes an optical sensor to capture an image of a real-world object. A texture extraction module extracts a texture of the image of the real-world object. A recognition module identifies the real-world object based on the captured image. A texture mapping module retrieves a virtual object corresponding to the identified real-world object, maps the texture to the virtual object, dynamically updates the texture to the virtual object in real time, and generates a visualization of the virtual object in a display of the viewing device.
    Type: Application
    Filed: January 26, 2015
    Publication date: July 28, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Mark Anthony Sararu, Andrew Thomas Krage, Gregory Khachaturyan
  • Publication number: 20160189397
    Abstract: A system and method for sampling-based color extraction for augmented reality are described. A viewing device includes an optical sensor to capture an image of a real-world object. A color extraction software divides the captured image into multiple regions or recognizes pre-defined regions and identifies a color value for each region. A color-based augmented reality effect module retrieves a virtual content based on the color values for the regions, and delivers the virtual content in the viewing device.
    Type: Application
    Filed: December 29, 2014
    Publication date: June 30, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Mark Sararu, Andrew Krage
  • Publication number: 20160070109
    Abstract: A head mounted device includes a helmet with a guide, a lens frame, at least one display surface mounted to the lens frame. The guide extends from a cavity of the helmet. The lens frame is moveably connected to the guide and moves along an axis of the guide between a first position within the cavity of the helmet and a second position outside the cavity of the helmet. The display surface is transparent and configured to display augmented reality content.
    Type: Application
    Filed: September 4, 2014
    Publication date: March 10, 2016
    Applicant: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus, Timotheos Leahy, Christopher Michaels Garcia
  • Publication number: 20160055674
    Abstract: A system and method for extracting data for augmented reality content are described. A device identifies a sensing device using an image captured with at least one camera of the device. Visual data are extracted from the sensing device. The device generates an AR content based on the extracted visual data and maps and displays the AR content in the display to form a layer on the sensing device.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 25, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160057511
    Abstract: An application generates instructions to a wearable device to remotely activate a sensor in the wearable device and to receive sensor data from the sensor. A query related to a physical object is received. Instructions to wearable devices are generated to remotely activate at least one sensor of the wearable devices in response to the query. Sensor data is received from at least one of the wearable devices in response to that wearable device being within a range of the physical object.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 25, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160055673
    Abstract: A system and method for visual inertial navigation for augmented reality are described. In some embodiments, at least one camera of a wearable device generates a plurality of video frames. At least one inertial measurement unit (IMU) sensors of the wearable device generates IMU data. Features in the plurality of video frames for each camera are tracked. The plurality of video frames for each camera are synchronized and aligned based on the IMU data. A dynamic state of the wearable device is computed based on the synchronized plurality of video frames with the IMU data for each camera. Augmented reality content is generated and positioned in a display of the wearable device based on the dynamic state of the wearable device.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 25, 2016
    Inventors: Christopher Broaddus, Brian Mullins, Matthew Kammerait, Austin Eliazar
  • Publication number: 20160054791
    Abstract: A system and method for navigating augmented reality (AR) content with a watch are described. A head mounted device identifies a watch, maps and generates a display of an AR menu in a transparent display of the head mounted device. The AR menu is displayed as a layer on the watch. The head mounted device detects a physical user interaction on the watch. The head mounted device navigates the AR menu in response to detecting the physical user interaction on the watch.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 25, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160049004
    Abstract: A remote expert application identifies a manipulation of virtual objects displayed in a first wearable device. The virtual objects are rendered based a physical object viewed with a second wearable device. A manipulation of the virtual objects is received from the first wearable device. A visualization of the manipulation of the virtual objects is generated for a display of the second wearable device. The visualization of the manipulation of the virtual objects is communicated to the second wearable device.
    Type: Application
    Filed: August 15, 2014
    Publication date: February 18, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160049005
    Abstract: A system and method for visualization of physical interactions are described. Objects in a scene are captured with a viewing device. Physical characteristics of the objects are computed using data from at least one sensor corresponding to the objects. A physics model of predicted interactions between the one or more objects is generated using the physical characteristics of the objects. An interaction visualization is generated based on the physics model of the predicted interactions between the one or more objects. An image of the one or more objects is augmented with the interaction visualization in a display of the viewing device.
    Type: Application
    Filed: August 15, 2014
    Publication date: February 18, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160048515
    Abstract: A system and method for spatial data processing are described. Path bundle data packages from a viewing device are accessed and processed. The path bundle data packages identify a user interaction of the viewing device with an augmented reality content relative to and based on a physical object captured by the viewing device. The path bundle data packages are generated based on the sensor data using a data model comprising a data header and a data payload. The data header comprises a contextual header having data identifying the viewing device and a user of the viewing device. A path header having data identifies the path of the interaction with the augmented reality content. A sensor header having data identifies the plurality of sensors. The data payload comprises dynamically sized sampling data from the sensor data. The path bundle data packages are normalized and aggregated. Analytics computation is performed on the normalized and aggregated path bundle data packages.
    Type: Application
    Filed: August 15, 2014
    Publication date: February 18, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Frank Chester Irving, JR.
  • Publication number: 20160049007
    Abstract: A system and method for spatial data visualization are described. An analytics computation of users' interactions with an augmented reality content is performed based on a physical object captured by a viewing device. The analytics computation comprises a computation of geometric paths of the users' interactions with the augmented reality content. A display of a visualization of the analytics computation is displayed based on the computation of the geometric paths of the users' interactions with the augmented reality content.
    Type: Application
    Filed: August 15, 2014
    Publication date: February 18, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Frank Chester Irving, JR.
  • Publication number: 20160049006
    Abstract: A system and method for spatial data collection are described. Sensor data related to a position and an orientation of a device are generated over time using sensors of the device. Augmented reality content is generated based on a physical object captured by the device. A path bundle data package identifying a user interaction of the device with the augmented reality content relative to the physical object is generated. The user interaction identifies a spatial path of an interaction with the augmented reality content. The path bundle data package is generated based on the sensor data using a data model comprising a data header and a data payload. The data header comprises a contextual header having data identifying the device and a user of the device. A path header includes data identifying the path of the interaction with the augmented reality content. A sensor header includes data identifying the sensors. The data payload comprises dynamically sized sampling data from the sensor data.
    Type: Application
    Filed: August 15, 2014
    Publication date: February 18, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Frank Chester Irving, JR.
  • Patent number: D750329
    Type: Grant
    Filed: April 8, 2015
    Date of Patent: February 23, 2016
    Assignee: DAQRI, LLC
    Inventors: Timotheos Leahy, Brian Mullins, Matthew Kammerait