Patents by Inventor Matthew Kammerait

Matthew Kammerait has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9652047
    Abstract: Visual gestures in a display device allow a user to select and activate features in a display of the display device. A sensor of the display device tracks an eye gaze of a user directed at a display of the display device. A visual gesture module identifies a predefined trigger zone in the display. A virtual object application displays a virtual object in the display based on the eye gaze of the user and the predefined trigger zone.
    Type: Grant
    Filed: February 24, 2016
    Date of Patent: May 16, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus, Sterling Crispin, Lucas Kazansky
  • Publication number: 20170109923
    Abstract: An application generates instructions to a wearable device to remotely activate a sensor in the wearable device and to receive sensor data from the sensor. A query related to a physical object is received. Instructions to wearable devices are generated to remotely activate at least one sensor of the wearable devices in response to the query. Sensor data is received from at least one of the wearable devices in response to that wearable device being within a range of the physical object.
    Type: Application
    Filed: December 29, 2016
    Publication date: April 20, 2017
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Patent number: 9625724
    Abstract: A head mounted device includes a helmet with a guide, a lens frame, at least one display surface mounted to the lens frame. The guide extends from a cavity of the helmet. The lens frame is moveably connected to the guide and moves along an axis of the guide between a first position within the cavity of the helmet and a second position outside the cavity of the helmet. The display surface is transparent and configured to display augmented reality content.
    Type: Grant
    Filed: September 4, 2014
    Date of Patent: April 18, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus, Timotheos Leahy, Christopher Michaels Garcia
  • Publication number: 20170103581
    Abstract: An imaging device of a device captures an image. The device identifies a first and second object in the image, determines physical properties of the first and second object, generates a model of a predicted interaction between the first and second object based on the physical properties of the first and second object, identifies an area in the image based on the model of the predicted interaction between the first and second object, and augments the image at the area in a display of the device.
    Type: Application
    Filed: December 22, 2016
    Publication date: April 13, 2017
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Patent number: 9619712
    Abstract: A head mounted device (HMD) includes a transparent display, sensors to generate sensor data, and a processor. The processor identifies a threat condition based on a threat pattern and the sensor data, and generates a warning notification in response to the identified threat condition. The threat pattern includes preconfigured thresholds for the sensor data. The HMD displays AR content comprising the warning notification in the transparent display.
    Type: Grant
    Filed: May 18, 2016
    Date of Patent: April 11, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20170092220
    Abstract: A system and method for generating a dynamic sensor array for an augmented reality system is described. A head mounted device includes one or more sensors, an augmented reality (AR) application, and a sensor array module. The sensor array module identifies available sensors from other head mounted devices that are geographically located within a predefined area. A dynamic sensor array is formed based on the available sensors and the one or more sensors. The dynamic sensor array is updated based on an operational status of the available sensors and the one or more sensors. The AR application generates AR content based on data from the dynamic sensor array. A display of the head mounted device displays the AR content.
    Type: Application
    Filed: September 30, 2015
    Publication date: March 30, 2017
    Applicant: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20170092002
    Abstract: A head-mounted device (HMD) includes sensors, a transparent display, a processor comprising an augmented reality (AR) application and an AR user interface module. The AR application generates virtual content based on sensing data from the sensors, and displays the virtual content in the transparent display. The AR user interface module generates an AR user interface that is positioned outside a field of view of a user based on the HMD being oriented at a reference pitch angle, causes a display of the AR user interface in the transparent display within the field of view of the user based on the HMD being oriented at a second pitch angle greater than the reference pitch angle, and causes a navigation of the AR user interface in response to a motion of the HMD and an eye gaze of the user matching a user interface trigger pattern.
    Type: Application
    Filed: September 30, 2015
    Publication date: March 30, 2017
    Applicant: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Michael French, Ian Crombie
  • Publication number: 20170092232
    Abstract: An augmented reality device includes a transparent optical display for displaying one or more depth-encoded images. The transparent optical display leverages an optical true time delay circuit communicatively coupled to a multi-layered optical element for displaying the one or more depth-encoded images. A light source is modified or modulated, which is then directed by the optical true time delay circuit, to create the depth-encoded images. A dynamic depth encoder determines which layers of the multi-layered optical element are to be energized, and the optical true time delay circuit is directed accordingly. In this manner, the optical true time delay circuit uses the inherent delay of transmitting the light as a controlled proxy for complex processing.
    Type: Application
    Filed: September 30, 2016
    Publication date: March 30, 2017
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20170094265
    Abstract: A device can determine a distance to an object. The device can use the determined distance to vary a focal length of a first adjustable element so that the first adjustable element directs light from the object into a first waveguide and onto a detector, and forms an image of the object at the detector. The device can produce an image, such as augmented content, on a panel. The device can direct light from the panel into a second waveguide. The device can use the determined distance to vary a focal length of a second adjustable element so that the second adjustable element directs light out of the second waveguide and forms a virtual image of the panel in a plane coincident with the object. The device can operate as an augmented reality headset. The adjustable elements can be phase modulators, or acoustically responsive material with surface acoustic wave transducers.
    Type: Application
    Filed: September 30, 2016
    Publication date: March 30, 2017
    Inventors: Brian Mullins, Matthew Kammerait
  • Patent number: 9578399
    Abstract: An application generates instructions to a wearable device to remotely activate a sensor in the wearable device and to receive sensor data from the sensor. A query related to a physical object is received. Instructions to wearable devices are generated to remotely activate at least one sensor of the wearable devices in response to the query. Sensor data is received from at least one of the wearable devices in response to that wearable device being within a range of the physical object.
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: February 21, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Patent number: 9558592
    Abstract: A system and method for visualization of physical interactions are described. Objects in a scene are captured with a viewing device. Physical characteristics of the objects are computed using data from at least one sensor corresponding to the objects. A physics model of predicted interactions between the one or more objects is generated using the physical characteristics of the objects. An interaction visualization is generated based on the physics model of the predicted interactions between the one or more objects. An image of the one or more objects is augmented with the interaction visualization in a display of the viewing device.
    Type: Grant
    Filed: August 15, 2014
    Date of Patent: January 31, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160341961
    Abstract: A head mounted device (HMD) includes a transparent display, a first set of sensors, a second set of sensors, and a processor. The first set of sensors measures first sensor data related to a user of the HMD. The second set of sensors measures second sensor data related to the HMD. The processor determines a user-based context based on the first sensor data, determines an ambient-based context based on the second sensor data, and accesses AR content based on the user-based context and the ambient-based context. The HMD displays the AR content on the transparent display.
    Type: Application
    Filed: May 18, 2016
    Publication date: November 24, 2016
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160342782
    Abstract: A head mounted device (HMD) includes a transparent display, a biometric sensor, and a processor. The transparent display displays virtual objects. The biometric sensor measures biometric data of a user of the HMD. The processor generates and renders a sequence of virtual objects in the transparent display, records the biometric data of the user of the HMD for each virtual object of the sequence of virtual objects displayed in the transparent display, and authenticates the user based on the biometric data.
    Type: Application
    Filed: May 18, 2016
    Publication date: November 24, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Publication number: 20160343168
    Abstract: A head mounted device (HMD) includes a transparent display, a first set of sensors, a second set of sensors, and a processor. The first set of sensors measures first sensor data including an identification of a user of the HMD and a biometric state of the user of the HMD. The second set of sensors measures second sensor data including a location of the HMD and ambient metrics based on the location of the HMD. The HMD determines a user-based context based on the first sensor data, determines an ambient-based context based on the second sensor data, determines an application context within an AR application implemented by the processor, identifies a virtual fictional character based on a combination of the user-based context, the ambient-based context, and the application context, and displays the virtual fictional character in the transparent display.
    Type: Application
    Filed: May 19, 2016
    Publication date: November 24, 2016
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160343169
    Abstract: A system and method for measuring depth using an optical radar system are described. The system includes an optical radar, a camera, a display, and a processor. The optical radar emits a signal towards an object. The processor identifies an object depicted in an image captured with the camera. The processor generates the signal with a non-repeating pattern of amplitude and frequency, and computes a depth of the object based on a difference in phase angle between the signal emitted from the optical radar and a return signal received at the optical radar. The depth includes a distance between the optical radar and the object. The processor generates AR content based on the identified object and adjusts a characteristic of the AR content in the display based on the computed depth of the object.
    Type: Application
    Filed: May 19, 2016
    Publication date: November 24, 2016
    Inventors: Brian Mullins, Matthew Kammerait, Mark Anthony Sararu
  • Publication number: 20160342840
    Abstract: A head mounted device (HMD) includes a transparent display, sensors to generate sensor data, and a processor. The processor identifies a threat condition based on a threat pattern and the sensor data, and generates a warning notification in response to the identified threat condition. The threat pattern includes preconfigured thresholds for the sensor data. The HMD displays AR content comprising the warning notification in the transparent display.
    Type: Application
    Filed: May 18, 2016
    Publication date: November 24, 2016
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160247324
    Abstract: A system and method for generating a virtual content to a physical object is described. A processor includes an augmented reality application. The augmented reality application creates virtual content at the head mounted device, and associates the virtual content with predefined conditions based on data from sensors embedded in the head mounted device at a time of creation of the virtual content. The virtual content is displayed in a display of the head mounted device in response to sensor data satisfying the predefined conditions.
    Type: Application
    Filed: February 15, 2016
    Publication date: August 25, 2016
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20160246384
    Abstract: Visual gestures in a display device allow a user to select and activate features in a display of the display device. A sensor of the display device tracks an eye gaze of a user directed at a display of the display device. A visual gesture module identifies a predefined trigger zone in the display. A virtual object application displays a virtual object in the display based on the eye gaze of the user and the predefined trigger zone.
    Type: Application
    Filed: February 24, 2016
    Publication date: August 25, 2016
    Inventors: BRIAN MULLINS, MATTHEW KAMMERAIT, CHRISTOPHER BROADDUS, STERLING CRISPIN, LUCAS KAZANSKY
  • Patent number: D777379
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: January 24, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus, Timotheos Leahy
  • Patent number: D777381
    Type: Grant
    Filed: April 24, 2015
    Date of Patent: January 24, 2017
    Assignee: DAQRI, LLC
    Inventors: Senka Agic Bergman, Timotheos Leahy, Brian Mullins, Matthew Kammerait