Patents by Inventor Brian Mullins

Brian Mullins has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9990759
    Abstract: A system and method for offloading augmented reality processing is described. A first sensor of a server generates a first set of sensor data corresponding to a location and an orientation of a display device. The server receives a request from the display device to offload a combination of at least one of a tracking process and a rendering process from the display device. The server generates offloaded processed data based on a combination of at least one of the first set of sensor data and a second set of sensor data. The second set of sensor data is generated by a second sensor at the display device. The server streams the offloaded processed data to the display device.
    Type: Grant
    Filed: May 11, 2017
    Date of Patent: June 5, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Publication number: 20180150957
    Abstract: A device for multi-spectrum segmentation for computer vision is described. A first optical sensor operates within a first spectrum range and generates first image data corresponding to a first image captured by the first optical sensor. A second optical sensor operates within a second spectrum range different from the first spectrum range and generates second image data corresponding to a second image captured by the second optical sensor. The device identifies a first region in the first image, maps a first portion of the first image to a second portion of the second image data, provides the second portion of the second image data to a server that generates augmented reality content based on the second portion of the second image data. The device displays the augmented reality content.
    Type: Application
    Filed: January 26, 2018
    Publication date: May 31, 2018
    Inventors: Brian Mullins, Nalin Senthamil, Eric Douglas Lundquist
  • Patent number: 9984508
    Abstract: A system and method for measuring depth using an optical radar system are described. The system includes an optical radar, a camera, a display, and a processor. The optical radar emits a signal towards an object. The processor identifies an object depicted in an image captured with the camera. The processor generates the signal with a non-repeating pattern of amplitude and frequency, and computes a depth of the object based on a difference in phase angle between the signal emitted from the optical radar and a return signal received at the optical radar. The depth includes a distance between the optical radar and the object. The processor generates AR content based on the identified object and adjusts a characteristic of the AR content in the display based on the computed depth of the object.
    Type: Grant
    Filed: May 19, 2016
    Date of Patent: May 29, 2018
    Assignee: Micron Technology, Inc.
    Inventors: Brian Mullins, Matthew Kammerait, Mark Anthony Sararu
  • Patent number: 9978174
    Abstract: An application generates instructions to a wearable device to remotely activate a sensor in the wearable device and to receive sensor data from the sensor. A query related to a physical object is received. Instructions to wearable devices are generated to remotely activate at least one sensor of the wearable devices in response to the query. Sensor data is received from at least one of the wearable devices in response to that wearable device being within a range of the physical object.
    Type: Grant
    Filed: December 29, 2016
    Date of Patent: May 22, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus
  • Patent number: 9934754
    Abstract: A system and method for generating a dynamic sensor array for an augmented reality system is described. A head mounted device includes one or more sensors, an augmented reality (AR) application, and a sensor array module. The sensor array module identifies available sensors from other head mounted devices that are geographically located within a predefined area. A dynamic sensor array is formed based on the available sensors and the one or more sensors. The dynamic sensor array is updated based on an operational status of the available sensors and the one or more sensors. The AR application generates AR content based on data from the dynamic sensor array. A display of the head mounted device displays the AR content.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: April 3, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Patent number: 9916664
    Abstract: A system and method for multi-spectrum segmentation for computer vision is described. A first sensor captures an image within a first spectrum range and generates first sensor data. A second sensor captures an image within a second spectrum range different than the first spectrum range and generates second sensor data. A multi-spectrum segmentation module identifies a segmented portion of the image within the second spectrum range based on: the second sensor data, a subset of the first sensor data corresponding to the segmented portion of the image within the second spectrum range, and a segmented portion of the image at the second spectrum range corresponding to the subset of the first sensor data. The multi-spectrum segmentation module identifies a physical object in the segmented portion of the image within the second spectrum range, and a device generates augmented reality content based on the identified physical object.
    Type: Grant
    Filed: February 9, 2016
    Date of Patent: March 13, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Nalin Senthamil, Eric Douglas Lundquist
  • Patent number: 9907104
    Abstract: Systems, apparatus, methods, and non-transitory media for programmatically associating nearby users are discussed herein. Some embodiments may include user-wearable gesture exchange device including a motion sensor and circuitry. The motion sensor may be configured to generate motion data values indicating motion of the gesture exchange device. The circuitry may be configured to monitor motion data values generated by the motion sensor for detection of a gesture exchange signature, such as a handshake, being performed by the user. In some embodiments, the gesture exchange device may be configured to establish a wireless communication connection with a second gesture exchange device for exchange of user data, such as in response to detecting user performance of the gesture exchange signature and/or determining that the second gesture exchange device is within a predetermined proximity.
    Type: Grant
    Filed: January 15, 2016
    Date of Patent: February 27, 2018
    Assignee: LOOPD INC.
    Inventors: Brian Mullin Friedman, Allen Houng, Sambhav Galada
  • Patent number: 9898844
    Abstract: A system and method for augmented reality content adapted to changes in real world space geometry are described. A device captures an image of a local environment and maps a real world space geometry of the local environment using the image of the local environment. The device generates a visualization of a virtual object in the display relative to the mapped real world space geometry of the local environment. A content of the virtual object is adjusted to changes in the real world space geometry of the local environment.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: February 20, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Publication number: 20180047216
    Abstract: A first display device determines a first device configuration of the first display device and a first context of a first AR application operating on the first display device. The first display device detects a second display device that comprises a head up display (HUD) system of a vehicle. A second device configuration of the second display device and a second context of a second AR application operating on the second display device are determined. The first display device generates a first collaboration configuration for the first display device and a second collaboration configuration for the second display device. A task of the first AR application is allocated to the second AR application based on the first collaboration configuration. The second display device performs the allocated task based on the second collaboration configuration.
    Type: Application
    Filed: October 26, 2017
    Publication date: February 15, 2018
    Inventor: Brian Mullins
  • Publication number: 20180046925
    Abstract: A method, apparatus and computer program product are provided for calculating closing metrics regarding a contract between a promotion service and provider. A promotional system may calculate a probability of closing, and an estimated time to close. The promotion service may offer a promotion to consumers for a discounted product or service, to be honored by the provider. A category, lead source, historical data, stage in sales, and/or size of the provider may be used in calculating a probability of close and/or time to close. An example method may comprise supplying a classifying model with a dataset, wherein the dataset comprises an identification of a provider and attributes corresponding to the provider and identifying a class of the provider in accordance with the plurality of corresponding attributes, wherein the identification is determined based on one or more patterns determinative of a return rate by the classifying model.
    Type: Application
    Filed: August 3, 2017
    Publication date: February 15, 2018
    Inventors: Brian Mullins, Matt DeLand, Zahra Ferdowsi, Stephen Lang, John Stokvis, Nolan Finn, Shafiq Shariff
  • Publication number: 20180047154
    Abstract: A mobile device identifies a user task provided by an augmented reality application at a mobile device. The mobile device identifies a first physical tool valid for performing the user task from a tool compliance library based on the user task. The mobile device detects and identifies a second physical tool present at the mobile device. The mobile device determines whether the second physical tool matches the first physical tool. The mobile device display augmented reality content that identifies at least one of a missing physical tool, an unmatched physical tool, or a matched physical tool based on whether the second physical tool matches the first physical tool.
    Type: Application
    Filed: October 23, 2017
    Publication date: February 15, 2018
    Inventor: Brian Mullins
  • Publication number: 20180012364
    Abstract: A survey application generates a survey of components associated with a three-dimensional model of an object. The survey application receives video feeds, location information, and orientation information from wearable devices in proximity to the object. The three-dimensional model of the object is generated based on the video feeds, sensor data, location information, and orientation information received from the wearable devices. Analytics is performed from the video feeds to identify a manipulation on the object. The three-dimensional model of the object is updated based on the manipulation on the object. A dynamic status related to the manipulation on the object is generated with respect to reference data related the object. A survey of components associated with the three-dimensional model of the object is generated.
    Type: Application
    Filed: September 21, 2017
    Publication date: January 11, 2018
    Inventor: Brian Mullins
  • Patent number: 9865093
    Abstract: A first display device determines a first device configuration of the first display device and a first context of a first AR application operating on the first display device. The first display device connects to a second display device and determines a second device configuration of the second display device and a second context of a second AR application operating on the second display device. The first display device generates a first collaboration configuration for the display device and a second collaboration configuration for the second display device. The first and second collaborations are based on the first device configuration, the first context, the second device configuration, and the second context. The second display device modifies an operation of the second AR application based on the second collaboration configuration. The first display device modifies an operation of the first AR application based on the first collaboration configuration.
    Type: Grant
    Filed: June 30, 2016
    Date of Patent: January 9, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: 9865058
    Abstract: A survey application generates a survey of components associated with a three-dimensional model of an object. The survey application receives video feeds, location information, and orientation information from wearable devices in proximity to the object. The three-dimensional model of the object is generated based on the video feeds, sensor data, location information, and orientation information received from the wearable devices. Analytics is performed from the video feeds to identify a manipulation on the object. The three-dimensional model of the object is updated based on the manipulation on the object. A dynamic status related to the manipulation on the object is generated with respect to reference data related the object. A survey of components associated with the three-dimensional model of the object is generated.
    Type: Grant
    Filed: February 19, 2014
    Date of Patent: January 9, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: 9864910
    Abstract: A head mounted device (HMD) includes a transparent display, sensors to generate sensor data, and a processor. The processor identifies a threat condition based on a threat pattern and the sensor data, and generates a warning notification in response to the identified threat condition. The threat pattern includes preconfigured thresholds for the sensor data. The HMD displays AR content comprising the warning notification in the transparent display.
    Type: Grant
    Filed: March 2, 2017
    Date of Patent: January 9, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20180005442
    Abstract: A first display device determines a first device configuration of the first display device and a first context of a first AR application operating on the first display device. The first display device connects to a second display device and determines a second device configuration of the second display device and a second context of a second AR application operating on the second display device. The first display device generates a first collaboration configuration for the display device and a second collaboration configuration for the second display device. The first and second collaborations are based on the first device configuration, the first context, the second device configuration, and the second context. The second display device modifies an operation of the second AR application based on the second collaboration configuration. The first display device modifies an operation of the first AR application based on the first collaboration configuration.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventor: Brian Mullins
  • Publication number: 20180005444
    Abstract: A device executes an augmented reality (AR) application at the device. The device detects a failure of the AR application and enters a failsafe mode in response to detecting the failure of the AR application. The failsafe mode includes a minimum set of AR content and AR application functionalities relative to the full set of AR content and AR application functionalities when the AR application is being executed.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventor: Brian Mullins
  • Publication number: 20180005440
    Abstract: An application programming interface (API) server accesses first data of a non-augmented reality (AR) application. The first data including first content data, first control data, first user interface data, and a first data format. The API server maps the first data from the non-AR application to second data compatible with an AR application in a display device. The second data includes AR content data, AR control data, AR user interface data, and an AR data format. The second data is generated using an API module of the API server. The API server provides the second data to the AR application in the display device. The display device is configured to display the AR content data, to operate on the content data based on the AR control data, and to generate an AR user interface for the display device with the AR user interface data.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventor: Brian Mullins
  • Patent number: 9858707
    Abstract: A server receives video data and location data from mobile devices. Each mobile device records a video of a target. The location data identifies a position of the corresponding mobile device relative to the target and a distance between the corresponding mobile device to the target. The location data is associated with a corresponding video frame from the video data. The server identifies video frames from the video data captured from the mobile devices. The server scales parts of the identified video frames based on the position and distance of the corresponding mobile devices to the target. The server extracts the scaled parts of the identified video frames and generates a three-dimensional model of the target based on the extracted scaled parts of the identified video frames from the plurality of mobile devices.
    Type: Grant
    Filed: December 29, 2016
    Date of Patent: January 2, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: 9844119
    Abstract: A head mounted device includes a helmet, an ambient light sensor, a pupil dimension sensor, a lighting element, and a dynamic lighting system. The ambient light sensor is disposed in an outside surface of the helmet and measures ambient light outside the helmet. The pupil dimension sensor is disposed in a housing of the helmet and measures a size of a pupil of a wearer of the helmet. The lighting element is disposed in the outside surface of the helmet. The dynamic lighting system controls the lighting element and adjusts an intensity of the lighting element based on the ambient light and the pupil size of the wearer of the helmet.
    Type: Grant
    Filed: February 9, 2016
    Date of Patent: December 12, 2017
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait