Patents by Inventor Brian Mullins

Brian Mullins has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180365893
    Abstract: A first augmented-reality (AR) device comprises an optical sensor, a geographic location sensor, an orientation sensor, and a display. The first AR device accesses a first geographic location of the first AR device and an orientation of the first AR device and generates a picture taken at the first geographic location of the first AR device and associated with the orientation of the first AR device. The first AR device retrieves, from a server, transportation information from a second AR device in a vehicle. The server assigns the second AR device to the first AR device. The first AR device forms transportation AR content based on the transportation information and displays the transportation AR content in the display based on the first geographic location and orientation of the first AR device, the transportation information, and the picture generated at the first geographic location of the first AR device.
    Type: Application
    Filed: June 16, 2017
    Publication date: December 20, 2018
    Inventor: Brian Mullins
  • Publication number: 20180354509
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for presenting an AR visualization of an ADAS. A viewing device integrated into a vehicle gathers sensor data from one or more sensors. The sensor data describes a speed and trajectory of the vehicle and a geographic location of a physical object in relation to the vehicle. The viewing device determines, based on the sensor data, that a threat level of the vehicle colliding with the physical object meets or exceeds a threshold threat level. In response to determining that the threat level of the vehicle colliding with the physical object meets or exceeds the threshold threat level, the viewing device determines an automated movement of the vehicle to avoid colliding with the physical object. The viewing device then presents, on a heads up display of the vehicle, virtual content depicting the automated movement, and executes the automated movement.
    Type: Application
    Filed: June 8, 2017
    Publication date: December 13, 2018
    Inventor: Brian Mullins
  • Patent number: 10147239
    Abstract: A server for content creation is described. A content creation tool of the server receives, from a first device, a content identifier of a physical object, a virtual object content, and a selection of a template corresponding to an interactive feature for the virtual object content. The content creation tool generates a content dataset based on the content identifier of the physical object, the virtual object content, and the selected template. The content creation tool provides the content dataset to a second device, the second device configured to display the interactive feature corresponding to the selected template.
    Type: Grant
    Filed: June 8, 2017
    Date of Patent: December 4, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Publication number: 20180332266
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for presenting spatially translated dimensions of an unseen object. A viewing device captures, with one or more sensors affixed to a vehicle, sensor data describing a distance between the vehicle and a physical object. The viewing device generates, based on the sensor data, a virtual model representing a position of at least a portion of the vehicle in relation to the physical object. The viewing device presents the virtual model on a display in the vehicle. The viewing device captures, with the one or more sensors affixed to the vehicle, updated sensor data describing an updated distance between the vehicle and the physical object, and updates the virtual model presented on the display based on the updated sensor data.
    Type: Application
    Filed: May 15, 2017
    Publication date: November 15, 2018
    Inventor: Brian Mullins
  • Publication number: 20180330531
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for adjusting depth of AR content on HUD. A viewing device identifies, based on sensor data, a physical object visible through a transparent display of the vehicle. The sensor data indicates an initial distance of the physical object from the vehicle. The viewing device gathers virtual content corresponding to the physical object and generates an initial presentation of the virtual content based on the initial distance. The viewing device presents the initial presentation of the virtual content on the transparent display at a position on the transparent display corresponding to the physical object. The viewing device determines, based on updated sensor data, an updated distance of the physical object and generates an updated presentation of the virtual content based on the updated distance. The viewing device presents the updated presentation of the virtual content on the transparent display of the vehicle.
    Type: Application
    Filed: May 15, 2017
    Publication date: November 15, 2018
    Inventor: Brian Mullins
  • Publication number: 20180332335
    Abstract: An augmented-reality device comprises an optical sensor and a transparent display. The augmented-reality device detects, using the optical sensor, a first physical display located within a display distance of the augmented-reality device. The first physical display is connected to a computer. The augmented-reality device generates a virtual display configured to operate as a second physical display. The computer controls the second physical display. The augmented-reality device displays the virtual display in the transparent display. The virtual display appears adjacent to the first physical display.
    Type: Application
    Filed: May 15, 2017
    Publication date: November 15, 2018
    Inventor: Brian Mullins
  • Patent number: 10110883
    Abstract: A device can determine a distance to an object. The device can use the determined distance to vary a focal length of a first adjustable element so that the first adjustable element directs light from the object into a first waveguide and onto a detector, and forms an image of the object at the detector. The device can produce an image, such as augmented content, on a panel. The device can direct light from the panel into a second waveguide. The device can use the determined distance to vary a focal length of a second adjustable element so that the second adjustable element directs light out of the second waveguide and forms a virtual image of the panel in a plane coincident with the object. The device can operate as an augmented reality headset. The adjustable elements can be phase modulators, or acoustically responsive material with surface acoustic wave transducers.
    Type: Grant
    Filed: September 30, 2016
    Date of Patent: October 23, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Matthew Kammerait
  • Publication number: 20180290057
    Abstract: An Augmented Reality (AR)-based gaming application identifies a plurality of AR markers. Each AR marker identifies a corresponding geolocation and corresponding AR content. A gaming geographic area is formed based on a profile of a user of the AR-based gaming application. The geolocations of the plurality of AR markers are mapped to the gaming geographic area. A playspace of the user of the AR-based gaming application is dynamically scaled to the gaming geographic area.
    Type: Application
    Filed: April 10, 2017
    Publication date: October 11, 2018
    Inventor: Brian Mullins
  • Patent number: 10089791
    Abstract: A predictive augmented reality assistance system is described. A device generates and renders augmented reality content in a display of the device. The device tracks user interactions with the augmented reality content of a user of the device. A context of the user interactions with the augmented reality content is determined based on the user interactions. A behavioral analysis of a user of the device is generated based on the context of the user interaction. A predictive model of the user of the device is generated based on the behavioral analysis. The augmented reality content is modified based on the predictive model of the user of the device.
    Type: Grant
    Filed: December 29, 2016
    Date of Patent: October 2, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Publication number: 20180268607
    Abstract: A system and method for data manipulation based on real world object manipulation is described. A device captures an image of a physical object. The image is communicated via a network to a remote server. The remote server includes virtual object data associated with the image and a communication notification for a user of the computing device. The device receives the virtual object data and displays the virtual image in a virtual landscape using the virtual object data. In response to relative movement between the computing device and the physical object caused by the user, the virtual image is modified.
    Type: Application
    Filed: March 15, 2017
    Publication date: September 20, 2018
    Inventor: Brian Mullins
  • Patent number: 10078839
    Abstract: This disclosure describes, in part, techniques for storing transaction data at a central service, and techniques for querying information associated with the data from the central service when authorizing payment instruments for transactions. For instance, a central service may receive historical transaction data from multiple payment services that authorize payment instruments for merchants, and then store the historical transaction data in one or more databases. A payment service may then receive a request to authorize a payment instrument for a transaction between a merchant and a customer. Based on receiving the request, the payment service can send the central service a message that includes a query for information associated with historical transaction data corresponding to the payment instrument. In response, the payment service can receive the information from the central service and authorize the payment instrument using the information.
    Type: Grant
    Filed: August 30, 2017
    Date of Patent: September 18, 2018
    Assignee: Square, Inc.
    Inventors: Brian Mullins, Fredrick Lee, Timothy Yip
  • Publication number: 20180261012
    Abstract: A system and method for offloading object detection are described. A server receives first sensor data from a first sensor of an augmented reality (AR) display device. The first sensor data indicates a pose of the AR display device relative to a first reference coordinate system. The server detects a physical object using second sensor data received from a second sensor of the AR display device. The server determines, based on the second sensor data, a pose of the physical object relative to the AR display device. The server then determines the pose of the physical object relative to the first reference coordinate system based on the pose of the physical object relative to the AR display device and the pose of the AR display device relative to the first reference coordinate system.
    Type: Application
    Filed: May 9, 2018
    Publication date: September 13, 2018
    Inventors: Brian Mullins, Daniel Wolf, Jakob Zillner, Branislav Micusik, William Hoff
  • Publication number: 20180253900
    Abstract: A device has an optical sensor, an inertial sensor, and a hardware processor. The optical sensor generates image data. The inertial sensor generates inertia data. The hardware processor receives an augmented reality (AR) authoring template authored at a client device, generates media content using the image data, and receives a selection of spatial coordinates within a three-dimensional region using the inertia data and the image data. The three-dimensional region is identified in the AR authoring template. The hardware processor further identifies an entry in the AR authoring template corresponding to the media content, places the media content at the selected spatial coordinates in the entry in the AR authoring template, and forms AR content using the media content at the selected spatial coordinates placed in the entry in the AR authoring template.
    Type: Application
    Filed: March 2, 2017
    Publication date: September 6, 2018
    Inventors: Samuel Finding, Brian Mullins, Noopur Gupta, Anthony L. Reyes, Neil Aalto
  • Publication number: 20180190020
    Abstract: A predictive augmented reality assistance system is described. A device generates and renders augmented reality content in a display of the device. The device tracks user interactions with the augmented reality content of a user of the device. A context of the user interactions with the augmented reality content is determined based on the user interactions. A behavioral analysis of a user of the device is generated based on the context of the user interaction. A predictive model of the user of the device is generated based on the behavioral analysis. The augmented reality content is modified based on the predictive model of the user of the device.
    Type: Application
    Filed: December 29, 2016
    Publication date: July 5, 2018
    Inventor: Brian Mullins
  • Publication number: 20180188684
    Abstract: A printing device (106) includes a dynamic holography printing application configured to generate a laser control signal and a LCOS-SLM (Liquid Crystal on Silicon Spatial Light Modulator) control signal based on a two-dimensional content corresponding to a lithography mask. A laser source (110) generates a plurality of incident laser beams based on the laser control signal. A LCOS-SLM (112) modulates the plurality of incident laser beams based on the LCOS-SLM control signal, generates a plurality of holographic wavefronts (214,216), each holographic wavefront forming at least one corresponding focal point. The LCOS-SLM generates a plurality of distinct focused light field regions (506,508,510) at interference points of focal points of the plurality of holographic wavefronts. The plurality of distinct focused light field regions correspond to the two-dimensional content.
    Type: Application
    Filed: December 22, 2016
    Publication date: July 5, 2018
    Applicant: Dualitas Ltd.
    Inventor: Brian MULLINS
  • Publication number: 20180190017
    Abstract: An augmented reality (AR) display application generates mapped visualization content overlaid on a real world physical environment. The AR display application receives sensor feeds, location information, and orientation information from wearable devices within the environment. A tessellation surface is visually mapped to surfaces of the environment based on a depth-based point cloud. A texture is applied to the tessellation surface and the tessellation may be viewed overlaying the surfaces of the environment via a wearable device.
    Type: Application
    Filed: January 3, 2018
    Publication date: July 5, 2018
    Inventors: Erick Mendez, Dominik Schnitzer, Bernhard Jung, Clemens Birklbauer, Kai Zhou, Kiyoung Kim, Daniel Wagner, Roy Lawrence Ashok Inigo, Frank Chester Irving, JR., Brian Mullins, Lucas Kazansky, Jonathan Trevor Freeman
  • Patent number: 9996155
    Abstract: A system and method for manipulating a virtual object based on thought is described. A display of a device displays a first virtual object based on an identification of a physical object detected by the device. The device monitors brain activity data of a user of the device. The brain activity data is generated in response to the first virtual object being displayed in the display. The device identifies a second virtual object based on the brain activity data of the user of the device and displays the second virtual object in the display.
    Type: Grant
    Filed: May 2, 2016
    Date of Patent: June 12, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: 9996983
    Abstract: A system and method for manipulating a virtual object based on intent is described. A reference identifier from a physical object is captured. The reference identifier is communicated via a network to a remote server. The remote server includes virtual object data associated with the reference identifier. The virtual object data is received at the computing device. The virtual image is displayed in a virtual landscape using the virtual object data. In response to relative movement between the computing device and the physical object caused by a user, the virtual image is modified. Brain activity data of the user is received. A state of the virtual object in the virtual landscape is changed based on the brain activity data.
    Type: Grant
    Filed: June 2, 2016
    Date of Patent: June 12, 2018
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: D820318
    Type: Grant
    Filed: April 12, 2017
    Date of Patent: June 12, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Lucas Kazansky, Christopher Michaels Garcia, Eduardo Salazar, Daniel Smitasin, Ryan Ries, Arash Kalantari, Frank Chester Irving, Jr., Michael French, Chin-Lin Huang
  • Patent number: D820319
    Type: Grant
    Filed: April 21, 2017
    Date of Patent: June 12, 2018
    Assignee: DAQRI, LLC
    Inventors: Brian Mullins, Christopher Broaddus, Muzaffer Kal, Wenyi Zhao, Ali M. Tatari, Saud Akram, Pip Tompkin, Denny Liao, Madison Smith, Samuel McClellan