Patents by Inventor Michael John Ebstyne

Michael John Ebstyne has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150317833
    Abstract: An augmented reality device including a plurality of sensors configured to output pose information indicating a pose of the augmented reality device. The augmented reality device further includes a band-agnostic filter and a band-specific filter. The band-specific filter includes an error correction algorithm configured to receive pose information as filtered by the band-agnostic filter and reduce a tracking error of the pose information in a selected frequency band. The augmented reality device further includes a display engine configured to position a virtual object on a see-through display as a function of the pose information as filtered by the band-agnostic filter and the band-specific filter.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Drew Steedly, Calvin Chan, Ethan Eade, Alex Kipman, Georg Klein
  • Publication number: 20150316767
    Abstract: Embodiments related to mapping an environment of a machine-vision system are disclosed. For example, one disclosed method includes acquiring image data resolving one or more reference features of an environment and computing a parameter value based on the image data, wherein the parameter value is responsive to physical deformation of the machine-vision system.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Drew Steedly, Georg Klein, Ethan Eade, Michael Grabner
  • Publication number: 20150317832
    Abstract: Embodiments that relate to communicating to a user of a head-mounted display device an estimated quality level of a world-lock display mode are disclosed. For example, in one disclosed embodiment a sensor data is received from one or more sensors of the device. Using the sensor data, an estimated pose of the device is determined. Using the estimated pose, one or more virtual objects are displayed via the device in either the world-lock display mode or in a body-lock display mode. One or more of input uncertainty values of the sensor data and pose uncertainty values of the estimated pose are determined. The input uncertainty values and/or pose uncertainty values are mapped to the estimated quality level of the world-lock display mode. Feedback of the estimated quality level is communicated to a user via device.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Drew Steedly, Ethan Eade, Martin Shetter, Michael Grabner
  • Publication number: 20150317831
    Abstract: Various embodiments relating to controlling a see-through display are disclosed. In one embodiment, virtual objects may be displayed on the see-through display. The virtual objects transition between having a position that is body-locked and a position that is world-locked based on various transition events.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Stephen Latta, Paul Albert Lalonde, Drew Steedly, Alex Kipman, Ethan Eade
  • Publication number: 20150228118
    Abstract: Embodiments are disclosed that relate to determining a pose of a device. One disclosed embodiment provides a method comprising receiving sensor information from one or more sensors of the device, and selecting a motion-family model from a plurality of different motion-family models based on the sensor information. The method further comprises providing the sensor information to the selected motion-family model and outputting an estimated pose of the device according to the selected motion-family model.
    Type: Application
    Filed: February 12, 2014
    Publication date: August 13, 2015
    Inventors: Ethan Eade, Michael John Ebstyne, Frederick Schaffalitzky, Drew Steedly
  • Publication number: 20140325457
    Abstract: The gesture-based searching of a line pattern representation amongst a collection of line pattern representations. Upon detecting an input gesture, a computing system matches the input gesture against each of multiple pattern representations. Each line pattern representation represents a line pattern having a changing value in a first dimension as a function of a value in a second dimension. At least some of the matched set may then be visualized to the user. The input gesture may be a literal line pattern to match against, or might be a gesture that has semantic meaning that describes search parameters of a line pattern to search for. The matched set may be presented so that a display parameter conveys a closeness of the match.
    Type: Application
    Filed: April 24, 2013
    Publication date: October 30, 2014
    Applicant: Microsoft Corporation
    Inventors: Adam Smolinski, Michael John Ebstyne
  • Publication number: 20140320503
    Abstract: The encoding of a line pattern representation. The line pattern representation has a changing value in a first dimension as a function of a value in a second dimension. The line pattern representation is segmented into multiple segments along the second dimension. The line pattern representation is then encoded by assigning a quantized value to each of the segments based on the changing value of the line pattern in the first dimension as present within the corresponding segment. If the line pattern generally falls within a given range within a segment, the segment will be assigned a quantized value corresponding to that range. The encoding may be used to assign the line pattern representation into a category.
    Type: Application
    Filed: April 24, 2013
    Publication date: October 30, 2014
    Applicant: Microsoft Corporation
    Inventors: Adam Smolinski, Michael John Ebstyne
  • Publication number: 20140325405
    Abstract: Auto-completion of an input partial line pattern. Upon detecting that the user has input the partial line pattern, the scope of the input partial line pattern is matched against corresponding line patterns from a collection of line pattern representations to form a scoped match set of line pattern representations. For one or more of the line pattern representations in the scoped match set, a visualization of completion options is then provided. For example, the corresponding line pattern representation might be displayed in a distinct portion of the display as compared to the input partial line pattern, or perhaps in the same portion in which case, in which case the remaining portion of the line pattern representation might extend off of the input partial line pattern representation.
    Type: Application
    Filed: April 24, 2013
    Publication date: October 30, 2014
    Applicant: Microsoft Corporation
    Inventors: Adam Smolinski, Michael John Ebstyne