Patents by Inventor Tobias Kinnebrew

Tobias Kinnebrew has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140368534
    Abstract: A see through head mounted display apparatus includes code performing a method of choosing and optimal viewing location and perspective for shared-view virtual objects rendered for multiple users in a common environment. Multiple objects and multiple users are taken into account in determining the optimal, common viewing location. The technology allows each user to have a common view if the relative position of the object in the environment.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140372957
    Abstract: A head mounted display allows user selection of a virtual object through multi-step focusing by the user. Focus on the selectable object is determined and then a validation object is displayed. When user focus moves to the validation object, a timeout determines that a selection of the validation object, and thus the selectable object has occurred. The technology can be used in see through head mounted displays to allow a user to effectively navigate an environment with a multitude of virtual objects without unintended selections.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368535
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is not focused on the virtual object, which may be a heads-up display, or HUD, the HUD may remain body locked to the user. As such, the user may explore and interact with a mixed reality environment presented by the head mounted display device without interference from the HUD. When a user wishes to view and/or interact with the HUD, the user may look at the HUD. At this point, the HUD may change from a body locked virtual object to a world locked virtual object. The user is then able to view and interact with the HUD from different positions and perspectives of the HUD.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368532
    Abstract: A method and apparatus for the creation of a perspective-locked virtual object having in world space. The virtual object may be consumed) by another user with a consumption device at a location, position, and orientation which is the same as, or proximate to, the location, position, and orientation where the virtual object is created.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368537
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment including shared virtual objects and private virtual objects. Multiple users can collaborate together in interacting with the shared virtual objects. A private virtual object may be visible to a single user. In examples, private virtual objects of respective users may facilitate the users' collaborative interaction with one or more shared virtual objects.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140347390
    Abstract: Embodiments are disclosed that relate to placing virtual objects in an augmented reality environment. For example, one disclosed embodiment provides a method comprising receiving sensor data comprising one or more of motion data, location data, and orientation data from one or more sensors located on a head-mounted display device, and based upon the motion data, determining a body-locking direction vector that is based upon an estimated direction in which a body of a user is facing. The method further comprises positioning a displayed virtual object based on the body-locking direction vector.
    Type: Application
    Filed: May 22, 2013
    Publication date: November 27, 2014
    Inventors: Adam G. Poulos, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Stephen Latta, Peter Tobias Kinnebrew, Nicholas Kamuda, Robert Pengelly, Jeffrey C. Fong, Aaron Woo, Udiyan I. Padmanahan, Andrew Wyman MacDonald, Olivia M. Janik
  • Publication number: 20140347391
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is moving through the mixed reality environment, the virtual objects may remain world-locked, so that the user can move around and explore the virtual objects from different perspectives. When the user is motionless in the mixed reality environment, the virtual objects may rotate to face the user so that the user can easily view and interact with the virtual objects.
    Type: Application
    Filed: May 23, 2013
    Publication date: November 27, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140204077
    Abstract: Embodiments that relate to presenting a textured shared world model of a physical environment are disclosed. One embodiment includes receiving geo-located crowd-sourced structural data items of the physical environment. The structural data items are stitched together to generate a 3D spatial shared world model. Geo-located crowd-sourced texture data items are also received and include time-stamped images or video. User input of a temporal filter parameter is used to temporally filter the texture data items. The temporally-filtered texture data items are applied to surfaces of the 3D spatial shared world model to generate a textured shared world model of the physical environment. The textured shared world model is then provided for display by a display device.
    Type: Application
    Filed: January 22, 2013
    Publication date: July 24, 2014
    Inventors: Nicholas Kamuda, Peter Tobias Kinnebrew
  • Publication number: 20140204117
    Abstract: Embodiments that relate to selectively filtering geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a mixed reality filtering program receives a plurality of geo-located data items and selectively filtering the data items based on one or more modes. The modes comprise one or more of a social mode, a popular mode, a recent mode, a work mode, a play mode, and a user interest mode. Such filtering yields a filtered collection of the geo-located data items. The filtered collection of data items is then provided to a mixed reality display program for display by a display device.
    Type: Application
    Filed: January 22, 2013
    Publication date: July 24, 2014
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140192084
    Abstract: A mixed reality accommodation system and related methods are provided. In one example, a head-mounted display device includes a plurality of sensors and a display system for presenting holographic objects. A mixed reality safety program is configured to receive a holographic object and associated content provider ID from a source. The program assigns a trust level to the object based on the content provider ID. If the trust level is less than a threshold, the object is displayed according to a first set of safety rules that provide a protective level of display restrictions. If the trust level is greater than or equal to the threshold, the object is displayed according to a second set of safety rules that provide a permissive level of display restrictions that are less than the protective level of display restrictions.
    Type: Application
    Filed: January 10, 2013
    Publication date: July 10, 2014
    Inventors: Stephen Latta, Peter Tobias Kinnebrew
  • Publication number: 20140160001
    Abstract: Embodiments that relate to presenting a mixed reality environment via a mixed reality display device are disclosed. For example, one disclosed embodiment provides a method for presenting a mixed reality environment via a head-mounted display device. The method includes using head pose data to generally identify one or more gross selectable targets within a sub-region of a spatial region occupied by the mixed reality environment. The method further includes specifically identifying a fine selectable target from among the gross selectable targets based on eye-tracking data. Gesture data is then used to identify a gesture, and an operation associated with the identified gesture is performed on the fine selectable target.
    Type: Application
    Filed: December 6, 2012
    Publication date: June 12, 2014
    Inventors: Peter Tobias Kinnebrew, Alex Kipman
  • Publication number: 20140071163
    Abstract: A holographic object presentation system and related methods for presenting a holographic object having a selective information detail level are provided. In one example, a holographic object presentation program may receive user behavior information and physical environment information. Using one or more of the user behavior information and the physical environment information, the program may adjust the selective information detail level of the holographic object to an adjusted information detail level. The program may then provide the holographic object at the adjusted information detail level to an augmented reality display program for display on a display device.
    Type: Application
    Filed: September 11, 2012
    Publication date: March 13, 2014
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140049559
    Abstract: Systems and related methods for presenting a holographic object that self-adapts to a mixed reality environment are provided. In one example, a holographic object presentation program captures physical environment data from a destination physical environment and creates a model of the environment including physical objects having associated properties. The program identifies a holographic object for display on a display of a display device, the holographic object including one or more rules linking a detected environmental condition and/or properties of the physical objects with a display mode of the holographic object. The program applies the one or more rules to select the display mode for the holographic object based on the detected environmental condition and/or the properties of the physical objects.
    Type: Application
    Filed: August 17, 2012
    Publication date: February 20, 2014
    Inventors: Rod G. Fleck, Nicholas Kamuda, Stephen Latta, Peter Tobias Kinnebrew
  • Publication number: 20130342564
    Abstract: A display apparatus and method for creating and displaying configured virtual environments based on real world source environments. A mixed reality environment includes real and virtual objects, and a set of one or more virtual objects may be associated with a source environment and stored as a configured environment for later rendering by the display apparatus. Configured environments may be associated with users, environments or locations.
    Type: Application
    Filed: June 25, 2012
    Publication date: December 26, 2013
    Inventors: Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20130342570
    Abstract: A see-through, near-eye, mixed reality display apparatus providing a mixed reality environment wherein one or more virtual objects and one or more real objects exist within the view of the device. Each of the real and virtual have a commonly defined set of attributes understood by the mixed reality system allowing the system to manage relationships and interaction between virtual objects and other virtual objects, and virtual and real objects.
    Type: Application
    Filed: June 25, 2012
    Publication date: December 26, 2013
    Inventors: Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20130342571
    Abstract: A see-through, near-eye, mixed reality display apparatus providing a mixed reality environment wherein one or more virtual objects and one or more real objects exist within the view of the device. Each of the real and virtual have a commonly defined set of attributes understood by the mixed reality system allowing the system to manage relationships and interaction between virtual objects and other virtual objects, and virtual and real objects.
    Type: Application
    Filed: June 25, 2012
    Publication date: December 26, 2013
    Inventors: Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20130194304
    Abstract: A method for presenting real and virtual images correctly positioned with respect to each other. The method includes, in a first field of view, receiving a first real image of an object and displaying a first virtual image. The method also includes, in a second field of view oriented independently relative to the first field of view, receiving a second real image of the object and displaying a second virtual image, the first and second virtual images positioned coincidently within a coordinate system.
    Type: Application
    Filed: February 1, 2012
    Publication date: August 1, 2013
    Inventors: Stephen Latta, Darren Bennett, Peter Tobias Kinnebrew, Kevin Geisner, Brian Mount, Arthur Tomlin, Mike Scavezze, Daniel McCulloch, David Nister, Drew Steedly, Jeffrey Alan Kohler, Ben Sugden, Sebastian Sylvan
  • Publication number: 20110319148
    Abstract: A system and method combining real-world actions and virtual actions in a gaming environment. In one aspect, a massively multiplayer environment combines the real world actions and virtual actions of a participant to influence both character metrics and game play within one or more games provided by the service. In the real world or location-based events, game play occurs in and around links explicitly created between real world locations and virtual representations of those locations within the game.
    Type: Application
    Filed: June 24, 2010
    Publication date: December 29, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Aron B. Kantor
  • Publication number: 20110314084
    Abstract: A system automatically and continuously finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.
    Type: Application
    Filed: June 17, 2010
    Publication date: December 22, 2011
    Inventors: Cesare John Saretto, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Henry Hooper Somuah, Matthew John McCloskey, Douglas C. Hebenthal, Kathleen P. Mulcahy