Patents by Inventor Peter Tobias Kinnebrew

Peter Tobias Kinnebrew has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9573064
    Abstract: A system and method combining real-world actions and virtual actions in a gaming environment. In one aspect, a massively multiplayer environment combines the real world actions and virtual actions of a participant to influence both character metrics and game play within one or more games provided by the service. In the real world or location-based events, game play occurs in and around links explicitly created between real world locations and virtual representations of those locations within the game.
    Type: Grant
    Filed: June 24, 2010
    Date of Patent: February 21, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Aron B. Kantor
  • Patent number: 9501873
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: July 22, 2015
    Date of Patent: November 22, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9429912
    Abstract: Systems and related methods for presenting a holographic object that self-adapts to a mixed reality environment are provided. In one example, a holographic object presentation program captures physical environment data from a destination physical environment and creates a model of the environment including physical objects having associated properties. The program identifies a holographic object for display on a display of a display device, the holographic object including one or more rules linking a detected environmental condition and/or properties of the physical objects with a display mode of the holographic object. The program applies the one or more rules to select the display mode for the holographic object based on the detected environmental condition and/or the properties of the physical objects.
    Type: Grant
    Filed: August 17, 2012
    Date of Patent: August 30, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rod G. Fleck, Nicholas Kamuda, Stephen Latta, Peter Tobias Kinnebrew
  • Patent number: 9412201
    Abstract: Embodiments that relate to selectively filtering geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a mixed reality filtering program receives a plurality of geo-located data items and selectively filters the data items based on one or more modes. The modes comprise one or more of a social mode, a popular mode, a recent mode, a work mode, a play mode, and a user interest mode. Such filtering yields a filtered collection of the geo-located data items. The filtered collection of data items is then provided to a mixed reality display program for display by a display device.
    Type: Grant
    Filed: January 22, 2013
    Date of Patent: August 9, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Patent number: 9367960
    Abstract: Embodiments are disclosed that relate to placing virtual objects in an augmented reality environment. For example, one disclosed embodiment provides a method comprising receiving sensor data comprising one or more of motion data, location data, and orientation data from one or more sensors located on a head-mounted display device, and based upon the motion data, determining a body-locking direction vector that is based upon an estimated direction in which a body of a user is facing. The method further comprises positioning a displayed virtual object based on the body-locking direction vector.
    Type: Grant
    Filed: May 22, 2013
    Date of Patent: June 14, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Stephen Latta, Peter Tobias Kinnebrew, Nicholas Kamuda, Robert Pengelly, Jeffrey C. Fong, Aaron Woo, Udiyan I. Padmanahan, Andrew Wyman MacDonald, Olivia M. Janik
  • Patent number: 9342929
    Abstract: Embodiments that relate to presenting a textured shared world model of a physical environment are disclosed. One embodiment includes receiving geo-located crowd-sourced structural data items of the physical environment. The structural data items are stitched together to generate a 3D spatial shared world model. Geo-located crowd-sourced texture data items are also received and include time-stamped images or video. User input of a temporal filter parameter is used to temporally filter the texture data items. The temporally-filtered texture data items are applied to surfaces of the 3D spatial shared world model to generate a textured shared world model of the physical environment. The textured shared world model is then provided for display by a display device.
    Type: Grant
    Filed: January 22, 2013
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Nicholas Kamuda, Peter Tobias Kinnebrew
  • Patent number: 9329682
    Abstract: A head mounted display allows user selection of a virtual object through multi-step focusing by the user. Focus on the selectable object is determined and then a validation object is displayed. When user focus moves to the validation object, a timeout determines that a selection of the validation object, and thus the selectable object has occurred. The technology can be used in see through head mounted displays to allow a user to effectively navigate an environment with a multitude of virtual objects without unintended selections.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: May 3, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9235051
    Abstract: A see-through head mounted display apparatus includes a display and a processor. The processor determines geo-located positions of points of interest within a field of view and generates markers indicating information regarding an associated real world object is available to the user. Markers are rendered in the display relative to the geo-located position and the field of view of the user. When a user selects a marker though a user gesture, the device displays a near-field virtual object having a visual tether to the marker simultaneously with the marker. The user may interact with the marker to view, add or delete information associated with the point of interest.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: January 12, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9230368
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is moving through the mixed reality environment, the virtual objects may remain world-locked, so that the user can move around and explore the virtual objects from different perspectives. When the user is motionless in the mixed reality environment, the virtual objects may rotate to face the user so that the user can easily view and interact with the virtual objects.
    Type: Grant
    Filed: May 23, 2013
    Date of Patent: January 5, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20150325054
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: July 22, 2015
    Publication date: November 12, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9129430
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: September 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20150149288
    Abstract: A system automatically and continuously finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.
    Type: Application
    Filed: February 2, 2015
    Publication date: May 28, 2015
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Cesare John Saretto, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Henry Hooper Somuah, Matthew John McCloskey, Douglas C. Hebenthal, Kathleen P. Mulcahy
  • Patent number: 9002924
    Abstract: A system automatically and continuously finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.
    Type: Grant
    Filed: June 17, 2010
    Date of Patent: April 7, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Cesare John Saretto, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Henry Hooper Somuah, Matthew John McCloskey, Douglas C. Hebenthal, Kathleen P. Mulcahy
  • Publication number: 20150035861
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a selected geo-located data item and provides a minimum visual information density level for the item to a head-mounted display device. The program receives via the head-mounted display device a user input corresponding to the selected geo-located data item. Based on the input, the program provides an increasing visual information density level for the selected item to the head-mounted display device for display within the mixed reality environment.
    Type: Application
    Filed: July 31, 2013
    Publication date: February 5, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140375683
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20140368534
    Abstract: A see through head mounted display apparatus includes code performing a method of choosing and optimal viewing location and perspective for shared-view virtual objects rendered for multiple users in a common environment. Multiple objects and multiple users are taken into account in determining the optimal, common viewing location. The technology allows each user to have a common view if the relative position of the object in the environment.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140372957
    Abstract: A head mounted display allows user selection of a virtual object through multi-step focusing by the user. Focus on the selectable object is determined and then a validation object is displayed. When user focus moves to the validation object, a timeout determines that a selection of the validation object, and thus the selectable object has occurred. The technology can be used in see through head mounted displays to allow a user to effectively navigate an environment with a multitude of virtual objects without unintended selections.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368533
    Abstract: A see-through head mounted display apparatus includes a display and a processor. The processor determines geo-located positions of points of interest within a field of view and generates markers indicating information regarding an associated real world object is available to the user. Markers are rendered in the display relative to the geo-located position and the field of view of the user. When a user selects a marker though a user gesture, the device displays a near-field virtual object having a visual tether to the marker simultaneously with the marker. The user may interact with the marker to view, add or delete information associated with the point of interest.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368535
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is not focused on the virtual object, which may be a heads-up display, or HUD, the HUD may remain body locked to the user. As such, the user may explore and interact with a mixed reality environment presented by the head mounted display device without interference from the HUD. When a user wishes to view and/or interact with the HUD, the user may look at the HUD. At this point, the HUD may change from a body locked virtual object to a world locked virtual object. The user is then able to view and interact with the HUD from different positions and perspectives of the HUD.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368532
    Abstract: A method and apparatus for the creation of a perspective-locked virtual object having in world space. The virtual object may be consumed) by another user with a consumption device at a location, position, and orientation which is the same as, or proximate to, the location, position, and orientation where the virtual object is created.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda