Patents by Inventor Nicholas Kamuda

Nicholas Kamuda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9761057
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: November 21, 2016
    Date of Patent: September 12, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9734636
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a first geo-located data item and provides a first visual information density level for the item to a head-mounted display device. When a spatial information density of geo-located data item information is below a threshold, the program provides a second visual information density level greater than the first level for a second geo-located data item displayed.
    Type: Grant
    Filed: March 7, 2017
    Date of Patent: August 15, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20170178412
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a first geo-located data item and provides a first visual information density level for the item to a head-mounted display device. When a spatial information density of geo-located data item information is below a threshold, the program provides a second visual information density level greater than the first level for a second geo-located data item displayed.
    Type: Application
    Filed: March 7, 2017
    Publication date: June 22, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Patent number: 9619939
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a selected geo-located data item and provides a minimum visual information density level for the item to a head-mounted display device. The program receives via the head-mounted display device a user input corresponding to the selected geo-located data item. Based on the input, the program provides an increasing visual information density level for the selected item to the head-mounted display device for display within the mixed reality environment.
    Type: Grant
    Filed: July 31, 2013
    Date of Patent: April 11, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20170069143
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: November 21, 2016
    Publication date: March 9, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9501873
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: July 22, 2015
    Date of Patent: November 22, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9429912
    Abstract: Systems and related methods for presenting a holographic object that self-adapts to a mixed reality environment are provided. In one example, a holographic object presentation program captures physical environment data from a destination physical environment and creates a model of the environment including physical objects having associated properties. The program identifies a holographic object for display on a display of a display device, the holographic object including one or more rules linking a detected environmental condition and/or properties of the physical objects with a display mode of the holographic object. The program applies the one or more rules to select the display mode for the holographic object based on the detected environmental condition and/or the properties of the physical objects.
    Type: Grant
    Filed: August 17, 2012
    Date of Patent: August 30, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rod G. Fleck, Nicholas Kamuda, Stephen Latta, Peter Tobias Kinnebrew
  • Patent number: 9412201
    Abstract: Embodiments that relate to selectively filtering geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a mixed reality filtering program receives a plurality of geo-located data items and selectively filters the data items based on one or more modes. The modes comprise one or more of a social mode, a popular mode, a recent mode, a work mode, a play mode, and a user interest mode. Such filtering yields a filtered collection of the geo-located data items. The filtered collection of data items is then provided to a mixed reality display program for display by a display device.
    Type: Grant
    Filed: January 22, 2013
    Date of Patent: August 9, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Patent number: 9367960
    Abstract: Embodiments are disclosed that relate to placing virtual objects in an augmented reality environment. For example, one disclosed embodiment provides a method comprising receiving sensor data comprising one or more of motion data, location data, and orientation data from one or more sensors located on a head-mounted display device, and based upon the motion data, determining a body-locking direction vector that is based upon an estimated direction in which a body of a user is facing. The method further comprises positioning a displayed virtual object based on the body-locking direction vector.
    Type: Grant
    Filed: May 22, 2013
    Date of Patent: June 14, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Stephen Latta, Peter Tobias Kinnebrew, Nicholas Kamuda, Robert Pengelly, Jeffrey C. Fong, Aaron Woo, Udiyan I. Padmanahan, Andrew Wyman MacDonald, Olivia M. Janik
  • Patent number: 9342929
    Abstract: Embodiments that relate to presenting a textured shared world model of a physical environment are disclosed. One embodiment includes receiving geo-located crowd-sourced structural data items of the physical environment. The structural data items are stitched together to generate a 3D spatial shared world model. Geo-located crowd-sourced texture data items are also received and include time-stamped images or video. User input of a temporal filter parameter is used to temporally filter the texture data items. The temporally-filtered texture data items are applied to surfaces of the 3D spatial shared world model to generate a textured shared world model of the physical environment. The textured shared world model is then provided for display by a display device.
    Type: Grant
    Filed: January 22, 2013
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Nicholas Kamuda, Peter Tobias Kinnebrew
  • Publication number: 20150325054
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: July 22, 2015
    Publication date: November 12, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9129430
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: September 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20150035861
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a selected geo-located data item and provides a minimum visual information density level for the item to a head-mounted display device. The program receives via the head-mounted display device a user input corresponding to the selected geo-located data item. Based on the input, the program provides an increasing visual information density level for the selected item to the head-mounted display device for display within the mixed reality environment.
    Type: Application
    Filed: July 31, 2013
    Publication date: February 5, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140375683
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20140347390
    Abstract: Embodiments are disclosed that relate to placing virtual objects in an augmented reality environment. For example, one disclosed embodiment provides a method comprising receiving sensor data comprising one or more of motion data, location data, and orientation data from one or more sensors located on a head-mounted display device, and based upon the motion data, determining a body-locking direction vector that is based upon an estimated direction in which a body of a user is facing. The method further comprises positioning a displayed virtual object based on the body-locking direction vector.
    Type: Application
    Filed: May 22, 2013
    Publication date: November 27, 2014
    Inventors: Adam G. Poulos, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Stephen Latta, Peter Tobias Kinnebrew, Nicholas Kamuda, Robert Pengelly, Jeffrey C. Fong, Aaron Woo, Udiyan I. Padmanahan, Andrew Wyman MacDonald, Olivia M. Janik
  • Publication number: 20140204077
    Abstract: Embodiments that relate to presenting a textured shared world model of a physical environment are disclosed. One embodiment includes receiving geo-located crowd-sourced structural data items of the physical environment. The structural data items are stitched together to generate a 3D spatial shared world model. Geo-located crowd-sourced texture data items are also received and include time-stamped images or video. User input of a temporal filter parameter is used to temporally filter the texture data items. The temporally-filtered texture data items are applied to surfaces of the 3D spatial shared world model to generate a textured shared world model of the physical environment. The textured shared world model is then provided for display by a display device.
    Type: Application
    Filed: January 22, 2013
    Publication date: July 24, 2014
    Inventors: Nicholas Kamuda, Peter Tobias Kinnebrew
  • Publication number: 20140204117
    Abstract: Embodiments that relate to selectively filtering geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a mixed reality filtering program receives a plurality of geo-located data items and selectively filtering the data items based on one or more modes. The modes comprise one or more of a social mode, a popular mode, a recent mode, a work mode, a play mode, and a user interest mode. Such filtering yields a filtered collection of the geo-located data items. The filtered collection of data items is then provided to a mixed reality display program for display by a display device.
    Type: Application
    Filed: January 22, 2013
    Publication date: July 24, 2014
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140071163
    Abstract: A holographic object presentation system and related methods for presenting a holographic object having a selective information detail level are provided. In one example, a holographic object presentation program may receive user behavior information and physical environment information. Using one or more of the user behavior information and the physical environment information, the program may adjust the selective information detail level of the holographic object to an adjusted information detail level. The program may then provide the holographic object at the adjusted information detail level to an augmented reality display program for display on a display device.
    Type: Application
    Filed: September 11, 2012
    Publication date: March 13, 2014
    Inventors: Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140049559
    Abstract: Systems and related methods for presenting a holographic object that self-adapts to a mixed reality environment are provided. In one example, a holographic object presentation program captures physical environment data from a destination physical environment and creates a model of the environment including physical objects having associated properties. The program identifies a holographic object for display on a display of a display device, the holographic object including one or more rules linking a detected environmental condition and/or properties of the physical objects with a display mode of the holographic object. The program applies the one or more rules to select the display mode for the holographic object based on the detected environmental condition and/or the properties of the physical objects.
    Type: Application
    Filed: August 17, 2012
    Publication date: February 20, 2014
    Inventors: Rod G. Fleck, Nicholas Kamuda, Stephen Latta, Peter Tobias Kinnebrew