Patents by Inventor Cameron G. Brown

Cameron G. Brown has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9165381
    Abstract: A system and method are disclosed for augmenting a reading experience in a mixed reality environment. In response to predefined verbal or physical gestures, the mixed reality system is able to answer a user's questions or provide additional information relating to what the user is reading. Responses may be displayed to the user on virtual display slates in a border or around the reading material without obscuring text or interfering with the user's reading experience.
    Type: Grant
    Filed: May 31, 2012
    Date of Patent: October 20, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Ryan L. Hastings, Cameron G. Brown, Aaron Krauss, Daniel J. McCulloch, Ben J. Sugden
  • Publication number: 20150206321
    Abstract: Methods for controlling the display of content as the content is being viewed by an end user of a head-mounted display device (HMD) are described. In some embodiments, an HMD may display the content using a virtual content reader for reading the content. The content may comprise text and/or images, such as text or images associated with an electronic book, an electronic magazine, a word processing document, a webpage, or an email. The virtual content reader may provide automated content scrolling based on a rate at which the end user reads a portion of the displayed content on the virtual content reader. In one embodiment, an HMD may combine automatic scrolling of content displayed on the virtual content reader with user controlled scrolling (e.g., via head tracking of the end user of the HMD).
    Type: Application
    Filed: January 23, 2014
    Publication date: July 23, 2015
    Inventors: Michael J. Scavezze, Adam G. Poulos, Johnathan Robert Bevis, Nicholas Gervase Fajt, Cameron G. Brown, Daniel J. McCulloch, Jeremy Lee
  • Publication number: 20150205494
    Abstract: Methods for enabling hands-free selection of virtual objects are described. In some embodiments, a gaze swipe gesture may be used to select a virtual object. The gaze swipe gesture may involve an end user of a head-mounted display device (HMD) performing head movements that are tracked by the HMD to detect whether a virtual pointer controlled by the end user has swiped across two or more edges of the virtual object. In some cases, the gaze swipe gesture may comprise the end user using their head movements to move the virtual pointer through two edges of the virtual object while the end user gazes at the virtual object. In response to detecting the gaze swipe gesture, the HMD may determine a second virtual object to be displayed on the HMD based on a speed of the gaze swipe gesture and a size of the virtual object.
    Type: Application
    Filed: January 23, 2014
    Publication date: July 23, 2015
    Inventors: Jason Scott, Arthur C. Tomlin, Mike Thomas, Matthew Kaplan, Cameron G. Brown, Jonathan Plumb, Nicholas Gervase Fajt, Daniel J. McCulloch, Jeremy Lee
  • Patent number: 9053483
    Abstract: A system provides a recommendation of food items to a user based on nutritional preferences of the user, using a head-mounted display device (HMDD) worn by the user. In a store, a forward-facing camera of the HMDD captures an image of a food item. The food item can be identified by the image, such as based on packaging of the food item. Nutritional parameters of the food item are compared to nutritional preferences of the user to determine whether the food item is recommended. The HMDD displays an augmented reality image to the user indicating whether the food item is recommended. If the food item is not recommended, a substitute food item can be identified. The nutritional preferences can indicate food allergies, preferences for low calorie foods and so forth. In a restaurant, the HMDD can recommend menu selections for a user.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: June 9, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Kathryn Stonw Perez, Stephen G Latta, Ben J Sugden, Benjamin I Vaught, Alex Aben-Athar Kipman, Cameron G Brown, Holly A Hirzel, Brian J Mount, Daniel McCulloch
  • Publication number: 20150007114
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Publication number: 20140306891
    Abstract: Methods for providing real-time feedback to an end user of a mobile device as they are interacting with or manipulating one or more virtual objects within an augmented reality environment are described. The real-time feedback may comprise visual feedback, audio feedback, and/or haptic feedback. In some embodiments, a mobile device, such as a head-mounted display device (HMD), may determine an object classification associated with a virtual object within an augmented reality environment, detect an object manipulation gesture performed by an end user of the mobile device, detect an interaction with the virtual object based on the object manipulation gesture, determine a magnitude of a virtual force associated with the interaction, and provide real-time feedback to the end user of the mobile device based on the interaction, the magnitude of the virtual force applied to the virtual object, and the object classification associated with the virtual object.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 16, 2014
    Inventors: Stephen G. Latta, Adam G. Poulos, Cameron G. Brown, Daniel J. McCulloch, Matthew Kaplan, Arnulfo Zepeda Navratil, Jon Paulovich, Kudo Tsunoda
  • Publication number: 20140306993
    Abstract: Methods for positioning virtual objects within an augmented reality environment using snap grid spaces associated with real-world environments, real-world objects, and/or virtual objects within the augmented reality environment are described. A snap grid space may comprise a two-dimensional or three-dimensional virtual space within an augmented reality environment in which one or more virtual objects may be positioned. In some embodiments, a head-mounted display device (HMD) may identify one or more grid spaces within an augmented reality environment, detect a positioning of a virtual object within the augmented reality environment, determine a target grid space of the one or more grid spaces in which to position the virtual object, determine a position of the virtual object within the target grid space, and display the virtual object within the augmented reality environment based on the position of the virtual object within the target grid space.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 16, 2014
    Inventors: Adam G. Poulos, Jason Scott, Matthew Kaplan, Christopher Obeso, Cameron G. Brown, Daniel J. McCulloch, Abby Lee, Brian J. Mount, Ben J. Sugden
  • Publication number: 20140306994
    Abstract: Methods for generating and displaying personalized virtual billboards within an augmented reality environment are described. The personalized virtual billboards may facilitate the sharing of personalized information between persons within an environment who have varying degrees of acquaintance (e.g., ranging from close familial relationships to strangers). In some embodiments, a head-mounted display device (HMD) may detect a mobile device associated with a particular person within an environment, acquire a personalized information set corresponding with the particular person, generate a virtual billboard based on the personalized information set, and display the virtual billboard on the HMD. The personalized information set may include information associated with the particular person such as shopping lists and classified advertisements.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 16, 2014
    Inventors: Cameron G. Brown, Abby Lee, Brian J. Mount, Daniel J. McCulloch, Michael J. Scavezze, Ryan L. Hastings, John Bevis, Mike Thomas, Ron Amador-Leon
  • Publication number: 20130321390
    Abstract: A system and method are disclosed for augmenting a reading experience in a mixed reality environment. In response to predefined verbal or physical gestures, the mixed reality system is able to answer a user's questions or provide additional information relating to what the user is reading. Responses may be displayed to the user on virtual display slates in a border or around the reading material without obscuring text or interfering with the user's reading experience.
    Type: Application
    Filed: May 31, 2012
    Publication date: December 5, 2013
    Inventors: Stephen G. Latta, Ryan L. Hastings, Cameron G. Brown, Aaron Krauss, Daniel J. McCulloch, Ben J. Sugden
  • Publication number: 20130085345
    Abstract: A system provides a recommendation of food items to a user based on nutritional preferences of the user, using a head-mounted display device (HMDD) worn by the user. In a store, a forward-facing camera of the HMDD captures an image of a food item. The food item can be identified by the image, such as based on packaging of the food item. Nutritional parameters of the food item are compared to nutritional preferences of the user to determine whether the food item is recommended. The HMDD displays an augmented reality image to the user indicating whether the food item is recommended. If the food item is not recommended, a substitute food item can be identified. The nutritional preferences can indicate food allergies, preferences for low calorie foods and so forth. In a restaurant, the HMDD can recommend menu selections for a user.
    Type: Application
    Filed: March 30, 2012
    Publication date: April 4, 2013
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Cameron G. Brown, Holly A. Hirzel, Brian J. Mount, Daniel McCulloch