Patents by Inventor Kathryn Stone Perez

Kathryn Stone Perez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9274747
    Abstract: A system and method are disclosed for combining interactive gaming aspects into a linear story. A user may interact with the linear story via a NUI system to alter the story and the images that are presented to the user. In an example, a user may alter the story by performing a predefined exploration gesture. This gesture brings the user into the 3-D world of the displayed image. In particular, the image displayed on the screen changes to create the impression that a user is stepping into the 3-D virtual world to allow a user to examine virtual objects from different perspectives or to peer around virtual objects.
    Type: Grant
    Filed: February 19, 2013
    Date of Patent: March 1, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew Fuller, Rudy Poat, Alex Aben-Athar Kipman, Kathryn Stone Perez
  • Patent number: 9274594
    Abstract: A system and method is disclosed for sensing, storing and using personal trait profile data. Once sensed and stored, this personal trait profile data may be used for a variety of purposes. In one example, a user's personal trait profile data may be accessed and downloaded to different computing systems with which a user may interact so that the different systems may be instantly tuned to the user's personal traits and manner of interaction. In a further example, a user's personal trait profile data may also be used for authentication purposes.
    Type: Grant
    Filed: May 28, 2010
    Date of Patent: March 1, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Aben-Athar Kipman, John Clavin, Joseph Molnar, Aaron E. Kornblum
  • Patent number: 9268406
    Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: February 23, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, Jr., Mathew J. Lamb
  • Patent number: 9245177
    Abstract: Technology determines whether a gesture of an avatar depicts one of a set of prohibited gestures. An example of a prohibited gesture is a lewd gesture. If the gesture is determined to be a prohibited gesture, the image data for display of the gesture is altered. Some examples of alteration are substitution of image data for the prohibited gesture, or performing a filtering technique to the image data depicting the gesture to visually obscure the prohibited gesture.
    Type: Grant
    Filed: June 2, 2010
    Date of Patent: January 26, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Kathryn Stone Perez
  • Patent number: 9229231
    Abstract: The technology provides for updating printed content with personalized virtual data using a see-through, near-eye, mixed reality display device system. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. Virtual data is selected from available virtual data for the printed content selection based on user profile data, and the display device system displays the selected virtual data in a position registered to the position of the printed content selection. In some examples, a task related to the printed content item is determined based on physical action user input, and personalized virtual data is displayed registered to the printed content item in accordance with the task.
    Type: Grant
    Filed: January 9, 2012
    Date of Patent: January 5, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Patent number: 9213163
    Abstract: The technology provides for automatic alignment of a see-through near-eye, mixed reality device with an inter-pupillary distance (IPD). A determination is made as to whether a see-through, near-eye, mixed reality display device is aligned with an IPD of a user. If the display device is not aligned with the IPD, the display device is automatically adjusted. In some examples, the alignment determination is based on determinations of whether an optical axis of each display optical system positioned to be seen through by a respective eye is aligned with a pupil of the respective eye in accordance with an alignment criteria. The pupil alignment may be determined based on an arrangement of gaze detection elements for each display optical system including at least one sensor for capturing data of the respective eye and the captured data. The captured data may be image data, image and glint data, and glint data only.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: December 15, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Kathryn Stone Perez, Alex Aben-Athar Kipman
  • Patent number: 9202443
    Abstract: A see-through head mounted-display and method for operating the display to optimize performance of the display by referencing a user profile automatically. The identity of the user is determined by performing an iris scan and recognition of a user enabling user profile information to be retrieved and used to enhance the user's experience with the see through head mounted display. The user profile may contain user preferences regarding services providing augmented reality images to the see-through head-mounted display, as well as display adjustment information optimizing the position of display elements in the see-though head-mounted display.
    Type: Grant
    Filed: November 29, 2012
    Date of Patent: December 1, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Bob Crocco, Jr., John R. Lewis, Ben Vaught, Alex Aben-Athar Kipman
  • Patent number: 9201243
    Abstract: Embodiments for interacting with an executable virtual object associated with a real object are disclosed. In one example, a method for interacting with an executable virtual object associated with a real object includes receiving sensor input from one or more sensors attached to the portable see-through display device, and obtaining information regarding a location of the user based on the sensor input. The method also includes, if the location includes a real object comprising an associated executable virtual object, then determining an intent of the user to interact with the executable virtual object, and if the intent to interact is determined, then interacting with the executable object.
    Type: Grant
    Filed: January 22, 2015
    Date of Patent: December 1, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Ben Sugden, John Clavin, Ben Vaught, Stephen Latta, Kathryn Stone Perez, Daniel McCulloch, Jason Scott, Wei Zhang, Darren Bennett, Ryan Hastings, Arthur Tomlin, Kevin Geisner
  • Patent number: 9182814
    Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.
    Type: Grant
    Filed: June 26, 2009
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
  • Patent number: 9183807
    Abstract: The technology provides embodiments for displaying virtual data as printed content by a see-through, near-eye, mixed reality display device system. One or more literary content items registered to a reading object in a field of view of the display device system are displayed with print layout characteristics. Print layout characteristics from a publisher of each literary content item are selected if available. The reading object has a type like a magazine, book, journal or newspaper and may be a real object or a virtual object displayed by the display device system. The reading object type of the virtual object is based on a reading object type associated with a literary content item to be displayed. Virtual augmentation data registered to a literary content item is displayed responsive to detecting user physical action in image data. An example of a physical action is a page flipping gesture.
    Type: Grant
    Filed: January 10, 2012
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Patent number: 9182815
    Abstract: The technology provides embodiments for making static printed content being viewed through a see-through, mixed reality display device system more dynamic with display of virtual data. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. A task in relation to the printed content selection can also be determined based on physical action user input. Virtual data for the printed content selection is displayed in accordance with the task. Additionally, virtual data can be linked to a work embodied in a printed content item. Furthermore, a virtual version of the printed material may be displayed at a more comfortable reading position and with improved visibility of the content.
    Type: Grant
    Filed: December 7, 2011
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Patent number: 9159151
    Abstract: Data captured with respect to a human may be analyzed and applied to a visual representation of a user such that the visual representation begins to reflect the behavioral characteristics of the user. For example, a system may have a capture device that captures data about the user in the physical space. The system may identify the user's characteristics, tendencies, voice patterns, behaviors, gestures, etc. Over time, the system may learn a user's tendencies and intelligently apply animations to the user's avatar such that the avatar behaves and responds in accordance with the identified behaviors of the user. The animations applied to the avatar may be animations selected from a library of pre-packaged animations, or the animations may be entered and recorded by the user into the avatar's avatar library.
    Type: Grant
    Filed: July 13, 2009
    Date of Patent: October 13, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Patent number: 9153195
    Abstract: The technology provides contextual personal information by a mixed reality display device system being worn by a user. A user inputs person selection criteria, and the display system sends a request for data identifying at least one person in a location of the user who satisfy the person selection criteria to a cloud based application with access to user profile data for multiple users. Upon receiving data identifying the at least one person, the display system outputs data identifying the person if he or she is within the field of view. An identifier and a position indicator of the person in the location is output if not. Directional sensors on the display device may also be used for determining a position of the person. Cloud based executing software can identify and track the positions of people based on image and non-image data from display devices in the location.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: October 6, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Darren Bennett, Relja Markovic, Stephen G. Latta, Daniel J. McCulloch, Jason Scott, Ryan L. Hastings, Alex Aben-Athar Kipman, Andrew John Fuller, Jeffrey Neil Margolis, Kathryn Stone Perez, Sheridan Martin Small
  • Patent number: D740303
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: October 6, 2015
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D741371
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: October 20, 2015
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D742409
    Type: Grant
    Filed: October 4, 2013
    Date of Patent: November 3, 2015
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D742917
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: November 10, 2015
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D749128
    Type: Grant
    Filed: October 4, 2013
    Date of Patent: February 9, 2016
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D749633
    Type: Grant
    Filed: October 4, 2013
    Date of Patent: February 16, 2016
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier
  • Patent number: D750131
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: February 23, 2016
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Sebastian Andreas Sheldon Grinke, Kristie Joy Fisher, Mathew Lee McInelly, Dana Ludwig, Roy Lewis Herrod, Jason Michael Hewitt, Jeffrey Scott Blazier