Patents by Inventor Kathryn Stone Perez

Kathryn Stone Perez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130307855
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Application
    Filed: May 16, 2012
    Publication date: November 21, 2013
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20130307856
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Application
    Filed: May 16, 2012
    Publication date: November 21, 2013
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
  • Publication number: 20130300653
    Abstract: The technology provides various embodiments for gaze determination within a see-through, near-eye, mixed reality display device. In some embodiments, the boundaries of a gaze detection coordinate system can be determined from a spatial relationship between a user eye and gaze detection elements such as illuminators and at least one light sensor positioned on a support structure such as an eyeglasses frame. The gaze detection coordinate system allows for determination of a gaze vector from each eye based on data representing glints on the user eye, or a combination of image and glint data. A point of gaze may be determined in a three-dimensional user field of view including real and virtual objects. The spatial relationship between the gaze detection elements and the eye may be checked and may trigger a re-calibration of training data sets if the boundaries of the gaze detection coordinate system have changed.
    Type: Application
    Filed: July 12, 2013
    Publication date: November 14, 2013
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez
  • Publication number: 20130293577
    Abstract: A see-through, near-eye, mixed reality display apparatus for providing translations of real world data for a user. A wearer's location and orientation with the apparatus is determined and input data for translation is selected using sensors of the apparatus. Input data can be audio or visual in nature, and selected by reference to the gaze of a wearer. The input data is translated for the user relative to user profile information bearing on accuracy of a translation and determining from the input data whether a linguistic translation, knowledge addition translation or context translation is useful.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130293530
    Abstract: An augmented reality system that provides augmented product and environment information to a wearer of a see through head mounted display. The augmentation information may include advertising, inventory, pricing and other information about products a wearer may be interested in. Interest is determined from wearer actions and a wearer profile. The information may be used to incentivize purchases of real world products by a wearer, or allow the wearer to make better purchasing decisions. The augmentation information may enhance a wearer's shopping experience by allowing the wearer easy access to important product information while the wearer is shopping in a retail establishment. Through virtual rendering, a wearer may be provided with feedback on how an item would appear in a wearer environment, such as the wearer's home.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130293468
    Abstract: A see-through, near-eye, mixed reality display device and system for collaboration amongst various users of other such devices and personal audio/visual devices of more limited capabilities. One or more wearers of a see through head mounted display apparatus define a collaboration environment. For the collaboration environment, a selection of collaboration data and the scope of the environment are determined. Virtual representations of the collaboration data in the field of view of the wearer, and other device users are rendered. Persons in the wearer's field of view to be included in collaboration environment and who are entitled to share information in the collaboration environment are defined by the wearer. If allowed, input from other users in the collaboration environment on the virtual object may be received and allowed to manipulate a change in the virtual object.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130286178
    Abstract: The technology provides various embodiments for gaze determination within a see-through, near-eye, mixed reality display device. In some embodiments, the boundaries of a gaze detection coordinate system can be determined from a spatial relationship between a user eye and gaze detection elements such as illuminators and at least one light sensor positioned on a support structure such as an eyeglasses frame. The gaze detection coordinate system allows for determination of a gaze vector from each eye based on data representing glints on the user eye, or a combination of image and glint data. A point of gaze may be determined in a three-dimensional user field of view including real and virtual objects. The spatial relationship between the gaze detection elements and the eye may be checked and may trigger a re-calibration of training data sets if the boundaries of the gaze detection coordinate system have changed.
    Type: Application
    Filed: March 15, 2013
    Publication date: October 31, 2013
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez
  • Patent number: 8542252
    Abstract: Techniques may comprise identifying surfaces, textures, and object dimensions from unorganized point clouds derived from a capture device, such as a depth sensing device. Employing target digitization may comprise surface extraction, identifying points in a point cloud, labeling surfaces, computing object properties, tracking changes in object properties over time, and increasing confidence in the object boundaries and identity as additional frames are captured. If the point cloud data includes an object, a model of the object may be generated. Feedback of the model associated with a particular object may be generated and provided real time to the user. Further, the model of the object may be tracked in response to any movement of the object in the physical space such that the model may be adjusted to mimic changes or movement of the object, or increase the fidelity of the target's characteristics.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: September 24, 2013
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas Burton, Andrew Wilson, Diego Fernandes Nehab
  • Patent number: 8537075
    Abstract: An environmental-light filter removably coupled to an optical see-through head-mounted display (HMD) device is disclosed. The environmental-light filter couples to the HMD device between a display component and a real-world scene. Coupling features are provided to allow the filter to be easily and removably attached to the HMD device when desired by a user. The filter increases the primacy of a provided augmented-reality image with respect to a real-world scene and reduces brightness and power consumption requirements for presenting the augmented-reality image. A plurality of filters of varied light transmissivity may be provided from which to select a desired filter based on environmental lighting conditions and user preference. The light transmissivity of the filter may be about 70% light transmissive to substantially or completely opaque.
    Type: Grant
    Filed: December 3, 2012
    Date of Patent: September 17, 2013
    Inventors: Robert Crocco, Ben Sugden, Kathryn Stone-Perez
  • Publication number: 20130194164
    Abstract: Embodiments for interacting with an executable virtual object associated with a real object are disclosed. In one example, a method for interacting with an executable virtual object associated with a real object includes receiving sensor input from one or more sensors attached to the portable see-through display device, and obtaining information regarding a location of the user based on the sensor input. The method also includes, if the location includes a real object comprising an associated executable virtual object, then determining an intent of the user to interact with the executable virtual object, and if the intent to interact is determined, then interacting with the executable object.
    Type: Application
    Filed: January 27, 2012
    Publication date: August 1, 2013
    Inventors: Ben Sugden, John Clavin, Ben Vaught, Stephen Latta, Kathryn Stone Perez, Daniel McCulloch, Jason Scott, Wei Zhang, Darren Bennett, Ryan Hastings, Arthur Tomlin, Kevin Geisner
  • Patent number: 8487838
    Abstract: The technology provides various embodiments for gaze determination within a see-through, near-eye, mixed reality display device. In some embodiments, the boundaries of a gaze detection coordinate system can be determined from a spatial relationship between a user eye and gaze detection elements such as illuminators and at least one light sensor positioned on a support structure such as an eyeglasses frame. The gaze detection coordinate system allows for determination of a gaze vector from each eye based on data representing glints on the user eye, or a combination of image and glint data. A point of gaze may be determined in a three-dimensional user field of view including real and virtual objects. The spatial relationship between the gaze detection elements and the eye may be checked and may trigger a re-calibration of training data sets if the boundaries of the gaze detection coordinate system have changed.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: July 16, 2013
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez
  • Publication number: 20130169683
    Abstract: A see-through head mounted-display and method for operating the display to optimize performance of the display by referencing a user profile automatically. The identity of the user is determined by performing an iris scan and recognition of a user enabling user profile information to be retrieved and used to enhance the user's experience with the see through head mounted display. The user profile may contain user preferences regarding services providing augmented reality images to the see-through head-mounted display, as well as display adjustment information optimizing the position of display elements in the see-though head-mounted display.
    Type: Application
    Filed: November 29, 2012
    Publication date: July 4, 2013
    Inventors: Kathryn Stone Perez, Bob Crocco, JR., John R. Lewis, Ben Vaught, Alex Aben-Athar Kipman
  • Publication number: 20130162505
    Abstract: An environmental-light filter removably coupled to an optical see-through head-mounted display (HMD) device is disclosed. The environmental-light filter couples to the HMD device between a display component and a real-world scene. Coupling features are provided to allow the filter to be easily and removably attached to the HMD device when desired by a user. The filter increases the primacy of a provided augmented-reality image with respect to a real-world scene and reduces brightness and power consumption requirements for presenting the augmented-reality image. A plurality of filters of varied light transmissivity may be provided from which to select a desired filter based on environmental lighting conditions and user preference. The light transmissivity of the filter may be about 70% light transmissive to substantially or completely opaque.
    Type: Application
    Filed: December 3, 2012
    Publication date: June 27, 2013
    Inventors: ROBERT CROCCO, BEN SUGDEN, KATHRYN STONE-PEREZ
  • Publication number: 20130147838
    Abstract: The technology provides for updating printed content with personalized virtual data using a see-through, near-eye, mixed reality display device system. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. Virtual data is selected from available virtual data for the printed content selection based on user profile data, and the display device system displays the selected virtual data in a position registered to the position of the printed content selection. In some examples, a task related to the printed content item is determined based on physical action user input, and personalized virtual data is displayed registered to the printed content item in accordance with the task.
    Type: Application
    Filed: January 9, 2012
    Publication date: June 13, 2013
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Publication number: 20130147836
    Abstract: The technology provides embodiments for making static printed content being viewed through a see-through, mixed reality display device system more dynamic with display of virtual data. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. A task in relation to the printed content selection can also be determined based on physical action user input. Virtual data for the printed content selection is displayed in accordance with the task. Additionally, virtual data can be linked to a work embodied in a printed content item. Furthermore, a virtual version of the printed material may be displayed at a more comfortable reading position and with improved visibility of the content.
    Type: Application
    Filed: December 7, 2011
    Publication date: June 13, 2013
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Publication number: 20130147687
    Abstract: The technology provides embodiments for displaying virtual data as printed content by a see-through, near-eye, mixed reality display device system. One or more literary content items registered to a reading object in a field of view of the display device system are displayed with print layout characteristics. Print layout characteristics from a publisher of each literary content item are selected if available. The reading object has a type like a magazine, book, journal or newspaper and may be a real object or a virtual object displayed by the display device system. The reading object type of the virtual object is based on a reading object type associated with a literary content item to be displayed. Virtual augmentation data registered to a literary content item is displayed responsive to detecting user physical action in image data. An example of a physical action is a page flipping gesture.
    Type: Application
    Filed: January 10, 2012
    Publication date: June 13, 2013
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Publication number: 20130137076
    Abstract: Technology disclosed herein provides for use of HMDs in a classroom setting. Technology disclosed herein provides for HMD use for holographic instruction. In one embodiment, the HMD is used for social coaching. User profile information may be used to tailor instruction to a specific user based on known skills, learning styles, and/or characteristics. One or more individuals may be monitored based on sensor data. The sensor data may come from an HMD. The monitoring may be analyzed to determine how to enhance an experience. The experience may be enhanced by presenting an image in at least one head mounted display worn by the one or more individuals.
    Type: Application
    Filed: November 30, 2011
    Publication date: May 30, 2013
    Inventors: Kathryn Stone Perez, Kevin A. Geisner, Ben J. Sugden, Daniel J. McCulloch, John Clavin
  • Publication number: 20130095924
    Abstract: Technology is described for providing a personalized sport performance experience with three dimensional (3D) virtual data displayed by a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus. A physical movement recommendation is determined for the user performing a sport based on skills data for the user for the sport, physical characteristics of the user, and 3D space positions for at least one or more sport objects. 3D virtual data depicting one or more visual guides for assisting the user in performing the physical movement recommendation may be displayed from a user perspective associated with a display field of view of the near-eye AR display. An avatar may also be displayed by the near-eye AR display performing a sport. The avatar may perform the sport interactively with the user or be displayed performing a prior performance of an individual represented by the avatar.
    Type: Application
    Filed: September 28, 2012
    Publication date: April 18, 2013
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, John Clavin
  • Patent number: 8418085
    Abstract: A capture device may capture a user's motion and a display device may display a model that maps to the user's motion, including gestures that are applicable for control. A user may be unfamiliar with a system that maps the user's motions or not know what gestures are applicable for an executing application. A user may not understand or know how to perform gestures that are applicable for the executing application. User motion data and/or outputs of filters corresponding to gestures may be analyzed to determine those cases where assistance to the user on performing the gesture is appropriate.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: April 9, 2013
    Assignee: Microsoft Corporation
    Inventors: Gregory N. Snook, Stephen Latta, Kevin Geisner, Darren Alexander Bennett, Kudo Tsunoda, Alex Kipman, Kathryn Stone Perez
  • Publication number: 20130083018
    Abstract: A system for generating an augmented reality environment using state-based virtual objects is described. A state-based virtual object may be associated with a plurality of different states. Each state of the plurality of different states may correspond with a unique set of triggering events different from those of any other state. The set of triggering events associated with a particular state may be used to determine when a state change from the particular state is required. In some cases, each state of the plurality of different states may be associated with a different 3-D model or shape. The plurality of different states may be defined using a predetermined and standardized file format that supports state-based virtual objects. In some embodiments, one or more potential state changes from a particular state may be predicted based on one or more triggering probabilities associated with the set of triggering events.
    Type: Application
    Filed: March 27, 2012
    Publication date: April 4, 2013
    Inventors: Kevin A. Geisner, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez