Patents by Inventor Stephen G Latta

Stephen G Latta has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10645525
    Abstract: Examples are disclosed that relate to devices and methods for sharing geo-located information between different devices. In one example, a method comprises receiving the geo-located information from a first user device having a first data type compatible with a first output component of the device, receiving first sensor data from the first device, determining a location of the geo-located information within a coordinate system in a physical environment, determining that a second user device is located in the physical environment, determining that the second device does not comprise an output component that is compatible with the first data type, transforming the geo-located information into a second data type compatible with a second output component of the second device, determining that the second device is proximate to the location of the geo-located information, and sending the geo-located information to the second device for output by the second output component.
    Type: Grant
    Filed: July 24, 2019
    Date of Patent: May 5, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth Liam Kiemele, Donna Katherine Long, Bryant Daniel Hawthorne, Anthony Ernst, Kendall Clark York, Jeffrey Sipko, Janet Lynn Schneider, Christian Michael Sadak, Stephen G. Latta
  • Publication number: 20190349706
    Abstract: Examples are disclosed that relate to devices and methods for sharing geo-located information between different devices. In one example, a method comprises receiving the geo-located information from a first user device having a first data type compatible with a first output component of the device, receiving first sensor data from the first device, determining a location of the geo-located information within a coordinate system in a physical environment, determining that a second user device is located in the physical environment, determining that the second device does not comprise an output component that is compatible with the first data type, transforming the geo-located information into a second data type compatible with a second output component of the second device, determining that the second device is proximate to the location of the geo-located information, and sending the geo-located information to the second device for output by the second output component.
    Type: Application
    Filed: July 24, 2019
    Publication date: November 14, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Liam KIEMELE, Donna Katherine LONG, Bryant Daniel HAWTHORNE, Anthony ERNST, Kendall Clark YORK, Jeffrey SIPKO, Janet Lynn SCHNEIDER, Christian Michael SADAK, Stephen G. LATTA
  • Publication number: 20190342696
    Abstract: Examples are disclosed that relate to devices and methods for sharing geo-located information between different devices. In one example, a method comprises receiving the geo-located information from a first user device having a first data type compatible with a first output component of the device, receiving first sensor data from the first device, determining a location of the geo-located information within a coordinate system in a physical environment, determining that a second user device is located in the physical environment, determining that the second device does not comprise an output component that is compatible with the first data type, transforming the geo-located information into a second data type compatible with a second output component of the second device, determining that the second device is proximate to the location of the geo-located information, and sending the geo-located information to the second device for output by the second output component.
    Type: Application
    Filed: May 4, 2018
    Publication date: November 7, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Liam KIEMELE, Donna Katherine LONG, Bryant Daniel HAWTHORNE, Anthony ERNST, Kendall Clark YORK, Jeffrey SIPKO, Janet Lynn SCHNEIDER, Christian Michael SADAK, Stephen G. LATTA
  • Patent number: 10455351
    Abstract: Examples are disclosed that relate to devices and methods for sharing geo-located information between different devices. In one example, a method comprises receiving the geo-located information from a first user device having a first data type compatible with a first output component of the device, receiving first sensor data from the first device, determining a location of the geo-located information within a coordinate system in a physical environment, determining that a second user device is located in the physical environment, determining that the second device does not comprise an output component that is compatible with the first data type, transforming the geo-located information into a second data type compatible with a second output component of the second device, determining that the second device is proximate to the location of the geo-located information, and sending the geo-located information to the second device for output by the second output component.
    Type: Grant
    Filed: May 4, 2018
    Date of Patent: October 22, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth Liam Kiemele, Donna Katherine Long, Bryant Daniel Hawthorne, Anthony Ernst, Kendall Clark York, Jeffrey Sipko, Janet Lynn Schneider, Christian Michael Sadak, Stephen G. Latta
  • Patent number: 10223832
    Abstract: The technology provides contextual personal information by a mixed reality display device system being worn by a user. A user inputs person selection criteria, and the display system sends a request for data identifying at least one person in a location of the user who satisfy the person selection criteria to a cloud based application with access to user profile data for multiple users. Upon receiving data identifying the at least one person, the display system outputs data identifying the person if he or she is within the field of view. An identifier and a position indicator of the person in the location is output if not. Directional sensors on the display device may also be used for determining a position of the person. Cloud based executing software can identify and track the positions of people based on image and non-image data from display devices in the location.
    Type: Grant
    Filed: September 25, 2015
    Date of Patent: March 5, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Darren Bennett, Relja Markovic, Stephen G. Latta, Daniel J. McCulloch, Jason Scott, Ryan L. Hastings, Alex Aben-Athar Kipman, Andrew John Fuller, Jeffrey Neil Margolis, Kathryn Stone Perez, Sheridan Martin Small
  • Patent number: 10132633
    Abstract: The technology causes disappearance of a real object in a field of view of a see-through, mixed reality display device system based on user disappearance criteria. Image data is tracked to the real object in the field of view of the see-through display for implementing an alteration technique on the real object causing its disappearance from the display. A real object may satisfy user disappearance criteria by being associated with subject matter that the user does not wish to see or by not satisfying relevance criteria for a current subject matter of interest to the user. In some embodiments, based on a 3D model of a location of the display device system, an alteration technique may be selected for a real object based on a visibility level associated with the position within the location. Image data for alteration may be prefetched based on a location of the display device system.
    Type: Grant
    Filed: December 29, 2015
    Date of Patent: November 20, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: James C. Liu, Stephen G. Latta, Benjamin I. Vaught, Christopher M. Novak, Darren Bennett
  • Patent number: 10062213
    Abstract: A system for generating a virtual gaming environment based on features identified within a real-world environment, and adapting the virtual gaming environment over time as the features identified within the real-world environment change is described. Utilizing the technology described, a person wearing a head-mounted display device (HMD) may walk around a real-world environment and play a virtual game that is adapted to that real-world environment. For example, the HMD may identify environmental features within a real-world environment such as five grassy areas and two cars, and then spawn virtual monsters based on the location and type of the environmental features identified. The location and type of the environmental features identified may vary depending on the particular real-world environment in which the HMD exists and therefore each virtual game may look different depending on the particular real-world environment.
    Type: Grant
    Filed: August 11, 2016
    Date of Patent: August 28, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian J. Mount, Jason Scott, Ryan L. Hastings, Darren Bennett, Stephen G. Latta, Daniel J. McCulloch, Kevin A. Geisner, Jonathan T. Steed, Michael J. Scavezze
  • Patent number: 10007349
    Abstract: Methods for recognizing gestures using adaptive multi-sensor gesture recognition are described. In some embodiments, a gesture recognition system receives a plurality of sensor inputs from a plurality of sensor devices and a plurality of confidence thresholds associated with the plurality of sensor inputs. A confidence threshold specifies a minimum confidence value for which it is deemed that a particular gesture has occurred. Upon detection of a compensating event, such as excessive motion involving one of the plurality of sensor devices, the gesture recognition system may modify the plurality of confidence thresholds based on the compensating event. Subsequently, the gesture recognition system generates a multi-sensor confidence value based on whether at least a subset of the plurality of confidence thresholds has been satisfied. The gesture recognition system may also modify the plurality of confidence thresholds based on the plugging and unplugging of sensor inputs from the gesture recognition system.
    Type: Grant
    Filed: May 4, 2015
    Date of Patent: June 26, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Brian J. Mount, Adam G. Poulos, Jeffrey A. Kohler, Arthur C. Tomlin, Jonathan T. Steed
  • Patent number: 9898865
    Abstract: A method for operating a computing device is described herein. The method includes determining a user's gaze direction based on a gaze input, determining an intersection between the user's gaze direction and an identified environmental surface in a 3-dimensional environment, and generating a drawing surface based on the intersection within a user interface on a display.
    Type: Grant
    Filed: June 22, 2015
    Date of Patent: February 20, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Joe Thompson, Dan Osborn, Tarek Hefny, Stephen G. Latta, Forest Woodcroft Gouin, James Nakashima, Megan Saunders, Anatolie Gavriliuc, Alberto E. Cerriteno, Shawn Crispin Wright
  • Patent number: 9767524
    Abstract: Technology is provided for transferring a right to a digital content item based on one or more physical actions detected in data captured by a see-through, augmented reality display device system. A digital content item may be represented by a three-dimensional (3D) virtual object displayed by the device system. A user can hold the virtual object in some examples, and transfer a right to the content item the object represents by handing the object to another user within a defined distance, who indicates acceptance of the right based upon one or more physical actions including taking hold of the transferred object. Other examples of physical actions performed by a body part of a user may also indicate offer and acceptance in the right transfer. Content may be transferred from display device to display device while rights data is communicated via a network with a service application executing remotely.
    Type: Grant
    Filed: May 18, 2015
    Date of Patent: September 19, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ryan L. Hastings, Stephen G. Latta, Benjamin I. Vaught, Darren Bennett
  • Patent number: 9606992
    Abstract: Technology is described for resource management based on data including image data of a resource captured by at least one capture device of at least one personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display. A resource is automatically identified from image data captured by at least one capture device of at least one personal A/V apparatus and object reference data. A location in which the resource is situated and a 3D space position or volume of the resource in the location is tracked. A property of the resource is also determined from the image data and tracked. A function of a resource may also be stored for determining whether the resource is usable for a task. Responsive to notification criteria for the resource being satisfied, image data related to the resource is displayed on the near-eye AR display.
    Type: Grant
    Filed: June 27, 2012
    Date of Patent: March 28, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Jeffrey A. Kohler, Daniel J. McCulloch
  • Patent number: 9583032
    Abstract: Technology is disclosed herein to help a user navigate through large amounts of content while wearing a see-through, near-eye, mixed reality display device such as a head mounted display (HMD). The user can use a physical object such as a book to navigate through content being presented in the HMD. In one embodiment, a book has markers on the pages that allow the system to organize the content. The book could have real content, but it could be blank other than the markers. As the user flips through the book, the system recognizes the markers and presents content associated with the respective marker in the HMD.
    Type: Grant
    Filed: June 5, 2012
    Date of Patent: February 28, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Sheridan Martin Small, Stephen G. Latta
  • Patent number: 9584766
    Abstract: Techniques for implementing an integrative interactive space are described. In implementations, video cameras that are positioned to capture video at different locations are synchronized such that aspects of the different locations can be used to generate an integrated interactive space. The integrated interactive space can enable users at the different locations to interact, such as via video interaction, audio interaction, and so on. In at least some embodiments, techniques can be implemented to adjust an image of a participant during a video session such that the participant appears to maintain eye contact with other video session participants at other locations. Techniques can also be implemented to provide a virtual shared space that can enable users to interact with the space, and can also enable users to interact with one another and/or objects that are displayed in the virtual shared space.
    Type: Grant
    Filed: June 3, 2015
    Date of Patent: February 28, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Vivek Pradeep, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Alice Jane Bernheim Brush
  • Publication number: 20160371886
    Abstract: A method for operating a computing device is described herein. The method includes determining a user's gaze direction based on a gaze input, determining an intersection between the user's gaze direction and an identified environmental surface in a 3-dimensional environment, and generating a drawing surface based on the intersection within a user interface on a display.
    Type: Application
    Filed: June 22, 2015
    Publication date: December 22, 2016
    Inventors: Joe Thompson, Dan Osborn, Tarek Hefny, Stephen G. Latta, Forest Woodcroft Gouin, James Nakashima, Megan Saunders, Anatolie Gavriliuc, Alberto E. Cerriteno, Shawn Crispin Wright
  • Patent number: 9519640
    Abstract: A see-through, near-eye, mixed reality display apparatus for providing translations of real world data for a user. A wearer's location and orientation with the apparatus is determined and input data for translation is selected using sensors of the apparatus. Input data can be audio or visual in nature, and selected by reference to the gaze of a wearer. The input data is translated for the user relative to user profile information bearing on accuracy of a translation and determining from the input data whether a linguistic translation, knowledge addition translation or context translation is useful.
    Type: Grant
    Filed: May 4, 2012
    Date of Patent: December 13, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20160350978
    Abstract: A system for generating a virtual gaming environment based on features identified within a real-world environment, and adapting the virtual gaming environment over time as the features identified within the real-world environment change is described. Utilizing the technology described, a person wearing a head-mounted display device (HMD) may walk around a real-world environment and play a virtual game that is adapted to that real-world environment. For example, the HMD may identify environmental features within a real-world environment such as five grassy areas and two cars, and then spawn virtual monsters based on the location and type of the environmental features identified. The location and type of the environmental features identified may vary depending on the particular real-world environment in which the HMD exists and therefore each virtual game may look different depending on the particular real-world environment.
    Type: Application
    Filed: August 11, 2016
    Publication date: December 1, 2016
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Brian J. Mount, Jason Scott, Ryan L. Hastings, Darren Bennett, Stephen G. Latta, Daniel J. McCulloch, Kevin A. Geisner, Jonathan T. Steed, Michael J. Scavezze
  • Patent number: 9498718
    Abstract: Disclosed herein are systems and methods for altering a view perspective within a display environment. For example, gesture data corresponding to a plurality of inputs may be stored. The input may be input into a game or application implemented by a computing device. Images of a user of the game or application may be captured. For example, a suitable capture device may capture several images of the user over a period of time. The images may be analyzed and processed for detecting a user's gesture. Aspects of the user's gesture may be compared to the stored gesture data for determining an intended gesture input for the user. The comparison may be part of an analysis for determining inputs corresponding to the gesture data, where one or more of the inputs are input into the game or application and cause a view perspective within the display environment to be altered.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Gregory N. Snook, Justin McBride, Arthur Charles Tomlin, Peter Sarrett, Kevin Geisner, Relja Markovic, Christopher Vuchetich
  • Patent number: 9498720
    Abstract: A game can be created, shared and played using a personal audio/visual apparatus such as a head-mounted display device (HMDD). Rules of the game, and a configuration of the game space, can be standard or custom. Boundary points of the game can be defined by a gaze direction of the HMDD, by the user's location, by a model of a physical game space such as an instrumented court or by a template. Players can be identified and notified of the availability of a game using a server push technology. For example, a user in a particular location may be notified of the availability of a game at that location. A server manages the game, including storing the rules, boundaries and a game state. The game state can identify players and their scores. Real world objects can be imaged and provided as virtual objects in the game space.
    Type: Grant
    Filed: April 12, 2012
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Stephen G Latta, Ben J Sugden, Benjamin I Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez, Ryan L Hastings, Jason Scott, Darren A Bennett, John Clavin, Daniel McCulloch
  • Publication number: 20160292850
    Abstract: The technology described herein includes a see-through, near-eye, mixed reality display device for providing customized experiences for a user. The system can be used in various entertainment, sports, shopping and theme-park situations to provide a mixed reality experience.
    Type: Application
    Filed: March 4, 2016
    Publication date: October 6, 2016
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Kevin A. Geisner, Alex Aben-Athar Kipman, Jennifer A. Karr
  • Patent number: 9454849
    Abstract: A system for generating a virtual gaming environment based on features identified within a real-world environment, and adapting the virtual gaming environment over time as the features identified within the real-world environment change is described. Utilizing the technology described, a person wearing a head-mounted display device (HMD) may walk around a real-world environment and play a virtual game that is adapted to that real-world environment. For example, the HMD may identify environmental features within a real-world environment such as five grassy areas and two cars, and then spawn virtual monsters based on the location and type of the environmental features identified. The location and type of the environmental features identified may vary depending on the particular real-world environment in which the HMD exists and therefore each virtual game may look different depending on the particular real-world environment.
    Type: Grant
    Filed: November 29, 2012
    Date of Patent: September 27, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Brian J. Mount, Jason Scott, Ryan L. Hastings, Darren Bennett, Stephen G. Latta, Daniel J. McCulloch, Kevin A. Geisner, Jonathan T. Steed, Michael J. Scavezze