Patents by Inventor Brian E. Keane

Brian E. Keane has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240119959
    Abstract: The present disclosure generally relates to a system and method for obtaining a diagnosis of a mental health condition. An exemplary system can receive an audio input; convert the audio input into a text string; identify a speaker associated with the text string; based on at least a portion of the audio input, determine a predefined audio characteristic of a plurality of predefined audio characteristics; based on the determined audio characteristic, identify an emotion corresponding to the portion of the audio input; generate a set of structured data based on the text string, the speaker, the predefined audio characteristic, and the identified emotion; and provide an output for obtaining the diagnosis of the mental disorder or condition, wherein the output is indicative of at least a portion of the set of structured data.
    Type: Application
    Filed: December 11, 2023
    Publication date: April 11, 2024
    Applicant: The MITRE Corporation
    Inventors: Qian HU, Brian P. Marx, Patricia D. King, Seth-David Donald Dworman, Matthew E. Coarr, Keith A. Crouch, Stelios Melachrinoudis, Cheryl Clark, Terence M. Keane
  • Patent number: 10955665
    Abstract: A see through head mounted display apparatus includes code performing a method of choosing and optimal viewing location and perspective for shared-view virtual objects rendered for multiple users in a common environment. Multiple objects and multiple users are taken into account in determining the optimal, common viewing location. The technology allows each user to have a common view if the relative position of the object in the environment.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: March 23, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 10643389
    Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.
    Type: Grant
    Filed: March 28, 2016
    Date of Patent: May 5, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 10175483
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is not focused on the virtual object, which may be a heads-up display, or HUD, the HUD may remain body locked to the user. As such, the user may explore and interact with a mixed reality environment presented by the head mounted display device without interference from the HUD. When a user wishes to view and/or interact with the HUD, the user may look at the HUD. At this point, the HUD may change from a body locked virtual object to a world locked virtual object. The user is then able to view and interact with the HUD from different positions and perspectives of the HUD.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: January 8, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 10139623
    Abstract: A method and apparatus for the creation of a perspective-locked virtual object having in world space. The virtual object may be consumed) by another user with a consumption device at a location, position, and orientation which is the same as, or proximate to, the location, position, and orientation where the virtual object is created. Objects may have one, few or many allowable consumption locations, positions, and orientations defined by its creator.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: November 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9583032
    Abstract: Technology is disclosed herein to help a user navigate through large amounts of content while wearing a see-through, near-eye, mixed reality display device such as a head mounted display (HMD). The user can use a physical object such as a book to navigate through content being presented in the HMD. In one embodiment, a book has markers on the pages that allow the system to organize the content. The book could have real content, but it could be blank other than the markers. As the user flips through the book, the system recognizes the markers and presents content associated with the respective marker in the HMD.
    Type: Grant
    Filed: June 5, 2012
    Date of Patent: February 28, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Sheridan Martin Small, Stephen G. Latta
  • Patent number: 9524081
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Grant
    Filed: April 15, 2015
    Date of Patent: December 20, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
  • Patent number: 9417692
    Abstract: Techniques are provided for rendering, in a see-through, near-eye mixed reality display, a virtual object within a virtual hole, window or cutout. The virtual hole, window or cutout may appear to be within some real world physical object such as a book, table, etc. The virtual object may appear to be just below the surface of the physical object. In a sense, the virtual world could be considered to be a virtual container that provides developers with additional locations for presenting virtual objects. For example, rather than rendering a virtual object, such as a lamp, in a mixed reality display such that appears to sit on top of a real world desk, the virtual object is rendered such that it appears to be located below the surface of the desk.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: August 16, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Jeffrey Neil Margolis
  • Publication number: 20160210789
    Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.
    Type: Application
    Filed: March 28, 2016
    Publication date: July 21, 2016
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 9384737
    Abstract: A system for generating one or more enhanced audio signals such that one or more sound levels corresponding with sounds received from one or more sources of sound within an environment may be dynamically adjusted based on contextual information is described. The one or more enhanced audio signals may be generated by a head-mounted display device (HMD) worn by an end user within the environment and outputted to earphones associated with the HMD such that the end user may listen to the one or more enhanced audio signals in real-time. In some cases, each of the one or more sources of sound may correspond with a priority level. The priority level may be dynamically assigned depending on whether the end user of the HMD is focusing on a particular source of sound or has specified a predetermined level of importance corresponding with the particular source of sound.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: July 5, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 9329682
    Abstract: A head mounted display allows user selection of a virtual object through multi-step focusing by the user. Focus on the selectable object is determined and then a validation object is displayed. When user focus moves to the validation object, a timeout determines that a selection of the validation object, and thus the selectable object has occurred. The technology can be used in see through head mounted displays to allow a user to effectively navigate an environment with a multitude of virtual objects without unintended selections.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: May 3, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9317971
    Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: April 19, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 9268406
    Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: February 23, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, Jr., Mathew J. Lamb
  • Patent number: 9235051
    Abstract: A see-through head mounted display apparatus includes a display and a processor. The processor determines geo-located positions of points of interest within a field of view and generates markers indicating information regarding an associated real world object is available to the user. Markers are rendered in the display relative to the geo-located position and the field of view of the user. When a user selects a marker though a user gesture, the device displays a near-field virtual object having a visual tether to the marker simultaneously with the marker. The user may interact with the marker to view, add or delete information associated with the point of interest.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: January 12, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9230368
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is moving through the mixed reality environment, the virtual objects may remain world-locked, so that the user can move around and explore the virtual objects from different perspectives. When the user is motionless in the mixed reality environment, the virtual objects may rotate to face the user so that the user can easily view and interact with the virtual objects.
    Type: Grant
    Filed: May 23, 2013
    Date of Patent: January 5, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 9116666
    Abstract: Techniques are provided for allowing a user to select a region within virtual imagery, such as a hologram, being presented in an HMD. The user could select the region by using their hands to form a closed loop such that from the perspective of the user, the closed loop corresponds to the region the user wishes to select. The user could select the region by using a prop, such as a picture frame. In response to the selection, the selected region could be presented using a different rendering technique than other regions of the virtual imagery. Various rendering techniques such as zooming, filtering, etc. could be applied to the selected region. The identification of the region by the user could also serve as a selection of an element in that portion of the virtual image.
    Type: Grant
    Filed: June 1, 2012
    Date of Patent: August 25, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tom G. Salter, Alex Aben-Athar Kipman, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb
  • Patent number: 9105210
    Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: August 11, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20150220231
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Application
    Filed: April 15, 2015
    Publication date: August 6, 2015
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
  • Patent number: 9035970
    Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: May 19, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 9035955
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Grant
    Filed: May 16, 2012
    Date of Patent: May 19, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman