Patents by Inventor Mathew J. Lamb
Mathew J. Lamb has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9317971Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: GrantFiled: June 29, 2012Date of Patent: April 19, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9268406Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.Type: GrantFiled: June 29, 2012Date of Patent: February 23, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, Jr., Mathew J. Lamb
-
Patent number: 9116666Abstract: Techniques are provided for allowing a user to select a region within virtual imagery, such as a hologram, being presented in an HMD. The user could select the region by using their hands to form a closed loop such that from the perspective of the user, the closed loop corresponds to the region the user wishes to select. The user could select the region by using a prop, such as a picture frame. In response to the selection, the selected region could be presented using a different rendering technique than other regions of the virtual imagery. Various rendering techniques such as zooming, filtering, etc. could be applied to the selected region. The identification of the region by the user could also serve as a selection of an element in that portion of the virtual image.Type: GrantFiled: June 1, 2012Date of Patent: August 25, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Tom G. Salter, Alex Aben-Athar Kipman, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb
-
Patent number: 9105210Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.Type: GrantFiled: June 29, 2012Date of Patent: August 11, 2015Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20150220231Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: ApplicationFiled: April 15, 2015Publication date: August 6, 2015Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20150212576Abstract: Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.Type: ApplicationFiled: January 28, 2014Publication date: July 30, 2015Inventors: Anthony J. Ambrus, Adam G. Poulos, Lewey Alec Geselowitz, Dan Kroymann, Arthur C. Tomlin, Roger Sebastian-Kevin Sylvan, Mathew J. Lamb, Brian J. Mount
-
Patent number: 9035970Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.Type: GrantFiled: June 29, 2012Date of Patent: May 19, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9035955Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: GrantFiled: May 16, 2012Date of Patent: May 19, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20140198017Abstract: A see through display apparatus includes a see-through, head mounted display and sensors on the display which detect audible and visual data in a field of view of the apparatus. A processor cooperates with the display to provide information to a wearer of the device using a behavior-based real object mapping system. At least a global zone and an egocentric behavioral zone relative to the apparatus are established, and real objects assigned behaviors that are mapped to the respective zones occupied by the object. The behaviors assigned to the objects can be used by applications that provide services to the wearer, using the behaviors as the foundation for evaluation of the type of feedback to provide in the apparatus.Type: ApplicationFiled: January 12, 2013Publication date: July 17, 2014Inventors: Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20140002492Abstract: Techniques are provided for propagating real world properties into mixed reality images in a see-through, near-eye mixed reality display device. A physical property from the real world may be propagated into a virtual image to be rendered in the display device. Thus, the physics depicted in the mixed reality images may be influenced by a physical property in the environment. Therefore, the user wearing the mixed reality display device is provided a better sense that it is mixed reality, as opposed to simply virtual reality. The mixed reality image may be linked to a real world physical object. This physical object can be movable such as a book, paper, cellular telephone, etc. Forces on the physical object may be propagated into the virtual image.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Tom G. Salter
-
Publication number: 20140002496Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140002491Abstract: Techniques are provided for rendering, in a see-through, near-eye mixed reality display, a virtual object within a virtual hole, window or cutout. The virtual hole, window or cutout may appear to be within some real world physical object such as a book, table, etc. The virtual object may appear to be just below the surface of the physical object. In a sense, the virtual world could be considered to be a virtual container that provides developers with additional locations for presenting virtual objects. For example, rather than rendering a virtual object, such as a lamp, in a mixed reality display such that appears to sit on top of a real world desk, the virtual object is rendered such that it appears to be located below the surface of the desk.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Jeffrey Neil Margolis
-
Publication number: 20140002495Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140006026Abstract: A system for generating one or more enhanced audio signals such that one or more sound levels corresponding with sounds received from one or more sources of sound within an environment may be dynamically adjusted based on contextual information is described. The one or more enhanced audio signals may be generated by a head-mounted display device (HMD) worn by an end user within the environment and outputted to earphones associated with the HMD such that the end user may listen to the one or more enhanced audio signals in real-time. In some cases, each of the one or more sources of sound may correspond with a priority level. The priority level may be dynamically assigned depending on whether the end user of the HMD is focusing on a particular source of sound or has specified a predetermined level of importance corresponding with the particular source of sound.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140002442Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20130321255Abstract: Technology is disclosed herein to help a user navigate through large amounts of content while wearing a see-through, near-eye, mixed reality display device such as a head mounted display (HMD). The user can use a physical object such as a book to navigate through content being presented in the HMD. In one embodiment, a book has markers on the pages that allow the system to organize the content. The book could have real content, but it could be blank other than the markers. As the user flips through the book, the system recognizes the markers and presents content associated with the respective marker in the HMD.Type: ApplicationFiled: June 5, 2012Publication date: December 5, 2013Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Sheridan Martin Small, Stephen G. Latta
-
Publication number: 20130321462Abstract: Techniques are provided for allowing a user to select a region within virtual imagery, such as a hologram, being presented in an HMD. The user could select the region by using their hands to form a closed loop such that from the perspective of the user, the closed loop corresponds to the region the user wishes to select. The user could select the region by using a prop, such as a picture frame. In response to the selection, the selected region could be presented using a different rendering technique than other regions of the virtual imagery. Various rendering techniques such as zooming, filtering, etc. could be applied to the selected region. The identification of the region by the user could also serve as a selection of an element in that portion of the virtual image.Type: ApplicationFiled: June 1, 2012Publication date: December 5, 2013Inventors: Tom G. Salter, Alex Aben-Athar Kipman, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb
-
Publication number: 20130307855Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: ApplicationFiled: May 16, 2012Publication date: November 21, 2013Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20130307856Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: ApplicationFiled: May 16, 2012Publication date: November 21, 2013Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20130083173Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.Type: ApplicationFiled: June 29, 2012Publication date: April 4, 2013Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, JR., Mathew J. Lamb