Patents by Inventor Christopher E. Miles
Christopher E. Miles has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10997512Abstract: Systems and methods for inferring user traits based on indirect questions. Indirect questions may be generated based on one or more triggers. The answers to the indirect questions provide cues to a system as to whether a user has one or more attributes associated with a trait. This information may be used to personalize a computing device.Type: GrantFiled: July 6, 2015Date of Patent: May 4, 2021Inventors: Margaret Novotny, Jacob Miller, William Wagner, Yelisaveta Pesenson, Aryn Shelander, Sheena Stevens, Claudio Russo, Thore Graepel, Andrew D. Gordon, Christopher E. Miles
-
Patent number: 10643389Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: GrantFiled: March 28, 2016Date of Patent: May 5, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9583032Abstract: Technology is disclosed herein to help a user navigate through large amounts of content while wearing a see-through, near-eye, mixed reality display device such as a head mounted display (HMD). The user can use a physical object such as a book to navigate through content being presented in the HMD. In one embodiment, a book has markers on the pages that allow the system to organize the content. The book could have real content, but it could be blank other than the markers. As the user flips through the book, the system recognizes the markers and presents content associated with the respective marker in the HMD.Type: GrantFiled: June 5, 2012Date of Patent: February 28, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Sheridan Martin Small, Stephen G. Latta
-
Patent number: 9524081Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: GrantFiled: April 15, 2015Date of Patent: December 20, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20160350667Abstract: Systems and methods for inferring user traits based on indirect questions. Indirect questions may be generated based on one or more triggers. The answers to the indirect questions provide cues to a system as to whether a user has one or more attributes associated with a trait. This information may be used to personalize a computing device.Type: ApplicationFiled: July 6, 2015Publication date: December 1, 2016Applicant: Microsoft Technology Licensing, LLCInventors: Margaret Novotny, Jacob Miller, William Wagner, Yelisaveta Pesenson, Aryn Shelander, Sheena Stevens, Claudio Russo, Thore Graepel, Andrew D. Gordon, Christopher E. Miles
-
Patent number: 9417692Abstract: Techniques are provided for rendering, in a see-through, near-eye mixed reality display, a virtual object within a virtual hole, window or cutout. The virtual hole, window or cutout may appear to be within some real world physical object such as a book, table, etc. The virtual object may appear to be just below the surface of the physical object. In a sense, the virtual world could be considered to be a virtual container that provides developers with additional locations for presenting virtual objects. For example, rather than rendering a virtual object, such as a lamp, in a mixed reality display such that appears to sit on top of a real world desk, the virtual object is rendered such that it appears to be located below the surface of the desk.Type: GrantFiled: June 29, 2012Date of Patent: August 16, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Jeffrey Neil Margolis
-
Publication number: 20160210789Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: ApplicationFiled: March 28, 2016Publication date: July 21, 2016Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9384737Abstract: A system for generating one or more enhanced audio signals such that one or more sound levels corresponding with sounds received from one or more sources of sound within an environment may be dynamically adjusted based on contextual information is described. The one or more enhanced audio signals may be generated by a head-mounted display device (HMD) worn by an end user within the environment and outputted to earphones associated with the HMD such that the end user may listen to the one or more enhanced audio signals in real-time. In some cases, each of the one or more sources of sound may correspond with a priority level. The priority level may be dynamically assigned depending on whether the end user of the HMD is focusing on a particular source of sound or has specified a predetermined level of importance corresponding with the particular source of sound.Type: GrantFiled: June 29, 2012Date of Patent: July 5, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9317971Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: GrantFiled: June 29, 2012Date of Patent: April 19, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9268406Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.Type: GrantFiled: June 29, 2012Date of Patent: February 23, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, Jr., Mathew J. Lamb
-
Patent number: 9116666Abstract: Techniques are provided for allowing a user to select a region within virtual imagery, such as a hologram, being presented in an HMD. The user could select the region by using their hands to form a closed loop such that from the perspective of the user, the closed loop corresponds to the region the user wishes to select. The user could select the region by using a prop, such as a picture frame. In response to the selection, the selected region could be presented using a different rendering technique than other regions of the virtual imagery. Various rendering techniques such as zooming, filtering, etc. could be applied to the selected region. The identification of the region by the user could also serve as a selection of an element in that portion of the virtual image.Type: GrantFiled: June 1, 2012Date of Patent: August 25, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Tom G. Salter, Alex Aben-Athar Kipman, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb
-
Patent number: 9105210Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.Type: GrantFiled: June 29, 2012Date of Patent: August 11, 2015Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20150220231Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: ApplicationFiled: April 15, 2015Publication date: August 6, 2015Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Patent number: 9035970Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.Type: GrantFiled: June 29, 2012Date of Patent: May 19, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Patent number: 9035955Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.Type: GrantFiled: May 16, 2012Date of Patent: May 19, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
-
Publication number: 20140152558Abstract: Methods for controlling an augmented reality environment associated with a head-mounted display device (HMD) are described. In some embodiments, a virtual pointer may be displayed to an end user of the HMD and controlled by the end user using motion and/or orientation information associated with a secondary device (e.g., a mobile phone). Using the virtual pointer, the end user may select and manipulate virtual objects within the augmented reality environment, select real-world objects within the augmented reality environment, and/or control a graphical user interface of the HMD. In some cases, the initial position of the virtual pointer within the augmented reality environment may be determined based on a particular direction in which the end user is gazing and/or a particular object at which the end user is currently focusing on or has recently focused on.Type: ApplicationFiled: November 30, 2012Publication date: June 5, 2014Inventors: Tom Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140002492Abstract: Techniques are provided for propagating real world properties into mixed reality images in a see-through, near-eye mixed reality display device. A physical property from the real world may be propagated into a virtual image to be rendered in the display device. Thus, the physics depicted in the mixed reality images may be influenced by a physical property in the environment. Therefore, the user wearing the mixed reality display device is provided a better sense that it is mixed reality, as opposed to simply virtual reality. The mixed reality image may be linked to a real world physical object. This physical object can be movable such as a book, paper, cellular telephone, etc. Forces on the physical object may be propagated into the virtual image.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Tom G. Salter
-
Publication number: 20140006026Abstract: A system for generating one or more enhanced audio signals such that one or more sound levels corresponding with sounds received from one or more sources of sound within an environment may be dynamically adjusted based on contextual information is described. The one or more enhanced audio signals may be generated by a head-mounted display device (HMD) worn by an end user within the environment and outputted to earphones associated with the HMD such that the end user may listen to the one or more enhanced audio signals in real-time. In some cases, each of the one or more sources of sound may correspond with a priority level. The priority level may be dynamically assigned depending on whether the end user of the HMD is focusing on a particular source of sound or has specified a predetermined level of importance corresponding with the particular source of sound.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140002442Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
-
Publication number: 20140002495Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman