Patents by Inventor Robert L. Crocco

Robert L. Crocco has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9105210
    Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, including those with viewing obstructions, the identity of the AR tag and the location of a corresponding virtual object may be determined by aggregating individual identity and location determinations from a plurality of head-mounted display devices (HMDs). The virtual object may comprise a shared virtual object that is viewable from each of the plurality of HMDs as existing at a shared location within the augmented reality environment. The shared location may comprise a weighted average of individual location determinations from each of the plurality of HMDs. By aggregating and analyzing individual identity and location determinations, a particular HMD of the plurality of HMDs may display a virtual object without having to identify a corresponding AR tag directly.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: August 11, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20150220231
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Application
    Filed: April 15, 2015
    Publication date: August 6, 2015
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
  • Patent number: 9035970
    Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: May 19, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, Jr., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Patent number: 9035955
    Abstract: A system for generating and displaying holographic visual aids associated with a story to an end user of a head-mounted display device while the end user is reading the story or perceiving the story being read aloud is described. The story may be embodied within a reading object (e.g., a book) in which words of the story may be displayed to the end user. The holographic visual aids may include a predefined character animation that is synchronized to a portion of the story corresponding with the character being animated. A reading pace of a portion of the story may be used to control the playback speed of the predefined character animation in real-time such that the character is perceived to be lip-syncing the story being read aloud. In some cases, an existing book without predetermined AR tags may be augmented with holographic visual aids.
    Type: Grant
    Filed: May 16, 2012
    Date of Patent: May 19, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Mathew J. Lamb, Alex Aben-Athar Kipman
  • Patent number: 9025252
    Abstract: The technology provides for adjusting a see-through, near-eye, mixed reality display device for alignment with an inter-pupillary distance (IPD) of a user by different examples of display adjustment mechanisms. The see-through, near-eye, mixed reality system includes for each eye a display optical system having an optical axis. Each display optical system is positioned to be seen through by a respective eye, and is supported on a respective movable support structure. A display adjustment mechanism attached to the display device also connects with each movable support structure for moving the structure. A determination is automatically made as to whether the display device is aligned with an IPD of a user. If not aligned, one or more adjustment values for a position of at least one of the display optical systems is automatically determined. The display adjustment mechanism moves the at least one display optical system in accordance with the adjustment values.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: May 5, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: John R. Lewis, Kathryn Stone Perez, Robert L. Crocco, Alex Aben-Athar Kipman
  • Patent number: 8928659
    Abstract: Described herein is a telepresence system where a real-time a virtual hologram of a user is displayed at a remote display screen and is rendered from a vantage point that is different than the vantage point from which images of the user are captured via a video camera. The virtual hologram is based at least in part upon data acquired from a sensor unit at the location of the user. The sensor unit includes a color video camera that captures 2-D images of the user including surface features of the user. The sensor unit also includes a depth sensor that captures 3-D geometry data indicative of the relative position of surfaces on the user in 3-D space. The virtual hologram is rendered to orientate the gaze of the eyes of the virtual hologram towards the eyes of a second user viewing the remote display screen.
    Type: Grant
    Filed: June 23, 2010
    Date of Patent: January 6, 2015
    Assignee: Microsoft Corporation
    Inventors: Avi Bar-Zeev, Christian F. Huitema, Alex Aben-Athar Kipman, Robert L. Crocco, Jr., John Allen Tardif, Eric G. Lang
  • Patent number: 8928558
    Abstract: The technology provides various embodiments for gaze determination within a see-through, near-eye, mixed reality display device. In some embodiments, the boundaries of a gaze detection coordinate system can be determined from a spatial relationship between a user eye and gaze detection elements such as illuminators and at least one light sensor positioned on a support structure such as an eyeglasses frame. The gaze detection coordinate system allows for determination of a gaze vector from each eye based on data representing glints on the user eye, or a combination of image and glint data. A point of gaze may be determined in a three-dimensional user field of view including real and virtual objects. The spatial relationship between the gaze detection elements and the eye may be checked and may trigger a re-calibration of training data sets if the boundaries of the gaze detection coordinate system have changed.
    Type: Grant
    Filed: July 12, 2013
    Date of Patent: January 6, 2015
    Assignee: Microsoft Corporation
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez
  • Publication number: 20140368532
    Abstract: A method and apparatus for the creation of a perspective-locked virtual object having in world space. The virtual object may be consumed) by another user with a consumption device at a location, position, and orientation which is the same as, or proximate to, the location, position, and orientation where the virtual object is created.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368537
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment including shared virtual objects and private virtual objects. Multiple users can collaborate together in interacting with the shared virtual objects. A private virtual object may be visible to a single user. In examples, private virtual objects of respective users may facilitate the users' collaborative interaction with one or more shared virtual objects.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368535
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is not focused on the virtual object, which may be a heads-up display, or HUD, the HUD may remain body locked to the user. As such, the user may explore and interact with a mixed reality environment presented by the head mounted display device without interference from the HUD. When a user wishes to view and/or interact with the HUD, the user may look at the HUD. At this point, the HUD may change from a body locked virtual object to a world locked virtual object. The user is then able to view and interact with the HUD from different positions and perspectives of the HUD.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140372957
    Abstract: A head mounted display allows user selection of a virtual object through multi-step focusing by the user. Focus on the selectable object is determined and then a validation object is displayed. When user focus moves to the validation object, a timeout determines that a selection of the validation object, and thus the selectable object has occurred. The technology can be used in see through head mounted displays to allow a user to effectively navigate an environment with a multitude of virtual objects without unintended selections.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, Jr., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368534
    Abstract: A see through head mounted display apparatus includes code performing a method of choosing and optimal viewing location and perspective for shared-view virtual objects rendered for multiple users in a common environment. Multiple objects and multiple users are taken into account in determining the optimal, common viewing location. The technology allows each user to have a common view if the relative position of the object in the environment.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140368533
    Abstract: A see-through head mounted display apparatus includes a display and a processor. The processor determines geo-located positions of points of interest within a field of view and generates markers indicating information regarding an associated real world object is available to the user. Markers are rendered in the display relative to the geo-located position and the field of view of the user. When a user selects a marker though a user gesture, the device displays a near-field virtual object having a visual tether to the marker simultaneously with the marker. The user may interact with the marker to view, add or delete information associated with the point of interest.
    Type: Application
    Filed: June 18, 2013
    Publication date: December 18, 2014
    Inventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Publication number: 20140347391
    Abstract: A system and method are disclosed for displaying virtual objects in a mixed reality environment in a way that is optimal and most comfortable for a user to interact with the virtual objects. When a user is moving through the mixed reality environment, the virtual objects may remain world-locked, so that the user can move around and explore the virtual objects from different perspectives. When the user is motionless in the mixed reality environment, the virtual objects may rotate to face the user so that the user can easily view and interact with the virtual objects.
    Type: Application
    Filed: May 23, 2013
    Publication date: November 27, 2014
    Inventors: Brian E. Keane, Ben J. Sugden, Robert L. Crocco, JR., Daniel Deptford, Tom G. Salter, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
  • Patent number: 8752963
    Abstract: The technology provides various embodiments for controlling brightness of a see-through, near-eye mixed display device based on light intensity of what the user is gazing at. The opacity of the display can be altered, such that external light is reduced if the wearer is looking at a bright object. The wearer's pupil size may be determined and used to adjust the brightness used to display images, as well as the opacity of the display. A suitable balance between opacity and brightness used to display images may be determined that allows real and virtual objects to be seen clearly, while not causing damage or discomfort to the wearer's eyes.
    Type: Grant
    Filed: November 4, 2011
    Date of Patent: June 17, 2014
    Assignee: Microsoft Corporation
    Inventors: Daniel J. McCulloch, Ryan L. Hastings, Kevin A. Geisner, Robert L. Crocco, Alexandru O. Balan, Derek L. Knee, Michael J. Scavezze, Stephen G. Latta, Brian J. Mount
  • Publication number: 20140152558
    Abstract: Methods for controlling an augmented reality environment associated with a head-mounted display device (HMD) are described. In some embodiments, a virtual pointer may be displayed to an end user of the HMD and controlled by the end user using motion and/or orientation information associated with a secondary device (e.g., a mobile phone). Using the virtual pointer, the end user may select and manipulate virtual objects within the augmented reality environment, select real-world objects within the augmented reality environment, and/or control a graphical user interface of the HMD. In some cases, the initial position of the virtual pointer within the augmented reality environment may be determined based on a particular direction in which the end user is gazing and/or a particular object at which the end user is currently focusing on or has recently focused on.
    Type: Application
    Filed: November 30, 2012
    Publication date: June 5, 2014
    Inventors: Tom Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20140002492
    Abstract: Techniques are provided for propagating real world properties into mixed reality images in a see-through, near-eye mixed reality display device. A physical property from the real world may be propagated into a virtual image to be rendered in the display device. Thus, the physics depicted in the mixed reality images may be influenced by a physical property in the environment. Therefore, the user wearing the mixed reality display device is provided a better sense that it is mixed reality, as opposed to simply virtual reality. The mixed reality image may be linked to a real world physical object. This physical object can be movable such as a book, paper, cellular telephone, etc. Forces on the physical object may be propagated into the virtual image.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman, Tom G. Salter
  • Publication number: 20140002442
    Abstract: A system for allowing a virtual object to interact with other virtual objects across different spaces within an augmented reality (AR) environment and to transition between the different spaces is described. An AR environment may include a plurality of spaces, each comprising a bounded area or volume within the AR environment. In one example, an AR environment may be associated with a three-dimensional world space and a two-dimensional object space corresponding with a page of a book within the AR environment. A virtual object within the AR environment may be assigned to the object space and transition from the two-dimensional object space to the three-dimensional world space upon the detection of a space transition event. In some cases, a dual representation of the virtual object may be used to detect interactions between the virtual object and other virtual objects in both the world space and the object space.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20140006026
    Abstract: A system for generating one or more enhanced audio signals such that one or more sound levels corresponding with sounds received from one or more sources of sound within an environment may be dynamically adjusted based on contextual information is described. The one or more enhanced audio signals may be generated by a head-mounted display device (HMD) worn by an end user within the environment and outputted to earphones associated with the HMD such that the end user may listen to the one or more enhanced audio signals in real-time. In some cases, each of the one or more sources of sound may correspond with a priority level. The priority level may be dynamically assigned depending on whether the end user of the HMD is focusing on a particular source of sound or has specified a predetermined level of importance corresponding with the particular source of sound.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman
  • Publication number: 20140002496
    Abstract: A system for identifying an AR tag and determining a location for a virtual object within an augmented reality environment corresponding with the AR tag is described. In some environments, the location of a virtual object corresponding with a particular AR tag may be determined by identifying a predefined object, determining an orientation and a scale of the predefined object relative to a head-mounted display device (HMD) based on a model of the predefined object, and inferring the location of the virtual object based on the orientation and the scale of the predefined object. In some cases, an identification of the particular AR tag corresponding with the virtual object may be acquired by aggregating and analyzing individual identity determinations from a plurality of HMDs within an augmented reality environment.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Mathew J. Lamb, Ben J. Sugden, Robert L. Crocco, JR., Brian E. Keane, Christopher E. Miles, Kathryn Stone Perez, Laura K. Massey, Alex Aben-Athar Kipman