Patents by Inventor Robert L. Crocco, JR.

Robert L. Crocco, JR. has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130154918
    Abstract: Systems, methods, and computer media for estimating user eye gaze are provided. A plurality of images of a user's eye are acquired. At least one image of at least part of the user's field of view is acquired. At least one gaze target area in the user's field of view is determined based on the plurality of images of the user's eye. An enhanced user eye gaze is then estimated by narrowing a database of eye information and corresponding known gaze lines to a subset of the eye information having gaze lines corresponding to a gaze target area. User eye information derived from the images of the user's eye is then compared with the narrowed subset of the eye information, and an enhanced estimated user eye gaze is identified as the known gaze line of a matching eye image.
    Type: Application
    Filed: December 20, 2011
    Publication date: June 20, 2013
    Inventors: BENJAMIN ISAAC VAUGHT, ROBERT L. CROCCO, JR., JOHN LEWIS, JIAN SUN, YICHEN WEI
  • Publication number: 20130083173
    Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.
    Type: Application
    Filed: June 29, 2012
    Publication date: April 4, 2013
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, JR., Mathew J. Lamb
  • Publication number: 20120206452
    Abstract: Technology is described for providing realistic occlusion between a virtual object displayed by a head mounted, augmented reality display system and a real object visible to the user's eyes through the display. A spatial occlusion in a user field of view of the display is typically a three dimensional occlusion determined based on a three dimensional space mapping of real and virtual objects. An occlusion interface between a real object and a virtual object can be modeled at a level of detail determined based on criteria such as distance within the field of view, display size or position with respect to a point of gaze. Technology is also described for providing three dimensional audio occlusion based on an occlusion between a real object and a virtual object in the user environment.
    Type: Application
    Filed: April 10, 2012
    Publication date: August 16, 2012
    Inventors: Kevin A. Geisner, Brian J. Mount, Stephen G. Latta, Daniel J. McCulloch, Kyungsuk David Lee, Ben J. Sugden, Jeffrey N. Margolis, Kathryn Stone Perez, Sheridan Martin Small, Mark J. Finocchio, Robert L. Crocco, JR.
  • Publication number: 20110316853
    Abstract: Described herein is a telepresence system where a real-time a virtual hologram of a user is displayed at a remote display screen and is rendered from a vantage point that is different than the vantage point from which images of the user are captured via a video camera. The virtual hologram is based at least in part upon data acquired from a sensor unit at the location of the user. The sensor unit includes a color video camera that captures 2-D images of the user including surface features of the user. The sensor unit also includes a depth sensor that captures 3-D geometry data indicative of the relative position of surfaces on the user in 3-D space. The virtual hologram is rendered to orientate the gaze of the eyes of the virtual hologram towards the eyes of a second user viewing the remote display screen.
    Type: Application
    Filed: June 23, 2010
    Publication date: December 29, 2011
    Applicant: Microsoft Corporation
    Inventors: Avi Bar-Zeev, Christian F. Huitema, Alex Aben-Athar Kipman, Robert L. Crocco, JR., John Allen Tardif, Eric G. Lang