Patents by Inventor Victor Adrian Prisacariu

Victor Adrian Prisacariu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11957978
    Abstract: The present disclosure describes approaches to camera re-localization that improve the speed and accuracy with which pose estimates are generated by fusing output of a computer vision algorithm with data from a prior model of a geographic area in which a user is located. For each candidate pose estimate output by the algorithm, a game server maps the estimate to a position on the prior model (e.g., a specific cell on a heatmap-style histogram) and retrieves a probability corresponding to the mapped position. A data fusion module fuses, for each candidate pose estimate, a confidence score generated by the computer vision algorithm with the location probability from the prior model to generate an updated confidence score. If an updated confidence score meets or exceeds a score threshold, a re-localization module initiates a location-based application (e.g., a parallel reality game) based on the associated candidate pose estimate.
    Type: Grant
    Filed: March 3, 2022
    Date of Patent: April 16, 2024
    Assignee: NIANTIC, INC.
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20240071018
    Abstract: An augmented reality (“AR”) device applies smooth correction methods to correct the location of the virtual objects presented to a user. The AR device may apply an angular threshold to determine whether a virtual object can be moved from an original location to a target location. An angular threshold is a maximum angle by which a line from the AR device to the virtual object can change within a timestep. Similarly, the AR device may apply a motion threshold, which is a maximum on the distance that a virtual object's location can be corrected based on the motion of the virtual object. Furthermore, the AR device may apply a pixel threshold to the correction of the virtual object's location. A pixel threshold is a maximum on the distance that a pixel projection of the virtual object can change based on the virtual object's change in location.
    Type: Application
    Filed: November 8, 2023
    Publication date: February 29, 2024
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20240046564
    Abstract: The present disclosure describes approaches to camera re-localization that improve the accuracy of re-localization determinations by performing simulated consistency checks for three-dimensional maps. Client devices associated with users of a location-based application transmit image scans to a game server, which divides the received scan data into mapping sets used to generate 3D maps of environments and validation sets used to test the accuracy of the maps. To perform the testing, the game server identifies query scans in the validation set having GPS coordinates within a threshold distance of the mapped location and uses the 3D map of the environment to generate a pose estimate for each frame. The results of the localization queries are analyzed by comparing differences between the localization pose estimates and differences between the poses of independent pairs of frames in the query scan to evaluate the accuracy of the 3D map.
    Type: Application
    Filed: August 2, 2022
    Publication date: February 8, 2024
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20240033631
    Abstract: A method of determining a position for a virtual object is described. A location of a client device is determined, and , based on the determined location a set of map segments is retrieved. A virtual object is determined to be displayed on the client device. Relation vectors between the virtual object and each map segment of the retrieved set of map segments are obtained. Each relation vector is weighted based on object parameters of the virtual object. A position to display the virtual object is determined based on the weighted relation vectors. The virtual object is provided for display on the client device as the determined position.
    Type: Application
    Filed: July 29, 2022
    Publication date: February 1, 2024
    Inventors: Ben Benfold, Victor Adrian Prisacariu, Daniel Knoblauch
  • Publication number: 20230410349
    Abstract: A method or a system for map-free visual relocalization of a device. The system obtains a reference image of an environment captured by a reference pose. The system also receives a query image taken by a camera of the device. The system determines a relative pose of the camera of the device relative to the reference camera based in part on the reference image and the query image. The system determines a pose of the query camera in the environment based on the reference pose and the relative pose.
    Type: Application
    Filed: June 20, 2023
    Publication date: December 21, 2023
    Inventors: Eduardo Henrique Arnold, Jamie Michael Wynn, Guillermo Garcia-Hernando, Sara Alexandra Gomes Vicente, Aron Monszpart, Victor Adrian Prisacariu, Daniyar Turmukhambetov, Eric Brachmann, Axel Barroso-Laguna
  • Patent number: 11847750
    Abstract: An augmented reality (“AR”) device applies smooth correction methods to correct the location of the virtual objects presented to a user. The AR device may apply an angular threshold to determine whether a virtual object can be moved from an original location to a target location. An angular threshold is a maximum angle by which a line from the AR device to the virtual object can change within a timestep. Similarly, the AR device may apply a motion threshold, which is a maximum on the distance that a virtual object's location can be corrected based on the motion of the virtual object. Furthermore, the AR device may apply a pixel threshold to the correction of the virtual object's location. A pixel threshold is a maximum on the distance that a pixel projection of the virtual object can change based on the virtual object's change in location.
    Type: Grant
    Filed: May 18, 2022
    Date of Patent: December 19, 2023
    Assignee: NIANTIC, INC.
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20230377278
    Abstract: An augmented reality (“AR”) device applies smooth correction methods to correct the location of the virtual objects presented to a user. The AR device may apply an angular threshold to determine whether a virtual object can be moved from an original location to a target location. An angular threshold is a maximum angle by which a line from the AR device to the virtual object can change within a timestep. Similarly, the AR device may apply a motion threshold, which is a maximum on the distance that a virtual object's location can be corrected based on the motion of the virtual object. Furthermore, the AR device may apply a pixel threshold to the correction of the virtual object's location. A pixel threshold is a maximum on the distance that a pixel projection of the virtual object can change based on the virtual object's change in location.
    Type: Application
    Filed: May 18, 2022
    Publication date: November 23, 2023
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20230360241
    Abstract: A depth estimation module may receive a reference image and a set of source images of an environment. The depth module may receive image features of the reference image and the set of source images. The depth module may generate a 4D feature volume that includes the image features and metadata associated with the reference image and set of source images. The image features and the metadata may be arranged in the feature volume based on relative pose distances between the reference image and the set of source images. The depth module may reduce the 4D feature volume to generate a 3D cost volume. The depth module may apply a depth estimation model to the 3D cost volume and data based on the reference image to generate a two dimensional (2D) depth map for the reference image.
    Type: Application
    Filed: May 5, 2023
    Publication date: November 9, 2023
    Inventors: Mohamed Sayed, John Gibson, James Watson, Victor Adrian Prisacariu, Michael David Firman, Clément Godard
  • Publication number: 20230277934
    Abstract: The present disclosure describes approaches to camera re-localization that improve the speed and accuracy with which pose estimates are generated by fusing output of a computer vision algorithm with data from a prior model of a geographic area in which a user is located. For each candidate pose estimate output by the algorithm, a game server maps the estimate to a position on the prior model (e.g., a specific cell on a heatmap-style histogram) and retrieves a probability corresponding to the mapped position. A data fusion module fuses, for each candidate pose estimate, a confidence score generated by the computer vision algorithm with the location probability from the prior model to generate an updated confidence score. If an updated confidence score meets or exceeds a score threshold, a re-localization module initiates a location-based application (e.g., a parallel reality game) based on the associated candidate pose estimate.
    Type: Application
    Filed: March 3, 2022
    Publication date: September 7, 2023
    Inventors: Ben Benfold, Victor Adrian Prisacariu
  • Publication number: 20220189049
    Abstract: A multi-frame depth estimation model is disclosed. The model is trained and configured to receive an input image and an additional image. The model outputs a depth map for the input image based on the input image and the additional image. The model may extract a feature map for the input image and an additional feature map for the additional image. For each of a plurality of depth planes, the model warps the feature map to the depth plane based on relative pose between the input image and the additional image, the depth plane, and camera intrinsics. The model builds a cost volume from the warped feature maps for the plurality of depth planes. A decoder of the model inputs the cost volume and the input image to output the depth map.
    Type: Application
    Filed: December 8, 2021
    Publication date: June 16, 2022
    Inventors: James Watson, Oisin Mac Aodha, Victor Adrian Prisacariu, Gabriel J. Brostow, Michael David Firman