Patents by Inventor Sudipta Narayan Sinha

Sudipta Narayan Sinha has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11941751
    Abstract: Techniques for aligning images generated by two cameras are disclosed. This alignment is performed by computing a relative 3D orientation between the two cameras. A first gravity vector for a first camera and a second gravity vector for a second camera are determined. A first camera image is obtained from the first camera, and a second camera image is obtained from the second camera. A first alignment process is performed to partially align the first camera's orientation with the second camera's orientation. This process is performed by aligning the gravity vectors, thereby resulting in two degrees of freedom of the relative 3D orientation being eliminated. Visual correspondences between the two images are identified. A second alignment process is performed to fully align the orientations. This process is performed by using the identified visual correspondences to identify and eliminate a third degree of freedom of the relative 3D orientation.
    Type: Grant
    Filed: March 30, 2023
    Date of Patent: March 26, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Raymond Kirk Price, Michael Bleyer, Christopher Douglas Edmonds, Sudipta Narayan Sinha
  • Publication number: 20240061251
    Abstract: Techniques for updating a position of overlaid image content using IMU data to reflect subsequent changes in camera positions to minimize latency effects are disclosed. A “system camera” refers to an integrated camera that is a part of an HMD. An “external camera” is a camera that is separated from the HMD. The system camera and the external camera generate images. These images are overlaid on one another and aligned to form an overlaid image. Content from the external camera image is surrounded by a bounding element in the overlaid image. IMU data associated with both the system camera and the external camera is obtained. Based on that IMU data, an amount of movement that the system camera and/or the external camera have moved since the images were originally generated is determined. Based on that movement, the bounding element is shifted to a new position in the overlaid image.
    Type: Application
    Filed: October 27, 2023
    Publication date: February 22, 2024
    Inventors: Michael BLEYER, Sudipta Narayan SINHA, Christopher Douglas EDMONDS, Raymond Kirk PRICE
  • Patent number: 11841509
    Abstract: Techniques for updating a position of overlaid image content using IMU data to reflect subsequent changes in camera positions to minimize latency effects are disclosed. A “system camera” refers to an integrated camera that is a part of an HMD. An “external camera” is a camera that is separated from the HMD. The system camera and the external camera generate images. These images are overlaid on one another and aligned to form an overlaid image. Content from the external camera image is surrounded by a bounding element in the overlaid image. IMU data associated with both the system camera and the external camera is obtained. Based on that IMU data, an amount of movement that the system camera and/or the external camera have moved since the images were originally generated is determined. Based on that movement, the bounding element is shifted to a new position in the overlaid image.
    Type: Grant
    Filed: November 14, 2022
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Bleyer, Sudipta Narayan Sinha, Christopher Douglas Edmonds, Raymond Kirk Price
  • Publication number: 20230260204
    Abstract: Techniques for aligning images generated by two cameras are disclosed. This alignment is performed by computing a relative 3D orientation between the two cameras. A first gravity vector for a first camera and a second gravity vector for a second camera are determined. A first camera image is obtained from the first camera, and a second camera image is obtained from the second camera. A first alignment process is performed to partially align the first camera’s orientation with the second camera’s orientation. This process is performed by aligning the gravity vectors, thereby resulting in two degrees of freedom of the relative 3D orientation being eliminated. Visual correspondences between the two images are identified. A second alignment process is performed to fully align the orientations. This process is performed by using the identified visual correspondences to identify and eliminate a third degree of freedom of the relative 3D orientation.
    Type: Application
    Filed: March 30, 2023
    Publication date: August 17, 2023
    Inventors: Raymond Kirk PRICE, Michael BLEYER, Christopher Douglas EDMONDS, Sudipta Narayan SINHA
  • Publication number: 20230154032
    Abstract: In various embodiments there is a method for camera localization within a scene. An image of a scene captured by the camera is input to a machine learning model, which has been trained for the particular scene to detect a plurality of 3D scene landmarks. The 3D scene landmarks are pre-specified in a pre-built map of the scene. The machine learning model outputs a plurality of predictions, each prediction comprising: either a 2D location in the image which is predicted to depict one of the 3D scene landmarks, or a 3D bearing vector, being a vector originating at the camera and pointing towards a predicted 3D location of one of the 3D scene landmarks. Using the predictions, an estimate of a position and orientation of the camera in the pre-built map of the scene is computed.
    Type: Application
    Filed: February 3, 2022
    Publication date: May 18, 2023
    Inventors: Sudipta Narayan SINHA, Ondrej MIKSIK, Joseph Michael DEGOL, Tien DO
  • Publication number: 20230148231
    Abstract: Techniques for aligning images generated by two cameras are disclosed. This alignment is performed by computing a relative 3D orientation between the two cameras. A first gravity vector for a first camera and a second gravity vector for a second camera are determined. A first camera image is obtained from the first camera, and a second camera image is obtained from the second camera. A first alignment process is performed to partially align the first camera's orientation with the second camera's orientation. This process is performed by aligning the gravity vectors, thereby resulting in two degrees of freedom of the relative 3D orientation being eliminated. Visual correspondences between the two images are identified. A second alignment process is performed to fully align the orientations. This process is performed by using the identified visual correspondences to identify and eliminate a third degree of freedom of the relative 3D orientation.
    Type: Application
    Filed: November 11, 2021
    Publication date: May 11, 2023
    Inventors: Raymond Kirk PRICE, Michael BLEYER, Christopher Douglas EDMONDS, Sudipta Narayan SINHA
  • Patent number: 11636645
    Abstract: Techniques for aligning images generated by two cameras are disclosed. This alignment is performed by computing a relative 3D orientation between the two cameras. A first gravity vector for a first camera and a second gravity vector for a second camera are determined. A first camera image is obtained from the first camera, and a second camera image is obtained from the second camera. A first alignment process is performed to partially align the first camera's orientation with the second camera's orientation. This process is performed by aligning the gravity vectors, thereby resulting in two degrees of freedom of the relative 3D orientation being eliminated. Visual correspondences between the two images are identified. A second alignment process is performed to fully align the orientations. This process is performed by using the identified visual correspondences to identify and eliminate a third degree of freedom of the relative 3D orientation.
    Type: Grant
    Filed: November 11, 2021
    Date of Patent: April 25, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Raymond Kirk Price, Michael Bleyer, Christopher Douglas Edmonds, Sudipta Narayan Sinha
  • Publication number: 20230076331
    Abstract: Techniques for updating a position of overlaid image content using IMU data to reflect subsequent changes in camera positions to minimize latency effects are disclosed. A “system camera” refers to an integrated camera that is a part of an HMD. An “external camera” is a camera that is separated from the HMD. The system camera and the external camera generate images. These images are overlaid on one another and aligned to form an overlaid image. Content from the external camera image is surrounded by a bounding element in the overlaid image. IMU data associated with both the system camera and the external camera is obtained. Based on that IMU data, an amount of movement that the system camera and/or the external camera have moved since the images were originally generated is determined. Based on that movement, the bounding element is shifted to a new position in the overlaid image.
    Type: Application
    Filed: November 14, 2022
    Publication date: March 9, 2023
    Inventors: Michael BLEYER, Sudipta Narayan SINHA, Christopher Douglas EDMONDS, Raymond Kirk PRICE
  • Patent number: 11543665
    Abstract: Techniques for updating a position of overlaid image content using IMU data to reflect subsequent changes in camera positions to minimize latency effects are disclosed. A “system camera” refers to an integrated camera that is a part of an HMD. An “external camera” is a camera that is separated from the HMD. The system camera and the external camera generate images. These images are overlaid on one another and aligned to form an overlaid image. Content from the external camera image is surrounded by a bounding element in the overlaid image. IMU data associated with both the system camera and the external camera is obtained. Based on that IMU data, an amount of movement that the system camera and/or the external camera have moved since the images were originally generated is determined. Based on that movement, the bounding element is shifted to a new position in the overlaid image.
    Type: Grant
    Filed: December 1, 2020
    Date of Patent: January 3, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Bleyer, Sudipta Narayan Sinha, Christopher Douglas Edmonds, Raymond Kirk Price
  • Patent number: 11450014
    Abstract: A system for continuous image alignment of separate cameras identifies a reference camera transformation matrix between a base reference camera pose and an updated reference camera pose. The system also identifies a match camera transformation matrix between a base match camera pose and an updated match camera pose and an alignment matrix based on visual correspondences between one or more reference frames captured by the reference camera and one or more match frames captured by the match camera. The system also generates a motion model configured to facilitate mapping of a set of pixels of a reference frame captured by the reference camera to a corresponding set of pixels of a match frame captured by the match camera based on the reference camera transformation matrix, the match camera transformation matrix, and the alignment matrix.
    Type: Grant
    Filed: July 22, 2020
    Date of Patent: September 20, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Bleyer, Christopher Douglas Edmonds, Michael Edward Samples, Sudipta Narayan Sinha, Matthew Beaudoin Karr, Raymond Kirk Price
  • Publication number: 20220284233
    Abstract: One example provides a computing system comprising a storage machine storing instructions executable by a logic machine to extract features from a source and target images to form source and target feature maps, form a correlation map comprising a plurality of similarity scores, form an initial correspondence map comprising initial mappings between pixels of the source feature map and corresponding pixels of the target feature map, refine the initial correspondence map by, for each of one or more pixels of the source feature map, for each of a plurality of candidate correspondences, inputting a four-dimensional patch into a trained scoring function, the trained scoring function being configured to output a correctness score, and selecting a refined correspondence based at least upon the correctness scores, and output a refined correspondence map comprising a refined correspondence for each of the one or more pixels of the source feature map.
    Type: Application
    Filed: March 3, 2021
    Publication date: September 8, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Joseph Michael DEGOL, Jae Yong LEE, Sudipta Narayan SINHA, Victor Manuel FRAGOSO ROJAS
  • Patent number: 11436742
    Abstract: A system for reducing a search area for identifying correspondences identifies an overlap region within a first match frame captured by a match camera. The overlap region includes one or more points of the first match frame that are associated with one or more same portions of an environment as one or more corresponding points of a first reference frame captured by a reference camera. The system obtains a second reference frame captured by the reference camera and a second match frame captured by the match camera. The system identifies a reference camera transformation matrix, and/or a match camera transformation matrix. The system defines a search area within the second match frame based on the overlap region and the reference camera transformation matrix and/or the match camera transformation matrix.
    Type: Grant
    Filed: July 22, 2020
    Date of Patent: September 6, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sudipta Narayan Sinha, Michael Bleyer, Christopher Douglas Edmonds, Raymond Kirk Price
  • Publication number: 20220171187
    Abstract: Techniques for updating a position of overlaid image content using IMU data to reflect subsequent changes in camera positions to minimize latency effects are disclosed. A “system camera” refers to an integrated camera that is a part of an HMD. An “external camera” is a camera that is separated from the HMD. The system camera and the external camera generate images. These images are overlaid on one another and aligned to form an overlaid image. Content from the external camera image is surrounded by a bounding element in the overlaid image. IMU data associated with both the system camera and the external camera is obtained. Based on that IMU data, an amount of movement that the system camera and/or the external camera have moved since the images were originally generated is determined. Based on that movement, the bounding element is shifted to a new position in the overlaid image.
    Type: Application
    Filed: December 1, 2020
    Publication date: June 2, 2022
    Inventors: Michael BLEYER, Sudipta Narayan SINHA, Christopher Douglas EDMONDS, Raymond Kirk PRICE
  • Publication number: 20220028095
    Abstract: A system for continuous image alignment of separate cameras identifies a reference camera transformation matrix between a base reference camera pose and an updated reference camera pose. The system also identifies a match camera transformation matrix between a base match camera pose and an updated match camera pose and an alignment matrix based on visual correspondences between one or more reference frames captured by the reference camera and one or more match frames captured by the match camera. The system also generates a motion model configured to facilitate mapping of a set of pixels of a reference frame captured by the reference camera to a corresponding set of pixels of a match frame captured by the match camera based on the reference camera transformation matrix, the match camera transformation matrix, and the alignment matrix.
    Type: Application
    Filed: July 22, 2020
    Publication date: January 27, 2022
    Inventors: Michael BLEYER, Christopher Douglas EDMONDS, Michael Edward SAMPLES, Sudipta Narayan SINHA, Matthew Beaudoin KARR, Raymond Kirk PRICE
  • Publication number: 20220028093
    Abstract: A system for reducing a search area for identifying correspondences identifies an overlap region within a first match frame captured by a match camera. The overlap region includes one or more points of the first match frame that are associated with one or more same portions of an environment as one or more corresponding points of a first reference frame captured by a reference camera. The system obtains a second reference frame captured by the reference camera and a second match frame captured by the match camera. The system identifies a reference camera transformation matrix, and/or a match camera transformation matrix. The system defines a search area within the second match frame based on the overlap region and the reference camera transformation matrix and/or the match camera transformation matrix.
    Type: Application
    Filed: July 22, 2020
    Publication date: January 27, 2022
    Inventors: Sudipta Narayan SINHA, Michael BLEYER, Christopher Douglas EDMONDS, Raymond Kirk PRICE
  • Patent number: 10964053
    Abstract: Computing devices and methods for estimating a pose of a user computing device are provided. In one example a 3D map comprising a plurality of 3D points representing a physical environment is obtained. Each 3D point is transformed into a 3D line that passes through the point to generate a 3D line cloud. A query image of the environment captured by a user computing device is received, the query image comprising query features that correspond to the environment. Using the 3D line cloud and the query features, a pose of the user computing device with respect to the environment is estimated.
    Type: Grant
    Filed: July 2, 2018
    Date of Patent: March 30, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sudipta Narayan Sinha, Pablo Alejandro Speciale, Sing Bing Kang, Marc Andre Leon Pollefeys
  • Patent number: 10951819
    Abstract: One or more techniques and/or systems are provided for ordering images for panorama stitching and/or for providing a focal point indicator for image capture. For example, one or more images, which may be stitched together to create a panorama of a scene, may be stored within an image stack according to one or more ordering preferences, such as where manually captured images are stored within a first/higher priority region of the image stack as compared to automatically captured images. One or more images within the image stack may be stitched according to a stitching order to create the panorama, such as using images in the first region for a foreground of the panorama. Also, a current position of a camera may be tracked and compared with a focal point of a scene to generate a focal point indicator to assist with capturing a new/current image of the scene.
    Type: Grant
    Filed: March 28, 2018
    Date of Patent: March 16, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Blaise Aguera y Arcas, Markus Unger, Donald A. Barnett, David Maxwell Gedye, Sudipta Narayan Sinha, Eric Joel Stollnitz, Johannes Kopf
  • Patent number: 10878590
    Abstract: Stereo image reconstruction can be achieved by fusing a plurality of proposal cost volumes computed from a pair of stereo images, using a predictive model operating on pixelwise feature vectors that include disparity and cost values sparsely sampled form the proposal cost volumes to compute disparity estimates for the pixels within the image.
    Type: Grant
    Filed: May 25, 2018
    Date of Patent: December 29, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sudipta Narayan Sinha, Marc André Léon Pollefeys, Johannes Lutz Schönberger
  • Patent number: 10839556
    Abstract: A method for estimating a camera pose includes recognizing a three-dimensional (3D) map representing a physical environment, the 3D map including 3D map features defined as 3D points. An obfuscated image representation is received, the representation derived from an original unobfuscated image of the physical environment captured by a camera. The representation includes a plurality of obfuscated features, each including (i) a two-dimensional (2D) line that passes through a 2D point in the original unobfuscated image at which an image feature was detected, and (ii) a feature descriptor that describes the image feature associated with the 2D point that the 2D line of the obfuscated feature passes through. Correspondences are determined between the obfuscated features and the 3D map features of the 3D map of the physical environment. Based on the determined correspondences, a six degree of freedom pose of the camera in the physical environment is estimated.
    Type: Grant
    Filed: October 23, 2018
    Date of Patent: November 17, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sudipta Narayan Sinha, Marc Andre Leon Pollefeys, Sing Bing Kang, Pablo Alejandro Speciale
  • Publication number: 20200311396
    Abstract: Examples are disclosed that relate to representing recorded hand motion. One example provides a computing device comprising instructions executable by a logic subsystem to receive video data capturing hand motion relative to an object, determine a first pose of the object, and associate a first coordinate system with the object based on the first pose. The instructions are further executable to determine a representation of the hand motion in the first coordinate system, the representation having a time-varying pose relative to the first pose of the object, and configure the representation for display relative to a second instance of the object having a second pose in a second coordinate system, with a time-varying pose relative to the second pose that is spatially consistent with the time-varying pose relative to the first pose.
    Type: Application
    Filed: March 25, 2019
    Publication date: October 1, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Marc Andre Leon POLLEFEYS, Sudipta Narayan SINHA, Harpreet Singh SAWHNEY, Bugra TEKIN, Federica BOGO