Patents by Inventor Madhurani R. Sapre

Madhurani R. Sapre has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230298278
    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine how to present a three-dimensional (3D) photo in an extended reality (XR) environment (e.g., in 3D, 2D, blurry, or not at all) based on viewing position of a user active in the XR environment relative to a placement of the 3D photo in the XR environment. In some implementations, at an electronic device having a processor, a 3D photo that is an incomplete 3D representation created based on one or more images captured by an image capture device is obtained. In some implementations, a viewing position of the electronic device relative to a placement position of the 3D photo is determined, and a presentation mode for the 3D photo is determined based on the viewing position. In some implementations, the 3D photo is provided at the placement position based on the presentation mode in the XR environment.
    Type: Application
    Filed: November 4, 2022
    Publication date: September 21, 2023
    Inventors: Alexandre DA VEIGA, Jeffrey S. NORRIS, Madhurani R. SAPRE, Spencer H. RAY
  • Publication number: 20200233212
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world. In one example, the user may gesture to the object which is detected by a visual sensor. In another example, the user may verbally describe the object which is detected by an audio sensor. The system receiving the input may then determine which object near the location of the user that the user is indicating. Such a determination may include utilizing known objects near the geographic location of the user or the autonomous or mobile machine.
    Type: Application
    Filed: February 10, 2020
    Publication date: July 23, 2020
    Inventors: Patrick S. Piemonte, Wolf Kienzle, Douglas Bowman, Shaun D. Budhram, Madhurani R. Sapre, Vyacheslav Leizerovich, Daniel De Rocha Rosario
  • Patent number: 10558037
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world. In one example, the user may gesture to the object which is detected by a visual sensor. In another example, the user may verbally describe the object which is detected by an audio sensor. The system receiving the input may then determine which object near the location of the user that the user is indicating. Such a determination may include utilizing known objects near the geographic location of the user or the autonomous or mobile machine.
    Type: Grant
    Filed: September 20, 2017
    Date of Patent: February 11, 2020
    Inventors: Patrick S. Piemonte, Wolf Kienzle, Douglas Bowman, Shaun D. Budhram, Madhurani R. Sapre, Vyacheslav Leizerovich, Daniel De Rocha Rosario
  • Publication number: 20180088324
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world. In one example, the user may gesture to the object which is detected by a visual sensor. In another example, the user may verbally describe the object which is detected by an audio sensor. The system receiving the input may then determine which object near the location of the user that the user is indicating. Such a determination may include utilizing known objects near the geographic location of the user or the autonomous or mobile machine.
    Type: Application
    Filed: September 20, 2017
    Publication date: March 29, 2018
    Inventors: Patrick S. Piemonte, Wolf Kienzle, Douglas Bowman, Shaun D. Budhram, Madhurani R. Sapre, Vyacheslav Leizerovich, Daniel De Rocha Rosario