Patents by Inventor Rahul A. Sheth

Rahul A. Sheth has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230407328
    Abstract: The present invention provides process for enriching adeno-associated virus particles using anion exchange chromatography and zonal ultracentrifugation.
    Type: Application
    Filed: November 2, 2021
    Publication date: December 21, 2023
    Applicant: BioMarin Pharmaceutical Inc.
    Inventors: John MAGA, Harmit VORA, Rahul SHETH, Daniel GOLD, Anant RISHI, Yanhong ZHANG, Kieu TRAN
  • Publication number: 20230004708
    Abstract: A system according to various exemplary embodiments includes a processor and a user interface, communication module, and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the system to: retrieve a digital image from a server using the communication module; present the digital image on a display of the user interface; receive edits to the digital image via the user interface; generate, based on the edits, a modified digital image, wherein generating the modified digital image includes transforming a format of the digital image to include a field containing an identifier associated with the modified digital image; and transmit the modified digital image to the server using the communication module.
    Type: Application
    Filed: July 11, 2022
    Publication date: January 5, 2023
    Inventors: Rahul Sheth, Kevin Dechau Tang, Ning Zhang
  • Patent number: 11386261
    Abstract: A system according to various exemplary embodiments includes a processor and a user interface, communication module, and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the system to: retrieve a digital image from a server using the communication module; present the digital image on a display of the user interface; receive edits to the digital image via the user interface; generate, based on the edits, a modified digital image, wherein generating the modified digital image includes transforming a format of the digital image to include a field containing an identifier associated with the modified digital image; and transmit the modified digital image to the server using the communication module.
    Type: Grant
    Filed: July 13, 2020
    Date of Patent: July 12, 2022
    Assignee: Snap Inc.
    Inventors: Rahul Sheth, Kevin Dechau Tang, Ning Zhang
  • Patent number: 11380051
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. Point cloud data describing an environment is then accessed. A two-dimensional surface of an image of an environment is captured, and a portion of the image is matched to a portion of key points in the point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Grant
    Filed: February 10, 2021
    Date of Patent: July 5, 2022
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 11315331
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A set of structure façade data describing one or more structure façades associated with the first position estimate is then accessed. A first image of an environment is captured, and a portion of the image is matched to part of the structure façade data. A second position is then estimated based on a comparison of the structure façade data with the portion of the image matched to the structure façade data.
    Type: Grant
    Filed: June 26, 2020
    Date of Patent: April 26, 2022
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 11091872
    Abstract: The present disclosure relates to a garment care device. The garment care device may include a housing and a garment bag. The garment bag can be disposed around a frame and attached to the housing. The garment bag can receive air along a first air flow from the housing and a mixture of steam and a chemical in a second air flow from the housing during a freshening process. A safety puck can engage to a fastener allowing for opening and secure closure of the garment bag and magnetically engage to an interface.
    Type: Grant
    Filed: July 31, 2020
    Date of Patent: August 17, 2021
    Assignee: Unwrinkly, Inc.
    Inventors: Vishva Somaya, Naren Inukoti, Rahul Sheth
  • Publication number: 20210174578
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. Point cloud data describing an environment is then accessed. A two-dimensional surface of an image of an environment is captured, and a portion of the image is matched to a portion of key points in the point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Application
    Filed: February 10, 2021
    Publication date: June 10, 2021
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10997783
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. Point cloud data describing an environment is then accessed. A two-dimensional surface of an image of an environment is captured, and a portion of the image is matched to a portion of key points in the point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Grant
    Filed: March 19, 2020
    Date of Patent: May 4, 2021
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Publication number: 20200342166
    Abstract: A system according to various exemplary embodiments includes a processor and a user interface, communication module, and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the system to: retrieve a digital image from a server using the communication module; present the digital image on a display of the user interface; receive edits to the digital image via the user interface; generate, based on the edits, a modified digital image, wherein generating the modified digital image includes transforming a format of the digital image to include a field containing an identifier associated with the modified digital image; and transmit the modified digital image to the server using the communication module.
    Type: Application
    Filed: July 13, 2020
    Publication date: October 29, 2020
    Inventors: Rahul Sheth, Kevin Dechau Tang, Ning Zhang
  • Publication number: 20200327738
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A set of structure façade data describing one or more structure façades associated with the first position estimate is then accessed. A first image of an environment is captured, and a portion of the image is matched to part of the structure façade data. A second position is then estimated based on a comparison of the structure façade data with the portion of the image matched to the structure façade data.
    Type: Application
    Filed: June 26, 2020
    Publication date: October 15, 2020
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M. Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10755036
    Abstract: A system according to various exemplary embodiments includes a processor and a user interface, communication module, and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the system to: retrieve a digital image from a server using the communication module; present the digital image on a display of the user interface; receive edits to the digital image via the user interface; generate, based on the edits, a modified digital image, wherein generating the modified digital image includes transforming a format of the digital image to include a field containing an identifier associated with the modified digital image; and transmit the modified digital image to the server using the communication module.
    Type: Grant
    Filed: May 5, 2016
    Date of Patent: August 25, 2020
    Assignee: Snap Inc.
    Inventors: Rahul Sheth, Kevin Dechau Tang, Ning Zhang
  • Patent number: 10733802
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A set of structure façade data describing one or more structure façades associated with the first position estimate is then accessed. A first image of an environment is captured, and a portion of the image is matched to part of the structure façade data. A second position is then estimated based on a comparison of the structure façade data with the portion of the image matched to the structure façade data.
    Type: Grant
    Filed: June 11, 2019
    Date of Patent: August 4, 2020
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Publication number: 20200219312
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. Point cloud data describing an environment is then accessed. A two-dimensional surface of an image of an environment is captured, and a portion of the image is matched to a portion of key points in the point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Application
    Filed: March 19, 2020
    Publication date: July 9, 2020
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M. Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10657708
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A 3D point cloud data describing an environment is then accessed. A first image of an environment is captured, and a portion of the image is matched to a portion of key points in the 3D point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the 3D point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Grant
    Filed: May 4, 2018
    Date of Patent: May 19, 2020
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10605736
    Abstract: An optical imaging method for analyzing an ex vivo tissue sample of a subject is provided. The method includes obtaining the ex vivo tissue sample, preparing the ex vivo tissue sample onto a sample receptacle of an optical imaging system, and emitting excitation light toward the ex vivo tissue sample. The method also includes acquiring imaging data of light emitted by the ex vivo tissue sample in response to the excitation light, analyzing the imaging data to determine whether the ex vivo tissue sample contains pathologic tissue, and generating an output indicating to an operator whether the ex vivo tissue sample contains pathologic tissue.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: March 31, 2020
    Assignee: The General Hospital Corporation
    Inventors: Rahul Sheth, Umar Mahmood, Anthony Samir
  • Publication number: 20190295326
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A set of structure façade data describing one or more structure façades associated with the first position estimate is then accessed. A first image of an environment is captured, and a portion of the image is matched to part of the structure façade data. A second position is then estimated based on a comparison of the structure façade data with the portion of the image matched to the structure façade data.
    Type: Application
    Filed: June 11, 2019
    Publication date: September 26, 2019
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M. Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10366543
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A set of structure façade data describing one or more structure façades associated with the first position estimate is then accessed. A first image of an environment is captured, and a portion of the image is matched to part of the structure façade data. A second position is then estimated based on a comparison of the structure façade data with the portion of the image matched to the structure façade data.
    Type: Grant
    Filed: September 20, 2018
    Date of Patent: July 30, 2019
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Patent number: 10055895
    Abstract: Systems and methods for local augmented reality (AR) tracking of an AR object are disclosed. In one example embodiment a device captures a series of video image frames. A user input is received at the device associating a first portion of a first image of the video image frames with an AR sticker object and a target. A first target template is generated to track the target across frames of the video image frames. In some embodiments, global tracking based on a determination that the target is outside a boundary area is used. The global tracking comprises using a global tracking template for tracking movement in the video image frames captured following the determination that the target is outside the boundary area. When the global tracking determines that the target is within the boundary area, local tracking is resumed along with presentation of the AR sticker object on an output display of the device.
    Type: Grant
    Filed: January 29, 2016
    Date of Patent: August 21, 2018
    Assignee: Snap Inc.
    Inventors: Jia Li, Linjie Luo, Rahul Sheth, Ning Xu, Jianchao Yang
  • Patent number: 9984499
    Abstract: Systems and methods for image based location estimation are described. In one example embodiment, a first positioning system is used to generate a first position estimate. A 3D point cloud data describing an environment is then accessed. A first image of an environment is captured, and a portion of the image is matched to a portion of key points in the 3D point cloud data. An augmented reality object is then aligned within one or more images of the environment based on the match of the 3D point cloud with the image. In some embodiments, building façade data may additionally be used to determine a device location and place the augmented reality object within an image.
    Type: Grant
    Filed: November 30, 2015
    Date of Patent: May 29, 2018
    Assignee: Snap Inc.
    Inventors: Nathan Jurgenson, Linjie Luo, Jonathan M Rodriguez, II, Rahul Sheth, Jia Li, Xutao Lv
  • Publication number: 20180007286
    Abstract: Systems and methods are described for receiving, at a computing device, a video comprising a plurality of frames and determining, by the computing device, that vertical cropping should be performed for the video. For each frame of the plurality of frames, the computing device processes the video by analyzing the frame to determine a region of interest in the frame, wherein the frame is a first frame, cropping the first frame based on the region of interest in the frame to produce a vertically cropped frame for the video, determining a second frame immediately preceding the first frame, and smoothing a trajectory from the second frame to the vertically cropped frame. The vertically cropped frame is displayed to a user instead of the first frame.
    Type: Application
    Filed: July 1, 2016
    Publication date: January 4, 2018
    Inventors: Jia Li, Nathan Litke, Jose Jesus (Joseph) Paredes, Rahul Sheth, Daniel Szeto, Ning Xu, Jianchao Yang