Patents by Inventor Aaron Yehuda Sarna

Aaron Yehuda Sarna has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230153629
    Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
    Type: Application
    Filed: April 12, 2021
    Publication date: May 18, 2023
    Inventors: Dilip Krishnan, Prannay Khosla, Piotr Teterwak, Aaron Yehuda Sarna, Aaron Joseph Maschinot, Ce Liu, Philip John Isola, Yonglong Tian, Chen Wang
  • Patent number: 11347975
    Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting. Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
    Type: Grant
    Filed: April 21, 2021
    Date of Patent: May 31, 2022
    Assignee: GOOGLE LLC
    Inventors: Dilip Krishnan, Prannay Khosla, Piotr Teterwak, Aaron Yehuda Sarna, Aaron Joseph Maschinot, Ce Liu, Phillip John Isola, Yonglong Tian, Chen Wang
  • Publication number: 20210326660
    Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting. Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
    Type: Application
    Filed: April 21, 2021
    Publication date: October 21, 2021
    Inventors: Dilip Krishnan, Prannay Khosla, Piotr Teterwak, Aaron Yehuda Sarna, Aaron Joseph Maschinot, Ce Liu, Phillip John Isola, Yonglong Tian, Chen Wang