Patents by Inventor Avi CACIULARU

Avi CACIULARU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230418909
    Abstract: Embodiments are described for automatically generating threshold values based on a target metric value that specifies a desired precision or recall performance of an ML model. For instance, a trained ML model is executed against a data set using possible threshold values. Accuracy metric(s) of the ML model is determined based on the execution. Using the accuracy metric(s), evaluation metrics are modeled. A probability that a first modeled evaluation metric value has a relationship with a target metric value is determined. A determination is made that the probability has a relationship with a confidence level. Responsive to determining that the probability has the relationship with the confidence level, the threshold value is added to a set of candidate threshold values. The threshold value from among the set of candidate threshold values is selected by selecting the candidate threshold value associated with the largest second modeled evaluation metric value.
    Type: Application
    Filed: June 24, 2022
    Publication date: December 28, 2023
    Inventors: Oren BARKAN, Avi CACIULARU, Noam KOENIGSTEIN, Nir NICE
  • Publication number: 20230177111
    Abstract: A method of training a machine learning model is provided. The method includes receiving labeled training data in the machine learning model, the received labeled training data including content data for items accessible to a user and input usage data representing recorded interaction between the user and the items, wherein the received content data for each item includes data representing intrinsic attributes of the item. The method further includes selecting a set of the input usage data that excludes input usage data for a proper subset of the items and training the machine learning model based on both the content data and the selected set of input usage data of the received labeled training data for the items.
    Type: Application
    Filed: December 6, 2021
    Publication date: June 8, 2023
    Inventors: Oren BARKAN, Roy HIRSCH, Ori KATZ, Avi CACIULARU, Yonathan WEILL, Noam KOENIGSTEIN, Nir NICE
  • Publication number: 20230137718
    Abstract: A relational similarity determination engine receives as input a dataset including a set of entities and co-occurrence data that defines co-occurrence relations for pairs of the entities. The relational similarity determination engine also receives as input side information defining explicit relations between the entities. The relational similarity determination engine jointly models the co-occurrence relations and the explicit relations for the entities to compute a similarity metric for each different pair of entities within the dataset. Based on the computed similarity metrics, the relational similarity determination engine identifies a most similar replacement entity from the dataset for each of the entities within the dataset. For a select entity received as an input, the relational similarity determination engine outputs the identified most similar replacement entity.
    Type: Application
    Filed: October 29, 2021
    Publication date: May 4, 2023
    Inventors: Oren BARKAN, Avi CACIULARU, Idan REJWAN, Yonathan WEILL, Noam KOENIGSTEIN, Ori KATZ, Itzik MALKIEL, Nir NICE
  • Publication number: 20230137692
    Abstract: A computing system scores importance of a number of tokens in an input token sequence to one or more prediction scores computed by a neural network model on the input token sequence. The neural network model includes multiple encoding layers. Self-attention matrices of the neural network model are received into an importance evaluator. The self-attention matrices are generated by the neural network model while computing the one or more prediction scores based on the input token sequence. Each self-attention matrix corresponds to one of the multiple encoding layers. The importance evaluator generates an importance score for one or more of the tokens in the input token sequence. Each importance score is based on a summation as a function of the self-attention matrices, the summation being computed across the tokens in the input token sequence, across the self-attention matrices, and across the multiple encoding layers in the neural network model.
    Type: Application
    Filed: October 29, 2021
    Publication date: May 4, 2023
    Inventors: Oren BARKAN, Edan HAUON, Ori KATZ, Avi CACIULARU, Itzik MALKIEL, Omri ARMSTRONG, Amir HERTZ, Noam KOENIGSTEIN, Nir NICE
  • Publication number: 20230138579
    Abstract: An anchor-based collaborative filtering system receives a training dataset including user-item interactions each identifying a user and an item that the user has positively interacted with. The system defines a vector space and distributes the items of the training dataset within the vector based on a determined similarity of the items. The system further defines a set of taste anchors that are each associated in memory with a subgroup of the items in a same neighborhood of the vector space. To make a recommendation to an individual user, the system identifies an anchor-based representation for the individual user that includes a subset of the defined taste anchors that best represents the types of items that the user has favorably interacted with in the past. The taste anchors included in the identified anchor-based representation for the individual user are used to make recommendations to the user in the future.
    Type: Application
    Filed: October 29, 2021
    Publication date: May 4, 2023
    Inventors: Oren BARKAN, Roy HIRSCH, Ori KATZ, Avi CACIULARU, Noam KOENIGSTEIN, Nir NICE
  • Publication number: 20230137744
    Abstract: A method of generating an aggregate saliency map using a convolutional neural network. Convolutional activation maps of the convolutional neural network model are received into a saliency map generator, the convolutional activation maps being generated by the neural network model while computing the one or more prediction scores based on unlabeled input data. Each convolutional activation map corresponds to one of the multiple encoding layers. The saliency map generator generates a layer-dependent saliency map for each encoding layer of the unlabeled input data, each layer-dependent saliency map being based on a summation of element-wise products of the convolutional activation maps and their corresponding gradients. The layer-dependent saliency maps are combined into the aggregate saliency map indicating the relative contributions of individual components of the unlabeled input data to the one or more prediction scores computed by the convolutional neural network model on the unlabeled input data.
    Type: Application
    Filed: October 29, 2021
    Publication date: May 4, 2023
    Inventors: Oren BARKAN, Omri ARMSTRONG, Amir HERTZ, Avi CACIULARU, Ori KATZ, Itzik MALKIEL, Noam KOENIGSTEIN, Nir NICE
  • Publication number: 20220231785
    Abstract: Disclosed herein is a neural network based pre-decoder comprising a permutation embedding engine, a permutation classifier each comprising one or more trained neural networks and a selection unit. The permutation embedding engine is trained to compute a plurality of permutation embedding vectors each for a respective one of a plurality of permutations of a received codeword encoded using an error correction code and transmitted over a transmission channel subject to interference. The permutation classifier is trained to compute a decode score for each of the plurality of permutations expressing its probability to be successfully decoded based on classification of the plurality of permutation embedding vectors coupled with the plurality of permutations. The selection unit is configured to output one or more selected permutations having a highest decode score. One or more decoders may be then applied to recover the encoded codeword by decoding the one or more selected permutations.
    Type: Application
    Filed: January 10, 2022
    Publication date: July 21, 2022
    Applicants: Ramot at Tel-Aviv University Ltd., Bar-Ilan University
    Inventors: Yair BEERY, Nir RAVIV, Tomer RAVIV, Jacob GOLDBERGER, Avi CACIULARU