Patents by Inventor Noam RAZIN
Noam RAZIN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12327085Abstract: The disclosure herein describes a system and method for attentive sentence similarity scoring. A distilled sentence embedding (DSE) language model is trained by decoupling a transformer language model using knowledge distillation. The trained DSE language model calculates sentence embeddings for a plurality of candidate sentences for sentence similarity comparisons. An embedding component associated with the trained DSE language model generates a plurality of candidate sentence representations representing each candidate sentence in the plurality of candidate sentences which are stored for use in analyzing input sentences associated with queries or searches. A representation is created for the selected sentence. This selected sentence representation is used with the plurality of candidate sentence representations to create a similarity score for each candidate sentence-selected sentence pair.Type: GrantFiled: June 20, 2022Date of Patent: June 10, 2025Assignee: Microsoft Technology Licensing, LLCInventors: Oren Barkan, Noam Razin, Noam Koenigstein
-
Publication number: 20240403353Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: ApplicationFiled: August 9, 2024Publication date: December 5, 2024Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN, Roy HIRSCH, Nir NICE
-
Patent number: 12093305Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: GrantFiled: June 19, 2023Date of Patent: September 17, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Oren Barkan, Noam Razin, Noam Koenigstein, Roy Hirsch, Nir Nice
-
Publication number: 20230334085Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: ApplicationFiled: June 19, 2023Publication date: October 19, 2023Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN, Roy HIRSCH, Nir NICE
-
Patent number: 11720622Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: GrantFiled: June 9, 2022Date of Patent: August 8, 2023Inventors: Oren Barkan, Noam Razin, Noam Koenigstein, Roy Hirsch, Nir Nice
-
Publication number: 20220318507Abstract: The disclosure herein describes a system and method for attentive sentence similarity scoring. A distilled sentence embedding (DSE) language model is trained by decoupling a transformer language model using knowledge distillation. The trained DSE language model calculates sentence embeddings for a plurality of candidate sentences for sentence similarity comparisons. An embedding component associated with the trained DSE language model generates a plurality of candidate sentence representations representing each candidate sentence in the plurality of candidate sentences which are stored for use in analyzing input sentences associated with queries or searches. A representation is created for the selected sentence. This selected sentence representation is used with the plurality of candidate sentence representations to create a similarity score for each candidate sentence-selected sentence pair.Type: ApplicationFiled: June 20, 2022Publication date: October 6, 2022Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN
-
Publication number: 20220300814Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: ApplicationFiled: June 9, 2022Publication date: September 22, 2022Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN, Roy HIRSCH, Nir NICE
-
Patent number: 11392770Abstract: The disclosure herein describes a system and method for attentive sentence similarity scoring. A distilled sentence embedding (DSE) language model is trained by decoupling a transformer language model using knowledge distillation. The trained DSE language model calculates sentence embeddings for a plurality of candidate sentences for sentence similarity comparisons. An embedding component associated with the trained DSE language model generates a plurality of candidate sentence representations representing each candidate sentence in the plurality of candidate sentences which are stored for use in analyzing input sentences associated with queries or searches. A representation is created for the selected sentence. This selected sentence representation is used with the plurality of candidate sentence representations to create a similarity score for each candidate sentence-selected sentence pair.Type: GrantFiled: February 12, 2020Date of Patent: July 19, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Oren Barkan, Noam Razin, Noam Koenigstein
-
Patent number: 11373095Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: GrantFiled: December 23, 2019Date of Patent: June 28, 2022Inventors: Oren Barkan, Noam Razin, Noam Koenigstein, Roy Hirsch, Nir Nice
-
Publication number: 20210192338Abstract: Machine learning multiple features of an item depicted in images. Upon accessing multiple images that depict the item, a neural network is used to machine train on the plurality of images to generate embedding vectors for each of multiple features of the item. For each of multiple features of the item depicted in the images, in each iteration of the machine learning, the embedding vector is converted into a probability vector that represents probabilities that the feature has respective values. That probability vector is then compared with a value vector representing the actual value of that feature in the depicted item, and an error between the two vectors is determined. That error is used to adjust parameters of the neural network used to generate the embedding vector, allowing for the next iteration in the generation of the embedding vectors. These iterative changes continue thereby training the neural network.Type: ApplicationFiled: December 23, 2019Publication date: June 24, 2021Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN, Roy HIRSCH, Nir NICE
-
Publication number: 20210192000Abstract: Computerized searching for an item based on a prior viewed item. A displayed item is identified as a query input item to be used in searching for a target item. That input item has an associated set of embedding vectors each representing a respective feature of the input item. Target features of the search are then identified based on the input item. For each feature in the target item that is desired to be the same as the input item, an embedding vector for the input item is accessed as the vector for that feature in the search. For each feature in the target item that is desired to be different than the input item, a special vector associated with that desired value and feature is accessed for that feature in the search. These accessed vectors are then compared against target items to find close matches.Type: ApplicationFiled: December 23, 2019Publication date: June 24, 2021Inventors: Oren BARKAN, Noam RAZIN, Roy HIRSCH, Noam KOENIGSTEIN, Nir NICE
-
Publication number: 20210182489Abstract: The disclosure herein describes a system and method for attentive sentence similarity scoring. A distilled sentence embedding (DSE) language model is trained by decoupling a transformer language model using knowledge distillation. The trained DSE language model calculates sentence embeddings for a plurality of candidate sentences for sentence similarity comparisons. An embedding component associated with the trained DSE language model generates a plurality of candidate sentence representations representing each candidate sentence in the plurality of candidate sentences which are stored for use in analyzing input sentences associated with queries or searches. A representation is created for the selected sentence. This selected sentence representation is used with the plurality of candidate sentence representations to create a similarity score for each candidate sentence-selected sentence pair.Type: ApplicationFiled: February 12, 2020Publication date: June 17, 2021Inventors: Oren BARKAN, Noam RAZIN, Noam KOENIGSTEIN