Patents by Inventor Wojciech Zaremba

Wojciech Zaremba has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240020096
    Abstract: Disclosed herein are methods, systems, and computer-readable media for generating computer code based on natural language input. In an embodiment, a method may comprise one or more of: receiving a docstring representing natural language text specifying a digital programming result; generating, using a trained machine learning model, and based on the docstring, a computer code sample configured to produce respective candidate results; causing the computer code sample to be executed; identifying, based on the executing, a computer code sample configured to produce a particular candidate result associated with the digital programming result; performing at least one of outputting, via a user interface, the identified computer code sample, compiling the identified computer code sample, transmitting the identified computer code sample to a recipient device, storing the identified computer code sample, and/or re-executing the identified computer code sample.
    Type: Application
    Filed: May 23, 2023
    Publication date: January 18, 2024
    Applicant: OpenAI Opco, LLC
    Inventors: Mark CHEN, Jerry TWOREK, Ilya SUTSKEVER, Wojciech ZAREMBA, Heewoo JUN, Henrique PONDE DE OLIVEIRA PINTO
  • Publication number: 20240020116
    Abstract: Disclosed herein are methods, systems, and computer-readable media for generating natural language based on computer code input. In an embodiment, a method may comprise one or more of: accessing a docstring generation model configured to generate docstrings from computer code; receiving one or more computer code samples; generating, using the docstring generation model and based on the received one or more computer code samples, one or more candidate docstrings representing natural language text, each of the one or more candidate docstrings being associated with at least a portion of the one or more computer code samples; identifying at least one of the one or more candidate docstrings that provides an intent of the at least a portion of the one or more computer code samples; and/or outputting, via a user interface, the at least one identified docstring with the at least a portion of the one or more computer code samples.
    Type: Application
    Filed: May 23, 2023
    Publication date: January 18, 2024
    Applicant: OpenAI Opco, LLC
    Inventors: Mark CHEN, Jerry TWOREK, Ilya SUTSKEVER, Wojciech ZAREMBA, Heewoo JUN, Henrique PONDE DE OLIVEIRA PINTO
  • Patent number: 11080594
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory using reinforcement learning. One of the methods includes providing an output derived from the system output portion of the neural network output as a system output in the sequence of system outputs; selecting a memory access process from a predetermined set of memory access processes for accessing the external memory from the reinforcement learning portion of the neural network output; writing and reading data from locations in the external memory in accordance with the selected memory access process using the differentiable portion of the neural network output; and combining the data read from the external memory with a next system input in the sequence of system inputs to generate a next neural network input in the sequence of neural network inputs.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: August 3, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Ilya Sutskever, Ivo Danihelka, Alexander Benjamin Graves, Gregory Duncan Wayne, Wojciech Zaremba
  • Patent number: 10936828
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for neural translation systems with rare word processing. One of the methods is a method training a neural network translation system to track the source in source sentences of unknown words in target sentences, in a source language and a target language, respectively and includes deriving alignment data from a parallel corpus, the alignment data identifying, in each pair of source and target language sentences in the parallel corpus, aligned source and target words; annotating the sentences in the parallel corpus according to the alignment data and a rare word model to generate a training dataset of paired source and target language sentences; and training a neural network translation model on the training dataset.
    Type: Grant
    Filed: November 16, 2018
    Date of Patent: March 2, 2021
    Assignee: Google LLC
    Inventors: Quoc V. Le, Minh-Thang Luong, Ilya Sutskever, Oriol Vinyals, Wojciech Zaremba
  • Patent number: 10657435
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing an input sequence using a recurrent neural network to generate an output for the input sequence. One of the methods includes receiving the input sequence; generating a doubled sequence comprising a first instance of the input sequence followed by a second instance of the input sequence; and processing the doubled sequence using the recurrent neural network to generate the output for the input sequence.
    Type: Grant
    Filed: October 7, 2015
    Date of Patent: May 19, 2020
    Assignee: Google LLC
    Inventors: Ilya Sutskever, Wojciech Zaremba
  • Patent number: 10402752
    Abstract: A system for training a model to predict a sequence (e.g. a sequence of words) given a context is disclosed. A model can be trained to make these predictions using a combination of individual predictions compared to base truth and sequences of predictions based on previous predictions, where the resulting sequence is compared to the base truth sequence. In particular, the model can initially use the individual predictions to train the model. The model can then be further trained over the training data in multiple iterations, where each iteration includes two processes for each training element. In the first process, an initial part of the sequence is predicted, and the model and model parameters are updated after each prediction. In the second process, the entire remaining amount of the sequence is predicted and compared to the corresponding training sequence to adjust model parameters to encourage or discourage each prediction.
    Type: Grant
    Filed: November 18, 2016
    Date of Patent: September 3, 2019
    Assignee: Facebook, Inc.
    Inventors: Marc Aurelio Ranzato, Sumit Chopra, Michael Auli, Wojciech Zaremba
  • Patent number: 10380482
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. One of the methods includes obtaining partitioned training data for the neural network, wherein the partitioned training data comprises a plurality of training items each of which is assigned to a respective one of a plurality of partitions, wherein each partition is associated with a respective difficulty level; and training the neural network on each of the partitions in a sequence from a partition associated with an easiest difficulty level to a partition associated with a hardest difficulty level, wherein, for each of the partitions, training the neural network comprises: training the neural network on a sequence of training items that includes training items selected from the training items in the partition interspersed with training items selected from the training items in all of the partitions.
    Type: Grant
    Filed: October 7, 2015
    Date of Patent: August 13, 2019
    Assignee: Google LLC
    Inventors: Ilya Sutskever, Wojciech Zaremba
  • Publication number: 20190188268
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for neural translation systems with rare word processing. One of the methods is a method training a neural network translation system to track the source in source sentences of unknown words in target sentences, in a source language and a target language, respectively and includes deriving alignment data from a parallel corpus, the alignment data identifying, in each pair of source and target language sentences in the parallel corpus, aligned source and target words; annotating the sentences in the parallel corpus according to the alignment data and a rare word model to generate a training dataset of paired source and target language sentences; and training a neural network translation model on the training dataset.
    Type: Application
    Filed: November 16, 2018
    Publication date: June 20, 2019
    Inventors: Quoc V. Le, Minh-Thang Luong, Ilya Sutskever, Oriol Vinyals, Wojciech Zaremba
  • Patent number: 10133739
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for neural translation systems with rare word processing. One of the methods is a method training a neural network translation system to track the source in source sentences of unknown words in target sentences, in a source language and a target language, respectively and includes deriving alignment data from a parallel corpus, the alignment data identifying, in each pair of source and target language sentences in the parallel corpus, aligned source and target words; annotating the sentences in the parallel corpus according to the alignment data and a rare word model to generate a training dataset of paired source and target language sentences; and training a neural network translation model on the training dataset.
    Type: Grant
    Filed: October 23, 2015
    Date of Patent: November 20, 2018
    Assignee: Google LLC
    Inventors: Quoc V. Le, Minh-Thang Luong, Ilya Sutskever, Oriol Vinyals, Wojciech Zaremba
  • Publication number: 20180144264
    Abstract: A system for training a model to predict a sequence (e.g. a sequence of words) given a context is disclosed. A model can be trained to make these predictions using a combination of individual predictions compared to base truth and sequences of predictions based on previous predictions, where the resulting sequence is compared to the base truth sequence. In particular, the model can initially use the individual predictions to train the model. The model can then be further trained over the training data in multiple iterations, where each iteration includes two processes for each training element. In the first process, an initial part of the sequence is predicted, and the model and model parameters are updated after each prediction. In the second process, the entire remaining amount of the sequence is predicted and compared to the corresponding training sequence to adjust model parameters to encourage or discourage each prediction.
    Type: Application
    Filed: November 18, 2016
    Publication date: May 24, 2018
    Inventors: Marc Aurelio Ranzato, Sumit Chopra, Michael Auli, Wojciech Zaremba
  • Publication number: 20170323201
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory using reinforcement learning. One of the methods includes providing an output derived from the system output portion of the neural network output as a system output in the sequence of system outputs; selecting a memory access process from a predetermined set of memory access processes for accessing the external memory from the reinforcement learning portion of the neural network output; writing and reading data from locations in the external memory in accordance with the selected memory access process using the differentiable portion of the neural network output; and combining the data read from the external memory with a next system input in the sequence of system inputs to generate a next neural network input in the sequence of neural network inputs.
    Type: Application
    Filed: December 30, 2016
    Publication date: November 9, 2017
    Inventors: Ilya Sutskever, Ivo Danihelka, Alexander Benjamin Graves, Gregory Duncan Wayne, Wojciech Zaremba
  • Publication number: 20160117316
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for neural translation systems with rare word processing. One of the methods is a method training a neural network translation system to track the source in source sentences of unknown words in target sentences, in a source language and a target language, respectively and includes deriving alignment data from a parallel corpus, the alignment data identifying, in each pair of source and target language sentences in the parallel corpus, aligned source and target words; annotating the sentences in the parallel corpus according to the alignment data and a rare word model to generate a training dataset of paired source and target language sentences; and training a neural network translation model on the training dataset.
    Type: Application
    Filed: October 23, 2015
    Publication date: April 28, 2016
    Inventors: Quoc V. Le, Minh-Thang Luong, Ilya Sutskever, Oriol Vinyals, Wojciech Zaremba
  • Publication number: 20160098632
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. One of the methods includes obtaining partitioned training data for the neural network, wherein the partitioned training data comprises a plurality of training items each of which is assigned to a respective one of a plurality of partitions, wherein each partition is associated with a respective difficulty level; and training the neural network on each of the partitions in a sequence from a partition associated with an easiest difficulty level to a partition associated with a hardest difficulty level, wherein, for each of the partitions, training the neural network comprises: training the neural network on a sequence of training items that includes training items selected from the training items in the partition interspersed with training items selected from the training items in all of the partitions.
    Type: Application
    Filed: October 7, 2015
    Publication date: April 7, 2016
    Inventors: Ilya Sutskever, Wojciech Zaremba
  • Patent number: 8411854
    Abstract: A method of generating a private key for use in an authentication protocol comprises, at a client: receiving a user specific identifier; converting the identifier through a one-way function to a string of a pre-determined length; and mapping said string to a permutation ?priv of a pre-determined order, said permutation being operable with a first graph G1 to generate a second graph G2=?priv(G1).
    Type: Grant
    Filed: December 10, 2008
    Date of Patent: April 2, 2013
    Assignee: National University of Ireland, Galway
    Inventors: Slawomir Grzonkowski, Wojciech Zaremba
  • Publication number: 20100290618
    Abstract: A method of generating a private key for use in an authentication protocol comprises, at a client: receiving a user specific identifier; converting the identifier through a one-way function to a string of a pre-determined length; and mapping said string to a permutation ?priv of a pre-determined order, said permutation being operable with a first graph G1 to generate a second graph G2=?priv(G1).
    Type: Application
    Filed: December 10, 2008
    Publication date: November 18, 2010
    Applicant: Hational University of Ireland, Galway
    Inventors: Grzonkowski Slawomir, Wojciech Zaremba