Patents by Inventor Colin Abraham Raffel

Colin Abraham Raffel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11354574
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for increasing the security of neural network by discretizing neural network inputs. One of the methods includes receiving a network input for a neural network; processing the network input using a discretization layer, wherein the discretization layer is configured to generate a discretized network input comprising a respective discretized vector for each of the numeric values in the network input; and processing the discretized network input using the plurality of additional neural network layers to generate a network output for the network input.
    Type: Grant
    Filed: April 27, 2020
    Date of Patent: June 7, 2022
    Assignee: Google LLC
    Inventors: Aurko Roy, Ian Goodfellow, Jacob Buckman, Colin Abraham Raffel
  • Publication number: 20220083743
    Abstract: A method includes receiving a sequence of audio features characterizing an utterance and processing, using an encoder neural network, the sequence of audio features to generate a sequence of encodings. At each of a plurality of output steps, the method also includes determining a corresponding hard monotonic attention output to select an encoding from the sequence of encodings, identifying a proper subset of the sequence of encodings based on a position of the selected encoding in the sequence of encodings, and performing soft attention over the proper subset of the sequence of encodings to generate a context vector at the corresponding output step. The method also includes processing, using a decoder neural network, the context vector generated at the corresponding output step to predict a probability distribution over possible output labels at the corresponding output step.
    Type: Application
    Filed: November 30, 2021
    Publication date: March 17, 2022
    Applicant: Google LLC
    Inventors: Chung-Cheng Chiu, Colin Abraham Raffel
  • Patent number: 11210475
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for enhanced attention mechanisms. In some implementations, data indicating an input sequence is received. The data is processed using an encoder neural network to generate a sequence of encodings. A series of attention outputs is determined using one or more attender modules. Determining each attention output can include (i) selecting an encoding from the sequence of encodings and (ii) determining attention over a proper subset of the sequence of encodings, where the proper subset of encodings is determined based on a position of the selected encoding in the sequence of encodings. The selections of encodings are also monotonic through the sequence of encodings. An output sequence is generated by processing the attention outputs using a decoder neural network. An output is provided that indicates a language sequence determined from the output sequence.
    Type: Grant
    Filed: July 22, 2019
    Date of Patent: December 28, 2021
    Assignee: Google LLC
    Inventors: Chung-Cheng Chiu, Colin Abraham Raffel
  • Patent number: 11080589
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence including a respective output at each of multiple output time steps from respective encoded representations of inputs in an input sequence. The method includes, for each output time step, starting from the position, in the input order, of the encoded representation that was selected as a preceding context vector at a preceding output time step, traversing the encoded representations until an encoded representation is selected as a current context vector at the output time step. A decoder neural network processes the current context vector and a preceding output at the preceding output time step to generate a respective output score for each possible output and to update the hidden state of the decoder recurrent neural network. An output is selected for the output time step using the output scores.
    Type: Grant
    Filed: July 8, 2019
    Date of Patent: August 3, 2021
    Assignee: Google LLC
    Inventors: Ron J. Weiss, Thang Minh Luong, Peter J. Liu, Colin Abraham Raffel, Douglas Eck
  • Publication number: 20200257978
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for increasing the security of neural network by discretizing neural network inputs. One of the methods includes receiving a network input for a neural network; processing the network input using a discretization layer, wherein the discretization layer is configured to generate a discretized network input comprising a respective discretized vector for each of the numeric values in the network input; and processing the discretized network input using the plurality of additional neural network layers to generate a network output for the network input.
    Type: Application
    Filed: April 27, 2020
    Publication date: August 13, 2020
    Inventors: Aurko Roy, Ian Goodfellow, Jacob Buckman, Colin Abraham Raffel
  • Publication number: 20200026760
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for enhanced attention mechanisms. In some implementations, data indicating an input sequence is received. The data is processed using an encoder neural network to generate a sequence of encodings. A series of attention outputs is determined using one or more attender modules. Determining each attention output can include (i) selecting an encoding from the sequence of encodings and (ii) determining attention over a proper subset of the sequence of encodings, where the proper subset of encodings is determined based on a position of the selected encoding in the sequence of encodings. The selections of encodings are also monotonic through the sequence of encodings. An output sequence is generated by processing the attention outputs using a decoder neural network. An output is provided that indicates a language sequence determined from the output sequence.
    Type: Application
    Filed: July 22, 2019
    Publication date: January 23, 2020
    Inventors: Chung-Cheng Chiu, Colin Abraham Raffel
  • Publication number: 20190332919
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence including a respective output at each of multiple output time steps from respective encoded representations of inputs in an input sequence. The method includes, for each output time step, starting from the position, in the input order, of the encoded representation that was selected as a preceding context vector at a preceding output time step, traversing the encoded representations until an encoded representation is selected as a current context vector at the output time step. A decoder neural network processes the current context vector and a preceding output at the preceding output time step to generate a respective output score for each possible output and to update the hidden state of the decoder recurrent neural network. An output is selected for the output time step using the output scores.
    Type: Application
    Filed: July 8, 2019
    Publication date: October 31, 2019
    Inventors: Ron J. Weiss, Thang Minh Luong, Peter J. Liu, Colin Abraham Raffel, Douglas Eck