Patents by Inventor Peter J. Liu

Peter J. Liu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240185065
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a text summarization neural network.
    Type: Application
    Filed: October 12, 2023
    Publication date: June 6, 2024
    Inventors: Mohammad Saleh, Jingqing Zhang, Yao Zhao, Peter J. Liu
  • Publication number: 20240173972
    Abstract: A printing system and method of inspecting drop ejection in a printing system is disclosed. The method includes capturing an image of each of a plurality of drops of a print material after ejection from an ejector of a printing system, creating a temporally averaged image from each image of the plurality of drops of print material, and classifying one of the plurality of drops of print material based on the temporally averaged image that was created. The use of a pretrained convolutional neural network for classifying one of the plurality of drops and comparing the temporally averaged image to another temporally averaged image to classify one of the plurality of drops may be employed. The printing system also includes a camera with a high-speed shutter where the shutter is synchronized to an ejector pulse, and a video analytic framework coupled to the ejector and the camera configured to generate a jetting result for each of the one or more drops of liquid print material.
    Type: Application
    Filed: November 29, 2022
    Publication date: May 30, 2024
    Applicant: XEROX CORPORATION
    Inventors: Peter KNAUSDORF, Sakib ZARGAR, Joseph C. SHEFLIN, Palghat S. RAMESH, Collin Alexander LADD, Chu-Heng LIU, Paul J. McCONVILLE
  • Patent number: 11886998
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.
    Type: Grant
    Filed: January 13, 2023
    Date of Patent: January 30, 2024
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Lukasz Mieczyslaw Kaiser, Etienne Pot, Mohammad Saleh, Ben Goodrich, Peter J. Liu, Ryan Sepassi
  • Patent number: 11803751
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a text summarization neural network.
    Type: Grant
    Filed: January 4, 2021
    Date of Patent: October 31, 2023
    Assignee: Google LLC
    Inventors: Mohammad Saleh, Jingqing Zhang, Yao Zhao, Peter J. Liu
  • Publication number: 20230153613
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.
    Type: Application
    Filed: January 13, 2023
    Publication date: May 18, 2023
    Inventors: Noam M. Shazeer, Lukasz Mieczyslaw Kaiser, Etienne Pot, Mohammad Saleh, Ben Goodrich, Peter J. Liu, Ryan Sepassi
  • Patent number: 11556786
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: January 17, 2023
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Lukasz Mieczyslaw Kaiser, Etienne Pot, Mohammad Saleh, Ben David Goodrich, Peter J. Liu, Ryan Sepassi
  • Publication number: 20210350229
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a text summarization neural network.
    Type: Application
    Filed: January 4, 2021
    Publication date: November 11, 2021
    Inventors: Mohammad Saleh, Jingqing Zhang, Yao Zhao, Peter J. Liu
  • Patent number: 11080589
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence including a respective output at each of multiple output time steps from respective encoded representations of inputs in an input sequence. The method includes, for each output time step, starting from the position, in the input order, of the encoded representation that was selected as a preceding context vector at a preceding output time step, traversing the encoded representations until an encoded representation is selected as a current context vector at the output time step. A decoder neural network processes the current context vector and a preceding output at the preceding output time step to generate a respective output score for each possible output and to update the hidden state of the decoder recurrent neural network. An output is selected for the output time step using the output scores.
    Type: Grant
    Filed: July 8, 2019
    Date of Patent: August 3, 2021
    Assignee: Google LLC
    Inventors: Ron J. Weiss, Thang Minh Luong, Peter J. Liu, Colin Abraham Raffel, Douglas Eck
  • Patent number: 10885436
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a text summarization neural network.
    Type: Grant
    Filed: May 7, 2020
    Date of Patent: January 5, 2021
    Assignee: Google LLC
    Inventors: Mohammad Saleh, Jingqing Zhang, Yao Zhao, Peter J. Liu
  • Publication number: 20200342316
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.
    Type: Application
    Filed: October 29, 2018
    Publication date: October 29, 2020
    Inventors: Noam M. Shazeer, Lukasz Mieczyslaw Kaiser, Etienne Pot, Mohammad Saleh, Ben David Goodrich, Peter J. Liu, Ryan Sepassi
  • Publication number: 20190332919
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence including a respective output at each of multiple output time steps from respective encoded representations of inputs in an input sequence. The method includes, for each output time step, starting from the position, in the input order, of the encoded representation that was selected as a preceding context vector at a preceding output time step, traversing the encoded representations until an encoded representation is selected as a current context vector at the output time step. A decoder neural network processes the current context vector and a preceding output at the preceding output time step to generate a respective output score for each possible output and to update the hidden state of the decoder recurrent neural network. An output is selected for the output time step using the output scores.
    Type: Application
    Filed: July 8, 2019
    Publication date: October 31, 2019
    Inventors: Ron J. Weiss, Thang Minh Luong, Peter J. Liu, Colin Abraham Raffel, Douglas Eck