Patents by Inventor Xilai LI

Xilai LI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11645509
    Abstract: Embodiments for training a neural network using sequential tasks are provided. A plurality of sequential tasks are received. For each task in the plurality of tasks a copy of the neural network that includes a plurality of layers is generated. From the copy of the neural network a task specific neural network is generated by performing an architectural search on the plurality of layers in the copy of the neural network. The architectural search identifies a plurality of candidate choices in the layers of the task specific neural network. Parameters in the task specific neural network that correspond to the plurality of candidate choices and that maximize architectural weights at each layer are identified. The parameters are retrained and merged with the neural network. The neural network trained on the plurality of sequential tasks is a trained neural network.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: May 9, 2023
    Assignee: Salesforce.com, Inc.
    Inventors: Yingbo Zhou, Xilai Li, Caiming Xiong
  • Publication number: 20220004709
    Abstract: The exemplified methods and systems provides deep neural network configured with a deep compositional grammatical architecture (e.g., to facilitate end-to-end representation learning). The instant deep compositional grammatical architecture beneficially integrates the compositionality and reconfigurability of grammar models with the capability of learning rich features of the deep neural networks in a principled way (e.g., for a convolutional neural network or a recombinant neural network). The instant deep compositional grammatical architecture utilizes AND-OR grammars to form an AND-OR grammar network.
    Type: Application
    Filed: November 14, 2019
    Publication date: January 6, 2022
    Inventors: Tianfu WU, Xilai LI
  • Publication number: 20200104699
    Abstract: Embodiments for training a neural network using sequential tasks are provided. A plurality of sequential tasks are received. For each task in the plurality of tasks a copy of the neural network that includes a plurality of layers is generated. From the copy of the neural network a task specific neural network is generated by performing an architectural search on the plurality of layers in the copy of the neural network. The architectural search identifies a plurality of candidate choices in the layers of the task specific neural network. Parameters in the task specific neural network that correspond to the plurality of candidate choices and that maximize architectural weights at each layer are identified. The parameters are retrained and merged with the neural network. The neural network trained on the plurality of sequential tasks is a trained neural network.
    Type: Application
    Filed: October 31, 2018
    Publication date: April 2, 2020
    Inventors: Yingbo ZHOU, Xilai LI, Caiming XIONG