Patents by Inventor Jason Zhi LIANG

Jason Zhi LIANG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11507844
    Abstract: The technology disclosed proposes a novel asynchronous evaluation strategy (AES) that increases throughput of evolutionary algorithms by continuously maintaining a queue of K individuals ready to be sent to the worker nodes for evaluation and evolving the next generation once a fraction Mi of the K individuals have been evaluated by the worker nodes, where Mi<<K. A suitable value for Mi is determined experimentally, balancing diversity and efficiency. The technology disclosed is extended to coevolution of deep neural network supermodules and blueprints in the form of AES for cooperative evolution of deep neural networks (CoDeepNEAT-AES). Applied to image captioning domain, a threefold speedup is observed on 200 graphics processing unit (GPU) worker nodes, demonstrating that the disclosed AES and CoDeepNEAT-AES are promising techniques for evolving complex systems with long and variable evaluation times.
    Type: Grant
    Filed: March 7, 2018
    Date of Patent: November 22, 2022
    Assignee: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Hormoz Shahrzad, Babak Hodjat, Risto Miikkulainen
  • Patent number: 11250327
    Abstract: The technology disclosed relates to evolving deep neural network structures. A deep neural network structure includes a plurality of modules with submodules and interconnections among the modules and the submodules. In particular, the technology disclosed relates to storing candidate genomes that identify respective values for a plurality of hyperparameters of a candidate genome. The hyperparameters include global topology hyperparameters, global operational hyperparameters, local topology hyperparameters, and local operational hyperparameters. It further includes evolving the hyperparameters by training, evaluating, and procreating the candidate genomes and corresponding modules and submodules.
    Type: Grant
    Filed: October 26, 2017
    Date of Patent: February 15, 2022
    Assignee: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Risto Miikkulainen
  • Patent number: 11250328
    Abstract: The technology disclosed relates to evolving a deep neural network based solution to a provided problem. In particular, it relates to providing an improved cooperative evolution technique for deep neural network structures. It includes creating blueprint structures that include a plurality of supermodule structures. The supermodule structures include a plurality of modules. The modules are neural networks. A first loop of evolution executes at the blueprint level. A second loop of evolution executes at the supermodule level. Further, multiple mini-loops of evolution execute at each of the subpopulations of the supermodules. The first loop, the second loop, and the mini-loops execute in parallel.
    Type: Grant
    Filed: October 26, 2017
    Date of Patent: February 15, 2022
    Assignee: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Risto Miikkulainen
  • Patent number: 11030529
    Abstract: Evolution and coevolution of neural networks via multitask learning is described. The foundation is (1) the original soft ordering, which uses a fixed architecture for the modules and a fixed routing (i.e. network topology) that is shared among all tasks. This architecture is then extended in two ways with CoDeepNEAT: (2) by coevolving the module architectures (CM), and (3) by coevolving both the module architectures and a single shared routing for all tasks using (CMSR). An alternative evolutionary process (4) keeps the module architecture fixed, but evolves a separate routing for each task during training (CTR). Finally, approaches (2) and (4) are combined into (5), where both modules and task routing are coevolved (CMTR).
    Type: Grant
    Filed: December 13, 2018
    Date of Patent: June 8, 2021
    Assignee: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
  • Patent number: 11003994
    Abstract: A system and method for evolving a deep neural network structure that solves a provided problem includes: a memory storing a candidate supermodule genome database having a pool of candidate supermodules having values for hyperparameters for identifying a plurality of neural network modules in the candidate supermodule and further storing fixed multitask neural networks; a training module that assembles and trains N enhanced fixed multitask neural networks and trains each enhanced fixed multitask neural network using training data; an evaluation module that evaluates a performance of each enhanced fixed multitask neural network using validation data; a competition module that discards supermodules in accordance with assigned fitness values and saves others in an elitist pool; an evolution module that evolves the supermodules in the elitist pool; and a solution harvesting module providing for deployment of a selected one of the enhanced fixed multitask neural networks, instantiated with supermodules selected fr
    Type: Grant
    Filed: December 7, 2018
    Date of Patent: May 11, 2021
    Assignee: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
  • Publication number: 20200143243
    Abstract: An evolutionary AutoML framework called LEAF optimizes hyperparameters, network architectures and the size of the network. LEAF makes use of both evolutionary algorithms (EAs) and distributed computing frameworks. A multiobjective evolutionary algorithm is used to maximize the performance and minimize the complexity of the evolved networks simultaneously by calculating the Pareto front given a group of individuals that have been evaluated for multiple obj ectives.
    Type: Application
    Filed: November 1, 2019
    Publication date: May 7, 2020
    Applicant: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
  • Publication number: 20190180186
    Abstract: A system and method for evolving a deep neural network structure that solves a provided problem includes: a memory storing a candidate supermodule genome database having a pool of candidate supermodules having values for hyperparameters for identifying a plurality of neural network modules in the candidate supermodule and further storing fixed multitask neural networks; a training module that assembles and trains N enhanced fixed multitask neural networks and trains each enhanced fixed multitask neural network using training data; an evaluation module that evaluates a performance of each enhanced fixed multitask neural network using validation data; a competition module that discards supermodules in accordance with assigned fitness values and saves others in an elitist pool; an evolution module that evolves the supermodules in the elitist pool; and a solution harvesting module providing for deployment of a selected one of the enhanced fixed multitask neural networks, instantiated with supermodules selected fr
    Type: Application
    Filed: December 7, 2018
    Publication date: June 13, 2019
    Applicant: Sentient Technologies (Barbados) Limited
    Inventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
  • Publication number: 20190180188
    Abstract: Evolution and coevolution of neural networks via multitask learning is described. The foundation is (1) the original soft ordering, which uses a fixed architecture for the modules and a fixed routing (i.e. network topology) that is shared among all tasks. This architecture is then extended in two ways with CoDeepNEAT: (2) by coevolving the module architectures (CM), and (3) by coevolving both the module architectures and a single shared routing for all tasks using (CMSR). An alternative evolutionary process (4) keeps the module architecture fixed, but evolves a separate routing for each task during training (CTR). Finally, approaches (2) and (4) are combined into (5), where both modules and task routing are coevolved (CMTR).
    Type: Application
    Filed: December 13, 2018
    Publication date: June 13, 2019
    Applicant: Cognizant Technology Solutions U.S. Corporation
    Inventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
  • Publication number: 20180260713
    Abstract: The technology disclosed proposes a novel asynchronous evaluation strategy (AES) that increases throughput of evolutionary algorithms by continuously maintaining a queue of K individuals ready to be sent to the worker nodes for evaluation and evolving the next generation once a fraction Mi of the K individuals have been evaluated by the worker nodes, where Mi<<K. A suitable value for Mi is determined experimentally, balancing diversity and efficiency. The technology disclosed is extended to coevolution of deep neural network supermodules and blueprints in the form of AES for cooperative evolution of deep neural networks (CoDeepNEAT-AES). Applied to image captioning domain, a threefold speedup is observed on 200 graphics processing unit (GPU) worker nodes, demonstrating that the disclosed AES and CoDeepNEAT-AES are promising techniques for evolving complex systems with long and variable evaluation times.
    Type: Application
    Filed: March 7, 2018
    Publication date: September 13, 2018
    Applicant: SENTIENT TECHNOLOGIES (BARBADOS) LIMITED
    Inventors: Jason Zhi LIANG, Hormoz SHAHRZAD, Babak HODJAT, Risto MIIKKULAINEN
  • Publication number: 20180114115
    Abstract: The technology disclosed relates to evolving deep neural network structures. A deep neural network structure includes a plurality of modules with submodules and interconnections among the modules and the submodules. In particular, the technology disclosed relates to storing candidate genomes that identify respective values for a plurality of hyperparameters of a candidate genome. The hyperparameters include global topology hyperparameters, global operational hyperparameters, local topology hyperparameters, and local operational hyperparameters. It further includes evolving the hyperparameters by training, evaluating, and procreating the candidate genomes and corresponding modules and submodules.
    Type: Application
    Filed: October 26, 2017
    Publication date: April 26, 2018
    Applicant: Sentient Technologies (Barbados) Limited
    Inventors: Jason Zhi LIANG, Risto MIIKKULAINEN
  • Publication number: 20180114116
    Abstract: The technology disclosed relates to evolving a deep neural network based solution to a provided problem. In particular, it relates to providing an improved cooperative evolution technique for deep neural network structures. It includes creating blueprint structures that include a plurality of supermodule structures. The supermodule structures include a plurality of modules. The modules are neural networks. A first loop of evolution executes at the blueprint level. A second loop of evolution executes at the supermodule level. Further, multiple mini-loops of evolution execute at each of the subpopulations of the supermodules. The first loop, the second loop, and the mini-loops execute in parallel.
    Type: Application
    Filed: October 26, 2017
    Publication date: April 26, 2018
    Inventors: Jason Zhi LIANG, Risto MIIKKULAINEN