Patents by Inventor Samyam Rajbhandari

Samyam Rajbhandari has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11526728
    Abstract: Systems, methods, and computer-executable instructions for determining a computation schedule for a recurrent neural network (RNN). A matrix multiplication (MM) directed-acyclic graph (DAG) is received for the RNN. Valid phased computation schedules for the RNN are generated. Each of the valid phase computation schedule includes an ordering of MM operations. For each of the plurality of valid phased computation schedules, each of the MM operations is partitioned to processor cores based on L3 cache to L2 cache data movement. The RNN is executed based on the valid phased computation schedules. A final computation schedule is stored. The final computation schedule is used for future executions of the RNN.
    Type: Grant
    Filed: June 26, 2018
    Date of Patent: December 13, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Minjia Zhang, Samyam Rajbhandari, Wenhan Wang, Yuxiong He
  • Publication number: 20190311245
    Abstract: Systems, methods, and computer-executable instructions for determining a computation schedule for a recurrent neural network (RNN). A matrix multiplication (MM) directed-acyclic graph (DAG) is received for the RNN. Valid phased computation schedules for the RNN are generated. Each of the valid phase computation schedule includes an ordering of MM operations. For each of the plurality of valid phased computation schedules, each of the MM operations is partitioned to processor cores based on L3 cache to L2 cache data movement. The RNN is executed based on the valid phased computation schedules. A final computation schedule is stored. The final computation schedule is used for future executions of the RNN.
    Type: Application
    Filed: June 26, 2018
    Publication date: October 10, 2019
    Inventors: Minjia Zhang, Samyam Rajbhandari, Wenhan Wang, Yuxiong He
  • Publication number: 20170193361
    Abstract: A neural network training tool selects from a plurality of parallelizing techniques and selects from a plurality of forward-propagation computation techniques. The neural network training tool performs a forward-propagation phase to train a neural network using the selected parallelizing technique and the selected forward-propagation computation technique based on one or more inputs. Additionally, the neural network training tool selects from a plurality computation techniques and from a plurality of parallelizing techniques for a backward-propagation phase. The neural network training tool performs a backward-propagation phase of training the neural network using the selected backward-propagation parallelizing technique and the selected backward-propagation computation technique to generate error gradients and weight deltas and to update weights associated with one or more layers of the neural network.
    Type: Application
    Filed: December 31, 2015
    Publication date: July 6, 2017
    Inventors: Trishul A. Chilimbi, Olatunji Ruwase, Samyam Rajbhandari, Michael Carbin, Yuxiong He