Patents by Inventor Adam Procter

Adam Procter has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11966843
    Abstract: Methods, apparatus, systems and articles of manufacture for distributed training of a neural network are disclosed. An example apparatus includes a neural network trainer to select a plurality of training data items from a training data set based on a toggle rate of each item in the training data set. A neural network parameter memory is to store neural network training parameters. A neural network processor is to generate training data results from distributed training over multiple nodes of the neural network using the selected training data items and the neural network training parameters. The neural network trainer is to synchronize the training data results and to update the neural network training parameters.
    Type: Grant
    Filed: June 13, 2022
    Date of Patent: April 23, 2024
    Assignee: Intel Corporation
    Inventors: Meenakshi Arunachalam, Arun Tejusve Raghunath Rajan, Deepthi Karkada, Adam Procter, Vikram Saletore
  • Publication number: 20220309349
    Abstract: Methods, apparatus, systems and articles of manufacture for distributed training of a neural network are disclosed. An example apparatus includes a neural network trainer to select a plurality of training data items from a training data set based on a toggle rate of each item in the training data set. A neural network parameter memory is to store neural network training parameters. A neural network processor is to generate training data results from distributed training over multiple nodes of the neural network using the selected training data items and the neural network training parameters. The neural network trainer is to synchronize the training data results and to update the neural network training parameters.
    Type: Application
    Filed: June 13, 2022
    Publication date: September 29, 2022
    Inventors: Meenakshi Arunachalam, Arun Tejusve Raghunath Rajan, Deepthi Karkada, Adam Procter, Vikram Saletore
  • Patent number: 10922610
    Abstract: Systems, apparatuses and methods may provide for technology that conducts a first timing measurement of a blockage timing of a first window of the training of the neural network. The blockage timing measures a time that processing is impeded at layers of the neural network during the first window of the training due to synchronization of one or more synchronizing parameters of the layers. Based upon the first timing measurement, the technology is to determine whether to modify a synchronization barrier policy to include a synchronization barrier to impede synchronization of one or more synchronizing parameters of one of the layers during a second window of the training. The technology is further to impede the synchronization of the one or more synchronizing parameters of the one of the layers during the second window if the synchronization barrier policy is modified to include the synchronization barrier.
    Type: Grant
    Filed: September 14, 2017
    Date of Patent: February 16, 2021
    Assignee: Intel Corporation
    Inventors: Adam Procter, Vikram Saletore, Deepthi Karkada, Meenakshi Arunachalam
  • Publication number: 20190080233
    Abstract: Systems, apparatuses and methods may provide for technology that conducts a first timing measurement of a blockage timing of a first window of the training of the neural network. The blockage timing measures a time that processing is impeded at layers of the neural network during the first window of the training due to synchronization of one or more synchronizing parameters of the layers. Based upon the first timing measurement, the technology is to determine whether to modify a synchronization barrier policy to include a synchronization barrier to impede synchronization of one or more synchronizing parameters of one of the layers during a second window of the training. The technology is further to impede the synchronization of the one or more synchronizing parameters of the one of the layers during the second window if the synchronization barrier policy is modified to include the synchronization barrier.
    Type: Application
    Filed: September 14, 2017
    Publication date: March 14, 2019
    Inventors: Adam Procter, Vikram Saletore, Deepthi Karkada, Meenakshi Arunachalam
  • Publication number: 20190042934
    Abstract: Methods, apparatus, systems and articles of manufacture for distributed training of a neural network are disclosed. An example apparatus includes a neural network trainer to select a plurality of training data items from a training data set based on a toggle rate of each item in the training data set. A neural network parameter memory is to store neural network training parameters. A neural network processor is to generate training data results from distributed training over multiple nodes of the neural network using the selected training data items and the neural network training parameters. The neural network trainer is to synchronize the training data results and to update the neural network training parameters.
    Type: Application
    Filed: December 1, 2017
    Publication date: February 7, 2019
    Inventors: Meenakshi Arunachalam, Arun Tejusve Raghunath Rajan, Deepthi Karkada, Adam Procter, Vikram Saletore