Patents by Inventor Antonios Kornilios Kourtis

Antonios Kornilios Kourtis has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11886960
    Abstract: Parallel training of a machine learning model on a computerized system may be provided. Computing tasks can be assigned to multiple workers of a system. A method may include accessing training data. A parallel training of the machine learning model can be started based on the accessed training data, so as for the training to be distributed through a first number K of workers, K>1. Responsive to detecting a change in a temporal evolution of a quantity indicative of a convergence rate of the parallel training (e.g., where said change reflects a deterioration of the convergence rate), the parallel training of the machine learning model is scaled-in, so as for the parallel training to be subsequently distributed through a second number K? of workers, where K>K??1. Related computerized systems and computer program products may be provided.
    Type: Grant
    Filed: May 7, 2019
    Date of Patent: January 30, 2024
    Assignee: International Business Machines Corporation
    Inventors: Michael Kaufmann, Thomas Parnell, Antonios Kornilios Kourtis
  • Patent number: 11562270
    Abstract: Embodiments of the present invention provide computer-implemented methods, computer program products and systems. Embodiments of the present invention can run preemptable tasks distributed according to a distributed environment, wherein each task of a plurality of preemptable tasks has been assigned two or more of the training data samples to process during each iteration. Embodiments of the present invention can, upon verifying that a preemption condition for each iteration is satisfied: preempt any task of the preemptable tasks that have started processing training data samples assigned to it, and update the cognitive model based on outputs obtained from completed tasks, including outputs obtained from both the preempted tasks and completed tasks that have finished processing all training data samples as assigned to it.
    Type: Grant
    Filed: April 2, 2020
    Date of Patent: January 24, 2023
    Assignee: International Business Machines Corporation
    Inventors: Michael Kaufmann, Thomas Parnell, Antonios Kornilios Kourtis, Celestine Mendler-Duenner
  • Publication number: 20210312241
    Abstract: Embodiments of the present invention provide computer-implemented methods, computer program products and systems. Embodiments of the present invention can run preemptable tasks distributed according to a distributed environment, wherein each task of a plurality of preemptable tasks has been assigned two or more of the training data samples to process during each iteration. Embodiments of the present invention can, upon verifying that a preemption condition for each iteration is satisfied: preempt any task of the preemptable tasks that have started processing training data samples assigned to it, and update the cognitive model based on outputs obtained from completed tasks, including outputs obtained from both the preempted tasks and completed tasks that have finished processing all training data samples as assigned to it.
    Type: Application
    Filed: April 2, 2020
    Publication date: October 7, 2021
    Inventors: Michael Kaufmann, Thomas Parnell, Antonios Kornilios Kourtis, Celestine Mendler-Duenner
  • Publication number: 20200356893
    Abstract: Parallel training of a machine learning model on a computerized system may be provided. Computing tasks can be assigned to multiple workers of a system. A method may include accessing training data. A parallel training of the machine learning model can be started based on the accessed training data, so as for the training to be distributed through a first number K of workers, K>1. Responsive to detecting a change in a temporal evolution of a quantity indicative of a convergence rate of the parallel training (e.g., where said change reflects a deterioration of the convergence rate), the parallel training of the machine learning model is scaled-in, so as for the parallel training to be subsequently distributed through a second number K? of workers, where K>K??1. Related computerized systems and computer program products may be provided.
    Type: Application
    Filed: May 7, 2019
    Publication date: November 12, 2020
    Inventors: Michael Kaufmann, Thomas Parnell, Antonios Kornilios Kourtis
  • Patent number: 10733114
    Abstract: Performance of a data cache is controlled; the cache implements a garbage collection process for maintaining free storage blocks in a data store of the cache and an eviction policy for selecting data to be evicted from the cache. A cache performance control method defines a performance target for operation of the cache and, in operation of the cache, monitors performance of the cache in relation to the performance target. The garbage collection process is selectively performed in a relocation mode and an eviction mode so as to promote compliance with the performance target. In the relocation mode, data contained in a set of storage blocks selected for garbage collection is relocated in the data store. In the eviction mode, a set of storage blocks for garbage collection is selected in dependence on the eviction policy and data contained in each selected storage block is evicted from the cache.
    Type: Grant
    Filed: August 28, 2018
    Date of Patent: August 4, 2020
    Assignee: International Business Machines Corporation
    Inventors: Antonios Kornilios Kourtis, Nikolas Ioannou, Ioannis Koltsidas
  • Publication number: 20200073823
    Abstract: Performance of a data cache is controlled; the cache implements a garbage collection process for maintaining free storage blocks in a data store of the cache and an eviction policy for selecting data to be evicted from the cache. A cache performance control method defines a performance target for operation of the cache and, in operation of the cache, monitors performance of the cache in relation to the performance target. The garbage collection process is selectively performed in a relocation mode and an eviction mode so as to promote compliance with the performance target. In the relocation mode, data contained in a set of storage blocks selected for garbage collection is relocated in the data store. In the eviction mode, a set of storage blocks for garbage collection is selected in dependence on the eviction policy and data contained in each selected storage block is evicted from the cache.
    Type: Application
    Filed: August 28, 2018
    Publication date: March 5, 2020
    Inventors: Antonios Kornilios Kourtis, Nikolas Ioannou, Ioannis Koltsidas