Patents by Inventor Jonas Furtado Dias

Jonas Furtado Dias has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11403525
    Abstract: Reinforcement learning is used to dynamically tune cache policy parameters. The current state of a workload on a cache is provided to a reinforcement learning process. The reinforcement learning process uses the cache workload characterization to select an action to be taken to adjust a value of one of multiple parameterized cache policies used to control operation of a cache. The adjusted value is applied to the cache for an upcoming time interval. At the end of the time interval, a reward associated with the action is determined, which may be computed by comparing the cache hit rate during the interval with a baseline hit rate. The process iterates until the end of an episode, at which point the parameters of the cache control policies are reset. The episode is used to train the reinforcement learning policy so that the reinforcement learning process converges to a trained state.
    Type: Grant
    Filed: June 1, 2020
    Date of Patent: August 2, 2022
    Assignee: Dell Products, L.P.
    Inventors: Vinicius Michel Gottin, Tiago Salviano Calmon, Jonas Furtado Dias, Alex Laier Bordignon, Daniel Sadoc Menasché
  • Publication number: 20220237338
    Abstract: A system and method for implementing design cycles for developing a hardware component including receiving sets of experimental data, each of set experimental data resulting from an application of a set of variables to the hardware component during a common or a different design cycle of the hardware component, where each variable represents an aspect of the hardware component, determining discretized classes of the experimental data based on one or more quality metrics, and obtaining statistical measurements of the variables to determine correlations between the discretized classes of the quality metrics and the statistical measurements of variables for determining a pattern of the quality metrics to reduce the number of design cycles implemented on the hardware component during the developing of the hardware component.
    Type: Application
    Filed: January 28, 2021
    Publication date: July 28, 2022
    Inventors: Paulo Abelha Ferreira, Adriana Bechara Prado, Jonas Furtado Dias
  • Publication number: 20220172075
    Abstract: Decoding random forest problem solving through node labeling and subtree distributions. Random forests, like any other type of machine learning algorithm, are designed and configured to solve classification, regression, and/or prediction problems. Solutions (or outputs) provided by random forests, given inputs in the form of values for a set of features, may sometimes be inaccurate, unexpected, or undesirable. Understanding or decoding how a random forest solves a given problem may be a way to correct or improve the random forest. The disclosed method, accordingly, proposes decoding random forest problem solving through the identification of subtrees (by way of node labeling) amongst a random forest, as well as the frequencies that these subtrees appear (or distributions thereof) throughout the random forest.
    Type: Application
    Filed: November 30, 2020
    Publication date: June 2, 2022
    Inventors: Paulo Abelha Ferreira, Jonas Furtado Dias, Adriana Bechara Prado
  • Patent number: 11275987
    Abstract: A method for optimizing performance of a storage system includes creating a structured state index from a universe of I/O traces of memory access operations in a storage system. The structured state index is validated against a target metric operational parameter of the storage system. If the structured state index has correlation against the target metric operational parameter of the storage system, the structured state index is used as input to a decision-making task. The decision-making task may be implemented as a deep neural network and the structured state index is used as input training data for the deep neural network. Once the decision-making task has been trained using the structured state index, the decision-making task is used in a predictive manner to generate a predicted target metric operational parameter of the storage system given a proposed storage policy.
    Type: Grant
    Filed: July 9, 2019
    Date of Patent: March 15, 2022
    Assignee: Dell Products, L.P.
    Inventors: Vinicius Michel Gottin, Jonas Furtado Dias, Tiago Salviano Calmon, Alex Laier Bordignon, Daniel Sadoc Menasché
  • Publication number: 20210374523
    Abstract: Reinforcement learning is used to dynamically tune cache policy parameters. The current state of a workload on a cache is provided to a reinforcement learning process. The reinforcement learning process uses the cache workload characterization to select an action to be taken to adjust a value of one of multiple parameterized cache policies used to control operation of a cache. The adjusted value is applied to the cache for an upcoming time interval. At the end of the time interval, a reward associated with the action is determined, which may be computed by comparing the cache hit rate during the interval with a baseline hit rate. The process iterates until the end of an episode, at which point the parameters of the cache control policies are reset. The episode is used to train the reinforcement learning policy so that the reinforcement learning process converges to a trained state.
    Type: Application
    Filed: June 1, 2020
    Publication date: December 2, 2021
    Inventors: Vinicius Michel Gottin, Tiago Salviano Calmon, Jonas Furtado Dias, Alex Laier Bordignon, Daniel Sadoc Menasché
  • Patent number: 11138118
    Abstract: The sizes of cache partitions, in a partitioned cache, are dynamically adjusted by determining, for each request, how many cache misses will occur in connection with implementing the request against the cache partition. The cache partition associated with the current request is increased in size by the number of cache misses and one or more other cache partitions is decreased in size causing cache evictions to occur from the other cache partitions rather than from the current cache partition. The other cache partitions, that are to be decreased in size, may be determined by ranking the cache partitions according to frequency of use and selecting the least frequently used cache partition to be reduced in size.
    Type: Grant
    Filed: January 13, 2020
    Date of Patent: October 5, 2021
    Assignee: EMC IP Holding Company LLC
    Inventors: Hugo de Oliveira Barbalho, Jonas Furtado Dias, Vinícius Michel Gottin
  • Patent number: 11113192
    Abstract: A method of dynamically adjusting sizes of cache partitions includes, for each cache partition, estimating a number of hits that would occur on the cache partition for a set of potential size increases of the cache partition and a set of potential size decreases of the cache partition. Based on these estimates, a determination is made for each cache partition, whether to increase the size of the cache partition, maintain a current size of the cache partition, or decrease the size of the cache partition. Cache partition size increases are balanced with cache partition size decreases to allocate the entirety of the cache to the set of cache partitions without over allocating cache resources and while optimizing a sum of total cache hit rates of the set of cache partitions. A set of data structures is used to efficiently determine the estimated hit increases and decreases for each cache partition.
    Type: Grant
    Filed: November 22, 2019
    Date of Patent: September 7, 2021
    Assignee: EMC IP Holding Company LLC
    Inventors: Hugo de Oliveira Barbalho, Jonas Furtado Dias
  • Publication number: 20210216460
    Abstract: The sizes of cache partitions, in a partitioned cache, are dynamically adjusted by determining, for each request, how many cache misses will occur in connection with implementing the request against the cache partition. The cache partition associated with the current request is increased in size by the number of cache misses and one or more other cache partitions is decreased in size causing cache evictions to occur from the other cache partitions rather than from the current cache partition. The other cache partitions, that are to be decreased in size, may be determined by ranking the cache partitions according to frequency of use and selecting the least frequently used cache partition to be reduced in size.
    Type: Application
    Filed: January 13, 2020
    Publication date: July 15, 2021
    Inventors: Hugo de Oliveira Barbalho, Jonas Furtado Dias, Vinícius Michel Gottin
  • Publication number: 20210157725
    Abstract: A method of dynamically adjusting sizes of cache partitions includes, for each cache partition, estimating a number of hits that would occur on the cache partition for a set of potential size increases of the cache partition and a set of potential size decreases of the cache partition. Based on these estimates, a determination is made for each cache partition, whether to increase the size of the cache partition, maintain a current size of the cache partition, or decrease the size of the cache partition. Cache partition size increases are balanced with cache partition size decreases to allocate the entirety of the cache to the set of cache partitions without over allocating cache resources and while optimizing a sum of total cache hit rates of the set of cache partitions. A set of data structures is used to efficiently determine the estimated hit increases and decreases for each cache partition.
    Type: Application
    Filed: November 22, 2019
    Publication date: May 27, 2021
    Inventors: Hugo de Oliveira Barbalho, Jonas Furtado Dias
  • Publication number: 20200341899
    Abstract: A data processing device includes persistent storage, a cache for the persistent storage, and a cache manager. The persistent storage is divided into logical units. The cache manager obtains persistent storage use data; selects model parameters for a cache prediction model based on the persistent storage use data; trains the cache prediction model based on the persistent storage use data using the selected model parameters to obtain a trained cache prediction model; and manages the cache based on logical units of the persistent storage using the trained cache prediction model.
    Type: Application
    Filed: April 26, 2019
    Publication date: October 29, 2020
    Inventors: Jonas Furtado Dias, Rômulo Teixeira de Abreu Pinho, Adriana Bechara Prado, Vinicius Michel Gottin, Tiago Salviano Calmon, Owen Martin