Patents by Inventor Pooja Garg

Pooja Garg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10320907
    Abstract: A system and method for scheduling the pre-loading of long-term data predicted to be requested in future time epochs into a faster storage tier are disclosed. For each epoch into the future, which may be on the order of minutes or hours, data chunks which may be accessed are predicted. Intersections are taken between predicted data chunks, starting with the furthest predicted epoch in the future, ranging back to the next future epoch. These are then intersected with adjacent results, on up a hierarchy until an intersection is taken of all of the predicted epochs. Commands are generated to preload the data chunks predicted to have the most recurring accesses, and the predicted data chunks are pre-loaded into the cache. This proceeds down the load order until either the last predicted data set is pre-loaded or it is determined that the cache has run out of space.
    Type: Grant
    Filed: September 26, 2016
    Date of Patent: June 11, 2019
    Assignee: NETAPP, INC.
    Inventors: Sai Rama Krishna Susarla, Pooja Garg
  • Publication number: 20180091593
    Abstract: A system and method for scheduling the pre-loading of long-term data predicted to be requested in future time epochs into a faster storage tier are disclosed. For each epoch into the future, which may be on the order of minutes or hours, data chunks which may be accessed are predicted. Intersections are taken between predicted data chunks, starting with the furthest predicted epoch in the future, ranging back to the next future epoch. These are then intersected with adjacent results, on up a hierarchy until an intersection is taken of all of the predicted epochs. Commands are generated to preload the data chunks predicted to have the most recurring accesses, and the predicted data chunks are pre-loaded into the cache. This proceeds down the load order until either the last predicted data set is pre-loaded or it is determined that the cache has run out of space.
    Type: Application
    Filed: September 26, 2016
    Publication date: March 29, 2018
    Inventors: Sai Rama Krishna Susarla, Pooja Garg