Patents by Inventor Lokesh M. Gupta

Lokesh M. Gupta has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210263862
    Abstract: A method for maintaining statistics for data elements in a cache is disclosed. The method maintains a heterogeneous cache comprising a higher performance portion and a lower performance portion. The method maintains, within the lower performance portion, a ghost cache containing statistics for data elements that are currently contained in the heterogeneous cache, and data elements that have been demoted from the heterogeneous cache within a specified time interval. The method maintains updates to the statistics in an update area within the higher performance portion. The method determines whether the updates have reached a specified threshold and, in the event the updates have reached the specified threshold, flushes the updates from the update area to the ghost cache to update the statistics. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 22, 2020
    Publication date: August 26, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew G. Borlick
  • Publication number: 20210263860
    Abstract: A method for maintaining statistics for data elements in a cache is disclosed. The method maintains a heterogeneous cache comprising a higher performance portion and a lower performance portion. The method maintains, within the lower performance portion, a ghost cache containing statistics for data elements that are currently contained in the heterogeneous cache, and data elements that have been demoted from the heterogeneous cache within a specified time interval. The method calculates a size of the ghost cache based on an amount of frequently accessed data that is stored in backend storage volumes behind the heterogeneous cache. The method alters the size of the ghost cache as the amount of frequently accessed data changes. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 22, 2020
    Publication date: August 26, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew G. Borlick
  • Publication number: 20210263861
    Abstract: A method for demoting data elements from a cache is disclosed. The method maintains a heterogeneous cache comprising a higher performance portion and a lower performance portion. The method maintains, within the lower performance portion, a ghost cache containing statistics for data elements that are currently contained in the heterogeneous cache, and data elements that have been demoted from the heterogeneous cache within a specified time interval. The method maintains, for the ghost cache, multiple LRU lists that designate an order in which data elements are demoted from the lower performance portion. The method utilizes the statistics to determine in which LRU lists the data elements are referenced. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 22, 2020
    Publication date: August 26, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew G. Borlick, Kyler A. Anderson, Kevin J. Ash
  • Publication number: 20210263781
    Abstract: A method for dispatching tasks on processor cores based on memory access efficiency is disclosed. The method identifies a task and a memory area to be accessed by the task. The method may use one or more of a compiler, code knowledge, and run-time statistics to identify the memory area that is accessed by the task. The method identifies multiple processor cores that are candidates to execute the task and identifies a particular processor core from the multiple processor cores that provides most efficient access to the memory area. The method dispatches the task to execute on the particular processor core that is deemed most efficient. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 22, 2020
    Publication date: August 26, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew J. Kalos, Kevin J. Ash, Trung N. Nguyen
  • Patent number: 11099743
    Abstract: Provided are a computer program product, system, and method for using a machine learning module to determine when to replace a storage device. Input on attributes of the storage device is provided to a machine learning module to produce an output value. A determination is made whether the output value indicates to replace the storage device. Indication is made to replace the storage device in response to determining that the output value indicates to replace the storage device.
    Type: Grant
    Filed: June 29, 2018
    Date of Patent: August 24, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Matthew G. Borlick, Karl A. Nielsen, Clint A. Hardy, Lokesh M. Gupta
  • Patent number: 11093395
    Abstract: Provide a computer program product, system, and method for adjusting insertion points used to determine locations in a cache list at which to indicate tracks based on number of tracks added at insertion points. There are a plurality of insertion points to a cache list for the cache having a least recently used (LRU) end and a most recently used (MRU) end. Each insertion point of the insertion points identifies a track in the cache list. A plurality of tracks are indicated at positions in the cache list with respect to insertion points. For each track indicated at an insertion point of the insertion points, at least one insertion point counter for at least one insertion point with respect to the insertion point at which the track is indicated is incremented. A plurality of the insertion points are adjusted to point to different tracks in the cache list based on insertion point counters for the insertion points.
    Type: Grant
    Filed: August 7, 2019
    Date of Patent: August 17, 2021
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos
  • Patent number: 11093399
    Abstract: Provided are a computer program product, system, and method for selecting resources to make available in local queues for processors to use. Each processor of a plurality of processors maintains a queue of resources for the processor to use when needed for processor operations. One of processors is selected. The selected processor accesses at least one available resource and includes the accessed at least one resource in the queue of the selected processor.
    Type: Grant
    Filed: June 24, 2019
    Date of Patent: August 17, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Kevin J. Ash, Matthew G. Borlick, Lokesh M. Gupta, Trung N. Nguyen
  • Publication number: 20210248079
    Abstract: A method to prevent starvation of non-favored volumes in cache is disclosed. In one embodiment, such a method includes storing, in a cache of a storage system, non-favored storage elements and favored storage elements. A cache demotion algorithm is used to retain the favored storage elements in the cache longer than the non-favored storage elements. The method designates a maximum amount of storage space that the favored storage elements are permitted to consume in the cache. In preparation to free storage space in the cache, the method determines whether an amount of storage space consumed by the favored storage elements in the cache has reached the maximum amount. If so, the method frees storage space in the cache by demoting favored storage elements. If not, the method frees storage space in the cache in accordance with the cache demotion algorithm. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 9, 2020
    Publication date: August 12, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kevin J. Ash, Matthew G. Borlick, Beth A. Peterson
  • Publication number: 20210247930
    Abstract: A method for pinning selected volumes within a heterogeneous cache is disclosed. The method maintains a heterogeneous cache made up of a higher performance portion and a lower performance portion. A list of pinned volumes is received that are provided higher priority than other volumes within the heterogeneous cache. The method dedicates, within the lower performance portion, a storage area to accommodate the pinned volumes and prestages the pinned volumes within the storage area. In certain embodiments, an LRU list is maintained that indicates an order in which storage elements of the pinned volumes are demoted from the storage area. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 9, 2020
    Publication date: August 12, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew G. Borlick, Kevin J. Ash, Beth A. Peterson
  • Publication number: 20210248082
    Abstract: A list of a first type of tracks in a cache is generated. A list of a second type of tracks in the cache is generated, wherein I/O operations are completed relatively faster to the first type of tracks than to the second type of tracks. A determination is made as to whether to demote a track from the list of the first type of tracks or from the list of the second type of tracks.
    Type: Application
    Filed: April 26, 2021
    Publication date: August 12, 2021
    Inventors: Kyler A. Anderson, Kevin J. Ash, Lokesh M. Gupta
  • Publication number: 20210248086
    Abstract: A method for improving cache hit ratios dedicates, within a cache, a portion of the cache to prefetched data elements. The method maintains a high priority LRU list designating an order in which high priority prefetched data elements are demoted, and a low priority LRU list designating an order in which low priority prefetched data elements are demoted. The method calculates, for the high priority LRU list, a first score based on a first priority and a first cache hit metric. The method calculates, for the low priority LRU list, a second score based on a second priority and a second cache hit metric. The method demotes, from the cache, a prefetched data element from the high priority LRU list or the low priority LRU list depending on which of the first score and the second score is lower. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 9, 2020
    Publication date: August 12, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew G. Borlick, Beth A. Peterson, Kyler A. Anderson
  • Publication number: 20210248080
    Abstract: A method for depopulating data from cache includes receiving a command to depopulate the cache of selected data. The command has an application identifier as a parameter. The application identifier is associated with an application that previously accessed the data. The method searches the cache for data elements that are marked with the application identifier and removes the data elements from the cache. In certain embodiments, the data elements are marked with a first application identifier associated with an application that staged the data elements into the cache, and a second application identifier associated with an application that last accessed the data elements. In certain embodiments, removing the data elements from the cache comprises only removing the data elements from the cache if the application identifier matches one or more of the first application identifier and the second application identifier. A corresponding system and computer program product are also disclosed.
    Type: Application
    Filed: February 9, 2020
    Publication date: August 12, 2021
    Applicant: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew G. Borlick, Kyler A. Anderson, Beth A. Peterson
  • Patent number: 11086784
    Abstract: Provided are a computer program product, system, and method for invalidating track format information for tracks in cache. Demoted tracks demoted from the cache are indicated in a demoted track list. Track format information is saved for the demoted tracks. The track format information indicates a layout of data in the demoted tracks, wherein the track format information for the demoted tracks is used when the demoted tracks are staged back into the cache. An operation is initiated to invalidate a metadata track of the metadata tracks in the storage. Demoted tracks indicated in the demoted track list having metadata in the metadata track to invalidate are removed. The track format information for the demoted tracks having metadata in the metadata track to invalidate is removed.
    Type: Grant
    Filed: January 14, 2020
    Date of Patent: August 10, 2021
    Assignee: International Business Machines Corporation
    Inventors: Kyler A. Anderson, Kevin J. Ash, Lokesh M. Gupta, Matthew J. Kalos
  • Patent number: 11080622
    Abstract: Provided are a computer program product, system, and method for determining sectors of a track to stage into cache by training a machine learning module. A machine learning module that receives as input performance attributes of system components affected by staging tracks from the storage to the cache and outputs a staging strategy comprising one of a plurality of staging strategy indicating at least one of a plurality of sectors of a track to stage into the cache. A margin of error is determined based on a current value of a performance attribute and a threshold of the performance attribute. An adjusted staging strategy is determined based on the margin of error. The machine learning module is retrained with current performance attributes to output the adjusted staging strategy.
    Type: Grant
    Filed: August 1, 2018
    Date of Patent: August 3, 2021
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Matthew G. Borlick, Kevin J. Ash
  • Patent number: 11080397
    Abstract: Provided are a computer program product, system, and method for using trap cache segments to detect malicious processes. A trap cache segment to the cache for data in the storage and indicated as a trap cache segment. Cache segments are added to the cache having data from the storage that are not indicated as trap cache segments. A memory function call from a process executing in the computer system reads data from a region of a memory device to output the read data to a buffer of the memory device. A determination is made as to whether the region of the memory device includes the trap cache segment. The memory function call is blocked and the process is treated as a potentially malicious process in response to determining that the region includes the trap cache segment.
    Type: Grant
    Filed: September 12, 2018
    Date of Patent: August 3, 2021
    Assignee: International Business Machines Corporation
    Inventors: Brian A. Rinaldi, Clint A. Hardy, Lokesh M. Gupta, Kevin J. Ash
  • Patent number: 11080149
    Abstract: Provided are a computer program product, system, and method for restoring tracks in cache. A restore operation is initiated to restore a track in the cache from a non-volatile storage to which tracks in the cache are backed-up. The non-volatile storage includes a current version of the track and wherein a previous version of the track subject to the restore operation is stored in a first location in the cache. A second location in the cache is allocated for the current version of the track to restore from the non-volatile storage. The data for the current version of the track is transferred from the non-volatile storage to the second location in the cache. Data for the track is merged from the second location into the first location in the cache to complete restoring to the current version of the track in the first location from the non-volatile storage.
    Type: Grant
    Filed: June 18, 2019
    Date of Patent: August 3, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Kyler A. Anderson, Kevin J. Ash, Matthew G. Borlick, Lokesh M. Gupta
  • Publication number: 20210232973
    Abstract: Provided are a computer program product, system, and method for determining sectors of a track to stage into cache using a machine learning module. Performance attributes of system components affected by staging tracks from the storage to the cache are provided to a machine learning module. An output is received, from the machine learning module having processed the provided performance attributes, indicating a staging strategy indicating sectors of a track to stage into the cache comprising one of a plurality of staging strategies. Sectors of an accessed track that is not in the cache are staged into the cache according to the staging strategy indicated in the output.
    Type: Application
    Filed: April 12, 2021
    Publication date: July 29, 2021
    Inventors: Lokesh M. Gupta, Kyle A. Anderson, Matthew G. Borlick, Kevin J. Ash
  • Patent number: 11074185
    Abstract: Provided are a computer program product, system, and method for adjusting a number of insertion points used to determine locations in a cache list at which to indicate tracks. Tracks added to the cache are indicated in a cache list. The cache list has a least recently used (LRU) end and a most recently used (MRU) end. In response to indicating in a cache list an insertion point interval number of tracks in the cache in a cache list, setting an insertion point to indicate one of the tracks of the insertion point interval number of tracks indicated in the cache list. Insertion points to tracks in the cache list are used to determine locations in the cache list at which to indicate tracks in the cache in the cache list.
    Type: Grant
    Filed: August 7, 2019
    Date of Patent: July 27, 2021
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos
  • Patent number: 11074118
    Abstract: A method for reporting incidents of data loss in a storage environment comprising redundant arrays of independent disks (RAIDs) is disclosed. In one embodiment, such a method monitors storage drive failures in a storage environment. For a storage drive failure detected in the storage environment, the method reports the RAID type in which the storage drive failure occurred and whether data loss occurred in the RAID as a result of the storage drive failure. In certain embodiments, the method reports whether the data loss could have been prevented had the RAID type been converted to a more robust RAID type. In other or the same embodiments, the method reports whether the data loss was prevented by the RAID type. A corresponding system and computer program product are also disclosed.
    Type: Grant
    Filed: June 15, 2019
    Date of Patent: July 27, 2021
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Matthew G. Borlick, Karl A. Nielsen, Clint A. Hardy, Brian A. Rinaldi
  • Patent number: 11074197
    Abstract: A computational device receives indications of a minimum retention time and a maximum retention time in cache for a first plurality of tracks, wherein no indications of a minimum retention time or a maximum retention time in the cache are received for a second plurality of tracks. A cache management application demotes a track of the first plurality of tracks from the cache, in response to determining that the track is a least recently used (LRU) track in a LRU list of tracks in the cache and the track has been in the cache for a time that exceeds the minimum retention time. The cache management application demotes the track of the first plurality of tracks, in response to determining that the track has been in the cache for a time that exceeds the maximum retention time.
    Type: Grant
    Filed: June 26, 2018
    Date of Patent: July 27, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Lokesh M. Gupta, Joseph Hayward, Kyler A. Anderson, Matthew G. Borlick