Patents by Inventor Kevin J. Ash

Kevin J. Ash has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11940920
    Abstract: Provided are a computer program product, system, and method for determining tracks to prestage into cache from a storage. Information is provided related to determining tracks to prestage from the storage to the cache in a stage group of sequential tracks including a trigger track comprising a track number in the stage group at which to start prestaging tracks and Input/Output (I/O) activity information to a machine learning module. A new trigger track in the stage group at which to start prestaging tracks is received from the machine learning module having processed the provided information. The trigger track is set to the new trigger track. Tracks are prestaged in response to processing an access request to the trigger track in the stage group.
    Type: Grant
    Filed: June 20, 2018
    Date of Patent: March 26, 2024
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Matthew G. Borlick, Kevin J. Ash
  • Patent number: 11822482
    Abstract: Provided are a computer program product for managing tracks in a storage in a cache. An active track data structure indicates tracks in the cache that have an active status. An active bit in a cache control block for a track is set to indicate active for the track indicated as active in the active track data structure. In response to processing the cache control block, a determination is made, from the cache control block for the track, whether the track is active or inactive to determine processing for the cache control block.
    Type: Grant
    Filed: November 10, 2022
    Date of Patent: November 21, 2023
    Inventors: Lokesh Mohan Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos, Brian Anthony Rinaldi, Beth Ann Peterson, Matthew G. Borlick
  • Patent number: 11797448
    Abstract: A computer-implemented method, according to one embodiment, includes: in response to a determination that an available capacity of one or more buffers in a primary cache is not outside a predetermined range, using the one or more buffers in the primary cache to satisfy all incoming I/O requests. In response to a determination that the available capacity of the one or more buffers in the primary cache is outside the predetermined range, one or more buffers in a secondary cache are allocated, and the one or more buffers in the secondary cache are used to satisfy at least some of the incoming I/O requests.
    Type: Grant
    Filed: July 5, 2022
    Date of Patent: October 24, 2023
    Assignee: International Business Machines Corporation
    Inventors: Beth Ann Peterson, Kevin J. Ash, Lokesh Mohan Gupta, Warren Keith Stanley, Roger G. Hathorn
  • Publication number: 20230306111
    Abstract: Provided are a computer program product, system, and method for using trap cache segments to detect malicious processes. A trap cache segment to the cache for data in the storage and indicated as a trap cache segment. Cache segments are added to the cache having data from the storage that are not indicated as trap cache segments. A memory function call from a process executing in the computer system reads data from a region of a memory device to output the read data to a buffer of the memory device. A determination is made as to whether the region of the memory device includes the trap cache segment. The memory function call is blocked and the process is treated as a potentially malicious process in response to determining that the region includes the trap cache segment.
    Type: Application
    Filed: May 30, 2023
    Publication date: September 28, 2023
    Inventors: Brian A. Rinaldi, Clint A. Hardy, Lokesh M. Gupta, Kevin J. Ash
  • Patent number: 11768773
    Abstract: Provided are I/O request type specific cache directories in accordance with the present description. In one embodiment, by limiting track entries of a cache directory to a specific I/O request type, the size of the cache directory may be reduced as compared to general cache directories for I/O requests of all types, for example. As a result, look-up operations directed to such smaller size I/O request type specific cache directories may be completed in each directory more quickly. In addition, look-ups may frequently be successfully completed after a look-up of a single I/O request type specific cache directory, improving the speed of cache look-ups and providing a significant improvement in system performance. Other aspects and advantages are provided, depending upon the particular application.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: September 26, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Gail Spear, Lokesh Mohan Gupta, Kevin J. Ash, Kyler A. Anderson
  • Patent number: 11704209
    Abstract: Provided are a computer program product, system, and method for using a track format code in a cache control block for a track in a cache to process read and write requests to the track in the cache. A track format table associates track format codes with track format metadata. A determination is made as to whether the track format table has track format metadata matching track format metadata of a track staged into the cache. A determination is made as to whether a track format code from the track format table for the track format metadata in the track format table matches the track format metadata of the track staged. A cache control block for the track being added to the cache is generated including the determined track format code when the track format table has the matching track format metadata.
    Type: Grant
    Filed: February 2, 2022
    Date of Patent: July 18, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Kyler A. Anderson, Kevin J. Ash, Lokesh M. Gupta, Matthew J. Kalos, Beth A. Peterson
  • Patent number: 11681799
    Abstract: Provided are a computer program product, system, and method for using trap cache segments to detect malicious processes. A trap cache segment to the cache for data in the storage and indicated as a trap cache segment. Cache segments are added to the cache having data from the storage that are not indicated as trap cache segments. A memory function call from a process executing in the computer system reads data from a region of a memory device to output the read data to a buffer of the memory device. A determination is made as to whether the region of the memory device includes the trap cache segment. The memory function call is blocked and the process is treated as a potentially malicious process in response to determining that the region includes the trap cache segment.
    Type: Grant
    Filed: December 23, 2020
    Date of Patent: June 20, 2023
    Assignee: INTERNATIONAL BUSINES MACHINES CORPORATION
    Inventors: Brian A. Rinaldi, Clint A. Hardy, Lokesh M. Gupta, Kevin J. Ash
  • Patent number: 11663144
    Abstract: A method for improving cache hit ratios for selected storage elements within a storage system includes storing, in a cache of a storage system, non-favored storage elements and favored storage elements. The favored storage elements are retained in the cache longer than the non-favored storage elements. The method maintains a first LRU list containing entries associated with non-favored storage elements and designating an order in which the non-favored storage elements are evicted from the cache, and a second LRU list containing entries associated with favored storage elements and designating an order in which the favored storage elements are evicted from the cache. The method periodically scans the first LRU list for non-favored storage elements that have changed to favored storage elements, and the second LRU list for favored storage elements that have changed to non-favored storage elements. A corresponding system and computer program product are also disclosed.
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: May 30, 2023
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kevin J. Ash, Matthew G. Borlick, Beth A. Peterson
  • Patent number: 11663129
    Abstract: Provided are a computer program product, system, and method for determining tracks to prestage into cache from a storage. Information is provided related to determining tracks to prestage from the storage to the cache in a stage group of sequential tracks including a trigger track comprising a track number in the stage group at which to start prestaging tracks and Input/Output (I/O) activity information to a machine learning module. A new trigger track in the stage group at which to start prestaging tracks is received from the machine learning module having processed the provided information. The trigger track is set to the new trigger track. Tracks are prestaged in response to processing an access request to the trigger track in the stage group.
    Type: Grant
    Filed: January 7, 2021
    Date of Patent: May 30, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Matthew G. Borlick, Kevin J. Ash
  • Patent number: 11620226
    Abstract: A method to prevent starvation of non-favored volumes in cache is disclosed. In one embodiment, such a method includes storing, in a cache of a storage system, non-favored storage elements and favored storage elements. A cache demotion algorithm is used to retain the favored storage elements in the cache longer than the non-favored storage elements. The method designates a maximum amount of storage space that the favored storage elements are permitted to consume in the cache. In preparation to free storage space in the cache, the method determines whether an amount of storage space consumed by the favored storage elements in the cache has reached the maximum amount. If so, the method frees storage space in the cache by demoting favored storage elements. If not, the method frees storage space in the cache in accordance with the cache demotion algorithm. A corresponding system and computer program product are also disclosed.
    Type: Grant
    Filed: February 9, 2020
    Date of Patent: April 4, 2023
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kevin J. Ash, Matthew G. Borlick, Beth A. Peterson
  • Patent number: 11620219
    Abstract: In one embodiment, storage drive dependent track removal processing logic performs destage tasks for tracks cached in a cache as a function of whether the storage drive is classified as a fast class or as slow class of storage drives, for example. In one embodiment, a destage task configured for a slow class storage drive, transfers an entry for a track selected for destaging from a main cache list to a wait cache list to await destaging to the slow class drive. A destage task configured for a fast class storage drive allows the cache list entry for the selected track to remain on the main cache list while the selected track is being destaged to the fast class storage drive, thereby bypassing the transfer of the entry to a wait cache list. Other features and aspects may be realized, depending upon the particular application.
    Type: Grant
    Filed: November 19, 2020
    Date of Patent: April 4, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Kevin J. Ash, Matthew G. Borlick, Lokesh M. Gupta, Trung N. Nguyen
  • Publication number: 20230075922
    Abstract: Provided are a computer program product for managing tracks in a storage in a cache. An active track data structure indicates tracks in the cache that have an active status. An active bit in a cache control block for a track is set to indicate active for the track indicated as active in the active track data structure. In response to processing the cache control block, a determination is made, from the cache control block for the track, whether the track is active or inactive to determine processing for the cache control block.
    Type: Application
    Filed: November 10, 2022
    Publication date: March 9, 2023
    Inventors: Lokesh Mohan Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos, Brian Anthony Rinaldi, Beth Ann Peterson, Matthew G. Borlick
  • Publication number: 20230036755
    Abstract: Provided are a computer program product for managing tracks in a storage in a cache. An active track data structure indicates tracks in the cache that have an active status. An active bit in a cache control block for a track is set to indicate active for the track indicated as active in the active track data structure. In response to processing the cache control block, a determination is made, from the cache control block for the track, whether the track is active or inactive to determine processing for the cache control block.
    Type: Application
    Filed: July 29, 2021
    Publication date: February 2, 2023
    Inventors: Lokesh Mohan Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos, Brian Anthony Rinaldi, Beth Ann Peterson, Matthew G. Borlick
  • Publication number: 20230036075
    Abstract: Provided are a computer program product, system, and method for indicating extents of tracks in mirroring queues based on information gathered on tracks in extents in cache. Extent information on an extent of tracks in a cache indicated in an active cache list is processed in response to destaging a track from the active cache list to add to a demote list used to determine tracks to remove from the cache. The extent information is related to a number of modified tracks in an extent destaged from the active cache list. The extent information for the extent is used to determine one of a plurality of mirroring queues to indicate the extent including modified tracks. A mirroring queue having a higher priority than another mirroring queue is processed at a higher rate to determine extents of tracks to mirror from the cache to the secondary storage.
    Type: Application
    Filed: October 7, 2022
    Publication date: February 2, 2023
    Inventors: Lokesh Mohan Gupta, Kevin J. Ash, Kyler A. Anderson, Matthew J. Kalos
  • Patent number: 11550732
    Abstract: A method for maintaining statistics for data elements in a cache is disclosed. The method maintains a heterogeneous cache comprising a higher performance portion and a lower performance portion. The method maintains, within the lower performance portion, a ghost cache containing statistics for data elements that are currently contained in the heterogeneous cache, and data elements that have been demoted from the heterogeneous cache within a specified time interval. The method calculates a size of the ghost cache based on an amount of frequently accessed data that is stored in backend storage volumes behind the heterogeneous cache. The method alters the size of the ghost cache as the amount of frequently accessed data changes. A corresponding system and computer program product are also disclosed.
    Type: Grant
    Filed: February 22, 2020
    Date of Patent: January 10, 2023
    Assignee: International Business Machines Corporation
    Inventors: Lokesh M. Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew G. Borlick
  • Patent number: 11550726
    Abstract: Provided are a computer program product for managing tracks in a storage in a cache. An active track data structure indicates tracks in the cache that have an active status. An active bit in a cache control block for a track is set to indicate active for the track indicated as active in the active track data structure. In response to processing the cache control block, a determination is made, from the cache control block for the track, whether the track is active or inactive to determine processing for the cache control block.
    Type: Grant
    Filed: July 29, 2021
    Date of Patent: January 10, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Lokesh Mohan Gupta, Kyler A. Anderson, Kevin J. Ash, Matthew J. Kalos, Brian Anthony Rinaldi, Beth Ann Peterson, Matthew G. Borlick
  • Patent number: 11494304
    Abstract: Provided are a computer program product, system, and method for indicating extents of tracks in mirroring queues based on information gathered on tracks in extents in cache. Extent information on an extent of tracks in a cache indicated in an active cache list is processed in response to destaging a track from the active cache list to add to a demote list used to determine tracks to remove from the cache. The extent information is related to a number of modified tracks in an extent destaged from the active cache list. The extent information for the extent is used to determine one of a plurality of mirroring queues to indicate the extent including modified tracks. A mirroring queue having a higher priority than another mirroring queue is processed at a higher rate to determine extents of tracks to mirror from the cache to the secondary storage.
    Type: Grant
    Filed: March 13, 2020
    Date of Patent: November 8, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Lokesh Mohan Gupta, Kevin J. Ash, Kyler A. Anderson, Matthew J. Kalos
  • Patent number: 11494309
    Abstract: A list of a first type of tracks in a cache is generated. A list of a second type of tracks in the cache is generated, wherein I/O operations are completed relatively faster to the first type of tracks than to the second type of tracks. A determination is made as to whether to demote a track from the list of the first type of tracks or from the list of the second type of tracks.
    Type: Grant
    Filed: April 26, 2021
    Date of Patent: November 8, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Kyler A. Anderson, Kevin J. Ash, Lokesh M. Gupta
  • Publication number: 20220334970
    Abstract: A computer-implemented method, according to one embodiment, includes: in response to a determination that an available capacity of one or more buffers in a primary cache is not outside a predetermined range, using the one or more buffers in the primary cache to satisfy all incoming I/O requests. In response to a determination that the available capacity of the one or more buffers in the primary cache is outside the predetermined range, one or more buffers in a secondary cache are allocated, and the one or more buffers in the secondary cache are used to satisfy at least some of the incoming I/O requests.
    Type: Application
    Filed: July 5, 2022
    Publication date: October 20, 2022
    Inventors: Beth Ann Peterson, Kevin J. Ash, Lokesh Mohan Gupta, Warren Keith Stanley, Roger G. Hathorn
  • Patent number: 11474941
    Abstract: A computer-implemented method, according to one approach, includes: receiving a stream of incoming I/O requests, all of which are satisfied using one or more buffers in a primary cache. However, in response to determining that the available capacity of the one or more buffers in the primary cache is outside a predetermined range: one or more buffers in the secondary cache are allocated. These one or more buffers in the secondary cache are used to satisfy at least some of the incoming I/O requests, while the one or more buffers in the primary cache are used to satisfy a remainder of the incoming I/O requests. Moreover, in response to determining that the available capacity of the one or more buffers in the primary cache is not outside the predetermined range: the one or more buffers in the primary cache are again used to satisfy all of the incoming I/O requests.
    Type: Grant
    Filed: March 9, 2020
    Date of Patent: October 18, 2022
    Assignee: International Business Machines Corporation
    Inventors: Beth Ann Peterson, Kevin J. Ash, Lokesh Mohan Gupta, Warren Keith Stanley, Roger G. Hathorn