Patents by Inventor Cagdas Dirik

Cagdas Dirik has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230238046
    Abstract: An energy-efficient and area-efficient, mitigation of errors in a memory media device that are caused by row hammer attacks and the like is described. The detection of errors is deterministically performed while maintaining, in an SRAM, a number of row access counters that is smaller than the total number of rows protected in the memory media device. The reduction of the number of required counters is achieved by aliasing a plurality of rows that are being protected to each counter. The mitigation may be implemented on a per-bank basis, per-channel basis or per-memory media device basis. The memory media device may be DRAM.
    Type: Application
    Filed: September 9, 2022
    Publication date: July 27, 2023
    Applicant: Micron Technology, Inc.
    Inventors: Edmund GIESKE, Cagdas DIRIK, Robert M. WALKER, Sujeet AYYAPUREDDI, Niccolo IZZO, Markus GEIGER, Yang LU, Ameen AKEL, Elliott C. COOPER-BALIS, Danilo CARACCIO
  • Publication number: 20230236735
    Abstract: Systems and methods for area-efficient mitigation of errors that are caused by row hammer attacks and the like in a memory media device are described. The counters for counting row accesses are maintained in a content addressable memory (CAM) the provides fast access times. The detection of errors is deterministically performed while maintaining a number of row access counters that is smaller than the total number of rows protected in the memory media device. The circuitry for the detection and mitigation may be in the memory media device or in a memory controller to which the memory media device attaches. The memory media device may be dynamic random access memory (DRAM).
    Type: Application
    Filed: August 29, 2022
    Publication date: July 27, 2023
    Applicant: Micron Technology, Inc.
    Inventors: Sujeet AYYAPUREDDI, Yang LU, Edmund GIESKE, Cagdas DIRIK, Ameen D. AKEL, Elliott C. COOPER-BALIS, Amitava MAJUMDAR, Danilo CARACCIO, Robert M. WALKER
  • Publication number: 20230205701
    Abstract: A method includes receiving, at a direct memory access (DMA) controller of a memory device, a first command from a first cache controller coupled to the memory device to prefetch first data from the memory device and sending the prefetched first data, in response to receiving the first command, to a second cache controller coupled to the memory device. The method can further include receiving a second command from a second cache controller coupled to the memory device to prefetch second data from the memory device, and sending the prefetched second data, in response to receiving the second command, to a third cache controller coupled to the memory device.
    Type: Application
    Filed: March 6, 2023
    Publication date: June 29, 2023
    Inventors: Laurent Isenegger, Robert M. Walker, Cagdas Dirik
  • Publication number: 20230102184
    Abstract: Exemplary methods, apparatuses, and systems include detecting an operation to write dirty data to a cache. The cache is divided into a plurality of channels. In response to the operation, the dirty data is written to a first cache line in the cache, the first cache line being accessed via a first channel. Additionally, a redundant copy of the dirty data is written to a second cache line in the cache. The second cache line serves as a redundant write buffer and is accessed via a second channel, the first and second channels differing from one another. A metadata entry for the second cache line is updated to reference a location of the dirty data in the first cache line.
    Type: Application
    Filed: December 7, 2022
    Publication date: March 30, 2023
    Inventors: Cagdas Dirik, Robert M. Walker
  • Patent number: 11599472
    Abstract: A method includes receiving, at a direct memory access (DMA) controller of a memory device, a first command from a first cache controller coupled to the memory device to prefetch first data from the memory device and sending the prefetched first data, in response to receiving the first command, to a second cache controller coupled to the memory device. The method can further include receiving a second command from a second cache controller coupled to the memory device to prefetch second data from the memory device, and sending the prefetched second data, in response to receiving the second command, to a third cache controller coupled to the memory device.
    Type: Grant
    Filed: September 1, 2021
    Date of Patent: March 7, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Laurent Isenegger, Robert M. Walker, Cagdas Dirik
  • Publication number: 20230064745
    Abstract: An access tracker configured to receive a request to access a page, determine whether a page identification (ID) associated with the page is in the access tracker, increment an access count of the page in response to determining the page ID is in the access tracker, sort a number of page IDs based on an access count of each page ID, and determine whether a different page is hot or cold in response to sorting the number of page IDs.
    Type: Application
    Filed: August 25, 2021
    Publication date: March 2, 2023
    Inventors: Cagdas Dirik, Robert M. Walker, Elliott C. Cooper-Balis
  • Publication number: 20230063747
    Abstract: A method includes receiving, at a direct memory access (DMA) controller of a memory device, a first command from a first cache controller coupled to the memory device to prefetch first data from the memory device and sending the prefetched first data, in response to receiving the first command, to a second cache controller coupled to the memory device.
    Type: Application
    Filed: September 1, 2021
    Publication date: March 2, 2023
    Inventors: Laurent Isenegger, Robert M. Walker, Cagdas Dirik
  • Patent number: 11550725
    Abstract: Exemplary methods, apparatuses, and systems include detecting an operation to write dirty data to a cache. The cache is divided into a plurality of channels. In response to the operation, the dirty data is written to a first cache line in the cache, the first cache line being accessed via a first channel. Additionally, a redundant copy of the dirty data is written to a second cache line in the cache. The second cache line serves as a redundant write buffer and is accessed via a second channel, the first and second channels differing from one another. A metadata entry for the second cache line is updated to reference a location of the dirty data in the first cache line.
    Type: Grant
    Filed: May 18, 2020
    Date of Patent: January 10, 2023
    Assignee: MICRON TECHNOLOGY, INC.
    Inventors: Cagdas Dirik, Robert M. Walker
  • Patent number: 11397683
    Abstract: Systems and methods are disclosed including a first memory device, a second memory device coupled to the first memory device, where the second memory device has a lower access latency than the first memory device and acts as a cache for the first memory device. A processing device operatively coupled to the first and second memory devices can track access statistics of segments of data stored at the second memory device, the segments having a first granularity, and determine to update, based on the access statistics, a segment of data stored at the second memory device from the first granularity to a second granularity. The processing device can further retrieve additional data associated with the segment of data from the first memory device and store the additional data at the second memory device to form a new segment having the second granularity.
    Type: Grant
    Filed: August 26, 2020
    Date of Patent: July 26, 2022
    Assignee: MICRON TECHNOLOGY, INC.
    Inventors: Horia C. Simionescu, Paul Stonelake, Chung Kuang Chin, Narasimhulu Dharanikumar Kotte, Robert M. Walker, Cagdas Dirik
  • Patent number: 11301383
    Abstract: A method is described for managing the issuance and fulfillment of memory commands. The method includes receiving, by a cache controller of a memory subsystem, a first memory command corresponding to a set of memory devices. In response, the cache controller adds the first memory command to a cache controller command queue such that the cache controller command queue stores a first set of memory commands and sets a priority of the first memory command to either a high or low priority based on (1) whether the first memory command is of a first or second type and (2) an origin of the first memory command.
    Type: Grant
    Filed: July 14, 2020
    Date of Patent: April 12, 2022
    Assignee: MICRON TECHNOLOGY, INC.
    Inventors: Patrick A. La Fratta, Cagdas Dirik, Laurent Isenegger, Robert M. Walker
  • Publication number: 20220083236
    Abstract: The present disclosure includes apparatuses and methods related to a memory system with cache line data. An example apparatus can store data in a number of cache lines in the cache, wherein each of the number of lines includes a number of chunks of data that are individually accessible.
    Type: Application
    Filed: November 29, 2021
    Publication date: March 17, 2022
    Inventors: Cagdas Dirik, Robert M. Walker
  • Publication number: 20220019533
    Abstract: A method is described for managing the issuance and fulfillment of memory commands. The method includes receiving, by a cache controller of a memory subsystem, a first memory command corresponding to a set of memory devices. In response, the cache controller adds the first memory command to a cache controller command queue such that the cache controller command queue stores a first set of memory commands and sets a priority of the first memory command to either a high or low priority based on (1) whether the first memory command is of a first or second type and (2) an origin of the first memory command.
    Type: Application
    Filed: July 14, 2020
    Publication date: January 20, 2022
    Applicant: Micron Technology, Inc.
    Inventors: Patrick A. La Fratta, Cagdas Dirik, II, Laurent Isenegger, Robert M. Walker
  • Patent number: 11188234
    Abstract: The present disclosure includes apparatuses and methods related to a memory system with cache line data. An example apparatus can store data in a number of cache lines in the cache, wherein each of the number of lines includes a number of chunks of data that are individually accessible.
    Type: Grant
    Filed: August 30, 2017
    Date of Patent: November 30, 2021
    Assignee: Micron Technology, Inc.
    Inventors: Cagdas Dirik, Robert M. Walker
  • Publication number: 20210357332
    Abstract: Exemplary methods, apparatuses, and systems include detecting an operation to write dirty data to a cache. The cache is divided into a plurality of channels. In response to the operation, the dirty data is written to a first cache line in the cache, the first cache line being accessed via a first channel. Additionally, a redundant copy of the dirty data is written to a second cache line in the cache. The second cache line serves as a redundant write buffer and is accessed via a second channel, the first and second channels differing from one another. A metadata entry for the second cache line is updated to reference a location of the dirty data in the first cache line.
    Type: Application
    Filed: May 18, 2020
    Publication date: November 18, 2021
    Inventors: Cagdas Dirik, Robert M. Walker
  • Publication number: 20210089454
    Abstract: Systems and methods are disclosed including a first memory device, a second memory device coupled to the first memory device, where the second memory device has a lower access latency than the first memory device and acts as a cache for the first memory device. A processing device operatively coupled to the first and second memory devices can track access statistics of segments of data stored at the second memory device, the segments having a first granularity, and determine to update, based on the access statistics, a segment of data stored at the second memory device from the first granularity to a second granularity. The processing device can further retrieve additional data associated with the segment of data from the first memory device and store the additional data at the second memory device to form a new segment having the second granularity.
    Type: Application
    Filed: August 26, 2020
    Publication date: March 25, 2021
    Inventors: Horia C. Simionescu, Paul Stonelake, Chung Kuang Chin, Narasimhulu Dharanikumar Kotte, Robert M. Walker, Cagdas Dirik
  • Publication number: 20190065072
    Abstract: The present disclosure includes apparatuses and methods related to a memory system with cache line data. An example apparatus can store data in a number of cache lines in the cache, wherein each of the number of lines includes a number of chunks of data that are individually accessible.
    Type: Application
    Filed: August 30, 2017
    Publication date: February 28, 2019
    Inventors: Cagdas Dirik, Robert M. Walker
  • Publication number: 20190065373
    Abstract: The present disclosure includes apparatuses and methods related to a cache buffer. An example apparatus can store data associated with a request in one of a number of buffers and service a subsequent request for data associated with the request using the one of the number of buffers. The subsequent request can be serviced while the request is being serviced by the cache controller.
    Type: Application
    Filed: August 30, 2017
    Publication date: February 28, 2019
    Inventors: Cagdas Dirik, Robert M. Walker