Patents by Inventor Gabriel H. Loh

Gabriel H. Loh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170177492
    Abstract: Systems, apparatuses, and methods for implementing a hybrid cache. A processor may include a hybrid L2/L3 cache which allows the processor to dynamically adjust a size of the L2 cache and a size of the L3 cache. In some embodiments, the processor may be a multi-core processor and there may be a single cache partitioned into a logical L2 cache and a logical L3 cache for use by the cores. In one embodiment, the processor may track the cache hit rates of the logical L2 and L3 caches and adjust the sizes of the logical L2 and L3 cache based on the cache hit rates. In another embodiment, the processor may adjust the sizes of the logical L2 and L3 caches based on which application is currently being executed by the processor.
    Type: Application
    Filed: December 17, 2015
    Publication date: June 22, 2017
    Inventor: Gabriel H. Loh
  • Publication number: 20170160955
    Abstract: A die-stacked hybrid memory device implements a first set of one or more memory dies implementing first memory cell circuitry of a first memory architecture type and a second set of one or more memory dies implementing second memory cell circuitry of a second memory architecture type different than the first memory architecture type. The die-stacked hybrid memory device further includes a set of one or more logic dies electrically coupled to the first and second sets of one or more memory dies, the set of one or more logic dies comprising a memory interface and a page migration manager, the memory interface coupleable to a device external to the die-stacked hybrid memory device, and the page migration manager to transfer memory pages between the first set of one or more memory dies and the second set of one or more memory dies.
    Type: Application
    Filed: November 16, 2016
    Publication date: June 8, 2017
    Inventors: Nuwan S. Jayasena, Gabriel H. Loh, James M. O'Connor, Niladrish Chatterjee
  • Patent number: 9672161
    Abstract: The described embodiments include a cache controller that configures a cache management mechanism. In the described embodiments, the cache controller is configured to monitor at least one structure associated with a cache to determine at least one cache block that may be accessed during a future access in the cache. Based on the determination of the at least one cache block that may be accessed during a future access in the cache, the cache controller configures the cache management mechanism.
    Type: Grant
    Filed: December 9, 2012
    Date of Patent: June 6, 2017
    Assignee: ADVANCED MICRO DEVICES, INC.
    Inventors: Gabriel H. Loh, Yasuko Eckert
  • Publication number: 20170132147
    Abstract: A processing device includes a cache implementing a set of at least three cache slices. Each cache slice is to store a corresponding set of cache lines. The cache further includes cache control logic coupled to the set of at least three cache slices. The cache control logic is to map addresses of an address space to the cache such that each address within the address space maps to a corresponding strict subset of two or more cache slices of the set of cache slices.
    Type: Application
    Filed: November 6, 2015
    Publication date: May 11, 2017
    Inventor: Gabriel H. LOH
  • Publication number: 20170085472
    Abstract: A communication device includes a data source that generates data for transmission over a bus, and a data encoder that receives and encodes outgoing data. An encoder system receives outgoing data from a data source and stores the outgoing data in a first queue. An encoder encodes outgoing data with a header type that is based upon a header type indication from a controller and stores the encoded data that may be a packet or a data word with at least one layered header in a second queue for transmission. The device is configured to receive at a payload extractor, a packet protocol change command from the controller and to remove the encoded data and to re-encode the data to create a re-encoded data packet and placing the re-encoded data packet in the second queue for transmission.
    Type: Application
    Filed: September 21, 2015
    Publication date: March 23, 2017
    Applicant: ADVANCED MICRO DEVICES, INC.
    Inventors: David A. Roberts, Michael Ignatowski, Nuwan Jayasena, Gabriel H. Loh
  • Patent number: 9552294
    Abstract: The described embodiments include a main memory and a cache memory (or “cache”) with a cache controller that includes a mode-setting mechanism. In some embodiments, the mode-setting mechanism is configured to dynamically determine an access pattern for the main memory. Based on the determined access pattern, the mode-setting mechanism configures at least one region of the main memory in a write-back mode and configures other regions of the main memory in a write-through mode. In these embodiments, when performing a write operation in the cache memory, the cache controller determines whether a region in the main memory where the cache block is from is configured in the write-back mode or the write-through mode and then performs a corresponding write operation in the cache memory.
    Type: Grant
    Filed: January 7, 2013
    Date of Patent: January 24, 2017
    Assignee: ADVANCED MICRO DEVICES, INC.
    Inventors: Jaewoong Sim, Mithuna S. Thottethodi, Gabriel H. Loh
  • Patent number: 9535831
    Abstract: A die-stacked hybrid memory device implements a first set of one or more memory dies implementing first memory cell circuitry of a first memory architecture type and a second set of one or more memory dies implementing second memory cell circuitry of a second memory architecture type different than the first memory architecture type. The die-stacked hybrid memory device further includes a set of one or more logic dies electrically coupled to the first and second sets of one or more memory dies, the set of one or more logic dies comprising a memory interface and a page migration manager, the memory interface coupleable to a device external to the die-stacked hybrid memory device, and the page migration manager to transfer memory pages between the first set of one or more memory dies and the second set of one or more memory dies.
    Type: Grant
    Filed: January 10, 2014
    Date of Patent: January 3, 2017
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Nuwan S. Jayasena, Gabriel H. Loh, James M. O'Connor, Niladrish Chatterjee
  • Publication number: 20160378655
    Abstract: Systems, apparatuses, and methods for sorting memory pages in a multi-level heterogeneous memory architecture. The system may classify pages into a first “hot” category or a second “cold” category. The system may attempt to place the “hot” pages into the memory level(s) closest to the systems' processor cores. The system may track parameters associated with each page, with the parameters including number of accesses, types of accesses, power consumed per access, temperature, wearability, and/or other parameters. Based on these parameters, the system may generate a score for each page. Then, the system may compare the score of each page to a threshold. If the score of a given page is greater than the threshold, the given page may be designated as “hot”. If the score of the given page is less than the threshold, the given page may be designated as “cold”.
    Type: Application
    Filed: June 26, 2015
    Publication date: December 29, 2016
    Inventors: Sergey Blagodurov, Gabriel H. Loh, Mitesh R. Meswani
  • Patent number: 9529718
    Abstract: To efficiently transfer of data from a cache to a memory, it is desirable that more data corresponding to the same page in the memory be loaded in a line buffer. Writing data to a memory page that is not currently loaded in a row buffer requires closing an old page and opening a new page. Both operations consume energy and clock cycles and potentially delay more critical memory read requests. Hence it is desirable to have more than one write going to the same DRAM page to amortize the cost of opening and closing DRAM pages. A desirable approach is batch write backs to the same DRAM page by retaining modified blocks in the cache until a sufficient number of modified blocks belonging to the same memory page are ready for write backs.
    Type: Grant
    Filed: December 12, 2014
    Date of Patent: December 27, 2016
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Syed Ali R. Jafri, Yasuko Eckert, Srilatha Manne, Mithuna S. Thottethodi, Gabriel H. Loh
  • Publication number: 20160359973
    Abstract: A technique for source-side memory request network admission control includes adjusting, by a first node, a rate of injection of memory requests by the first node into a network coupled to a memory system. The adjusting is based on an injection policy for the first node and memory request efficiency indicators. The method may include injecting memory requests by the first node into the network coupled to the memory system. The injecting has the rate of injection. The technique includes adjusting the rate of injection by the first node. The first node adjusts the rate of injection according to an injection policy for the first node and memory request efficiency indicators. The injection policy may be based on an injection rate limit for the first node. The injection policy for the first node may be based on an injection rate limit per memory channel for the first node.
    Type: Application
    Filed: June 4, 2015
    Publication date: December 8, 2016
    Inventors: Gabriel H. Loh, Eric Christopher Morton
  • Patent number: 9489321
    Abstract: A memory accessing agent includes a memory access generating circuit and a memory controller. The memory access generating circuit is adapted to generate multiple memory accesses in a first ordered arrangement. The memory controller is coupled to the memory access generating circuit and has an output port, for providing the multiple memory accesses to the output port in a second ordered arrangement based on the memory accesses and characteristics of an external memory. The memory controller determines the second ordered arrangement by calculating an efficient row burst value and interrupting multiple row-hit requests to schedule a row-miss request based on the efficient row burst value.
    Type: Grant
    Filed: June 13, 2013
    Date of Patent: November 8, 2016
    Assignee: Advanced Micro Devices, Inc.
    Inventors: James M. O'Connor, Niladrish Chatterjee, Nuwan S. Jayasena, Gabriel H. Loh
  • Patent number: 9477605
    Abstract: A system includes a first memory and a device coupleable to the first memory. The device includes a second memory to cache data from the first memory. The second memory includes a plurality of rows, each row including a corresponding set of compressed data blocks of non-uniform sizes and a corresponding set of tag blocks. Each tag block represents a corresponding compressed data block of the row. The device further includes decompression logic to decompress data blocks accessed from the second memory. The device further includes compression logic to compress data blocks to be stored in the second memory.
    Type: Grant
    Filed: July 11, 2013
    Date of Patent: October 25, 2016
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Gabriel H. Loh, James M. O'Connor
  • Patent number: 9454419
    Abstract: A method and a system are provided for partitioning a system data bus. The method can include partitioning off a portion of a system data bus that includes one or more faulty bits to form a partitioned data bus. Further, the method includes transferring data over the partitioned data bus to compensate for data loss due to the one or more faulty bits in the system data bus.
    Type: Grant
    Filed: September 3, 2013
    Date of Patent: September 27, 2016
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Gabriel H. Loh, Yi Xu, James M. O'Connor
  • Publication number: 20160246715
    Abstract: A memory module is responsive to control signaling for a random access memory (RAM) module, and performs translation of received memory addresses so that it can map a relatively small address space of an operating system to a larger physical address space of its storage arrays. The memory module can therefore be employed in systems requiring a large amount of memory, such as systems using many processors, without requiring specialized operating systems for addressing the larger physical address space.
    Type: Application
    Filed: February 23, 2015
    Publication date: August 25, 2016
    Inventors: Michael Ignatowski, Gabriel H. Loh, Nuwan S. Jayasena
  • Publication number: 20160232097
    Abstract: An integrated circuit (IC) package includes a stacked-die memory device. The stacked-die memory device includes a set of one or more stacked memory dies implementing memory cell circuitry. The stacked-die memory device further includes a set of one or more logic dies electrically coupled to the memory cell circuitry. The set of one or more logic dies includes a query controller and a memory controller. The memory controller is coupleable to at least one device external to the stacked-die memory device. The query controller is to perform a query operation on data stored in the memory cell circuitry responsive to a query command received from the external device.
    Type: Application
    Filed: February 5, 2016
    Publication date: August 11, 2016
    Inventors: Gabriel H. Loh, Nuwan S. Jayasena, James M. O'Connor, Yasuko Eckert
  • Publication number: 20160231933
    Abstract: A processor maintains a count of accesses to each memory page. When the accesses to a memory page exceed a threshold amount for that memory page, the processor sets an indicator for the page. Based on the indicators for the memory pages, the processor manages data at one or more levels of the processor's memory hierarchy.
    Type: Application
    Filed: February 6, 2015
    Publication date: August 11, 2016
    Inventors: Gabriel H. Loh, David A. Roberts, Mitesh R. Meswani, Mark R. Nutter, John R. Slice, Prashant Nair, Michael Ignatowski
  • Patent number: 9406403
    Abstract: A memory subsystem employs spare memory cells external to one or more memory devices. In some embodiments, a processing system uses the spare memory cells to replace individual selected cells at the protected memory, whereby the selected cells are replaced on a cell-by-cell basis, rather than exclusively on a row-by-row, column-by-column, or block-by-block basis. This allows faulty memory cells to be replaced efficiently, thereby improving memory reliability and manufacturing yields, without requiring large blocks of spare memory cells.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: August 2, 2016
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Gabriel H. Loh, Vilas Sridharan, James M. O'Connor
  • Patent number: 9405703
    Abstract: The described embodiments include a translation lookaside buffer (“TLB”) that is used for performing virtual address to physical address translations when making memory accesses in a memory in a computing device. In the described embodiments, the TLB includes a hierarchy of tables that are each used for performing virtual address to physical address translations based on the arrangement of pages of memory in corresponding regions of the memory. When performing a virtual address to physical address translation, the described embodiments perform a lookup in each of the tables in parallel for the virtual address to physical address translation and use a physical address that is returned from a lowest table in the hierarchy as the translation.
    Type: Grant
    Filed: June 4, 2014
    Date of Patent: August 2, 2016
    Assignee: ADVANCED MICRO DEVICES, INC.
    Inventor: Gabriel H. Loh
  • Publication number: 20160188456
    Abstract: In one form, a computer system includes a central processing unit, a memory controller coupled to the central processing unit and capable of accessing non-volatile random access memory (NVRAM), and an NVRAM-aware operating system. The NVRAM-aware operating system causes the central processing unit to selectively execute selected ones of a plurality of application programs, and is responsive to a predetermined operation to cause the central processing unit to execute a memory persistence procedure using the memory controller to access the NVRAM.
    Type: Application
    Filed: December 31, 2014
    Publication date: June 30, 2016
    Applicants: ATI TECHNOLOGIES ULC, ADVANCED MICRO DEVICES, INC.
    Inventors: Sergey Blagodurov, Gabriel H. Loh, Mauricio Breternitz
  • Publication number: 20160179382
    Abstract: A processor modifies memory management mode for a range of memory locations of a multilevel memory hierarchy based on changes in an application phase of an application executing at a processor. The processor monitors the application phase (e.g.,. computation-bound phase, input/output phase, or memory access phase) of the executing application and in response to a change in phase consults a management policy to identify a memory management mode. The processor automatically reconfigures a memory controller and other modules so that a range of memory locations of the multilevel memory hierarchy are managed according to the identified memory management mode. By changing the memory management mode for the range of memory locations according to the application phase, the processor improves processing efficiency and flexibility.
    Type: Application
    Filed: December 19, 2014
    Publication date: June 23, 2016
    Inventors: Sergey Blagodurov, Mitesh Ramesh Meswani, Gabriel H. Loh, Mauricio Breternitz, JR., Mark Richard Nutter, John Robert Slice, David Andrew Roberts, Michael Ignatowski, Mark Henry Oskin