Patents by Inventor David Andrew Roberts

David Andrew Roberts has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11422934
    Abstract: Described apparatuses and methods track access metadata pertaining to activity within respective address ranges. The access metadata can be used to inform prefetch operations within the respective address ranges. The prefetch operations may involve deriving access patterns from access metadata covering the respective ranges. Suitable address range sizes for accurate pattern detection, however, can vary significantly from region to region of the address space based on, inter alia, workloads produced by programs utilizing the regions. Advantageously, the described apparatuses and methods can adapt the address ranges covered by the access metadata for improved prefetch performance. A data structure may be used to manage the address ranges in which access metadata are tracked. The address ranges can be adapted to improve prefetch performance through low-overhead operations implemented within the data structure.
    Type: Grant
    Filed: July 24, 2020
    Date of Patent: August 23, 2022
    Assignee: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Patent number: 11409657
    Abstract: Described apparatuses and methods track access metadata pertaining to activity within respective address ranges. The access metadata can be used to inform prefetch operations within the respective address ranges. The prefetch operations may involve deriving access patterns from access metadata covering the respective ranges. Suitable address range sizes for accurate pattern detection, however, can vary significantly from region to region of the address space based on, inter alia, workloads produced by programs utilizing the regions. Advantageously, the described apparatuses and methods can adapt the address ranges covered by the access metadata for improved prefetch performance. A data structure may be used to manage the address ranges in which access metadata are tracked. The address ranges can be adapted to improve prefetch performance through low-overhead operations implemented within the data structure.
    Type: Grant
    Filed: July 14, 2020
    Date of Patent: August 9, 2022
    Assignee: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220222180
    Abstract: Described apparatuses and methods form adaptive cache lines having a configurable capacity from hardware cache lines having a fixed capacity. The adaptive cache lines can be formed in accordance with a programmable cache-line parameter. The programmable cache-line parameter can specify a capacity for the adaptive cache lines. The adaptive cache lines may be formed by combining respective groups of fixed-capacity hardware cache lines. The quantity of fixed-capacity hardware cache lines included in respective adaptive cache lines may be based on the programmable cache-line parameter. The programmable cache-line parameter can be selected in accordance with characteristics of the cache workload.
    Type: Application
    Filed: April 4, 2022
    Publication date: July 14, 2022
    Applicant: Micron Technology, Inc.
    Inventors: David Andrew Roberts, Joseph Thomas Pawlowski
  • Publication number: 20220223201
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, the accelerator can have processing units to perform at least matrix computations of an artificial neural network via execution of instructions. The processing units have a local memory store operands of the instructions. The accelerator can access a random access memory via a system buffer, or without going through the system buffer. A fetch instruction can request an item, available at a memory address in the random access memory, to be loaded into the local memory at a local address. The fetch instruction can include a hint for the caching of the item in the system buffer. During execution of the instruction, the hint can be used to determine whether to load the item through the system buffer or to bypass the system buffer in loading the item.
    Type: Application
    Filed: January 11, 2021
    Publication date: July 14, 2022
    Inventors: Aliasger Tayeb Zaidy, Patrick Alan Estep, David Andrew Roberts
  • Patent number: 11379376
    Abstract: Techniques and devices are described for embedding data in an address stream on an interconnect, such as a memory bus. Addresses in an address stream indicate at least part of a location in memory (e.g., a memory page and offset), whereas data embedded in the address stream can indicate when metadata or other information is available to lend context to the addresses in the address stream. The indication of data in the address stream can be communicated using, for example, a mailbox, a preamble message in a messaging protocol, a checksum, repetitive transmission, or combinations thereof. The indication of data can be recorded from the address stream and may later be used to interpret memory traces recorded during a test or can be used to communicate with a memory device or other recipient of the data during testing or regular operations.
    Type: Grant
    Filed: May 20, 2020
    Date of Patent: July 5, 2022
    Assignee: Micron Technologies, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220188250
    Abstract: Techniques for implementing and/or operating an apparatus, which includes a host system, a memory system, and a shared memory bus. The memory system includes a first memory type that is subject to a first memory type-specific timing constraint and a second memory type that is subject to a second memory type-specific timing constraint. Additionally, the shared memory bus is shared by the first memory type and the second memory type. Furthermore, the apparatus utilizes a first time period to communicate with the first memory type via the shared memory bus at least in part by enforcing the first memory type-specific timing constraint during the first time period and utilizes a second time period to communicate with the second memory type via the shared memory bus at least in part by enforcing the second memory type-specific timing constraint during the second time period.
    Type: Application
    Filed: February 28, 2022
    Publication date: June 16, 2022
    Inventors: David Andrew Roberts, Joseph Thomas Pawlowski, Elliott Cooper-Balis
  • Publication number: 20220188606
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit (IC) device includes a first stack of IC dies connected to a plurality of second stacks of IC dies. The first stack has a first die of a memory controller and processing units of the Deep Learning Accelerator and at least one second die that is stacked on the first die to provide a first type of memory. Each of the second stacks has a base die and at least a third die and a fourth die having different types of memory. The base die has logic circuit configured to copy data within the same stack in response to commands from the memory controller and has a second type of memory usable as die cross buffer.
    Type: Application
    Filed: December 14, 2020
    Publication date: June 16, 2022
    Inventor: David Andrew Roberts
  • Patent number: 11354246
    Abstract: Techniques for implementing and/or operating an apparatus, which includes a memory system coupled to a processing system via a memory bus. The memory system includes hierarchical memory levels and a memory controller. The memory controller receives a memory access request at least in part by receiving an address parameter indicative of a memory address associated with a data block from the memory bus during a first clock cycle and receiving a context parameter indicative of context information associated with current targeting of the data block from the memory bus during a second clock cycle, instructs the memory system to output the data block to the memory bus based on the memory address indicated in the address parameter, and predictively controls data storage in the hierarchical memory levels based at least in part on the context information indicated in the context parameter of the memory access request.
    Type: Grant
    Filed: December 28, 2020
    Date of Patent: June 7, 2022
    Assignee: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Patent number: 11334260
    Abstract: Described apparatuses and methods control a voltage or a temperature of a memory domain to balance memory performance and energy use. In some aspects, an adaptive controller monitors memory performance metrics of a host processor that correspond to commands made to a memory domain of a memory system, including one operating at cryogenic temperatures. Based on the memory performance metrics, the adaptive controller can determine memory performance demand of the host processor, such as latency demand or bandwidth demand, for the memory domain. The adaptive controller may alter, using the determined performance demand, a voltage or a temperature of the memory domain to enable memory access performance that is tailored to meet the demand of the host processor. By so doing, the adaptive controller can manage various settings of the memory domain to address short- or long-term changes in memory performance demand.
    Type: Grant
    Filed: October 14, 2020
    Date of Patent: May 17, 2022
    Assignee: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220129196
    Abstract: Various embodiments enable versioning of data stored on a memory device, where the versioning allows the memory device to maintain different versions of data within a set of physical memory locations (e.g., a row) of the memory device. In particular, some embodiments provide for a memory device or a memory sub-system that uses versioning of stored data to facilitate a rollback operation/behavior, a checkpoint operation/behavior, or both. Additionally, some embodiments provide for a transactional memory device or a transactional memory sub-system that uses versioning of stored data to enable rollback of a memory transaction, commitment of a memory transaction, or handling of a read or write command associated with respect to a memory transaction.
    Type: Application
    Filed: October 28, 2020
    Publication date: April 28, 2022
    Inventors: David Andrew Roberts, Sean Stephen Eilert
  • Publication number: 20220113881
    Abstract: Described apparatuses and methods control a voltage or a temperature of a memory domain to balance memory performance and energy use. In some aspects, an adaptive controller monitors memory performance metrics of a host processor that correspond to commands made to a memory domain of a memory system, including one operating at cryogenic temperatures. Based on the memory performance metrics, the adaptive controller can determine memory performance demand of the host processor, such as latency demand or bandwidth demand, for the memory domain. The adaptive controller may alter, using the determined performance demand, a voltage or a temperature of the memory domain to enable memory access performance that is tailored to meet the demand of the host processor. By so doing, the adaptive controller can manage various settings of the memory domain to address short- or long-term changes in memory performance demand.
    Type: Application
    Filed: October 14, 2020
    Publication date: April 14, 2022
    Applicant: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220114093
    Abstract: Described apparatuses and methods balance memory-portion accessing. Some memory architectures are designed to accelerate memory accesses using schemes that may be at least partially dependent on memory access requests being distributed roughly equally across multiple memory portions of a memory. Examples of such memory portions include cache sets of cache memories and memory banks of multibank memories. Some code, however, may execute in a manner that concentrates memory accesses in a subset of the total memory portions, which can reduce memory responsiveness in these memory types. To account for such behaviors, described techniques can shuffle memory addresses based on a shuffle map to produce shuffled memory addresses. The shuffle map can be determined based on a count of the occurrences of a reference bit value at bit positions of the memory addresses. Using the shuffled memory address for memory requests can substantially balance the accesses across the memory portions.
    Type: Application
    Filed: October 14, 2020
    Publication date: April 14, 2022
    Applicant: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Patent number: 11294808
    Abstract: Described apparatuses and methods form adaptive cache lines having a configurable capacity from hardware cache lines having a fixed capacity. The adaptive cache lines can be formed in accordance with a programmable cache-line parameter. The programmable cache-line parameter can specify a capacity for the adaptive cache lines. The adaptive cache lines may be formed by combining respective groups of fixed-capacity hardware cache lines. The quantity of fixed-capacity hardware cache lines included in respective adaptive cache lines may be based on the programmable cache-line parameter. The programmable cache-line parameter can be selected in accordance with characteristics of the cache workload.
    Type: Grant
    Filed: May 21, 2020
    Date of Patent: April 5, 2022
    Assignee: Micron Technology, Inc.
    Inventors: David Andrew Roberts, Joseph Thomas Pawlowski
  • Publication number: 20220091990
    Abstract: Systems, apparatuses, and methods for memory management are described. For example, these may include a first memory level including memory pages in a memory array, a second memory level including a cache, a pre-fetch buffer, or both, and a memory controller that determines state information associated with a memory page in the memory array targeted by a memory access request. The state information may include a first parameter indicative of a current activation state of the memory page and a second parameter indicative of statistical likelihood (e.g., confidence) that a subsequent memory access request will target the memory page. The memory controller may disable storage of data associated with the memory page in the second memory level when the first parameter associated with the memory page indicates that the memory page is activated and the second parameter associated with the memory page is greater than or equal to a threshold.
    Type: Application
    Filed: December 6, 2021
    Publication date: March 24, 2022
    Inventor: David Andrew Roberts
  • Patent number: 11281604
    Abstract: Techniques for implementing and/or operating an apparatus, which includes a host system, a memory system, and a shared memory bus. The memory system includes a first memory type that is subject to a first memory type-specific timing constraint and a second memory type that is subject to a second memory type-specific timing constraint. Additionally, the shared memory bus is shared by the first memory type and the second memory type. Furthermore, the apparatus utilizes a first time period to communicate with the first memory type via the shared memory bus at least in part by enforcing the first memory type-specific timing constraint during the first time period and utilizes a second time period to communicate with the second memory type via the shared memory bus at least in part by enforcing the second memory type-specific timing constraint during the second time period.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: March 22, 2022
    Assignee: Micron Technology, Inc.
    Inventors: David Andrew Roberts, Joseph Thomas Pawlowski, Elliott Cooper-Balis
  • Publication number: 20220058132
    Abstract: Described apparatuses and methods partition a cache memory based, at least in part, on a metric indicative of prefetch performance. The amount of cache memory allocated for metadata related to prefetch operations versus cache storage can be adjusted based on operating conditions. Thus, the cache memory can be partitioned into a first portion allocated for metadata pertaining to an address space (prefetch metadata) and a second portion allocated for data associated with the address space (cache data). The amount of cache memory allocated to the first portion can be increased under workloads that are suitable for prefetching and decreased otherwise. The first portion may include one or more cache units, cache lines, cache ways, cache sets, or other resources of the cache memory.
    Type: Application
    Filed: August 19, 2020
    Publication date: February 24, 2022
    Applicant: Micron Technology, Inc.
    Inventors: David Andrew Roberts, Joseph Thomas Pawlowski
  • Publication number: 20220019530
    Abstract: Described apparatuses and methods track access metadata pertaining to activity within respective address ranges. The access metadata can be used to inform prefetch operations within the respective address ranges. The prefetch operations may involve deriving access patterns from access metadata covering the respective ranges. Suitable address range sizes for accurate pattern detection, however, can vary significantly from region to region of the address space based on, inter alia, workloads produced by programs utilizing the regions. Advantageously, the described apparatuses and methods can adapt the address ranges covered by the access metadata for improved prefetch performance. A data structure may be used to manage the address ranges in which access metadata are tracked. The address ranges can be adapted to improve prefetch performance through low-overhead operations implemented within the data structure.
    Type: Application
    Filed: July 24, 2020
    Publication date: January 20, 2022
    Applicant: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220019537
    Abstract: Described apparatuses and methods track access metadata pertaining to activity within respective address ranges. The access metadata can be used to inform prefetch operations within the respective address ranges. The prefetch operations may involve deriving access patterns from access metadata covering the respective ranges. Suitable address range sizes for accurate pattern detection, however, can vary significantly from region to region of the address space based on, inter alia, workloads produced by programs utilizing the regions. Advantageously, the described apparatuses and methods can adapt the address ranges covered by the access metadata for improved prefetch performance. A data structure may be used to manage the address ranges in which access metadata are tracked. The address ranges can be adapted to improve prefetch performance through low-overhead operations implemented within the data structure.
    Type: Application
    Filed: July 14, 2020
    Publication date: January 20, 2022
    Applicant: Micron Technology, Inc.
    Inventor: David Andrew Roberts
  • Publication number: 20220019538
    Abstract: Systems, apparatuses, and methods for predictive memory access are described. Memory control circuitry instructs a memory array to read a data block from or write the data block to a location targeted by a memory access request, determines memory access information including a data value correlation parameter determined based on data bits used to indicate a raw data value in the data block and/or an inter-demand delay correlation parameter determined based on a demand time of the memory access request, predicts that read access to another location in the memory array will subsequently be demanded by another memory access request based on the data value correlation parameter and/or the inter-demand delay correlation parameter, and instructs the memory array to output another data block stored at the other location to a different memory level that provides faster data access speed before the other memory access request is received.
    Type: Application
    Filed: September 30, 2021
    Publication date: January 20, 2022
    Inventor: David Andrew Roberts
  • Publication number: 20210390053
    Abstract: Methods, apparatuses, and techniques related to a host-assisted memory-side prefetcher are described herein. In general, prefetchers monitor the pattern of memory-address requests by a host device and use the pattern information to determine or predict future memory-address requests and fetch data associated with those predicted requests into a faster memory. In many cases, prefetchers that can make predictions with high performance use appreciable processing and computing resources, power, and cooling. Generally, however, producing a prefetching configuration that the prefetcher uses involves more resources than making predictions. The described host-assisted memory-side prefetcher uses the greater computing resources of the host device to produce at least an updated prefetching configuration.
    Type: Application
    Filed: June 15, 2020
    Publication date: December 16, 2021
    Applicant: Micron Technology, Inc.
    Inventor: David Andrew Roberts