Patents Examined by Aracelis Ruiz
  • Patent number: 11977489
    Abstract: Apparatuses, systems, and techniques for memory management are disclosed. In at least one embodiment, memory management is provided for a heterogenous system, for example, a system including a CPU and a GPU, in which redundant or unnecessary memory transfers are reduced.
    Type: Grant
    Filed: July 19, 2022
    Date of Patent: May 7, 2024
    Assignee: NVIDIA Corporation
    Inventors: Weixi Zhu, Guilherme Cox
  • Patent number: 11972140
    Abstract: In an embodiment, a system may support programmable hashing of address bits at a plurality of levels of granularity to map memory addresses to memory controllers and ultimately at least to memory devices. The hashing may be programmed to distribute pages of memory across the memory controllers, and consecutive blocks of the page may be mapped to physically distant memory controllers. In an embodiment, address bits may be dropped from each level of granularity, forming a compacted pipe address to save power within the memory controller. In an embodiment, a memory folding scheme may be employed to reduce the number of active memory devices and/or memory controllers in the system when the full complement of memory is not needed.
    Type: Grant
    Filed: December 20, 2022
    Date of Patent: April 30, 2024
    Assignee: Apple Inc.
    Inventors: Steven Fishwick, Jeffry E. Gonion, Per H. Hammarlund, Eran Tamari, Lior Zimet, Gerard R. Williams, III
  • Patent number: 11972148
    Abstract: Example storage systems, storage devices, and methods provide proactive management of storage operations using thermal states. Host storage requests are received and used to determine storage commands for a data storage device. For each storage command, a temperature index value corresponding to an estimated change in thermal state for executing the storage command may be determined. The storage commands are allocated to command queues based on the thermal index values and then executed from the command queues by the data storage device without triggering thermal throttling of storage commands.
    Type: Grant
    Filed: June 14, 2022
    Date of Patent: April 30, 2024
    Assignee: Western Digital Technologies, Inc.
    Inventor: Ramanathan Muthiah
  • Patent number: 11971786
    Abstract: A backup processing method and a server are provided. The method is applied to a backup system, and the backup system includes a plurality of backup servers. The method includes: dividing, by a master backup server in the plurality of backup servers, a backup task into a plurality of child backup tasks; allocating, by the master backup server, each child backup task to each of the plurality of backup servers; and sending, by the master backup server, the plurality of child backup tasks to respective corresponding backup servers. According to this method, a data backup rate can be improved.
    Type: Grant
    Filed: February 23, 2022
    Date of Patent: April 30, 2024
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Lei Zhang, Maopeng Xu, Xianglin Wang
  • Patent number: 11960725
    Abstract: Embodiments of the present disclosure generally relate to an NVMe storage device having a controller memory manager and a method of accessing an NVMe storage device having a controller memory manager. In one embodiment, a storage device comprises a non-volatile memory, a volatile memory, and a controller memory manager. The controller memory manager is operable to store one or more NVMe data structures within the non-volatile memory and the volatile memory.
    Type: Grant
    Filed: December 7, 2022
    Date of Patent: April 16, 2024
    Assignee: Western Digital Technologies, Inc.
    Inventor: Shay Benisty
  • Patent number: 11960405
    Abstract: Graphics processors for implementing multi-tile memory management are disclosed. In one embodiment, a graphics processor includes a first graphics device having a local memory, a second graphics device having a local memory, and a graphics driver to provide a single virtual allocation with a common virtual address range to mirror a resource to each local memory of the first and second graphics devices.
    Type: Grant
    Filed: December 30, 2022
    Date of Patent: April 16, 2024
    Assignee: Intel Corporation
    Inventors: Zack S. Waters, Travis Schluessler, Michael Apodaca, Ankur Shah
  • Patent number: 11954364
    Abstract: According to an embodiment, a memory system includes memory chips operable in parallel and a memory controller. The memory chips each include first storage areas. The memory controller generates first groups each including first storage areas selected from different memory chips. The memory controller generates second groups each being constituted by a minimum number of first storage areas composed by excluding one or more first storage areas from each of the first groups. The minimum number of first storage areas are capable of storing at least a first amount of data received from the host. The memory controller executes writing of the data to all the minimum number of first storage areas constituting one second group.
    Type: Grant
    Filed: March 8, 2022
    Date of Patent: April 9, 2024
    Assignee: Kioxia Corporation
    Inventor: Hironobu Miyamoto
  • Patent number: 11947802
    Abstract: The present disclosure relates to utilizing a buffer management system to efficiently manage and deallocate memory buffers utilized by multiple processing roles on computer hardware devices. For example, the buffer management system utilizes distributed decentralized memory buffer monitoring in connection with augmented buffer pointers to deallocate memory buffers accurately and efficiently. In this manner, the buffer management system provides an efficient approach for multiple processing roles to consume source data stored in a memory buffer and to deallocate the buffer only after all roles have finished using it.
    Type: Grant
    Filed: September 13, 2022
    Date of Patent: April 2, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Yi Yuan, Narayanan Ravichandran, Robert Groza, Jr., Yevgeny Yankilevich, Hari Daas Angepat
  • Patent number: 11947461
    Abstract: A method, programming product, processor, and/or system for prefetching data is disclosed that includes: receiving a request for data at a cache; identifying whether the request for data received at the cache is a demand request or a prefetch request; and determining, in response to identifying that the request for data received at the cache is a prefetch request, whether to terminate the prefetch request, wherein determining whether to terminate the prefetch request comprises: determining how many hits have occurred for a prefetch stream corresponding to the prefetch request received at the cache; and determining, based upon the number of hits that have occurred for the prefetch stream corresponding to the prefetch request received by the cache, whether to terminate the prefetch request.
    Type: Grant
    Filed: January 10, 2022
    Date of Patent: April 2, 2024
    Assignee: International Business Machines Corporation
    Inventors: Mohit Karve, Naga P. Gorti, Guy L. Guthrie, Sanjeev Ghai
  • Patent number: 11940919
    Abstract: System and techniques for recall pending cache line eviction are described herein. A queue that includes a deferred memory request is kept for a cache line. Metadata for the queue is stored in a cache line tag. When a recall is needed, the metadata is written from the tag to a first recall storage, referenced by a memory request ID. After the recall request is transmitted, the memory request ID is written to a second recall storage referenced by the message ID of the recall request. Upon receipt of a response to the recall request, the queue for the cache line can be restored by using the message ID in the response to lookup the memory request ID from the second recall storage, then using the memory request ID to lookup the metadata from the first recall storage, and then writing the metadata into the tag for the cache line.
    Type: Grant
    Filed: August 30, 2022
    Date of Patent: March 26, 2024
    Assignee: Micron Technology, Inc.
    Inventors: Dean E. Walker, Tony M. Brewer
  • Patent number: 11940917
    Abstract: Methods and systems for managing storage of data in a distributed system are disclosed. To manage storage of data in a distributed system, a data processing system may include a network interface controller (NIC). The network interface controller may present emulated storages that may be used for data storage. The emulated storage devices may utilize storage resources of storage devices. The storage devices may be remote to the NIC. To reduce communication bandwidth and/or use of resources of the storage devices, the NIC and/or NICs of other data processing systems may implemented a distributed cache for data stored in the storage devices. The NICs may implement a method of managing the distributed cache to maintain synchronization between the distributed cache and the data stored in the storage devices.
    Type: Grant
    Filed: July 12, 2022
    Date of Patent: March 26, 2024
    Assignee: Dell Products L.P.
    Inventor: Boris Glimcher
  • Patent number: 11940910
    Abstract: As one aspect of the present disclosure, a byte-addressable device is disclosed. The device may include: a volatile memory device; and a controller configured to be connected with a host processor, the volatile memory device, and a non-volatile storage device, wherein the controller may be further configured to communicate with the volatile memory device and the non-volatile storage device based on address information included in a request received from the host processor.
    Type: Grant
    Filed: October 6, 2023
    Date of Patent: March 26, 2024
    Assignee: METISX CO., LTD.
    Inventors: Ju Hyun Kim, Jin Yeong Kim, Do Hun Kim
  • Patent number: 11934310
    Abstract: In one embodiment, a microprocessor, comprising: plural cores, each of the cores comprising a level 1 (L1) cache and a level 2 (L2) cache; and a shared level 3 (L3) cache comprising plural L3 tag array entries, wherein a first portion of the plural L3 tag array entries is associated with data and a second portion of the plural L3 tag array entries is decoupled from data, wherein each L3 tag array entry comprises tag information and data zero information, the data zero information indicating whether any data associated with the tag information is known to be zero or not.
    Type: Grant
    Filed: January 21, 2022
    Date of Patent: March 19, 2024
    Assignee: CENTAUR TECHNOLOGY, INC.
    Inventors: Douglas Raye Reed, Al Loper, Terry Parks
  • Patent number: 11928054
    Abstract: As one aspect of the present disclosure, an electronic device is disclosed. The device may include: a volatile memory device; and a controller configured to be connected with a host processor and the volatile memory device, wherein the controller may be further configured to receive a request related to data access from the host processor, determine whether data corresponding to address information is compressed based on the address information included in the request, and communicate with the volatile memory device and process the request based on a result of determining whether the data is compressed.
    Type: Grant
    Filed: October 5, 2023
    Date of Patent: March 12, 2024
    Assignee: METISX CO., LTD.
    Inventors: Ju Hyun Kim, Jin Yeong Kim, Jae Wan Yeon
  • Patent number: 11922018
    Abstract: To generate an optimum compressor irrespective of the number of dimensions and a format of a multidimensional dataset. A storage system refers to dimension setting information, which is information representing an attribute for each of data dimensions of the multidimensional dataset, and generates a compressor based on the dimension setting information.
    Type: Grant
    Filed: December 18, 2020
    Date of Patent: March 5, 2024
    Assignee: Hitachi, Ltd.
    Inventors: Hiroaki Akutsu, Takahiro Naruko, Akifumi Suzuki
  • Patent number: 11914521
    Abstract: A mechanism for cache quota control is disclosed. A cache memory is configured to receive access requests from a plurality of agents, wherein a given request from a given agent of the plurality of agents specifies an identification value associated with the given agent of the plurality of agents. A cache controller is coupled to the cache memory, and is configured to store indications of current allocations of the cache memory to individual ones of the plurality of agents. The cache controller is further configured to track requests to the cache memory based on identification values specified in the requests and determine whether to update allocations of the cache memory to the individual ones of the plurality of agents based on the tracked requests.
    Type: Grant
    Filed: June 29, 2022
    Date of Patent: February 27, 2024
    Assignee: Apple Inc.
    Inventors: Wolfgang H. Klingauf, Muhammad Umer Amjad, Connie W. Cheung, Yueh-Ta Wu, Muditha Kanchana, John H. Kelm
  • Patent number: 11907583
    Abstract: Apparatus, methods, media and systems for multiple sets of trim parameters are described. A non-volatile memory device may comprise a first register, a second register, a multiplexer, a first set of I/O lines, each coupled to the first register and the multiplexer, each associated with a particular trim set among multiple trim sets stored in the first register, one or more second I/O lines, each coupled to the second register and the multiplexer. The multiplexer is configured to receive a control signal. The multiplexer is configured to output, based on the control signal, a particular trim set among the multiple trim sets to the second register using the one or more second I/O lines.
    Type: Grant
    Filed: June 7, 2022
    Date of Patent: February 20, 2024
    Assignee: Western Digital Technologies, Inc.
    Inventors: Tomer Tzvi Eliash, Asaf Gueta, Inon Cohen, Yuval Grossman
  • Patent number: 11907753
    Abstract: An apparatus includes a CPU core, a first cache subsystem coupled to the CPU core, and a second memory coupled to the cache subsystem. The first cache subsystem includes a configuration register, a first memory, and a controller. The controller is configured to: receive a request directed to an address in the second memory and, in response to the configuration register having a first value, operate in a non-caching mode. In the non-caching mode, the controller is configured to provide the request to the second memory without caching data returned by the request in the first memory. In response to the configuration register having a second value, the controller is configured to operate in a caching mode. In the caching mode the controller is configured to provide the request to the second memory and cache data returned by the request in the first memory.
    Type: Grant
    Filed: November 7, 2022
    Date of Patent: February 20, 2024
    Assignee: Texas Instruments Incorporated
    Inventors: Abhijeet Ashok Chachad, Timothy David Anderson, David Matthew Thompson
  • Patent number: 11907125
    Abstract: A computer-implemented method is provided. The method includes determining whether a rejection of a request is required and determining whether the request is software forward progress (SFP)-likely or SFP-unlikely upon determining that the rejection of the request is required. The method also includes executing a first pseudo random decision to set or not set a requested state of the request in an event the request is SFP-likely or SFP-unlikely, respectively, and rejecting the request following execution of the second pseudo random decision.
    Type: Grant
    Filed: April 5, 2022
    Date of Patent: February 20, 2024
    Assignee: International Business Machines Corporation
    Inventors: Gregory William Alexander, Tu-An T. Nguyen, Deanna Postles Dunn Berger, Timothy Bronson, Christian Jacobi
  • Patent number: 11868270
    Abstract: A storage device includes a storage controller and a host interface which sends an address translation service request to a host. The host interface includes an address translation cache which stores first address information included in the address translation service request, and an address translation service latency storage which stores latency-related information including a first time until the address translation cache receives an address translation service response corresponding to the address translation service request from the host. After the host interface sends the address translation service request to the host based on the latency-related information including the first time, and after the first time elapses, the storage controller polls the host interface.
    Type: Grant
    Filed: August 18, 2022
    Date of Patent: January 9, 2024
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Seung Moon Woo, Seon Bong Kim, Han-Ju Lee