Patents Examined by Aracelis Ruiz
  • Patent number: 12045508
    Abstract: A data storage device and method for device-initiated hibernation are provided. In one embodiment, the data storage device comprises a non-volatile memory and a controller. The controller is configured to: receive, from a host during a set-up phase of a hibernation process, a plurality of write commands with a current state of a volatile memory in the host; store the plurality of write commands in a queue, wherein the plurality of write commands are not executed during the set-up phase of the hibernation process; receive a trigger from the host to perform an execution phase of the hibernation process; and in response to receiving the trigger, execute the plurality of write commands to store the current state of the host's volatile memory in the non-volatile memory of the data storage device. Other embodiments are possible, and each of the embodiments can be used alone or together in combination.
    Type: Grant
    Filed: May 24, 2022
    Date of Patent: July 23, 2024
    Assignee: Sandisk Technologies, Inc.
    Inventors: Judah Gamliel Hahn, Ariel Navon, Shay Benisty
  • Patent number: 12032835
    Abstract: Aspects of the innovations herein are consistent with a storage system for storing variable sized objects. According to certain implementations, the storage system may be a transaction-based system that uses variable sized objects to store data, and/or may be implemented using data stores, such as arrays disks arranged in ranks. In some exemplary implementations, each rank may include multiple stripes, each stripe may be read and written as a convenient unit for maximum performance, and/or a rank manager may be provided to dynamically configure the ranks. In certain implementations, the storage system may include a stripe space table that contains entries describing the amount of space used in each stripe. Further, an object map may provide entries for each object in the storage system describing the location, the length and/or version of the object.
    Type: Grant
    Filed: April 18, 2023
    Date of Patent: July 9, 2024
    Assignee: Primos Storage Technology, LLC
    Inventor: Robert E. Cousins
  • Patent number: 12026102
    Abstract: Systems, apparatuses, and methods related to isolating virtual machines in a memory device are described. A memory apparatus includes a memory device and a controller coupled to the memory device, wherein the controller is configured to provide a plurality of Peripheral Component Interconnect express (PCIe) functions of the memory device and isolate access to each of the plurality of PCIe functions via respective passwords and digital signatures created from host keys.
    Type: Grant
    Filed: September 7, 2022
    Date of Patent: July 2, 2024
    Assignee: Micron Technology, Inc.
    Inventors: Michael Burk, Lance Dover
  • Patent number: 12019562
    Abstract: An apparatus comprising a processor unit comprising circuitry to generate, for a first network host, a request for an object of a second network host, wherein the request comprises an address comprising a routable host ID of the second network host and an at least partially encrypted object ID, wherein the address uniquely identifies the object within a distributed computing domain; and a memory element to store at least a portion of the object.
    Type: Grant
    Filed: September 22, 2021
    Date of Patent: June 25, 2024
    Assignee: Intel Corporation
    Inventors: Michael D. LeMay, David M. Durham, Anjo Lucas Vahldiek-Oberwagner, Anna Trikalinou
  • Patent number: 12019563
    Abstract: Systems, apparatuses and methods provide for technology that determines that first data associated with a first security domain is to be stored in a first permutated cache set, where the first permuted cache set is identified based on a permutation function that permutes at least one of a plurality of first cache indexes. The technology further determines that second data associated with a second security domain is to be stored in a second permutated cache set, where the second permuted cache set is identified based on the permutation function. The second permutated cache set may intersect the first permutated cache set at one data cache line to cause an eviction of first data associated with the first security domain from the one data cache line and bypass eviction of data associated with the first security domain from at least one other data cache line of the first permuted cache set.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: June 25, 2024
    Assignee: Intel Corporation
    Inventors: Scott Constable, Thomas Unterluggauer
  • Patent number: 12013794
    Abstract: According to a first aspect, execution logic is configured to perform a linear capability transfer operation which transfers a physical capability from a partition of a first software modules to a partition of a second of software module without retaining it in the partition of the first. According to a second, alternative or additional aspect, the execution logic is configured to perform a sharding operation whereby a physical capability is divided into at least two instances, which may later be combined.
    Type: Grant
    Filed: October 28, 2020
    Date of Patent: June 18, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David T. Chisnall, Sylvan W. Clebsch, Cédric Alain Marie Christophe Fournet
  • Patent number: 12007889
    Abstract: Methods, systems, and devices for valid data identification for garbage collection are described. In connection with writing data to a block of memory cells, a memory system may identify a portion of a logical address space that includes a logical address for the data. The memory system may set a bit of a bitmap, which may indicate that the block includes data having a logical address within a portion of the logical address space corresponding to the bit. The logical address space may be divided into any quantity of portions, each corresponding to a different subset of a logical-to-physical (L2P) table, and the bitmap may include any quantity of corresponding bits. To perform garbage collection on the block, the bitmap may be used to identify one or more subsets of the L2P table to evaluate to determine whether different sets of data within the block are valid or invalid.
    Type: Grant
    Filed: October 18, 2022
    Date of Patent: June 11, 2024
    Assignee: Micron Technology, Inc.
    Inventor: David Aaron Palmer
  • Patent number: 12001349
    Abstract: A storage device includes a memory device including a first memory region having a lowest bit density, a second memory region having a medium bit density, and a third memory region having a highest bit density, and a controller configured to control the memory device. The controller is configured to determine data from a host as being any one of hot data, warm data and cold data, is configured to store the hot data in the first memory region, is configured to store the warm data in the second memory region, is configured to store the cold data in the third memory region, is configured to select a source block of first memory blocks included in the first memory region, is configured to select destination blocks in each of the second and third memory regions, and is configured to migrate each piece of unit data stored in the source block to one of the destination blocks according to a degree of hotness of each piece of the unit data.
    Type: Grant
    Filed: June 29, 2022
    Date of Patent: June 4, 2024
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Chanha Kim, Gyeongmin Nam, Seungryong Jang
  • Patent number: 12001308
    Abstract: A creating method of a classification model about a hard disk efficiency problem comprising: by an analyzing device, performing: obtaining a plurality of pieces of measurement data of a plurality of hard disk devices each of which comprises a plurality of values of a plurality of vibration parameters; binarizing the plurality of pieces of measurement data based on a plurality of preset conditions respectively corresponding to the plurality of vibration parameters; and obtaining the classification model about the hard disk efficiency problem based on the plurality of pieces of binarized measurement data and a decision tree algorithm.
    Type: Grant
    Filed: June 15, 2022
    Date of Patent: June 4, 2024
    Assignees: INVENTEC (PUDONG) TECHNOLOGY CORPORATION, INVENTEC CORPORATION
    Inventors: Yi-Ju Liao, Jen-Yuan Chang, Po-Hsiu Chen, Hsieh-Liang Tsai
  • Patent number: 11995000
    Abstract: A packet cache system includes a cache memory allocator for receiving a memory address corresponding to a non-cache memory and allocated to a packet, and associating the memory address with a cache memory address; a hash table for storing the memory address and the cache memory address, with the memory address as a key and the cache memory address as a value; a cache memory for storing the packet at a location indicated by the cache memory address; and an eviction engine for determining one or more cached packets to remove from the cache memory and place in the non-cache memory when occupancy of the cache memory is high.
    Type: Grant
    Filed: June 7, 2022
    Date of Patent: May 28, 2024
    Assignee: Google LLC
    Inventors: Jiazhen Zheng, Srinivas Vaduvatha, Hugh McEvoy Walsh, Prashant R. Chandra, Abhishek Agarwal, Weihuang Wang, Weiwei Jiang
  • Patent number: 11989454
    Abstract: A semiconductor device includes a programming control signal generation circuit configured to generate a programming control signal and a programming termination signal based on programming data when a programming operation is performed, and a programming control circuit configured to program a command, an address, and an operation signal, based on the programming control signal to generate a programming command, a programming address, and a programming operation signal.
    Type: Grant
    Filed: March 16, 2022
    Date of Patent: May 21, 2024
    Assignee: SK hynix Inc.
    Inventors: Choung Ki Song, Woo Yeong Cho
  • Patent number: 11989137
    Abstract: Logging cache line lifetime hints when recording an execution trace. A microprocessor detects occurrence of a first cache event that initiates a lifetime of a cache line within a memory cache, and initiates logging first trace information indicating a beginning of the lifetime of the cache line within the memory cache. Subsequently, the microprocessor detects occurrence of a second cache event that ends the lifetime of the cache line within the memory cache. Based on detecting the second cache event, the microprocessor initiates logging second trace information indicating an ending of the lifetime of the cache line within the memory cache.
    Type: Grant
    Filed: March 21, 2022
    Date of Patent: May 21, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Jordi Mola
  • Patent number: 11989129
    Abstract: Systems, apparatuses and methods may provide for technology that identifies a NUMA node, defines a first virtual proximity domain within the NUMA node, and defines a second virtual proximity domain within the NUMA node, wherein the first virtual proximity domain and the second virtual proximity domain are defined via one or more OS interface tables.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: May 21, 2024
    Assignee: Intel Corporation
    Inventors: Kyle Delehanty, Sridharan Sakthivelu, Janardhana Yoga Narasimhaswamy, Vijay Bahirji, Toby Opferman
  • Patent number: 11983121
    Abstract: Provided is a cache memory device including a command reception unit for packetizing each of read commands and write commands and classifying them as even or odd; a cache scheduler comprising a first reorder scheduling queue for receiving commands classified as even numbers from the command reception unit and scheduling the commands classified as even numbers for cache memory accesses and a second reorder scheduling queue for receiving commands classified as odd numbers from the command reception unit and scheduling the commands classified as odd numbers for cache memory accesses; and an access execution unit for performing cache memory accesses via a cache tag to scheduled commands classified as even numbers and scheduled commands classified as odd numbers.
    Type: Grant
    Filed: November 15, 2023
    Date of Patent: May 14, 2024
    Assignee: METISX CO., LTD.
    Inventors: Do Hun Kim, Keebum Shin, Kwangsun Lee
  • Patent number: 11983122
    Abstract: A system includes a multi-port RAM configured to store an instruction table. The instruction table specifies a regular expression for application to a data stream. The system includes a regular expression engine (engine) that processes the data stream using the instruction table. The engine includes a decoder circuit that determines validity of active states output from the multi-port RAM and a plurality of priority FIFO memories (PFIFOs) operating concurrently. Each PFIFO can initiate a read from a different port of the multi-port RAM. Each PFIFO can track a plurality of active paths for the regular expression and a priority of each active path by, at least in part, storing entries corresponding to active states in each respective PFIFO in decreasing priority order. The engine includes switching circuitry that selectively routes the active states from the decoder circuit to the plurality of PFIFOs according to the priority order.
    Type: Grant
    Filed: April 26, 2022
    Date of Patent: May 14, 2024
    Assignee: Xilinx, Inc.
    Inventors: David K. Liddell, Sachin Kumawat
  • Patent number: 11977489
    Abstract: Apparatuses, systems, and techniques for memory management are disclosed. In at least one embodiment, memory management is provided for a heterogenous system, for example, a system including a CPU and a GPU, in which redundant or unnecessary memory transfers are reduced.
    Type: Grant
    Filed: July 19, 2022
    Date of Patent: May 7, 2024
    Assignee: NVIDIA Corporation
    Inventors: Weixi Zhu, Guilherme Cox
  • Patent number: 11972140
    Abstract: In an embodiment, a system may support programmable hashing of address bits at a plurality of levels of granularity to map memory addresses to memory controllers and ultimately at least to memory devices. The hashing may be programmed to distribute pages of memory across the memory controllers, and consecutive blocks of the page may be mapped to physically distant memory controllers. In an embodiment, address bits may be dropped from each level of granularity, forming a compacted pipe address to save power within the memory controller. In an embodiment, a memory folding scheme may be employed to reduce the number of active memory devices and/or memory controllers in the system when the full complement of memory is not needed.
    Type: Grant
    Filed: December 20, 2022
    Date of Patent: April 30, 2024
    Assignee: Apple Inc.
    Inventors: Steven Fishwick, Jeffry E. Gonion, Per H. Hammarlund, Eran Tamari, Lior Zimet, Gerard R. Williams, III
  • Patent number: 11972148
    Abstract: Example storage systems, storage devices, and methods provide proactive management of storage operations using thermal states. Host storage requests are received and used to determine storage commands for a data storage device. For each storage command, a temperature index value corresponding to an estimated change in thermal state for executing the storage command may be determined. The storage commands are allocated to command queues based on the thermal index values and then executed from the command queues by the data storage device without triggering thermal throttling of storage commands.
    Type: Grant
    Filed: June 14, 2022
    Date of Patent: April 30, 2024
    Assignee: Western Digital Technologies, Inc.
    Inventor: Ramanathan Muthiah
  • Patent number: 11971786
    Abstract: A backup processing method and a server are provided. The method is applied to a backup system, and the backup system includes a plurality of backup servers. The method includes: dividing, by a master backup server in the plurality of backup servers, a backup task into a plurality of child backup tasks; allocating, by the master backup server, each child backup task to each of the plurality of backup servers; and sending, by the master backup server, the plurality of child backup tasks to respective corresponding backup servers. According to this method, a data backup rate can be improved.
    Type: Grant
    Filed: February 23, 2022
    Date of Patent: April 30, 2024
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Lei Zhang, Maopeng Xu, Xianglin Wang
  • Patent number: 11960405
    Abstract: Graphics processors for implementing multi-tile memory management are disclosed. In one embodiment, a graphics processor includes a first graphics device having a local memory, a second graphics device having a local memory, and a graphics driver to provide a single virtual allocation with a common virtual address range to mirror a resource to each local memory of the first and second graphics devices.
    Type: Grant
    Filed: December 30, 2022
    Date of Patent: April 16, 2024
    Assignee: Intel Corporation
    Inventors: Zack S. Waters, Travis Schluessler, Michael Apodaca, Ankur Shah