Patents by Inventor Matthew Tomei

Matthew Tomei has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12200139
    Abstract: An electronic system for calculating and mining digital currency using circuit layout optimized for power consumption, performance level, and integrated circuit surface area. A circuit simulation system simulates and evaluates circuit layouts retrieved from a circuit database to identify circuit parameters to compare against threshold values. The circuit simulation varies operational parameters of the circuits simulated to evaluate the active circuit parameters. The operational parameters include voltage levels, clock frequencies, thermal characteristics, and layout characteristics of dedicated components and sub-modules. The active circuit parameters include the effective hash rate, power, performance, and surface area.
    Type: Grant
    Filed: July 25, 2024
    Date of Patent: January 14, 2025
    Assignee: Auradine, Inc.
    Inventors: Matthew Tomei, Raju Rakha, Saptadeep Pal
  • Publication number: 20240413974
    Abstract: Dynamically calculating an optimal operational efficiency configuration of a plurality of digital currency mining systems based on trending information related to the digital currency and extrinsic factors affecting the plurality of digital currency mining. The plurality of digital currency mining systems are sent configuration settings to achieve the optimal operational efficiency configuration.
    Type: Application
    Filed: June 7, 2024
    Publication date: December 12, 2024
    Inventors: Saptadeep PAL, Patrick XU, David CARLSON, Nicholas CABI, Aditya BATRA, Raju RAKHA, Barun KAR, Rajiv KHEMANI, Robert ASHLEY, Matthew TOMEI, Sridhar CHIRRAVURI
  • Patent number: 12164924
    Abstract: A method includes, in response to receiving an instruction to perform a first operation on first data stored in a memory device, obtaining first compression metadata from the memory device based on an address for the first data, and reducing a number of operations in a set of operations based on the first operation and one or more matching addresses, the one or more matching addresses corresponding to second compression metadata matching the first compression metadata.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: December 10, 2024
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Matthew Tomei, Shomit Das
  • Publication number: 20240184738
    Abstract: A densely integrated and chiplet/dielet based networked memory pool with very high intra-pool bandwidth is provided. Chiplets are used to provide a common interface to the network. This means all memories (even those built with different process technologies) look the same from the network's perspective and vice versa: memory can be assembled in many different configurations while only changing the configuration at a high level of abstraction. The memory pool can easily be scaled in capacity and custom configurations that were previously impossible to achieve because of incompatibility of different technologies or level of integration are made possible.
    Type: Application
    Filed: April 13, 2022
    Publication date: June 6, 2024
    Applicants: The Regents of the University of California, The Board of Trustees of the University of Illinois
    Inventors: Saptadeep PAL, Matthew TOMEI, Puneet GUPTA, Rakesh KUMAR
  • Patent number: 12001237
    Abstract: Systems, methods, and devices for performing pattern-based cache block compression and decompression. An uncompressed cache block is input to the compressor. Byte values are identified within the uncompressed cache block. A cache block pattern is searched for in a set of cache block patterns based on the byte values. A compressed cache block is output based on the byte values and the cache block pattern. A compressed cache block is input to the decompressor. A cache block pattern is identified based on metadata of the cache block. The cache block pattern is applied to a byte dictionary of the cache block. An uncompressed cache block is output based on the cache block pattern and the byte dictionary. A subset of cache block patterns is determined from a training cache trace based on a set of compressed sizes and a target number of patterns for each size.
    Type: Grant
    Filed: September 23, 2020
    Date of Patent: June 4, 2024
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Matthew Tomei, Shomit N. Das, David A. Wood
  • Patent number: 11604738
    Abstract: A processing device is provided which includes memory comprising data cache memory configured to store compressed data and metadata cache memory configured to store metadata, each portion of metadata comprising an encoding used to compress a portion of data. The processing device also includes at least one processor configured to compress portions of data and select, based on one or more utility level metrics, portions of metadata to be stored in the metadata cache memory. The at least one processor is also configured to store, in the metadata cache memory, the portions of metadata selected to be stored in the metadata cache memory, store, in the data cache memory, each portion of compressed data having a selected portion of corresponding metadata stored in the metadata cache memory. Each portion of compressed data, having the selected portion of corresponding metadata stored in the metadata cache memory, is decompressed.
    Type: Grant
    Filed: September 28, 2018
    Date of Patent: March 14, 2023
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Matthew Tomei, David A. Wood
  • Publication number: 20220100518
    Abstract: A method includes, in response to receiving an instruction to perform a first operation on first data stored in a memory device, obtaining first compression metadata from the memory device based on an address for the first data, and reducing a number of operations in a set of operations based on the first operation and one or more matching addresses, the one or more matching addresses corresponding to second compression metadata matching the first compression metadata.
    Type: Application
    Filed: September 25, 2020
    Publication date: March 31, 2022
    Inventors: Matthew Tomei, Shomit Das
  • Publication number: 20210157485
    Abstract: Systems, methods, and devices for performing pattern-based cache block compression and decompression. An uncompressed cache block is input to the compressor. Byte values are identified within the uncompressed cache block. A cache block pattern is searched for in a set of cache block patterns based on the byte values. A compressed cache block is output based on the byte values and the cache block pattern. A compressed cache block is input to the decompressor. A cache block pattern is identified based on metadata of the cache block. The cache block pattern is applied to a byte dictionary of the cache block. An uncompressed cache block is output based on the cache block pattern and the byte dictionary. A subset of cache block patterns is determined from a training cache trace based on a set of compressed sizes and a target number of patterns for each size.
    Type: Application
    Filed: September 23, 2020
    Publication date: May 27, 2021
    Applicant: Advanced Micro Devices, Inc.
    Inventors: Matthew Tomei, Shomit N. Das, David A. Wood
  • Patent number: 10884940
    Abstract: A method of operating a cache in a computing device includes, in response to receiving a memory access request at the cache, determining compressibility of data specified by the request, selecting in the cache a destination portion for storing the data based on the compressibility of the data and a persistent fault history of the destination portion, and storing a compressed copy of the data in a non-faulted subportion of the destination portion, wherein the persistent fault history indicates that the non-faulted subportion excludes any persistent faults.
    Type: Grant
    Filed: December 21, 2018
    Date of Patent: January 5, 2021
    Assignee: Advanced Micro Devices, Inc.
    Inventors: John Kalamatianos, Shrikanth Ganapathy, Shomit Das, Matthew Tomei
  • Patent number: 10860489
    Abstract: Techniques are disclosed for designing cache compression algorithms that control how data in caches are compressed. The techniques generate a custom “byte select algorithm” by applying repeated transforms applied to an initial compression algorithm until a set of suitability criteria is met. The suitability criteria include that the “cost” is below a threshold and that a metadata constraint is met. The “cost” is the number of blocks that can be compressed by an algorithm as compared with the “ideal” algorithm. The metadata constraint is the number of bits required for metadata.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: December 8, 2020
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Matthew Tomei, David A. Wood
  • Patent number: 10838727
    Abstract: A processing device is provided which includes memory and at least one processor. The memory includes main memory and cache memory in communication with the main memory via a link. The at least one processor is configured to receive a request for a cache line and read the cache line from main memory. The at least one processor is also configured to compress the cache line according to a compression algorithm and, when the compressed cache line includes at least one byte predicted not to be accessed, drop the at least one byte from the compressed cache line based on whether the compression algorithm is determined to successfully compress the cache line according to a compression parameter.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: November 17, 2020
    Assignee: ADVANCED MICRO DEVICES, INC.
    Inventors: Shomit N. Das, Kishore Punniyamurthy, Matthew Tomei, Bradford M. Beckmann
  • Publication number: 20200201777
    Abstract: A method of operating a cache in a computing device includes, in response to receiving a memory access request at the cache, determining compressibility of data specified by the request, selecting in the cache a destination portion for storing the data based on the compressibility of the data and a persistent fault history of the destination portion, and storing a compressed copy of the data in a non-faulted subportion of the destination portion, wherein the persistent fault history indicates that the non-faulted subportion excludes any persistent faults.
    Type: Application
    Filed: December 21, 2018
    Publication date: June 25, 2020
    Inventors: John Kalamatianos, Shrikanth Ganapathy, Shomit Das, Matthew Tomei
  • Publication number: 20200192671
    Abstract: A processing device is provided which includes memory and at least one processor. The memory includes main memory and cache memory in communication with the main memory via a link. The at least one processor is configured to receive a request for a cache line and read the cache line from main memory. The at least one processor is also configured to compress the cache line according to a compression algorithm and, when the compressed cache line includes at least one byte predicted not to be accessed, drop the at least one byte from the compressed cache line based on whether the compression algorithm is determined to successfully compress the cache line according to a compression parameter.
    Type: Application
    Filed: December 14, 2018
    Publication date: June 18, 2020
    Applicant: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Kishore Punniyamurthy, Matthew Tomei, Bradford M. Beckmann
  • Publication number: 20200133866
    Abstract: The disclosure herein provides techniques for designing cache compression algorithms that control how data in caches are compressed. The techniques generate a custom “byte select algorithm” by applying repeated transforms applied to an initial compression algorithm until a set of suitability criteria is met. The suitability criteria include that the “cost” is below a threshold and that a metadata constraint is met. The “cost” is the number of blocks that can be compressed by an algorithm as compared with the “ideal” algorithm. The metadata constraint is the number of bits required for metadata.
    Type: Application
    Filed: October 31, 2018
    Publication date: April 30, 2020
    Applicant: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Matthew Tomei, David A. Wood
  • Publication number: 20200104262
    Abstract: A processing device is provided which includes memory comprising data cache memory configured to store compressed data and metadata cache memory configured to store metadata, each portion of metadata comprising an encoding used to compress a portion of data. The processing device also includes at least one processor configured to compress portions of data and select, based on one or more utility level metrics, portions of metadata to be stored in the metadata cache memory. The at least one processor is also configured to store, in the metadata cache memory, the portions of metadata selected to be stored in the metadata cache memory, store, in the data cache memory, each portion of compressed data having a selected portion of corresponding metadata stored in the metadata cache memory. Each portion of compressed data, having the selected portion of corresponding metadata stored in the metadata cache memory, is decompressed.
    Type: Application
    Filed: September 28, 2018
    Publication date: April 2, 2020
    Applicant: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Matthew Tomei, David A. Wood
  • Publication number: 20200073845
    Abstract: Systems, apparatuses, and methods for reliably transmitting data over voltage scaled links are disclosed. A computing system includes at least first and second devices connected via a link. In one implementation, if a data block can be compressed to less than or equal to half the original size of the data block, then the data block is compressed and sent on the link in a single clock cycle rather than two clock cycles. If the data block cannot be compressed to half the original size, but if the data block can be compressed enough to include error correction code (ECC) bits without exceeding the original size, then ECC bits are added to the compressed block which is sent on the link at a reduced voltage. The ECC bits help to correct for any errors that are generated as a result of operating the link at the reduced voltage.
    Type: Application
    Filed: August 30, 2018
    Publication date: March 5, 2020
    Inventors: Shomit N. Das, Matthew Tomei, Shrikanth Ganapathy, John Kalamatianos
  • Patent number: 10558606
    Abstract: Systems, apparatuses, and methods for reliably transmitting data over voltage scaled links are disclosed. A computing system includes at least first and second devices connected via a link. In one implementation, if a data block can be compressed to less than or equal to half the original size of the data block, then the data block is compressed and sent on the link in a single clock cycle rather than two clock cycles. If the data block cannot be compressed to half the original size, but if the data block can be compressed enough to include error correction code (ECC) bits without exceeding the original size, then ECC bits are added to the compressed block which is sent on the link at a reduced voltage. The ECC bits help to correct for any errors that are generated as a result of operating the link at the reduced voltage.
    Type: Grant
    Filed: August 30, 2018
    Date of Patent: February 11, 2020
    Assignee: Advanced Micro Devices, Inc.
    Inventors: Shomit N. Das, Matthew Tomei, Shrikanth Ganapathy, John Kalamatianos
  • Patent number: 10411731
    Abstract: A processing device is provided which includes a plurality of encoders each configured to compress a portion of data using a different compression algorithm. The processing device also includes one or more processors configured to cause an encoder, of the plurality of encoders, to compress the portion of data when it is determined that the portion of data, which is compressed by another encoder configured to compress the portion of data prior to the encoder in an encoder hierarchy, is not successfully compressed according to a compression metric by the other encoder in the encoder hierarchy. The one or more processors are also configured to prevent the encoder from compressing the portion of data when it is determined that the portion of data is successfully compressed according to the compression metric by the other encoder in the encoder hierarchy.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: September 10, 2019
    Assignee: ADVANCED MICRO DEVICES, INC.
    Inventors: Shomit N. Das, Matthew Tomei