Patents by Inventor Evan Lawrence

Evan Lawrence has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250199965
    Abstract: Techniques for providing secure cross-host memory sharing are described herein. A memory buffer device having processing circuitry is to receive a first request from a first initiator to share a region of memory associated with the memory buffer device with a second initiator. The processing circuitry may identify a first passcode associated with the first initiator. The processing circuitry may receive a second request from the second initiator to access the region of memory. The second request includes a second passcode. The processing circuitry may authenticate the second request using the first passcode and the second passcode. Responsive to authentication of the second request, the processing circuitry may generate a mapping between a host physical address space associated with the second initiator and a physical memory address space associated with the region of memory to enable the second initiator to access the region of memory.
    Type: Application
    Filed: December 3, 2024
    Publication date: June 19, 2025
    Inventors: Evan Lawrence Erickson, Michael Alexander Hamburg, Helena Handschuh, Mark Evan Marson, Taeksang Song
  • Patent number: 12327042
    Abstract: Technologies for securing dynamic random access memory contents to non-volatile memory in a persistent memory module are described. One persistent memory module includes an inline memory encryption (IME) circuit that receives a data stream from a host, encrypts the data stream into encrypted data, and stores the encrypted data in DRAM. A management processor transfers the encrypted data from the DRAM to persistent storage memory responsive to a signal associated with a power-loss or power-down event.
    Type: Grant
    Filed: April 25, 2023
    Date of Patent: June 10, 2025
    Assignee: Rambus Inc.
    Inventors: Taeksang Song, Evan Lawrence Erickson, Craig E. Hampel
  • Patent number: 12321502
    Abstract: A buffer integrated circuit (IC) chip is disclosed. The buffer IC chip includes host interface circuitry to receive a request from at least one host. The request includes at least one command to access a memory. Memory interface circuitry couples to the memory. Message authentication circuitry performs a verification operation on the received request. Selective containment circuitry, during a containment mode of operation, (1) inhibits changes to the memory in response to the at least one command until completion of the verification operation, and (2) during performance of the verification operation, carries out at least one non-memory modifying sub-operation associated with the at least one command.
    Type: Grant
    Filed: April 3, 2023
    Date of Patent: June 3, 2025
    Assignee: Rambus Inc.
    Inventors: Evan Lawrence Erickson, John Eric Linstadt
  • Publication number: 20250147904
    Abstract: A memory buffer device and memory module for accurate hot and cold page detection is disclosed. The memory buffer device is coupled to a device memory including a plurality of regions. The memory buffer device identifies frequently accessed regions of the plurality of regions by counting accesses to the plurality of regions at a first granularity. The frequently accessed regions are associated with counters that satisfy a threshold criterion and are further tracked using a filter at a second granularity. The second granularity is smaller than the first granularity.
    Type: Application
    Filed: October 28, 2024
    Publication date: May 8, 2025
    Inventors: Taeksang Song, Evan Lawrence Erickson
  • Publication number: 20250130892
    Abstract: Technologies for configurable adaptive double device data correction (ADDDC) are described. A memory buffer device includes error detection and correction (EDC) logic to autonomously detect errors in a region of memory and, upon detection of an error, calculate additional ECC check symbols for cache lines in the region and store the additional ECC check symbols as metadata associated with the cache lines.
    Type: Application
    Filed: October 4, 2024
    Publication date: April 24, 2025
    Inventors: Evan Lawrence Erickson, Taeksang Song, Thomas Vogelsang, John Eric Linstadt
  • Publication number: 20250117138
    Abstract: Disclosed are techniques for a memory buffer to track access to paged regions of a memory system at a configurable granularity finer than the size of the paged regions to provide more detailed statistics on memory access. The memory buffer may advertise its capabilities for fine-grained cold page tracking. The memory buffer may receive from the host information to configure a granularity of sub-regions of a paged region and a size of counters used to track access to the sub-regions. The memory buffer may track access requests to the sub-regions using the counters and to provide information on sub-region tracking to the host to identify individual hot or cold sub-regions. The host may make migration decisions for the paged regions with more granular information such as compaction of sub-regions to create a cold page or to treat each sub-region as a separately compressible entity to compress a mostly cold page.
    Type: Application
    Filed: October 4, 2024
    Publication date: April 10, 2025
    Inventors: Evan Lawrence Erickson, Taeksang Song
  • Publication number: 20250110670
    Abstract: A memory system enables a host device to flexibly allocate compressed storage managed by a memory buffer device. The host device allocates a first block of host-visible addresses associated with the compressed region and a memory buffer device allocates a corresponding second block of host-visible memory. The host device may migrate uncompressed data to and from compressed storage by referencing an address in the second block (with compression and decompression managed by the memory buffer device) and may migrate compressed data to and from compressed storage (bypassing compression and decompression on the memory buffer device) by instead referencing an address in the first block.
    Type: Application
    Filed: September 17, 2024
    Publication date: April 3, 2025
    Inventor: Evan Lawrence Erickson
  • Publication number: 20250110917
    Abstract: A multi-processor device is disclosed. The multi-processor device includes interface circuitry to receive requests from at least one host device. A primary processor is coupled to the interface circuitry to process the requests in the absence of a failure event associated with the primary processor. A secondary processor processes operations on behalf of the primary processor and selectively receives the requests from the interface circuitry based on detection of the failure event associated with the primary processor.
    Type: Application
    Filed: October 17, 2024
    Publication date: April 3, 2025
    Inventors: Michael Raymond Miller, Evan Lawrence Erickson
  • Publication number: 20250103508
    Abstract: Described are computational systems in which hosts share pooled memory on the same memory module. A memory buffer with access to the pooled memory manages which regions of the memory are allocated to the different hosts such that memory regions, and thus the data they contain, can be exchanged between hosts. Unidirectional or bidirectional data exchanges between hosts swap regions of equal size so the amount of memory allocated to each host is not changed as a result of the exchange.
    Type: Application
    Filed: August 22, 2024
    Publication date: March 27, 2025
    Inventors: Evan Lawrence Erickson, Taeksang Song, Christopher Haywood
  • Publication number: 20250053521
    Abstract: A memory system selectively compresses and/or decompresses pages of a memory array based on requests from a host device. Upon performing compression, the memory buffer device returns compression context metadata to the host device for storing in the page table of the host device to enable the host device to subsequently obtain data from the compressed page. The host device may subsequently send a request for the memory buffer device to perform decompression to a free page in the memory array for accessing by the host device, or the host device may directly access the compressed page for local decompression and storage.
    Type: Application
    Filed: July 26, 2024
    Publication date: February 13, 2025
    Inventor: Evan Lawrence Erickson
  • Publication number: 20250047469
    Abstract: Techniques for providing reduced latency metadata encryption and decryption are described herein. A memory buffer device having a cryptographic circuit to receive a first data and a first metadata associated with the first data. The cryptographic circuit can encrypt or decrypt the first metadata using a first cryptographic algorithm. The cryptographic circuit can encrypt or decrypt the first data using a second cryptographic algorithm. The first data and the first metadata can be stored at a same location, within a memory device, corresponding to a memory address.
    Type: Application
    Filed: May 21, 2024
    Publication date: February 6, 2025
    Inventors: Evan Lawrence Erickson, Michael Alexander Hamburg, Taeksang Song, Wendy Elsasser
  • Patent number: 12204446
    Abstract: A buffer/interface device of a memory node reads a block of data (e.g., page). As each unit of data (e.g., cache line sized) of the block is read, it is compared against one or more predefined patterns (e.g., all 0's, all 1's, etc.). If the block (page) is only storing one of the predefined patterns, a flag in the page table entry for the block is set to indicate the block is only storing one of the predefined patterns. The physical memory the block was occupying may then be deallocated so other data may be stored using those physical memory addresses.
    Type: Grant
    Filed: April 27, 2023
    Date of Patent: January 21, 2025
    Assignee: Rambus Inc.
    Inventors: Evan Lawrence Erickson, Christopher Haywood
  • Patent number: 12197602
    Abstract: A device includes interface circuitry to receive requests from at least one host system, a primary processor coupled to the interface circuitry, and a secure processor coupled to the primary processor. In response to a failure of the primary processor, the secure processor is to: verify a log retrieval command received via the interface circuitry, wherein the log retrieval command is cryptographically signed; in response to the verification, retrieve crash dump data stored in memory that is accessible by the primary processor; generate a log file that comprises the retrieved crash dump data; and cause the log file to be transmitted to the at least one host system over a sideband link that is coupled externally to the interface circuitry.
    Type: Grant
    Filed: October 18, 2022
    Date of Patent: January 14, 2025
    Assignee: Rambus Inc.
    Inventor: Evan Lawrence Erickson
  • Patent number: 12174749
    Abstract: The creation, maintenance, and accessing of page tables is done by a virtual machine monitor running on a computing system rather than the guest operating systems. This allows page table walks to be completed in fewer memory accesses when compared to the guest operating system's maintenance of the page tables. In addition, the virtual machine monitor may utilize additional resources to offload page table access and maintenance functions from the CPU to another device, such as a page table management device or page table management node. Offloading some or all page table access and maintenance functions to a specialized device or node enables the CPU to perform other tasks during page table walks and/or other page table maintenance functions.
    Type: Grant
    Filed: January 14, 2022
    Date of Patent: December 24, 2024
    Assignee: Rambus Inc.
    Inventors: Steven C. Woo, Christopher Haywood, Evan Lawrence Erickson
  • Publication number: 20240388420
    Abstract: Systems and techniques for cryptographically protecting data in a computer memory are disclosed. The techniques include dividing the data into a first portion and a second portion, encrypting the first portion of the data to create a first stored form of the data, encrypting the second portion of the data, and storing, in the computer memory, the first stored form of the data and a second stored form of the data. The techniques include, to encrypt the second portion, calculating a hash based on the first stored form of the data, applying a first pseudorandom function to the hash to obtain a bit sequence, and combining the bit sequence with the second portion of the data to obtain the second stored form of the data.
    Type: Application
    Filed: May 9, 2024
    Publication date: November 21, 2024
    Inventors: Michael Alexander Hamburg, Evan Lawrence Erickson, Ajay Kapoor
  • Patent number: 12147351
    Abstract: Memory pages are background-relocated from a low-latency local operating memory of a server computer to a higher-latency memory installation that enables high-resolution access monitoring and thus access-demand differentiation among the relocated memory pages. Higher access-demand memory pages are background-restored to the low-latency operating memory, while lower access-demand pages are maintained in the higher latency memory installation and yet-lower access-demand pages are optionally moved to yet higher-latency memory installation.
    Type: Grant
    Filed: April 25, 2023
    Date of Patent: November 19, 2024
    Assignee: Rambus Inc.
    Inventors: Evan Lawrence Erickson, Christopher Haywood, Mark D. Kellam
  • Patent number: 12130772
    Abstract: A multi-processor device is disclosed. The multi-processor device includes interface circuitry to receive requests from at least one host device. A primary processor is coupled to the interface circuitry to process the requests in the absence of a failure event associated with the primary processor. A secondary processor processes operations on behalf of the primary processor and selectively receives the requests from the interface circuitry based on detection of the failure event associated with the primary processor.
    Type: Grant
    Filed: October 24, 2022
    Date of Patent: October 29, 2024
    Assignee: Rambus Inc.
    Inventors: Michael Raymond Miller, Evan Lawrence Erickson
  • Patent number: 12072504
    Abstract: Head-mounted display assemblies may include a first eyecup and a second eyecup that are configured for respectively positioning a first lens and a second lens in front of intended locations of a user's eyes when the head-mounted display assembly is donned. The first eyecup and the second eyecup may be movable relative to each other to adjust for an interpupillary distance of the user's eyes. A single near-eye display screen may be configured for displaying an image to the user through the first and second eyecups. An enclosure over the single near-eye display screen may include a first transparent component positioned between the first lens and the single near-eye display screen and a second transparent component positioned between the second lens and the single near-eye display screen. Various other methods, devices, systems, and assemblies are also disclosed.
    Type: Grant
    Filed: April 17, 2023
    Date of Patent: August 27, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Jeffrey Taylor Stellman, Nirav Rajendra Patel, Samuel Redmond D'Amico, Wei Rong, Evan Lawrence Coons, Joseph Patrick Sullivan
  • Patent number: D1080860
    Type: Grant
    Filed: March 30, 2023
    Date of Patent: June 24, 2025
    Assignee: Aeronics, Inc.
    Inventors: Blake Dube, Evan Lawrence, Mark Spitz, John Puskar-Pasewicz, William Grosskopf, Joshua Lederer
  • Patent number: D1080861
    Type: Grant
    Filed: March 30, 2023
    Date of Patent: June 24, 2025
    Assignee: Aeronics, Inc.
    Inventors: Blake Dube, Evan Lawrence, Mark Spitz, John Puskar-Pasewicz, William Grosskopf, Joshua Lederer