Patents by Inventor Michael Raymond Miller
Michael Raymond Miller has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250036304Abstract: A control component implements pipelined data processing operations in either of two timing domains bridged by a domain-crossing circuit according to one or more configuration signals that indicate relative clock frequencies of the two domain and/or otherwise indicate which of the two timing domains will complete the data processing operations with lowest latency.Type: ApplicationFiled: July 29, 2024Publication date: January 30, 2025Inventors: Michael Raymond Miller, Dongyun Lee
-
Publication number: 20250028467Abstract: An interconnected stack of one or more Dynamic Random Access Memory (DRAM) die also has one or more custom logic, controller, or processor die. The custom die(s) of the stack include direct channel interfaces that allow direct access to memory regions on one or more DRAMs in the stack. The direct channels are time-division multiplexed such that each DRAM die is associated with a time slot on a direct channel. The custom die configures a first DRAM die to read a block of data and transmit it via the direct channel using a time slot that is assigned to a second DRAM die. The custom die also configures the second memory device to receive the first block of data in its assigned time slot and write the block of data.Type: ApplicationFiled: August 5, 2024Publication date: January 23, 2025Inventors: Michael Raymond MILLER, Steven C. Woo, Thomas Vogelsang
-
Publication number: 20250021484Abstract: Disclosed is a dynamic random access memory that has columns, data rows, tag rows and comparators. Each comparator compares address bits and tag information bits from the tag rows to determine a cache hit and generate address bits to access data information in the DRAM as a multiway set associative cache.Type: ApplicationFiled: July 24, 2024Publication date: January 16, 2025Inventors: Thomas Vogelsang, Frederick A. Ware, Michael Raymond Miller, Collins Williams
-
Publication number: 20240394195Abstract: A dynamic random access memory (DRAM) device includes functions configured to aid with operating the DRAM device as part of data caching functions. The DRAM is configured to respond to at least two types of commands. A first type of command (cache data access command) seeks to access a cache line of data, if present in the DRAM cache. A second type of command (cache probe command) seeks to determine whether a cache line of data is present, but is not requesting the data be returned in response. In response to these types of access commands, the DRAM device is configured to receive cache tag query values and to compare stored cache tag values with the cache tag query values. A hit/miss (HM) interface/bus may indicate the result of the cache tag compare and stored cache line status bits to a controller.Type: ApplicationFiled: May 15, 2024Publication date: November 28, 2024Inventors: Steven C. WOO, Michael Raymond MILLER, Taeksang SONG, Wendy ELSASSER, Maryam BABAIE
-
Patent number: 12130772Abstract: A multi-processor device is disclosed. The multi-processor device includes interface circuitry to receive requests from at least one host device. A primary processor is coupled to the interface circuitry to process the requests in the absence of a failure event associated with the primary processor. A secondary processor processes operations on behalf of the primary processor and selectively receives the requests from the interface circuitry based on detection of the failure event associated with the primary processor.Type: GrantFiled: October 24, 2022Date of Patent: October 29, 2024Assignee: Rambus Inc.Inventors: Michael Raymond Miller, Evan Lawrence Erickson
-
Publication number: 20240354014Abstract: A memory system includes two or more memory controllers capable of accessing the same dynamic, random-access memory (DRAM), one controller having access to the DRAM or a subset of the DRAM at a time. Different subsets of the DRAM are supported with different refresh-control circuitry, including respective refresh-address counters. Whichever controller has access to a given subset of the DRAM issues refresh requests to the corresponding refresh-address counter. Counters are synchronized before control of a given subset of the DRAM is transferred between controllers to avoid a loss of stored data.Type: ApplicationFiled: May 6, 2024Publication date: October 24, 2024Inventors: Thomas Vogelsang, Steven C. Woo, Michael Raymond Miller
-
Publication number: 20240354191Abstract: Data and error correction information may involve accessing multiple data channels (e.g., 8) and one error detection and correction channel concurrently. This technique requires a total of N+1 row requests for each access, where N is the number of data channels (e.g., 8 data row accesses and 1 error detection and correction row access equals 9 row accesses.) A single (or at least less than N) data channel row may be accessed concurrently with a single error detection and correction row. This reduces the number of row requests to two (2)—one for the data and one for the error detection and correction information. Because, row requests consume power, reducing the number of row requests is more power efficient.Type: ApplicationFiled: April 29, 2024Publication date: October 24, 2024Inventors: Michael Raymond MILLER, Stephen Magee, John Eric Linstadt
-
Publication number: 20240311334Abstract: A stacked processor-plus-memory device includes a processing die with an array of processing elements of an artificial neural network. Each processing element multiplies a first operand—e.g. a weight—by a second operand to produce a partial result to a subsequent processing element. To prepare for these computations, a sequencer loads the weights into the processing elements as a sequence of operands that step through the processing elements, each operand stored in the corresponding processing element. The operands can be sequenced directly from memory to the processing elements or can be stored first in cache. The processing elements include streaming logic that disregards interruptions in the stream of operands.Type: ApplicationFiled: April 2, 2024Publication date: September 19, 2024Inventors: Steven C. Woo, Michael Raymond Miller
-
Publication number: 20240311301Abstract: A dynamic random access memory (DRAM) device includes functions configured to aid with operating the DRAM device as part of data caching functions. In response to some write and/or read access commands, the DRAM device is configured to copy a cache line (e.g., dirty cache line) from the main DRAM memory array, place it in a flush buffer, and replace the copied cache line in the main DRAM memory array with a new (e.g., different) cache line of data. In response to conditions and/or events (e.g., explicit command, refresh, write-to-read command sequence, unused data bus bandwidth, full flush buffer, etc.) the DRAM device transmits the cache line from the flush buffer to the controller. The controller may then transmit the cache line to other cache levels.Type: ApplicationFiled: March 7, 2024Publication date: September 19, 2024Inventors: Michael Raymond MILLER, Steven C. Woo, Wendy Elsasser, Taeksang Song
-
Patent number: 12093180Abstract: A device includes a memory controller and a cache memory coupled to the memory controller. The cache memory has a first set of cache lines associated with a first memory block and comprising a first plurality of cache storage locations, as well as a second set of cache lines associated with a second memory block and comprising a second plurality of cache storage locations. A first location of the second plurality of cache storage locations comprises cache tag data for both the first set of cache lines and the second set of cache lines.Type: GrantFiled: June 29, 2022Date of Patent: September 17, 2024Assignee: Rambus Inc.Inventors: Michael Raymond Miller, Dennis Doidge, Collins Williams
-
Patent number: 12086441Abstract: An interconnected stack of one or more Dynamic Random Access Memory (DRAM) die also has one or more custom logic, controller, or processor die. The custom die(s) of the stack include direct channel interfaces that allow direct access to memory regions on one or more DRAMs in the stack. The direct channels are time-division multiplexed such that each DRAM die is associated with a time slot on a direct channel. The custom die configures a first DRAM die to read a block of data and transmit it via the direct channel using a time slot that is assigned to a second DRAM die. The custom die also configures the second memory device to receive the first block of data in its assigned time slot and write the block of data.Type: GrantFiled: August 30, 2021Date of Patent: September 10, 2024Assignee: Rambus Inc.Inventors: Michael Raymond Miller, Steven C. Woo, Thomas Vogelsang
-
Publication number: 20240295961Abstract: An integrated circuit (IC) memory device includes an array of storage cells configured into multiple banks. Interface circuitry receives refresh commands from a host memory controller to refresh the multiple banks for a first refresh mode. On-die refresh control circuitry selectively generates local refresh commands to refresh the multiple banks in cooperation with the host memory controller during a designated hidden refresh interval in a second refresh mode. Mode register circuitry stores a value indicating whether the on-die refresh control circuitry is enabled for use during the second refresh mode. The interface circuitry includes backchannel control circuitry to transmit a corrective action control signal during operation in the second refresh mode.Type: ApplicationFiled: March 7, 2024Publication date: September 5, 2024Inventors: Michael Raymond Miller, Steven C. Woo, Thomas Vogelsang
-
Patent number: 12073111Abstract: A control component implements pipelined data processing operations in either of two timing domains bridged by a domain-crossing circuit according to one or more configuration signals that indicate relative clock frequencies of the two domain and/or otherwise indicate which of the two timing domains will complete the data processing operations with lowest latency.Type: GrantFiled: September 8, 2022Date of Patent: August 27, 2024Assignee: Rambus Inc.Inventors: Michael Raymond Miller, Dongyun Lee
-
Patent number: 12072807Abstract: Disclosed is a dynamic random access memory that has columns, data rows, tag rows and comparators. Each comparator compares address bits and tag information bits from the tag rows to determine a cache hit and generate address bits to access data information in the DRAM as a multiway set associative cache.Type: GrantFiled: May 31, 2019Date of Patent: August 27, 2024Assignee: RAMBUS INC.Inventors: Thomas Vogelsang, Frederick A. Ware, Michael Raymond Miller, Collins Williams
-
Publication number: 20240241670Abstract: An interconnected stack of one or more Dynamic Random Access Memory (DRAM) die has a base logic die and one or more custom logic or processor die. The processor logic die snoops commands sent to and through the stack. In particular, the processor logic die may snoop mode setting commands (e.g., mode register set—MRS commands). At least one mode setting command that is ignored by the DRAM in the stack is used to communicate a command to the processor logic die. In response the processor logic die may prevent commands, addresses, and data from reaching the DRAM die(s). This enables the processor logic die to send commands/addresses and communicate data with the DRAM die(s). While being able to send commands/addresses and communicate data with the DRAM die(s), the processor logic die may execute software using the DRAM die(s) for program and/or data storage and retrieval.Type: ApplicationFiled: January 30, 2024Publication date: July 18, 2024Inventors: Thomas VOGELSANG, Michael Raymond MILLER, Steven C. WOO
-
Patent number: 12001697Abstract: A memory system includes two or more memory controllers capable of accessing the same dynamic, random-access memory (DRAM), one controller having access to the DRAM or a subset of the DRAM at a time. Different subsets of the DRAM are supported with different refresh-control circuitry, including respective refresh-address counters. Whichever controller has access to a given subset of the DRAM issues refresh requests to the corresponding refresh-address counter. Counters are synchronized before control of a given subset of the DRAM is transferred between controllers to avoid a loss of stored data.Type: GrantFiled: October 15, 2021Date of Patent: June 4, 2024Assignee: Rambus Inc.Inventors: Thomas Vogelsang, Steven C. Woo, Michael Raymond Miller
-
Patent number: 12001283Abstract: Data and error correction information may involve accessing multiple data channels (e.g., 8) and one error detection and correction channel concurrently. This technique requires a total of N+1 row requests for each access, where N is the number of data channels (e.g., 8 data row accesses and 1 error detection and correction row access equals 9 row accesses.) A single (or at least less than N) data channel row may be accessed concurrently with a single error detection and correction row. This reduces the number of row requests to two (2)—one for the data and one for the error detection and correction information. Because, row requests consume power, reducing the number of row requests is more power efficient.Type: GrantFiled: April 4, 2023Date of Patent: June 4, 2024Assignee: Rambus Inc.Inventors: Michael Raymond Miller, Stephen Magee, John Eric Linstadt
-
Publication number: 20240153548Abstract: Disclosed is a memory system including a memory component having at least one tag row and at least one data row and multiple ways to hold a data group as a cache-line or cache-block. The memory system includes a memory controller that is connectable to the memory component to implement a cache and operable with the memory controller and the memory component in each of a plurality of operating modes including a first and second operating mode having differing addressing and timing requirements for accessing the data group. The first operating mode having placement of each of at least two ways of a data group in differing rows in the memory component, with tag access and data access not overlapped. The second operating mode having placement of all ways of a data group in a same row in the memory component, with tag access and data access overlapped.Type: ApplicationFiled: November 6, 2023Publication date: May 9, 2024Inventors: Frederick A. Ware, Thomas Vogelsang, Michael Raymond Miller, Collins Williams
-
Patent number: 11960438Abstract: A stacked processor-plus-memory device includes a processing die with an array of processing elements of an artificial neural network. Each processing element multiplies a first operand—e.g. a weight—by a second operand to produce a partial result to a subsequent processing element. To prepare for these computations, a sequencer loads the weights into the processing elements as a sequence of operands that step through the processing elements, each operand stored in the corresponding processing element. The operands can be sequenced directly from memory to the processing elements or can be stored first in cache. The processing elements include streaming logic that disregards interruptions in the stream of operands.Type: GrantFiled: August 24, 2021Date of Patent: April 16, 2024Assignee: Rambus Inc.Inventors: Steven C. Woo, Michael Raymond Miller
-
Publication number: 20240119989Abstract: Row hammer is mitigated by issuing, to a memory device, mitigation operation (MOP) commands in order to cause the refresh of rows at a specified vicinity of a suspected aggressor row. These mitigation operation commands are each associated with respective row addresses that indicate the suspected aggressor row and an indicator of which neighbor row in the vicinity of the suspected aggressor row is to be refreshed. The mitigation operation commands are issued in response to a fixed number of activate commands. The suspected aggressor row is selected by randomly choosing, with equal probability, one of the N previous activate commands to supply its associated row address as the suspected aggressor row address. The neighbor row may be selected randomly with a probability that diminishes inversely with the distance between the suspected aggressor row and the neighbor row.Type: ApplicationFiled: October 2, 2023Publication date: April 11, 2024Inventors: Steven C. WOO, Michael Raymond MILLER