Patents by Inventor Shane Keil
Shane Keil has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9490847Abstract: One embodiment of the present invention sets forth a technique for protecting data with an error correction code (ECC). The data is accessed by a processing unit and stored in an external memory, such as dynamic random access memory (DRAM). Application data and related ECC data are advantageously stored in a common page within a common DRAM device. Application data and ECC data are transmitted between the processor and the external common DRAM device over a common set of input/output (I/O) pins. Eliminating I/O pins and DRAM devices conventionally associated with transmitting and storing ECC data advantageously reduces system complexity and cost.Type: GrantFiled: October 25, 2012Date of Patent: November 8, 2016Assignee: NVIDIA CorporationInventors: Fred Gruner, Shane Keil, John S. Montrym
-
Patent number: 8656093Abstract: One embodiment of the invention sets forth a mechanism to transmit commands received from an L2 cache to a bank page within the DRAM. An arbiter unit determines which commands from a command sorter to transmit to a command queue. An activate command associated with the bank page related to the commands is also transmitted to an activate queue. The last command in the command queue is marked as “last.” An interlock counter stores a count of “last” commands in the read/write command queue. A DRAM controller transmits activate and commands from the activate queue and the command queue to the DRAM. Each time a command marked as “last” is encountered, the DRAM controller decrements the interlock counter. If the count in the interlock counter is zero, then the command marked as “last” is marked as “auto-precharge.” The “auto-precharge” command, when processed, causes the bank page to be closed.Type: GrantFiled: December 1, 2008Date of Patent: February 18, 2014Assignee: NVIDIA CorporationInventors: John H. Edmondson, Shane Keil
-
Patent number: 8489839Abstract: The memory splitter chip couples multiple DRAM units to the PPU, thereby expanding the memory capacity available to the PPU for storing data and increasing the overall performance of the graphics processing system. The memory splitter chip includes logic for managing the transmission of data between the PPU and the DRAM units when the transmission frequencies and the burst lengths of the PPU interface and the DRAM interfaces differ. Specifically, the memory splitter chip implements an overlapping transmission mode, a pairing transmission mode or a combination of the two modes when the transmission frequencies or the burst lengths differ.Type: GrantFiled: December 16, 2009Date of Patent: July 16, 2013Assignee: Nvidia CorporationInventors: Ashish Karandikar, Kaustubh Sanghani, Jonah M. Alben, Shane Keil
-
Patent number: 8375163Abstract: One embodiment of the invention sets forth a mechanism to transmit commands received from an L2 cache to a bank page within the DRAM. An arbiter unit determines which commands from a command sorter to transmit to a command queue. An activate command associated with the bank page related to the commands is also transmitted to an activate queue. The last command in the command queue is marked as “last.” An interlock counter stores a count of “last” commands in the read/write command queue. A DRAM controller transmits activate and commands from the activate queue and the command queue to the DRAM. Each time a command marked as “last” is encountered, the DRAM controller decrements the interlock counter. If the count in the interlock counter is zero, then the command marked as “last” is marked as “auto-precharge.” The “auto-precharge” command, when processed, causes the bank page to be closed.Type: GrantFiled: December 1, 2008Date of Patent: February 12, 2013Assignee: NVIDIA CorporationInventors: John H. Edmondson, Shane Keil
-
Patent number: 8365015Abstract: The present disclosure provides memory level error correction methods and apparatus. A memory controller is intermediate the memory devices, such as DRAM chips or memory modules, and a processor, such a graphics processor or a main processor. The memory controller can provide error correction. In an example, the memory controller includes a buffer to store instructions and data for execution by the controller and a replay buffer to store the instructions such that operations can be replayed to prior state before the error. An error detector receives data read from the memory devices and if no error is detected outputs the data. If an error is detected, the error detector signals the memory controller to replay the instructions stored in the replay buffer.Type: GrantFiled: August 9, 2010Date of Patent: January 29, 2013Assignee: Nvidia CorporationInventors: Shu-Yi Yu, Shane Keil, John Edmondson
-
Patent number: 8321618Abstract: One embodiment of the present invention sets forth a mechanism to schedule read data transmissions and write data transmissions to/from a cache to frame buffer logic on the L2 bus. When processing a read or a write command, a scheduling arbiter examines a bus schedule to determine that a read-read conflict, a read-write conflict or a write-read exists, and allocates an available memory space in a read buffer to store the read data causing the conflict until the read return data transmission can be scheduled. In the case of a write command, the scheduling arbiter then transmits a write request to a request buffer. When processing a write request, the request arbiter examines the request buffers to determine whether a write-write conflict. If so, then the request arbiter allocates a memory space in a request buffer to store the write request until the write data transmission can be scheduled.Type: GrantFiled: July 28, 2009Date of Patent: November 27, 2012Assignee: NVIDIA CorporationInventors: Shane Keil, John H. Edmondson
-
Patent number: 8307165Abstract: One embodiment of the invention sets forth a mechanism for increasing the number of read commands or write commands transmitted to an activated bank page in the DRAM. Read requests and dirty notifications are organized in a read request sorter or a dirty notification sorter, respectively, and each sorter includes multiple sets with entries that may be associated with different bank pages in the DRAM. Read requests and dirty notifications are stored in read request lists and dirty notification lists, where each list is associated with a specific bank page. When a bank page is activated to process read requests, read commands associated with read requests stored in a particular read request list are transmitted to the bank page. When a bank page is activated to process dirty notifications, write commands associated with dirty notifications stored in a particular dirty notification list are transmitted to the bank page.Type: GrantFiled: July 10, 2009Date of Patent: November 6, 2012Assignee: Nvidia CorporationInventors: Shane Keil, John H. Edmondson, Sean J. Treichler
-
Patent number: 8301980Abstract: One embodiment of the present invention sets forth a technique for protecting data with an error correction code (ECC). The data is accessed by a processing unit and stored in an external memory, such as dynamic random access memory (DRAM). Application data and related ECC data are advantageously stored in a common page within a common DRAM device. Application data and ECC data are transmitted between the processor and the external common DRAM device over a common set of input/output (I/O) pins. Eliminating I/O pins and DRAM devices conventionally associated with transmitting and storing ECC data advantageously reduces system complexity and cost.Type: GrantFiled: September 28, 2009Date of Patent: October 30, 2012Assignee: NVIDIA CorporationInventors: Fred Gruner, Shane Keil, John S. Montrym
-
Patent number: 8195858Abstract: One embodiment of the present invention sets forth a mechanism to schedule read data transmissions and write data transmissions to/from a cache to frame buffer logic on the L2 bus. When processing a read or a write command, a scheduling arbiter examines a bus schedule to determine that a read-read conflict, a read-write conflict or a write-read exists, and allocates an available memory space in a read buffer to store the read data causing the conflict until the read return data transmission can be scheduled. In the case of a write command, the scheduling arbiter then transmits a write request to a request buffer. When processing a write request, the request arbiter examines the request buffers to determine whether a write-write conflict. If so, then the request arbiter allocates a memory space in a request buffer to store the write request until the write data transmission can be scheduled.Type: GrantFiled: July 28, 2009Date of Patent: June 5, 2012Assignee: NVIDIA CorporationInventors: Shane Keil, John H. Edmondson
-
Patent number: 8190974Abstract: One embodiment of the present invention sets forth a technique for protecting data with an error correction code (ECC). The data is accessed by a processing unit and stored in an external memory, such as dynamic random access memory (DRAM). Application data and related ECC data are advantageously stored in a common page within a common DRAM device. Application data and ECC data are transmitted between the processor and the external common DRAM device over a common set of input/output (I/O) pins. Eliminating I/O pins and DRAM devices conventionally associated with transmitting and storing ECC data advantageously reduces system complexity and cost.Type: GrantFiled: September 28, 2009Date of Patent: May 29, 2012Assignee: NVIDIA CorporationInventors: Fred Gruner, Shane Keil, John S. Montrym
-
Publication number: 20110078537Abstract: One embodiment of the present invention sets forth a technique for protecting data with an error correction code (ECC). The data is accessed by a processing unit and stored in an external memory, such as dynamic random access memory (DRAM). Application data and related ECC data are advantageously stored in a common page within a common DRAM device. Application data and ECC data are transmitted between the processor and the external common DRAM device over a common set of input/output (I/O) pins. Eliminating I/O pins and DRAM devices conventionally associated with transmitting and storing ECC data advantageously reduces system complexity and cost.Type: ApplicationFiled: September 28, 2009Publication date: March 31, 2011Inventors: FRED GRUNER, Shane KEIL, John S. MONTRYM
-
Publication number: 20110078544Abstract: One embodiment of the present invention sets forth a technique for protecting data with an error correction code (ECC). The data is accessed by a processing unit and stored in an external memory, such as dynamic random access memory (DRAM). Application data and related ECC data are advantageously stored in a common page within a common DRAM device. Application data and ECC data are transmitted between the processor and the external common DRAM device over a common set of input/output (I/O) pins. Eliminating I/O pins and DRAM devices conventionally associated with transmitting and storing ECC data advantageously reduces system complexity and cost.Type: ApplicationFiled: September 28, 2009Publication date: March 31, 2011Inventors: Fred GRUNER, Shane KEIL, John S. MONTRYM