Patents by Inventor Gary Lauterbach

Gary Lauterbach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 6098154
    Abstract: A central processing unit (CPU) of a computer has a data caching unit which includes a novel dual-ported prefetch cache configured in parallel with a conventional single-ported data cache. In response to a data cache miss, the requested data is fetched from external memory and loaded into the data cache and into the prefetch cache. Thereafter, if a prefetch cache hit occurs, the physical address of the corresponding data request is provided to a prefetch engine which, in turn, adds a stride to the physical address to derive a prefetch address. This prefetch address identifies data which is predicted to be soon requested in subsequent instructions of the computer program. Data corresponding to the prefetch address is then retrieved from external memory and loaded into the prefetch cache. This prefetching operation frequently results in the prefetch cache storing data that is requested by subsequently executed instructions in a computer program.
    Type: Grant
    Filed: June 25, 1997
    Date of Patent: August 1, 2000
    Assignee: Sun Microsystems, Inc.
    Inventors: Herbert Lopez-Aguado, Denise Chiacchia, Gary Lauterbach
  • Patent number: 6044446
    Abstract: A system for reducing query traffic in multi-processor shared memory system utilizes the inclusion of an unshared bit in translation table entries in the address translation system. A query system does not generate queries when the unshared bit indicates that the data has not been shared between the processors.
    Type: Grant
    Filed: July 1, 1997
    Date of Patent: March 28, 2000
    Assignee: Sun Microsystems, Inc.
    Inventors: Bill Joy, Gary Lauterbach
  • Patent number: 6035118
    Abstract: A technique for eliminating the performance penalty of implementing jump instructions in a deeply pipelined processor includes a pipeline having a signal for indicating that the top of the address return stack has been updated by an address moved to the return register. An instruction moving a previously computed jump target address to the return register is included in code to be executed. The pipeline uses the instruction at the top of the RAS as a guess of the target instruction of a fetched jump instruction and immediately begins fetching instructions indicated by the guess.
    Type: Grant
    Filed: June 23, 1997
    Date of Patent: March 7, 2000
    Assignee: Sun Microsystems, Inc.
    Inventors: Gary Lauterbach, Kit Tam
  • Patent number: 5996061
    Abstract: A central processing unit (CPU) of a computer includes a novel prefetch cache configured in parallel with a conventional data cache. If a data cache miss occurs, the requested data is fetched from external memory and loaded into the data cache and into the prefetch cache. Thereafter, if a prefetch cache hit occurs, a prefetch address is derived, and data corresponding to the prefetch address is prefetched into the prefetch cache. This prefetching operation frequently results in the prefetch cache storing data that is requested by subsequently executed instructions in a computer program, thereby eliminating latencies associated with external memory. A software compiler of the computer ensures the validity of data stored in the prefetch cache. The software compiler alerts the prefetch cache that data stored within the prefetch cache is to be rewritten and, in response thereto, the prefetch cache invalidates the data.
    Type: Grant
    Filed: June 25, 1997
    Date of Patent: November 30, 1999
    Assignee: Sun Microsystems, Inc.
    Inventors: Herbert Lopez-Aguado, Denise Chiacchia, Gary Lauterbach
  • Patent number: 5912906
    Abstract: On-chip delivery of data from an on-chip or off-chip cache is separated into two buses. A fast fill bus provides data to latency critical caches without ECC error detection and correction. A slow fill bus provides the data to latency insensitive caches with ECC error detection and correction. Because the latency critical caches receive the data without error detection, they receive the data at least one clock cycle before the latency insensitive caches, thereby enhancing performance if there is no ECC error. If an ECC error is detected, a software trap is executed which flushes the external cache and the latency sensitive caches that received the data before the trap was generated. If the error is correctable, ECC circuitry corrects the error and rewrites the corrected data back to the external cache. If the error is not correctable, the data is read from main memory to the external cache.
    Type: Grant
    Filed: June 23, 1997
    Date of Patent: June 15, 1999
    Assignee: Sun Microsystems, Inc.
    Inventors: Chang-Hong Wu, Gary Lauterbach