Patents by Inventor John Michael Greer

John Michael Greer has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9293953
    Abstract: The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibers. The cooling fluid may flow through the spaces or voids between the fibers.
    Type: Grant
    Filed: March 24, 2011
    Date of Patent: March 22, 2016
    Assignee: GE Energy Power Conversion Technology, Ltd.
    Inventor: John Michael Greer
  • Patent number: 9251083
    Abstract: A microprocessor includes a first and second hardware data prefetchers configured to prefetch data into the microprocessor according to first and second respective algorithms, which are different. The second prefetcher is configured to detect a memory access pattern within a memory region and responsively prefetch data from the memory region according the second algorithm. The second prefetcher is further configured to provide to the first prefetcher a descriptor of the memory region. The first prefetcher is configured to stop prefetching data from the memory region in response to receiving the descriptor of the memory region from the second prefetcher. The second prefetcher also provides to the first prefetcher a communication to resume prefetching data from the memory region, such as when the second prefetcher subsequently detects that a predetermined number of memory accesses to the memory region are not in the memory access pattern.
    Type: Grant
    Filed: March 11, 2013
    Date of Patent: February 2, 2016
    Assignee: VIA TECHNOLOGIES, INC.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Patent number: 9032159
    Abstract: A hardware data prefetcher includes a queue of indexed storage elements into which are queued strides associated with a stream of temporally adjacent load requests. Each stride is a difference between cache line offsets of memory addresses of respective adjacent load requests. Hardware logic calculates a current stride between a current load request and a newest previous load request. The hardware logic compares the current stride and a stride M in the queue and compares the newest of the queued strides with a queued stride M+1, which is older than and adjacent to stride M. When the comparisons match, the hardware logic prefetches a cache line whose offset is the sum of the offset of the current load request and a stride M?1. Stride M?1 is newer than and adjacent to stride M in the queue.
    Type: Grant
    Filed: June 27, 2012
    Date of Patent: May 12, 2015
    Assignee: Via Technologies, Inc.
    Inventors: Meera Ramani-Augustin, John Michael Greer
  • Publication number: 20140365753
    Abstract: A microprocessor includes a predicting unit and a control unit. The control unit controls the predicting unit to accumulate a history of characteristics of executed instructions and makes predictions related to subsequent instructions based on the history while the microprocessor is running a first thread. The control unit also detects a transition from running the first thread to running a second thread and controls the predicting unit to selectively suspend accumulating the history and making the predictions using the history while running the second thread. The predicting unit makes static predictions while running the second thread. The selectivity may be based on the privilege level, identity or length of the second thread, static prediction effectiveness during a previous execution instance of the thread, whether the transition was made due to a system call, and whether the second thread is an interrupt handler.
    Type: Application
    Filed: January 27, 2014
    Publication date: December 11, 2014
    Inventors: Rodney E. Hooker, Terry Parks, John Michael Greer
  • Publication number: 20140349043
    Abstract: A structural component that experiences a changing magnetic field during use is described. The structural component can be a stator bore tube, a stator tooth or stator support, for example. A stator bore tube is made of a multi-layered composite material with layers of unidirectional carbon fibre reinforced polymer (CFRP). The eddy current direction is along the axis of the stator bore tube. The carbon fibres in the CFRP layers are orientated along the circumferential direction of the stator bore tube.
    Type: Application
    Filed: May 22, 2014
    Publication date: November 27, 2014
    Applicant: GE Energy Power Conversion Technology Ltd
    Inventor: John Michael GREER
  • Patent number: 8880807
    Abstract: A data prefetcher in a microprocessor. The data prefetcher includes a plurality of period match counters associated with a corresponding plurality of different pattern periods. The data prefetcher also includes control logic that updates the plurality of period match counters in response to accesses to a memory block by the microprocessor, determines a clear pattern period based on the plurality of period match counters and prefetches into the microprocessor non-fetched cache lines within the memory block based on a pattern having the clear pattern period determined based on the plurality of period match counters.
    Type: Grant
    Filed: May 20, 2014
    Date of Patent: November 4, 2014
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20140310479
    Abstract: A microprocessor includes a first hardware data prefetcher that prefetches data into the microprocessor according to a first algorithm. The microprocessor also includes a second hardware data prefetcher that prefetches data into the microprocessor according to a second algorithm, wherein the first and second algorithms are different. The second prefetcher detects that it is prefetching data into the microprocessor according to the second algorithm in excess of a first predetermined rate and, in response, sends a throttle indication to the first prefetcher. The first prefetcher prefetches data into the microprocessor according to the first algorithm at below a second predetermined rate in response to receiving the throttle indication from the second prefetcher.
    Type: Application
    Filed: June 25, 2014
    Publication date: October 16, 2014
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20140289479
    Abstract: A data prefetcher in a microprocessor. The data prefetcher includes a plurality of period match counters associated with a corresponding plurality of different pattern periods. The data prefetcher also includes control logic that updates the plurality of period match counters in response to accesses to a memory block by the microprocessor, determines a clear pattern period based on the plurality of period match counters and prefetches into the microprocessor non-fetched cache lines within the memory block based on a pattern having the clear pattern period determined based on the plurality of period match counters.
    Type: Application
    Filed: May 20, 2014
    Publication date: September 25, 2014
    Applicant: VIA TECHNOLOGIES, INC.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Patent number: 8762649
    Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
    Type: Grant
    Filed: February 24, 2011
    Date of Patent: June 24, 2014
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Patent number: 8719510
    Abstract: A microprocessor includes a cache memory and a data prefetcher. The data prefetcher detects a pattern of memory accesses within a first memory block and prefetch into the cache memory cache lines from the first memory block based on the pattern. The data prefetcher also observes a new memory access request to a second memory block. The data prefetcher also determines that the first memory block is virtually adjacent to the second memory block and that the pattern, when continued from the first memory block to the second memory block, predicts an access to a cache line implicated by the new request within the second memory block. The data prefetcher also responsively prefetches into the cache memory cache lines from the second memory block based on the pattern.
    Type: Grant
    Filed: February 24, 2011
    Date of Patent: May 6, 2014
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Patent number: 8656111
    Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
    Type: Grant
    Filed: February 24, 2011
    Date of Patent: February 18, 2014
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Patent number: 8645631
    Abstract: A microprocessor includes a first-level cache memory, a second-level cache memory, and a data prefetcher that detects a predominant direction and pattern of recent memory accesses presented to the second-level cache memory and prefetches cache lines into the second-level cache memory based on the predominant direction and pattern. The data prefetcher also receives from the first-level cache memory an address of a memory access received by the first-level cache memory, wherein the address implicates a cache line. The data prefetcher also determines one or more cache lines indicated by the pattern beyond the implicated cache line in the predominant direction. The data prefetcher also causes the one or more cache lines to be prefetched into the first-level cache memory.
    Type: Grant
    Filed: February 24, 2011
    Date of Patent: February 4, 2014
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20140006718
    Abstract: A hardware data prefetcher includes a queue of indexed storage elements into which are queued strides associated with a stream of temporally adjacent load requests. Each stride is a difference between cache line offsets of memory addresses of respective adjacent load requests. Hardware logic calculates a current stride between a current load request and a newest previous load request. The hardware logic compares the current stride and a stride M in the queue and compares the newest of the queued strides with a queued stride M+1, which is older than and adjacent to stride M. When the comparisons match, the hardware logic prefetches a cache line whose offset is the sum of the offset of the current load request and a stride M?1. Stride M?1 is newer than and adjacent to stride M in the queue.
    Type: Application
    Filed: June 27, 2012
    Publication date: January 2, 2014
    Applicant: VIA TECHNOLOGIES, INC.
    Inventors: Meera Ramani-Augustin, John Michael Greer
  • Publication number: 20130200734
    Abstract: The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibres. The cooling fluid may flow through the spaces or voids between the fibres.
    Type: Application
    Filed: March 24, 2011
    Publication date: August 8, 2013
    Applicant: GE ENERGY POWER CONVERSION TECHNOLOGY LTD.
    Inventor: John Michael Greer
  • Patent number: 8364902
    Abstract: A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.
    Type: Grant
    Filed: October 15, 2009
    Date of Patent: January 29, 2013
    Assignee: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20110238920
    Abstract: A microprocessor includes a cache memory and a data prefetcher. The data prefetcher detects a pattern of memory accesses within a first memory block and prefetch into the cache memory cache lines from the first memory block based on the pattern. The data prefetcher also observes a new memory access request to a second memory block. The data prefetcher also determines that the first memory block is virtually adjacent to the second memory block and that the pattern, when continued from the first memory block to the second memory block, predicts an access to a cache line implicated by the new request within the second memory block. The data prefetcher also responsively prefetches into the cache memory cache lines from the second memory block based on the pattern.
    Type: Application
    Filed: February 24, 2011
    Publication date: September 29, 2011
    Applicant: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20110238923
    Abstract: A microprocessor includes a first-level cache memory, a second-level cache memory, and a data prefetcher that detects a predominant direction and pattern of recent memory accesses presented to the second-level cache memory and prefetches cache lines into the second-level cache memory based on the predominant direction and pattern. The data prefetcher also receives from the first-level cache memory an address of a memory access received by the first-level cache memory, wherein the address implicates a cache line. The data prefetcher also determines one or more cache lines indicated by the pattern beyond the implicated cache line in the predominant direction. The data prefetcher also causes the one or more cache lines to be prefetched into the first-level cache memory.
    Type: Application
    Filed: February 24, 2011
    Publication date: September 29, 2011
    Applicant: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20110238922
    Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
    Type: Application
    Filed: February 24, 2011
    Publication date: September 29, 2011
    Applicant: VIA Technologies, Inc.
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20110035551
    Abstract: A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.
    Type: Application
    Filed: October 15, 2009
    Publication date: February 10, 2011
    Inventors: Rodney E. Hooker, John Michael Greer
  • Publication number: 20110010506
    Abstract: A data prefetcher includes a table of entries to maintain a history of load operations. Each entry stores a tag and a corresponding next stride. The tag comprises a concatenation of first and second strides. The next stride comprises the first stride. The first stride comprises a first cache line address subtracted from a second cache line address. The second stride comprises the second cache line address subtracted from a third cache line address. The first, second and third cache line addresses each comprise a memory address of a cache line implicated by respective first, second and third temporally preceding load operations. Control logic calculates a current stride by subtracting a previous cache line address from a new load cache line address, looks up in the table a concatenation of a previous stride and the current stride, and prefetches a cache line using the hitting table entry next stride.
    Type: Application
    Filed: October 5, 2009
    Publication date: January 13, 2011
    Applicant: VIA Technologies, Inc.
    Inventors: John Michael Greer, Rodney E. Hooker, Albert J. Loper, JR.