Patents by Inventor John Michael Greer
John Michael Greer has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9293953Abstract: The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibers. The cooling fluid may flow through the spaces or voids between the fibers.Type: GrantFiled: March 24, 2011Date of Patent: March 22, 2016Assignee: GE Energy Power Conversion Technology, Ltd.Inventor: John Michael Greer
-
Patent number: 9251083Abstract: A microprocessor includes a first and second hardware data prefetchers configured to prefetch data into the microprocessor according to first and second respective algorithms, which are different. The second prefetcher is configured to detect a memory access pattern within a memory region and responsively prefetch data from the memory region according the second algorithm. The second prefetcher is further configured to provide to the first prefetcher a descriptor of the memory region. The first prefetcher is configured to stop prefetching data from the memory region in response to receiving the descriptor of the memory region from the second prefetcher. The second prefetcher also provides to the first prefetcher a communication to resume prefetching data from the memory region, such as when the second prefetcher subsequently detects that a predetermined number of memory accesses to the memory region are not in the memory access pattern.Type: GrantFiled: March 11, 2013Date of Patent: February 2, 2016Assignee: VIA TECHNOLOGIES, INC.Inventors: Rodney E. Hooker, John Michael Greer
-
Patent number: 9032159Abstract: A hardware data prefetcher includes a queue of indexed storage elements into which are queued strides associated with a stream of temporally adjacent load requests. Each stride is a difference between cache line offsets of memory addresses of respective adjacent load requests. Hardware logic calculates a current stride between a current load request and a newest previous load request. The hardware logic compares the current stride and a stride M in the queue and compares the newest of the queued strides with a queued stride M+1, which is older than and adjacent to stride M. When the comparisons match, the hardware logic prefetches a cache line whose offset is the sum of the offset of the current load request and a stride M?1. Stride M?1 is newer than and adjacent to stride M in the queue.Type: GrantFiled: June 27, 2012Date of Patent: May 12, 2015Assignee: Via Technologies, Inc.Inventors: Meera Ramani-Augustin, John Michael Greer
-
Publication number: 20140365753Abstract: A microprocessor includes a predicting unit and a control unit. The control unit controls the predicting unit to accumulate a history of characteristics of executed instructions and makes predictions related to subsequent instructions based on the history while the microprocessor is running a first thread. The control unit also detects a transition from running the first thread to running a second thread and controls the predicting unit to selectively suspend accumulating the history and making the predictions using the history while running the second thread. The predicting unit makes static predictions while running the second thread. The selectivity may be based on the privilege level, identity or length of the second thread, static prediction effectiveness during a previous execution instance of the thread, whether the transition was made due to a system call, and whether the second thread is an interrupt handler.Type: ApplicationFiled: January 27, 2014Publication date: December 11, 2014Inventors: Rodney E. Hooker, Terry Parks, John Michael Greer
-
Publication number: 20140349043Abstract: A structural component that experiences a changing magnetic field during use is described. The structural component can be a stator bore tube, a stator tooth or stator support, for example. A stator bore tube is made of a multi-layered composite material with layers of unidirectional carbon fibre reinforced polymer (CFRP). The eddy current direction is along the axis of the stator bore tube. The carbon fibres in the CFRP layers are orientated along the circumferential direction of the stator bore tube.Type: ApplicationFiled: May 22, 2014Publication date: November 27, 2014Applicant: GE Energy Power Conversion Technology LtdInventor: John Michael GREER
-
Patent number: 8880807Abstract: A data prefetcher in a microprocessor. The data prefetcher includes a plurality of period match counters associated with a corresponding plurality of different pattern periods. The data prefetcher also includes control logic that updates the plurality of period match counters in response to accesses to a memory block by the microprocessor, determines a clear pattern period based on the plurality of period match counters and prefetches into the microprocessor non-fetched cache lines within the memory block based on a pattern having the clear pattern period determined based on the plurality of period match counters.Type: GrantFiled: May 20, 2014Date of Patent: November 4, 2014Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20140310479Abstract: A microprocessor includes a first hardware data prefetcher that prefetches data into the microprocessor according to a first algorithm. The microprocessor also includes a second hardware data prefetcher that prefetches data into the microprocessor according to a second algorithm, wherein the first and second algorithms are different. The second prefetcher detects that it is prefetching data into the microprocessor according to the second algorithm in excess of a first predetermined rate and, in response, sends a throttle indication to the first prefetcher. The first prefetcher prefetches data into the microprocessor according to the first algorithm at below a second predetermined rate in response to receiving the throttle indication from the second prefetcher.Type: ApplicationFiled: June 25, 2014Publication date: October 16, 2014Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20140289479Abstract: A data prefetcher in a microprocessor. The data prefetcher includes a plurality of period match counters associated with a corresponding plurality of different pattern periods. The data prefetcher also includes control logic that updates the plurality of period match counters in response to accesses to a memory block by the microprocessor, determines a clear pattern period based on the plurality of period match counters and prefetches into the microprocessor non-fetched cache lines within the memory block based on a pattern having the clear pattern period determined based on the plurality of period match counters.Type: ApplicationFiled: May 20, 2014Publication date: September 25, 2014Applicant: VIA TECHNOLOGIES, INC.Inventors: Rodney E. Hooker, John Michael Greer
-
Patent number: 8762649Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.Type: GrantFiled: February 24, 2011Date of Patent: June 24, 2014Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Patent number: 8719510Abstract: A microprocessor includes a cache memory and a data prefetcher. The data prefetcher detects a pattern of memory accesses within a first memory block and prefetch into the cache memory cache lines from the first memory block based on the pattern. The data prefetcher also observes a new memory access request to a second memory block. The data prefetcher also determines that the first memory block is virtually adjacent to the second memory block and that the pattern, when continued from the first memory block to the second memory block, predicts an access to a cache line implicated by the new request within the second memory block. The data prefetcher also responsively prefetches into the cache memory cache lines from the second memory block based on the pattern.Type: GrantFiled: February 24, 2011Date of Patent: May 6, 2014Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Patent number: 8656111Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.Type: GrantFiled: February 24, 2011Date of Patent: February 18, 2014Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Patent number: 8645631Abstract: A microprocessor includes a first-level cache memory, a second-level cache memory, and a data prefetcher that detects a predominant direction and pattern of recent memory accesses presented to the second-level cache memory and prefetches cache lines into the second-level cache memory based on the predominant direction and pattern. The data prefetcher also receives from the first-level cache memory an address of a memory access received by the first-level cache memory, wherein the address implicates a cache line. The data prefetcher also determines one or more cache lines indicated by the pattern beyond the implicated cache line in the predominant direction. The data prefetcher also causes the one or more cache lines to be prefetched into the first-level cache memory.Type: GrantFiled: February 24, 2011Date of Patent: February 4, 2014Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20140006718Abstract: A hardware data prefetcher includes a queue of indexed storage elements into which are queued strides associated with a stream of temporally adjacent load requests. Each stride is a difference between cache line offsets of memory addresses of respective adjacent load requests. Hardware logic calculates a current stride between a current load request and a newest previous load request. The hardware logic compares the current stride and a stride M in the queue and compares the newest of the queued strides with a queued stride M+1, which is older than and adjacent to stride M. When the comparisons match, the hardware logic prefetches a cache line whose offset is the sum of the offset of the current load request and a stride M?1. Stride M?1 is newer than and adjacent to stride M in the queue.Type: ApplicationFiled: June 27, 2012Publication date: January 2, 2014Applicant: VIA TECHNOLOGIES, INC.Inventors: Meera Ramani-Augustin, John Michael Greer
-
Publication number: 20130200734Abstract: The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibres. The cooling fluid may flow through the spaces or voids between the fibres.Type: ApplicationFiled: March 24, 2011Publication date: August 8, 2013Applicant: GE ENERGY POWER CONVERSION TECHNOLOGY LTD.Inventor: John Michael Greer
-
Patent number: 8364902Abstract: A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.Type: GrantFiled: October 15, 2009Date of Patent: January 29, 2013Assignee: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20110238920Abstract: A microprocessor includes a cache memory and a data prefetcher. The data prefetcher detects a pattern of memory accesses within a first memory block and prefetch into the cache memory cache lines from the first memory block based on the pattern. The data prefetcher also observes a new memory access request to a second memory block. The data prefetcher also determines that the first memory block is virtually adjacent to the second memory block and that the pattern, when continued from the first memory block to the second memory block, predicts an access to a cache line implicated by the new request within the second memory block. The data prefetcher also responsively prefetches into the cache memory cache lines from the second memory block based on the pattern.Type: ApplicationFiled: February 24, 2011Publication date: September 29, 2011Applicant: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20110238923Abstract: A microprocessor includes a first-level cache memory, a second-level cache memory, and a data prefetcher that detects a predominant direction and pattern of recent memory accesses presented to the second-level cache memory and prefetches cache lines into the second-level cache memory based on the predominant direction and pattern. The data prefetcher also receives from the first-level cache memory an address of a memory access received by the first-level cache memory, wherein the address implicates a cache line. The data prefetcher also determines one or more cache lines indicated by the pattern beyond the implicated cache line in the predominant direction. The data prefetcher also causes the one or more cache lines to be prefetched into the first-level cache memory.Type: ApplicationFiled: February 24, 2011Publication date: September 29, 2011Applicant: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20110238922Abstract: A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.Type: ApplicationFiled: February 24, 2011Publication date: September 29, 2011Applicant: VIA Technologies, Inc.Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20110035551Abstract: A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.Type: ApplicationFiled: October 15, 2009Publication date: February 10, 2011Inventors: Rodney E. Hooker, John Michael Greer
-
Publication number: 20110010506Abstract: A data prefetcher includes a table of entries to maintain a history of load operations. Each entry stores a tag and a corresponding next stride. The tag comprises a concatenation of first and second strides. The next stride comprises the first stride. The first stride comprises a first cache line address subtracted from a second cache line address. The second stride comprises the second cache line address subtracted from a third cache line address. The first, second and third cache line addresses each comprise a memory address of a cache line implicated by respective first, second and third temporally preceding load operations. Control logic calculates a current stride by subtracting a previous cache line address from a new load cache line address, looks up in the table a concatenation of a previous stride and the current stride, and prefetches a cache line using the hitting table entry next stride.Type: ApplicationFiled: October 5, 2009Publication date: January 13, 2011Applicant: VIA Technologies, Inc.Inventors: John Michael Greer, Rodney E. Hooker, Albert J. Loper, JR.