Patents by Inventor Steven Gerard LeMire
Steven Gerard LeMire has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9336154Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: GrantFiled: August 26, 2015Date of Patent: May 10, 2016Assignee: Avago Technologies General IP (Singapore) Pte. Ltd.Inventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Publication number: 20150378923Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: ApplicationFiled: August 26, 2015Publication date: December 31, 2015Inventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Patent number: 9195605Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: GrantFiled: April 15, 2015Date of Patent: November 24, 2015Assignee: EMULEX CORPORATIONInventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Publication number: 20150220443Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: ApplicationFiled: April 15, 2015Publication date: August 6, 2015Inventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Patent number: 9043558Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: GrantFiled: October 17, 2014Date of Patent: May 26, 2015Assignee: EMULEX CORPORATIONInventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Publication number: 20150039839Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: ApplicationFiled: October 17, 2014Publication date: February 5, 2015Inventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Patent number: 8892823Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: GrantFiled: December 28, 2007Date of Patent: November 18, 2014Assignee: Emulex CorporationInventors: Steven Gerard LeMire, Vuong Cao Nguyen
-
Patent number: 8111696Abstract: A method is disclosed for indicating a status of a transfer of data from a first device to a second device over a network. In one embodiment, the data includes one or more data frames. Each frame includes a header having one or more bits. The method includes setting a last bit of the one or more bits in the header of a last frame of the one or more data frames to a first value if the status of the transfer of data is good and setting the value of the last bit of the last data frame to a second value if the transfer of data failed. This results in a less congested, more efficient network.Type: GrantFiled: October 14, 2008Date of Patent: February 7, 2012Assignee: Emulex Design & Manufacturing CorporationInventors: Vuong Cao Nguyen, Steven Gerard Lemire, Raul Bersamin Oteyza, Jeff Junwei Zheng
-
Patent number: 7805572Abstract: Embodiments of the present invention are directed to a scheme in which information as to the future behavior of particular software is used in order to optimize cache management and reduce cache pollution. Accordingly, a certain type of data can be defined as “short life data” by using knowledge of the expected behavior of particular software. Short life data can be a type of data which, according to the ordinary expected operation of the software, is not expected to be used by the software often in the future. Data blocks which are to be stored in the cache can be examined to determine if they are short life data blocks. If the data blocks are in fact short life data blocks they can be stored only in a particular short life area of the cache.Type: GrantFiled: June 29, 2007Date of Patent: September 28, 2010Assignee: Emulex Design & Manufacturing CorporationInventors: Steven Gerard LeMire, Eddie Miller, Eric David Peel
-
Publication number: 20100091658Abstract: A method is disclosed for indicating a status of a transfer of data from a first device to a second device over a network. In one embodiment, the data includes one or more data frames. Each frame includes a header having one or more bits. The method includes setting a last bit of the one or more bits in the header of a last frame of the one or more data frames to a first value if the status of the transfer of data is good and setting the value of the last bit of the last data frame to a second value if the transfer of data failed. This results in a less congested, more efficient network.Type: ApplicationFiled: October 14, 2008Publication date: April 15, 2010Inventors: Vuong Cao NGUYEN, Steven Gerard Lemire, Raul Bersamin Oteyza, Jeff Junwei Zheng
-
Publication number: 20090172287Abstract: Embodiments of the current invention permit a user to allocate cache memory to main memory more efficiently. The processor or a user allocates the cache memory and associates the cache memory to the main memory location, but suppresses or bypassing reading the main memory data into the cache memory. Some embodiments of the present invention permit the user to specify how many cache lines are allocated at a given time. Further, embodiments of the present invention may initialize the cache memory to a specified pattern. The cache memory may be zeroed or set to some desired pattern, such as all ones. Alternatively, a user may determine the initialization pattern through the processor.Type: ApplicationFiled: December 28, 2007Publication date: July 2, 2009Inventors: Steven Gerard LeMIRE, Vuong Cao Nguyen
-
Publication number: 20090006761Abstract: Embodiments of the present invention are directed to a scheme in which information as to the future behavior of particular software is used in order to optimize cache management and reduce cache pollution. Accordingly, a certain type of data can be defined as “short life data” by using knowledge of the expected behavior of particular software. Short life data can be a type of data which, according to the ordinary expected operation of the software, is not expected to be used by the software often in the future. Data blocks which are to be stored in the cache can be examined to determine if they are short life data blocks. If the data blocks are in fact short life data blocks they can be stored only in a particular short life area of the cache.Type: ApplicationFiled: June 29, 2007Publication date: January 1, 2009Applicant: Emulex Design & Manufacturing CorporationInventors: Steven Gerard LeMire, Eddie Miller, Eric David Peel