Patents by Inventor Dongpei Su

Dongpei Su has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9531920
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an input image. The a×b pixel macro-cell may contain 4 non-overlapping m×n pixel cells. The a×b pixels in the a×b pixel macro-cell may have respective color values and may be associated with respective object type tags. The example embodiment may also include selecting a compression technique to either (i) compress the a×b pixel macro-cell as a whole, or (ii) compress the a×b pixel macro-cell by compressing each of the 4 non-overlapping m×n pixel cells separately. The example embodiment may further include compressing the a×b pixel macro-cell according to the selected compression technique, and writing a representation of the compressed a×b pixel macro-cell to a computer-readable output medium.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: December 27, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Alfred Abkarian, Stephen L. Schafer
  • Patent number: 9489709
    Abstract: A system and method for implementing a real-time state machine with a microcontroller is disclosed. The method includes using a two-stage process, including a configuration stage and a run-time stage, for processing objects for a printing device. The configuration stage is executed prior to the run-time stage, which operates in real-time. During the configuration stage, the system predetermines a state transition list, devices that need to be monitored, devices that need to be controlled, and other variables used during the run-time stage. Once the configuration stage is complete, the system executes the run-time stage in real-time to complete processing of an object for a printing device. By pre-calculating items during the configuration stage, the system reduces the execution time of the run-time stage in real-time. As a result, the performance of the microcontroller in real-time is enhanced.
    Type: Grant
    Filed: March 27, 2015
    Date of Patent: November 8, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Dongpei Su, Masayoshi Nakamura, Christa Neil, Kenneth A. Schmidt
  • Publication number: 20160284042
    Abstract: A system and method for implementing a real-time state machine with a microcontroller is disclosed. The method includes using a two-stage process, including a configuration stage and a run-time stage, for processing objects for a printing device. The configuration stage is executed prior to the run-time stage, which operates in real-time. During the configuration stage, the system predetermines a state transition list, devices that need to be monitored, devices that need to be controlled, and other variables used during the run-time stage. Once the configuration stage is complete, the system executes the run-time stage in real-time to complete processing of an object for a printing device. By pre-calculating items during the configuration stage, the system reduces the execution time of the run-time stage in real-time. As a result, the performance of the microcontroller in real-time is enhanced.
    Type: Application
    Filed: March 27, 2015
    Publication date: September 29, 2016
    Inventors: Dongpei Su, Masayoshi Nakamura, Christa Neil, Kenneth A. Schmidt
  • Publication number: 20160286083
    Abstract: A computer-implemented method comprises receiving a data stream that includes a series of code words that encodes a respective series of pixel data according to a first entropy coding lookup table, and processing the data stream to determine if there is a match between a first code word and a consecutive second code word, and a code word entry in a second entropy coding lookup table. The method also includes, if there is a match, decoding the first code word and the second code word using the second entropy coding lookup table. Further, the method includes, if there is not a match, decoding the first code word using the first entropy coding lookup table.
    Type: Application
    Filed: March 20, 2015
    Publication date: September 29, 2016
    Inventors: Dongpei Su, Kenneth A. Schmidt, Thien-Phuc Nguyen Do, Sheng Li
  • Publication number: 20160269726
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an input image. Pixels in the a×b pixel macro-cell may have respective pixel values and may be associated with respective tags. It may be determined whether at least e of the respective tags indicate that their associated pixels represent edges in the input image. Based on this determination, either a first encoding or a second encoding of the a×b pixel macro-cell may be selected. The first encoding may weigh pixels that represent edges in the input image heavier than pixels that do not represent edges in the input image, and the second encoding might not consider whether pixels represent edges. The selected encoding may be performed and written to a computer-readable output medium.
    Type: Application
    Filed: May 25, 2016
    Publication date: September 15, 2016
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong
  • Publication number: 20160255246
    Abstract: An example embodiment may involve obtaining (i) an a×b attribute macro-cell, and (ii) a×b pixel macro-cells for each of a luminance plane, a first color plane, and a second color plane of an input image. The a×b pixel macro-cells may each contain 4 non-overlapping m×n pixel cells. The example embodiment may also involve determining 4 attribute-plane output values that represent the 4 non-overlapping m×n attribute cells, 1 to 4 luminance-plane output values that represent the a×b pixel macro-cell of the luminance plane, a first color-plane output value to represent the a×b pixel macro-cell of the first color plane, and a second color-plane output value to represent the a×b pixel macro-cell of the second color plane. The example embodiment may further involve writing an interleaved representation of the output values to a computer-readable output medium.
    Type: Application
    Filed: May 10, 2016
    Publication date: September 1, 2016
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman
  • Publication number: 20160255247
    Abstract: An example embodiment may involve obtaining an m×n pixel cell from an input image. Each of the m×n pixels in the m×n pixel cell may be associated with at least one color value. An m×n attribute cell may be determined, elements of which may be associated in a one-to-one fashion with respective pixels in the m×n pixel cell. The m×n pixel cell may be compressed in a lossy fashion, and the m×n attribute cell may be compressed in a lossless fashion. Compression of the m×n pixel cell may be based on at least part of the m×n attribute cell. An interleaved representation of the compressed m×n pixel cell and the compressed m×n attribute cell may be written to an output medium.
    Type: Application
    Filed: May 10, 2016
    Publication date: September 1, 2016
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman, Alfred Abkarian, Stephen L. Schafer
  • Publication number: 20160227076
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an input image. The a×b pixel macro-cell may contain 4 non-overlapping m×n pixel cells. The a×b pixels in the a×b pixel macro-cell may have respective color values and may be associated with respective object type tags. The example embodiment may also include selecting a compression technique to either (i) compress the a×b pixel macro-cell as a whole, or (ii) compress the a×b pixel macro-cell by compressing each of the 4 non-overlapping m×n pixel cells separately. The example embodiment may further include compressing the a×b pixel macro-cell according to the selected compression technique, and writing a representation of the compressed a×b pixel macro-cell to a computer-readable output medium.
    Type: Application
    Filed: January 30, 2015
    Publication date: August 4, 2016
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Alfred Abkarian, Stephen L. Schafer
  • Publication number: 20160227075
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an image with one or more color planes, and an a×b attribute macro-cell. The a×b pixel macro-cell may contain 4 non-overlapping m×n pixel cells, and the a×b attribute macro-cell may contain 4 non-overlapping m×n attribute cells. The pixels in the a×b pixel macro-cell may be associated with respective color values. The example embodiment may also involve determining 4 attribute output values associated respectively with the 4 non-overlapping m×n attribute cells. The example embodiment may further involve determining 1 to 4 color-plane output values for the non-overlapping m×n pixel cells, and writing an interleaved representation of the 4 attribute output values and the determined color-plane output values.
    Type: Application
    Filed: March 15, 2016
    Publication date: August 4, 2016
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman, Alfred Abkarian, Stephen L. Schafer
  • Patent number: 9380304
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an input image. Pixels in the a×b pixel macro-cell may have respective pixel values and may be associated with respective tags. It may be determined whether at least e of the respective tags indicate that their associated pixels represent edges in the input image. Based on this determination, either a first encoding or a second encoding of the a×b pixel macro-cell may be selected. The first encoding may weigh pixels that represent edges in the input image heavier than pixels that do not represent edges in the input image, and the second encoding might not consider whether pixels represent edges. The selected encoding may be performed and written to a computer-readable output medium.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: June 28, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong
  • Patent number: 9363416
    Abstract: An example embodiment may involve obtaining an m×n pixel cell from an input image. Each of the m×n pixels in the m×n pixel cell may be associated with at least one color value. An m×n attribute cell may be determined, elements of which may be associated in a one-to-one fashion with respective pixels in the m×n pixel cell. The m×n pixel cell may be compressed in a lossy fashion, and the m×n attribute cell may be compressed in a lossless fashion. Compression of the m×n pixel cell may be based on at least part of the m×n attribute cell. An interleaved representation of the compressed m×n pixel cell and the compressed m×n attribute cell may be written to an output medium.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: June 7, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman, Alfred Abkarian, Stephen L. Schafer
  • Patent number: 9363422
    Abstract: An example embodiment may involve obtaining (i) an a×b attribute macro-cell, and (ii) a×b pixel macro-cells for each of a luminance plane, a first color plane, and a second color plane of an input image. The a×b pixel macro-cells may each contain 4 non-overlapping m×n pixel cells. The example embodiment may also involve determining 4 attribute-plane output values that represent the 4 non-overlapping m×n attribute cells, 1 to 4 luminance-plane output values that represent the a×b pixel macro-cell of the luminance plane, a first color-plane output value to represent the a×b pixel macro-cell of the first color plane, and a second color-plane output value to represent the a×b pixel macro-cell of the second color plane. The example embodiment may further involve writing an interleaved representation of the output values to a computer-readable output medium.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: June 7, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman
  • Patent number: 9319565
    Abstract: An example embodiment may involve obtaining an a×b pixel macro-cell from an image with one or more color planes, and an a×b attribute macro-cell. The a×b pixel macro-cell may contain 4 non-overlapping m×n pixel cells, and the a×b attribute macro-cell may contain 4 non-overlapping m×n attribute cells. The pixels in the a×b pixel macro-cell may be associated with respective color values. The example embodiment may also involve determining 4 attribute output values associated respectively with the 4 non-overlapping m×n attribute cells. The example embodiment may further involve determining 1 to 4 color-plane output values for the non-overlapping m×n pixel cells, and writing an interleaved representation of the 4 attribute output values and the determined color-plane output values.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: April 19, 2016
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Michael M. Chang, Kenneth A. Schmidt, Dongpei Su, Sheng Li, Kendrick Wong, Larry Lubman, Alfred Abkarian, Stephen L. Schafer
  • Patent number: 8848250
    Abstract: Based on an m×n halftone matrix and an m×n pixel block of an image, an m×n halftone version of the m×n pixel block may be determined. An n-way interleave may be performed on rows of the m×n halftone version to create an mn×1 halftone segment. The mn×1 halftone segment may be compared to one or more halftone segments in a buffer. Based on the comparison, a literal code word and a representation of the mn×1 halftone segment may be output, and the representation of the mn×1 halftone segment may be written to the buffer. Alternatively, a repeat code word and a repeat value may be output, and at least one representation of the mn×1 halftone segment may be written to the buffer.
    Type: Grant
    Filed: October 23, 2012
    Date of Patent: September 30, 2014
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Dongpei Su, Kendrick Wong, Larry Lubman, Michael M. Chang, Stephen L. Schafer
  • Patent number: 8805069
    Abstract: An m×n pixel cell may be obtained from an input image, each of the pixels having a respective color value. A characterization of the cell may be determined, including determining a lowest color value and a highest color value of the pixels cell. A difference between the highest color value and the lowest color value may be calculated. If the difference is less than or equal to a threshold difference, an output color value inclusively between the highest color value and the lowest color value may be selected, and a first representation of the output color value may be written to an output medium. If the difference is greater than the threshold difference, multiple output color values may be selected, and a second representation of the multiple output color values may be written to the output medium.
    Type: Grant
    Filed: June 12, 2012
    Date of Patent: August 12, 2014
    Assignee: KYOCERA Document Solutions Inc.
    Inventors: Guo Li, Kenneth A. Schmidt, Dongpei Su, Stephen L. Schafer, Alfred Abkarian, Sheng Li, Michael M. Chang
  • Publication number: 20140111830
    Abstract: Based on an m×n halftone matrix and an m×n pixel block of an image, an m×n halftone version of the m×n pixel block may be determined. An n-way interleave may be performed on rows of the m×n halftone version to create an mn×1 halftone segment. The mn×1 halftone segment may be compared to one or more halftone segments in a buffer. Based on the comparison, a literal code word and a representation of the mn×1 halftone segment may be output, and the representation of the mn×1 halftone segment may be written to the buffer. Alternatively, a repeat code word and a repeat value may be output, and at least one representation of the mn×1 halftone segment may be written to the buffer.
    Type: Application
    Filed: October 23, 2012
    Publication date: April 24, 2014
    Applicant: KYOCERA DOCUMENTS SOLUTIONS, INC.
    Inventors: Dongpei Su, Kendrick Wong, Larry Lubman, Michael M. Chang, Stephen L. Schafer
  • Publication number: 20130329237
    Abstract: An m×n pixel cell may be obtained from an input image, each of the pixels having a respective color value. A characterization of the cell may be determined, including determining a lowest color value and a highest color value of the pixels cell. A difference between the highest color value and the lowest color value may be calculated. If the difference is less than or equal to a threshold difference, an output color value inclusively between the highest color value and the lowest color value may be selected, and a first representation of the output color value may be written to an output medium. If the difference is greater than the threshold difference, multiple output color values may be selected, and a second representation of the multiple output color values may be written to the output medium.
    Type: Application
    Filed: June 12, 2012
    Publication date: December 12, 2013
    Applicant: KYOCERA DOCUMENT SOLUTIONS INC.
    Inventors: Guo Li, Kenneth A. Schmidt, Dongpei Su, Stephen L. Schafer, Alfred Abkarian, Sheng Li, Michael M. Chang
  • Patent number: 8199359
    Abstract: A system and method for trapping in electrophotographic color printing and related technologies for printing or display in which the final image is an overlay of multiple components subject to alignment errors. Trapping is based on the cyan (C), magenta (M), and black (K) planes. There are four steps as follows: detect object edges on each of the four color planes; detect coincident and opposing edge transitions on each pair of planes (CM, CK, and KM); determine which plane to trap, i.e., to extend object across edge; and generate trap on that plane using a simple trap generation rule and a single trap generation rule.
    Type: Grant
    Filed: April 24, 2007
    Date of Patent: June 12, 2012
    Assignee: Kyocera Mita Corporation
    Inventors: Lakhbir Gandhi, Robert Westervelt, Kenneth A. Schmidt, Dongpei Su, Alfred Abkarian, Mark Raley
  • Publication number: 20080007752
    Abstract: A system and method for trapping in electrophotographic color printing and related technologies for printing or display in which the final image is an overlay of multiple components subject to alignment errors. Trapping is based on the cyan (C), magenta (M), and black (K) planes. There are four steps as follows: detect object edges on each of the four color planes; detect coincident and opposing edge transitions on each pair of planes (CM, CK, and KM); determine which plane to trap, i.e., to extend object across edge; and generate trap on that plane using a simple trap generation rule and a single trap generation rule.
    Type: Application
    Filed: April 24, 2007
    Publication date: January 10, 2008
    Inventors: Lakhbir Gandhi, Robert Westervelt, Kenneth Schmidt, Dongpei Su, Alfred Abkarian, Mark Raley
  • Patent number: 6919825
    Abstract: The embodiments of the invention include a system and method for losslessly encoding and compressing a data stream. The data stream may be an image, text or combination of the two. The data stream may be received from a computer application or peripheral device. The encoding compresses the data stream by comparing consecutive values of the data stream and encoding the data based on the difference between consecutive data values.
    Type: Grant
    Filed: September 25, 2003
    Date of Patent: July 19, 2005
    Assignee: Peerless Systems Corporation
    Inventors: Dongpei Su, Michele A. Lipman, Raymond B. Robinson, Matthew R. Lipman