Patents by Inventor Wai-ho Yeung

Wai-ho Yeung has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220322136
    Abstract: Described herein are systems, methods, and other techniques for compatible packet separation for communication networks. A block comprising a plurality of packets to be transmitted over a network is received. The block includes a set of batches, and the plurality of packets are distributed between the set of batches. A pseudo interleaver depth is calculated for each of the set of batches to produce a set of pseudo interleaver depths. Blockwise adaptive recoding is performed using the set of pseudo interleaver depths to produce a number of recoded packets for each of the set of batches. A transmission sequence is generated using the number of recoded packets for each of the set of batches.
    Type: Application
    Filed: April 1, 2021
    Publication date: October 6, 2022
    Inventors: Ho Fai Hoover YIN, Ka Hei Ng, Zhuowei Zhong, Raymond Wai Ho Yeung, Shenghao Yang
  • Patent number: 11452003
    Abstract: Described herein are systems, methods, and other techniques for compatible packet separation for communication networks. A block comprising a plurality of packets to be transmitted over a network is received. The block includes a set of batches, and the plurality of packets are distributed between the set of batches. A pseudo interleaver depth is calculated for each of the set of batches to produce a set of pseudo interleaver depths. Blockwise adaptive recoding is performed using the set of pseudo interleaver depths to produce a number of recoded packets for each of the set of batches. A transmission sequence is generated using the number of recoded packets for each of the set of batches.
    Type: Grant
    Filed: April 1, 2021
    Date of Patent: September 20, 2022
    Assignee: The Chinese University of Hong Kong
    Inventors: Ho Fai Hoover Yin, Ka Hei Ng, Zhuowei Zhong, Raymond Wai Ho Yeung, Shenghao Yang
  • Patent number: 10237782
    Abstract: Hardware acceleration for batched sparse (BATS) codes is enabled. Hardware implementation of some timing-critical procedures can effectively offload computationally intensive overheads, for example, finite field arithmetic, Gaussian elimination, and belief propagation (BP) calculations, and this can be done without direct mapping of software codes to a hardware implementation. Suitable acceleration hardware may include pipelined multipliers configured to multiply input data with coefficients of a matrix associated with a random linear network code in a pipelined manner, addition components configured to add multiplier output to feedback data, and switches to direct data flows to and from memory components such that valid result data is not overwritten and such that feedback data corresponds to most recent valid result data. Acceleration hardware components (e.g., number and configuration) may be dynamically adjusted to modify BATS code parameters and adapt to changing network conditions.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: March 19, 2019
    Assignee: The Chinese University of Hong Kong
    Inventors: Shenghao Yang, Wai-ho Yeung, Tak-Ion Chao, Kin-Hong Lee, Chi-Iam Ho
  • Publication number: 20170195914
    Abstract: Hardware acceleration for batched sparse (BATS) codes is enabled. Hardware implementation of some timing-critical procedures can effectively offload computationally intensive overheads, for example, finite field arithmetic, Gaussian elimination, and belief propagation (BP) calculations, and this can be done without direct mapping of software codes to a hardware implementation. Suitable acceleration hardware may include pipelined multipliers configured to multiply input data with coefficients of a matrix associated with a random linear network code in a pipelined manner, addition components configured to add multiplier output to feedback data, and switches to direct data flows to and from memory components such that valid result data is not overwritten and such that feedback data corresponds to most recent valid result data. Acceleration hardware components (e.g., number and configuration) may be dynamically adjusted to modify BATS code parameters and adapt to changing network conditions.
    Type: Application
    Filed: December 30, 2016
    Publication date: July 6, 2017
    Inventors: Shenghao Yang, Wai-ho Yeung, Tak-lon Chao, Kin-Hong Lee, Chi-lam Ho
  • Patent number: 8693501
    Abstract: A method for data encoding and associated decoding is based on the concept of batches that allows transmission of a large data file from a source node to multiple destination nodes through communication networks that may employ network coding wherein sparse matrix codes are employed in a network setting. A batch is a set of packets generated by a subset of the input packets using sparse matrix encoder. A sparse matrix encoder can be called repeatedly to generate multiple batches. The batches are generally independent of one another. During the transmission in a communication network, network coding can be applied to packets belonging to the same batch to improve the multicast throughput. A decoder recovers all or at least a fixed fraction of the input packets using received batches. The input packets can be pre-coded using a pre-code before applying sparse matrix codes. The data file can then be reconstructed by further decoding the pre-code.
    Type: Grant
    Filed: May 20, 2011
    Date of Patent: April 8, 2014
    Assignee: The Chinese University of Hong Kong
    Inventors: Shenghao Yang, Raymond Wai Ho Yeung
  • Publication number: 20120128009
    Abstract: A method for data encoding and associated decoding is based on the concept of batches that allows transmission of a large data file from a source node to multiple destination nodes through communication networks that may employ network coding wherein sparse matrix codes are employed in a network setting. A batch is a set of packets generated by a subset of the input packets using sparse matrix encoder. A sparse matrix encoder can be called repeatedly to generate multiple batches. The batches are generally independent of one another. During the transmission in a communication network, network coding can be applied to packets belonging to the same batch to improve the multicast throughput. A decoder recovers all or at least a fixed fraction of the input packets using received batches. The input packets can be pre-coded using a pre-code before applying sparse matrix codes. The data file can then be reconstructed by further decoding the pre-code.
    Type: Application
    Filed: May 20, 2011
    Publication date: May 24, 2012
    Applicant: The Chinese University of Hong Kong
    Inventors: Shenghao Yang, Raymond Wai Ho Yeung