Patents by Inventor Mowei WANG

Mowei WANG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12105774
    Abstract: A method for generating traffic demand data of a data center network includes: acquiring traffic demand samples each including a source address, a destination address, a flow interval, and a flow size; acquiring a first interval number by performing equal-frequency binning discretization processing according to the flow interval and acquiring a second interval number by performing equal-frequency binning discretization processing according to the flow size; determining a traffic demand subset according to the source address and the destination address, and acquiring a first parameter matrix and a second parameter matrix by training a latent Dirichlet allocation probability topic model according to the traffic demand subset; and generating the traffic demand data according to the first interval number, the second interval number, the first parameter matrix, and the second parameter matrix.
    Type: Grant
    Filed: April 28, 2023
    Date of Patent: October 1, 2024
    Assignee: TSINGHUA UNIVERSITY
    Inventors: Yong Cui, Zhiwen Liu, Mowei Wang
  • Publication number: 20240163229
    Abstract: An information transmission method and a related device applied to a cloud service system are provided. The cloud service system includes an arbiter, a first switch, and a first server group. The arbiter is connected to the first server group by using the first switch. The method includes: a first server obtains a first identifier of the first server in the first server group. The first server determines, in a first slot, a first time domain location corresponding to the first identifier. The first time domain location is a time domain location at which the first switch receives a first request to be processed by the arbiter, and different identifiers correspond to different time domain locations in the first slot. The first server sends the first request to the first switch based on the first time domain location.
    Type: Application
    Filed: December 21, 2023
    Publication date: May 16, 2024
    Inventors: Yong Cui, Mowei Wang, Cong Liang, Yashe Liu, Ru Liang, Yong Jiang
  • Publication number: 20230421468
    Abstract: A method for generating traffic demand data of a data center network includes: acquiring traffic demand samples each including a source address, a destination address, a flow interval, and a flow size; acquiring a first interval number by performing equal-frequency binning discretization processing according to the flow interval and acquiring a second interval number by performing equal-frequency binning discretization processing according to the flow size; determining a traffic demand subset according to the source address and the destination address, and acquiring a first parameter matrix and a second parameter matrix by training a latent Dirichlet allocation probability topic model according to the traffic demand subset; and generating the traffic demand data according to the first interval number, the second interval number, the first parameter matrix, and the second parameter matrix.
    Type: Application
    Filed: April 28, 2023
    Publication date: December 28, 2023
    Inventors: Yong CUI, Zhiwen LIU, Mowei WANG
  • Patent number: 11818023
    Abstract: A traffic-aware switch-shared cache scheduling method includes: S1, setting a cache threshold of each outgoing port of a switch according to a traffic state of each outgoing port of the switch; S2, monitoring each outgoing port of the switch to determine whether an event of packet entry queue, packet exit queue, packet loss, buffer overflow or port queue state change occurs; S3, determining a traffic state of the outgoing port according to the event that occurs at the outgoing port and corresponding port queue state information; S4, setting a port control state according to the traffic state of the outgoing port; and S5, adjusting the cache threshold corresponding to the outgoing port according to the port control state, and performing S2 to continue monitoring until the switch stops working.
    Type: Grant
    Filed: April 25, 2022
    Date of Patent: November 14, 2023
    Assignee: TSINGHUA UNIVERSITY
    Inventors: Yong Cui, Sijiang Huang, Mowei Wang
  • Publication number: 20220345388
    Abstract: A traffic-aware switch-shared cache scheduling method includes: S1, setting a cache threshold of each outgoing port of a switch according to a traffic state of each outgoing port of the switch; S2, monitoring each outgoing port of the switch to determine whether an event of packet entry queue, packet exit queue, packet loss, buffer overflow or port queue state change occurs; S3, determining a traffic state of the outgoing port according to the event that occurs at the outgoing port and corresponding port queue state information; S4, setting a port control state according to the traffic state of the outgoing port; and S5, adjusting the cache threshold corresponding to the outgoing port according to the port control state, and performing S2 to continue monitoring until the switch stops working.
    Type: Application
    Filed: April 25, 2022
    Publication date: October 27, 2022
    Inventors: Yong CUI, Sijiang HUANG, Mowei WANG