Patents by Inventor Binbin WU
Binbin WU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240345866Abstract: Provided in the embodiments of the present application are a hardware performance acquisition method, and a device, a system and a storage medium. A virtual machine is additionally provided with a performance service component, which can be used as an interaction interface, and a physical machine where the virtual machine is located is additionally provided with a performance proxy component. The virtual machine can acquire a hardware performance data request by means of the performance service component, and provide the hardware performance data request to the performance proxy component by means of a performance server; the performance proxy component can acquire, according to the hardware performance data request, an incidence number that reflects the hardware performance of the physical machine; and the performance server can process associated data according to a set data processing mode, so as to obtain corresponding hardware performance parameters.Type: ApplicationFiled: October 25, 2022Publication date: October 17, 2024Inventors: Binbin WU, Jinkui REN, Renjiang MIAO, Bang DI
-
Patent number: 12090504Abstract: The present disclosure provides a device for controlling the shape of an aerosol particle condensation growth flow field through an electromagnetic field. The device includes an aerosol growth device and a power supply. The aerosol growth device includes a porous medium, magnetic rubber and an electromagnet group. The magnetic rubber is sleeved in an inner cavity of the electromagnet group, and the porous medium is sleeved in an inner cavity of the magnetic rubber. The magnetic rubber is clung or clings to the porous medium, and the power supply is connected with the electromagnet group. The present disclosure also provides a method for controlling the shape of the aerosol particle condensation growth flow field through the electromagnetic field.Type: GrantFiled: August 23, 2021Date of Patent: September 17, 2024Assignee: CHINA JILIANG UNIVERSITYInventors: Mingzhou Yu, Chenyang Liu, Yueyan Liu, Qianyu Zhang, Binbin Zhu, Taiquan Wu, Yanlong Cao, Yitao Zhang
-
Publication number: 20240231885Abstract: A method and system for optimizing live migration of a virtual machine (VM) from a source server to a destination server where hardware accelerator virtualization is used. Hardware accelerator performance data is obtained while executing a workload on a virtual function at the source server. It is determined whether to transfer the workload from the source server to the destination server based on the hardware accelerator performance data. The workload is transferred from the source server to the destination server based on the determination. The hardware accelerator performance data may include an amount of output data the workload generates and an amount of input data to the workload. The hardware accelerator may be a graphics processing unit (GPU), and the workload may be a GPU workload.Type: ApplicationFiled: March 27, 2024Publication date: July 11, 2024Inventors: Zhi WANG, Binbin WU, Guang ZENG
-
Publication number: 20240037059Abstract: A virtual acceleration device is deployed for a physical machine. The physical machine and the virtual acceleration device are interconnected through a high-speed serial bus. A serial port device for the physical machine can be virtualized and implemented on the virtual acceleration device. The physical machine can send and receive serial port data through the virtual serial port device. The physical machine only needs to transmit data to the virtual serial port device through the high-speed serial bus, and subsequent transmission actions are completed by the virtual acceleration device. With the advantages of the high-speed serial bus in the transmission speed, the transmission rate of serial port data of the physical machine can be greatly improved, which is beneficial to improving the utilization rate of CPU of the physical machine, and ensuring the stability of service performance of the physical machine.Type: ApplicationFiled: October 5, 2023Publication date: February 1, 2024Inventors: Binbin WU, Xiantao Zhang, Junkang Fu, Gan WEN, Jinkui REN
-
Publication number: 20240037060Abstract: A virtualization acceleration device is deployed for a physical machine, and a virtualized peripheral controller for the physical machine is implemented on the virtualization acceleration device, so that the physical machine may call the virtualized peripheral controller to execute an operation related to an external device. In this way, a VNC server is deployed on the virtualization acceleration device instead of on the physical machine; the virtualization acceleration device cooperates with a remote control device, and the virtualized peripheral controller remotely controls the physical machine, thereby reducing resources of the physical machine consumed in the remote interaction process, which in turns improves the physical machine performance.Type: ApplicationFiled: October 5, 2023Publication date: February 1, 2024Inventors: Jinkui REN, Xiantao Zhang, Binbin WU, Gan WEN, Junkang Fu
-
Patent number: 11409644Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer-readable storage media, for validation of mobile device workflows. In some implementations, a mobile device application to be tested is identified. An installation of the application on each of a plurality of remote mobile devices, including mobile devices having different hardware configurations and different operating system configurations, is initiated. Usage of the application by instructing the remote mobile devices to perform a series of operations using the application is simulated. Performance of the respective mobile devices during the simulated usage is measured. A document indicating performance of the application across the different mobile device configurations is generated.Type: GrantFiled: March 11, 2020Date of Patent: August 9, 2022Assignee: MicroStrategy IncorporatedInventors: Herminio Carames, Andrew Smith, Binbin Wu, Ying Ma, Jun Peng, David Hagen
-
Patent number: 11364401Abstract: The present disclosure relates to an automatic control type hot smoke testing system which includes a fire source system, a smoke generation system and a control system. The fire source system is used for generating fire source and includes a first tank and several liquid fuel atomizing jet burners, wherein the first tank includes an air tank which is used for providing air and a fuel control tank which is used for controlling valves, distributing fuel and inspecting flame of the burners. The smoke generation system is used for generating smoke and includes a second tank, a smoke outlet pipe, smoke cake clamps and smoke cake turntables. Smoke cakes are initially placed in the smoke cake clamps, then moved to an ignition position one by one by for ignition, and finally rotated to through hole positions of a smoke generation box to fall into the smoke generation box by the servo motor. The control system is used for controlling the fire source system and the smoking generation system.Type: GrantFiled: August 9, 2021Date of Patent: June 21, 2022Assignee: China Academy of Safety Science and TechnologyInventors: Congling Shi, Honglei Che, Jian Li, Xingkai Zhang, Fei Ren, Xiaodong Qian, Jiehong Shi, Li He, Chen Zhao, Xuan Xu, Binbin Wu
-
Patent number: 11256595Abstract: A predictive storage management system includes a storage system having storage devices, and a predictive storage management device coupled to the storage system via a network. The predictive storage management device includes a statistical time-series storage device usage sub-engine that retrieves first storage device usage data from a first storage device in the storage system and uses it to generate a first storage device usage trend model. A machine-learning storage system usage sub-engine in the predictive storage management device retrieves storage system implementation information from the storage system and uses it to generate a storage system implementation model. A storage management sub-engine in the predictive storage management device analyzes the first storage device usage trend model and the storage system implementation model to predict future usage of the first storage device and, based on that predicted future usage, performs a management action associated with the first storage device.Type: GrantFiled: July 11, 2019Date of Patent: February 22, 2022Assignee: Dell Products L.P.Inventors: Muzhar S. Khokhar, Binbin Wu
-
Publication number: 20220040517Abstract: The present disclosure relates to an automatic control type hot smoke testing system which includes a fire source system, a smoke generation system and a control system. The fire source system is used for generating fire source and includes a first tank and several liquid fuel atomizing jet burners, wherein the first tank includes an air tank which is used for providing air and a fuel control tank which is used for controlling valves, distributing fuel and inspecting flame of the burners. The smoke generation system is used for generating smoke and includes a second tank, a smoke outlet pipe, smoke cake clamps and smoke cake turntables. Smoke cakes are initially placed in the smoke cake clamps, then moved to an ignition position one by one by for ignition, and finally rotated to through hole positions of a smoke generation box to fall into the smoke generation box by the servo motor. The control system is used for controlling the fire source system and the smoking generation system.Type: ApplicationFiled: August 9, 2021Publication date: February 10, 2022Inventors: Congling Shi, Honglei Che, Jian Li, Xingkai Zhang, Fei Ren, Xiaodong Qian, Jiehong Shi, Li He, Chen Zhao, Xuan Xu, Binbin Wu
-
Patent number: 11185826Abstract: The present disclosure provides a liquid membrane conveying apparatus for preparing a porous membrane includes a transmission unit and a carrier unit. The carrier unit conveys a liquid membrane into a gelling solution by the entrainment of the transmission unit. The carrier unit includes a first carrier and a second carrier. The first carrier and the second carrier respectively contact with opposite edges of the liquid membrane along a conveying direction of the liquid membrane. The consistency of the pores on the two surfaces of the porous membrane is improved by using the liquid membrane conveying apparatus.Type: GrantFiled: July 6, 2018Date of Patent: November 30, 2021Assignee: MICROVAST POWER SYSTEMS CO., LTD.Inventors: Xiang Li, Heji Huang, Wei Li, Binbin Wu
-
Publication number: 20210011830Abstract: A predictive storage management system includes a storage system having storage devices, and a predictive storage management device coupled to the storage system via a network. The predictive storage management device includes a statistical time-series storage device usage sub-engine that retrieves first storage device usage data from a first storage device in the storage system and uses it to generate a first storage device usage trend model. A machine-learning storage system usage sub-engine in the predictive storage management device retrieves storage system implementation information from the storage system and uses it to generate a storage system implementation model. A storage management sub-engine in the predictive storage management device analyzes the first storage device usage trend model and the storage system implementation model to predict future usage of the first storage device and, based on that predicted future usage, performs a management action associated with the first storage device.Type: ApplicationFiled: July 11, 2019Publication date: January 14, 2021Inventors: Muzhar S. Khokhar, Binbin Wu
-
Publication number: 20200293436Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer-readable storage media, for validation of mobile device workflows. In some implementations, a mobile device application to be tested is identified. An installation of the application on each of a plurality of remote mobile devices, including mobile devices having different hardware configurations and different operating system configurations, is initiated. Usage of the application by instructing the remote mobile devices to perform a series of operations using the application is simulated. Performance of the respective mobile devices during the simulated usage is measured. A document indicating performance of the application across the different mobile device configurations is generated.Type: ApplicationFiled: March 11, 2020Publication date: September 17, 2020Inventors: Herminio Carames, Andrew Smith, Binbin Wu, Ying Ma, Jun Peng, David Hagen
-
Patent number: 10637672Abstract: A power over Ethernet apparatus is provided. The power over Ethernet apparatus includes a power sourcing equipment (PSE) controller, a main chip and a redundant controller. The PSE controller is for power allocation among ports of the power over Ethernet apparatus. The main chip is coupled to the PSE controller through a first branch and for controlling the PSE controller. The redundant controller is coupled with the main chip, and coupled to the PSE controller through a second branch to control the PSE controller when the main chip stops sending a heart-beat signal. Methods of power over Ethernet are also provided.Type: GrantFiled: October 24, 2016Date of Patent: April 28, 2020Assignee: Zhejiang Uniview Technologies Co., LtdInventors: Lingliang Sun, Binbin Wu
-
Patent number: 10606798Abstract: A method for configuring an address table in a field-programmable gate array (FPGA), an FPGA, and a network device applying the FPGA, where the FPGA includes k storage blocks, the k is greater than or equal to the two, and the FPGA is configured to obtain a key, where the key is generated based on a first packet of a data stream, and a length of the key is equal to a key bit width of the FPGA, obtain an index number corresponding to the key, where the index number is used to search for a forwarding entry of the data stream, divide the key into k sub-keys, where each of the k sub-keys corresponds to one of the k storage blocks, determine an address entry of each of the k sub-keys in a corresponding storage block, and write a storage address to the address entry based on the index number.Type: GrantFiled: December 13, 2018Date of Patent: March 31, 2020Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Kejian You, Weibo Xiong, Chao Kong, Binbin Wu
-
Publication number: 20190179797Abstract: A method for configuring an address table in a field-programmable gate array (FPGA), an FPGA, and a network device applying the FPGA, where the FPGA includes k storage blocks, the k is greater than or equal to the two, and the FPGA is configured to obtain a key, where the key is generated based on a first packet of a data stream, and a length of the key is equal to a key bit width of the FPGA, obtain an index number corresponding to the key, where the index number is used to search for a forwarding entry of the data stream, divide the key into k sub-keys, where each of the k sub-keys corresponds to one of the k storage blocks, determine an address entry of each of the k sub-keys in a corresponding storage block, and write a storage address to the address entry based on the index number.Type: ApplicationFiled: December 13, 2018Publication date: June 13, 2019Inventors: Kejian You, Weibo Xiong, Chao Kong, Binbin Wu
-
Publication number: 20190009222Abstract: The present disclosure provides a liquid membrane conveying apparatus for preparing a porous membrane includes a transmission unit and a carrier unit. The carrier unit conveys a liquid membrane into a gelling solution by the entrainment of the transmission unit. The carrier unit includes a first carrier and a second carrier. The first carrier and the second carrier respectively contact with opposite edges of the liquid membrane along a conveying direction of the liquid membrane. The consistency of the pores on the two surfaces of the porous membrane is improved by using the liquid membrane conveying apparatus.Type: ApplicationFiled: July 6, 2018Publication date: January 10, 2019Inventors: XIANG LI, HEJI HUANG, WEI LI, BINBIN WU
-
Patent number: 10129151Abstract: A traffic management (TM) implementation method and apparatus, and a network device, where the TM implementation apparatus is located aside a processor or a switching fabric chip, receives a packet management request sent by the processor or the switching fabric chip, where the packet management request includes a queue identifier, and the queue identifier is used to identify a flow queue in which the processor or the switching fabric chip stores a data packet, performs traffic management on the packet management request, and generates a packet management response according to a management result, where the packet management response includes a management indication and the queue identifier, sends the packet management response to the processor or the switching fabric chip such that the processor or the switching fabric chip processes, according to the management indication, the data packet in the flow queue corresponding to the queue identifier.Type: GrantFiled: September 7, 2016Date of Patent: November 13, 2018Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Weibo Xiong, Binbin Wu, Minghui Wang, Guichao Huo
-
Publication number: 20170118032Abstract: A power over Ethernet apparatus is provided. The power over Ethernet apparatus includes a power sourcing equipment (PSE) controller, a main chip and a redundant controller. The PSE controller is for power allocation among ports of the power over Ethernet apparatus. The main chip is coupled to the PSE controller through a first branch and for controlling the PSE controller. The redundant controller is coupled with the main chip, and coupled to the PSE controller through a second branch to control the PSE controller when the main chip stops sending a heart-beat signal. Methods of power over Ethernet are also provided.Type: ApplicationFiled: October 24, 2016Publication date: April 27, 2017Inventors: Lingliang Sun, Binbin Wu
-
Publication number: 20160380895Abstract: A traffic management (TM) implementation method and apparatus, and a network device, where the TM implementation apparatus is located aside a processor or a switching fabric chip, receives a packet management request sent by the processor or the switching fabric chip, where the packet management request includes a queue identifier, and the queue identifier is used to identify a flow queue in which the processor or the switching fabric chip stores a data packet, performs traffic management on the packet management request, and generates a packet management response according to a management result, where the packet management response includes a management indication and the queue identifier, sends the packet management response to the processor or the switching fabric chip such that the processor or the switching fabric chip processes, according to the management indication, the data packet in the flow queue corresponding to the queue identifier.Type: ApplicationFiled: September 7, 2016Publication date: December 29, 2016Inventors: Weibo Xiong, Binbin Wu, Minghui Wang, Guichao Huo
-
Publication number: 20140149335Abstract: A system and method for determining at least one new treatment plan for at least one new patient, comprising: providing at least one representation of the at least one new patient's at least one organ at risk relative to at least one target; searching for at least one prior treatment plan for at least one prior patient with at least one similar representation; and reviewing the at least one prior treatment plan for the at least one prior patient in order to determine whether the at least one new treatment plan can be improved based on information in the at least one prior treatment plan.Type: ApplicationFiled: January 31, 2014Publication date: May 29, 2014Applicant: The Johns Hopkins UniversityInventors: Todd R. MCNUTT, Russell H. TAYLOR, Michael KAZHDAN, Binbin WU, Patricio SIMARI