Patents by Inventor Jingren Zhou
Jingren Zhou has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240004703Abstract: A system for multi-modal multi-task processing includes a task representation component configured to determine a task representation element corresponding to a task representation framework that is used to define a content format for describing a to-be-processed task, and the task representation element including an element used to define task description information, an element used to define task input information, and an element used to define task output information; and based on the task representation element, acquire task description information, task input information, and task output information corresponding to each of to-be-processed tasks in different modalities; a data conversion component configured to determine an encoding sequence corresponding to each of the to-be-processed tasks; and a data processing component configured to process each of the to-be-processed tasks based on the encoding sequence corresponding to each of the to-be-processed tasks to obtain a task processing result corresponType: ApplicationFiled: June 12, 2023Publication date: January 4, 2024Inventors: Chang ZHOU, Jinze BAI, Peng WANG, An YANG, Junyang LIN, Hongxia YANG, Jingren ZHOU
-
Publication number: 20230336327Abstract: Embodiments of this specification disclose computer-implemented methods, apparatuses, systems, mediums, and program products related to batch encryption. In an example computer-implemented method, N first plaintexts are obtained. The N first plaintexts are spliced based on a first predetermined rule to obtain a first target plaintext. The first target plaintext are encrypted by using a predetermined encryption algorithm to obtain a first target ciphertext. N is a positive integer greater than or equal to 2.Type: ApplicationFiled: April 11, 2023Publication date: October 19, 2023Applicant: Alipay (Hangzhou) Information Technology Co., Ltd.Inventors: Yufei Lu, Chaofan Yu, Lei Wang, Jingren Zhou
-
Publication number: 20230325716Abstract: A pre-training service system is provided. The pre-training service system includes: a producer service module configured to provide a model producer with a model pre-training process for a pre-training dataset and generate a corresponding pre-training model; an optimizer service module configured to optimize the pre-training model according to a fine-tuning dataset provided by a model optimizer and obtain an optimized model; and a consumer service module configured to provide a model consumer with a service interface for the pre-training model or the optimized model, wherein the pre-training model or the optimized model is configured to perform inference on data provided by the model consumer and output a model prediction result.Type: ApplicationFiled: December 23, 2022Publication date: October 12, 2023Inventors: Rui MEN, Chang ZHOU, Peng WANG, Yichang ZHANG, Junyang LIN, An YANG, Yong LI, Wei LIN, Ming DING, Xu ZOU, Zhengxiao DU, Jie TANG, Hongxia YANG, Jingren ZHOU
-
Patent number: 11714849Abstract: Embodiments of this application provide an image generation system and method. In an exemplary manufacturing industry scenario, a style requirement of a product category in a manufacturing industry is automatically captured according to user behavior data and product description information associated with the product category. Based on these data, a style description text may be generated and converted to product images by using a text prediction-based image generation model. The product images are further screened by using an image-text matching model, to obtain a product image with high quality. This process covers from style description text mining to text-to-image prediction to image quality evaluation. It provides an automation product image generation capability for the manufacturing industry, shorten a cycle of designing and producing the product image in the manufacturing industry, and improve production efficiency of the product image.Type: GrantFiled: August 23, 2022Date of Patent: August 1, 2023Assignee: Alibaba Damo (Hangzhou) Technology Co., Ltd.Inventors: Huiling Zhou, Jinbao Xue, Zhikang Li, Jie Liu, Shuai Bai, Chang Zhou, Hongxia Yang, Jingren Zhou
-
Publication number: 20230214385Abstract: Runtime statistics from the actual performance of operations on a set of data are collected and utilized to dynamically modify the execution plan for processing a set of data. The operations performed are modified to include statistics collection operations, the statistics being tailored to the specific operations being quantified. Optimization policy defines how often optimization is attempted and how much more efficient an execution plan should be to justify transitioning from the current one. Optimization is based on the collected runtime statistics but also takes into account already materialized intermediate data to gain further optimization by avoiding reprocessing.Type: ApplicationFiled: March 13, 2023Publication date: July 6, 2023Inventors: Nicolas BRUNO, Jingren ZHOU
-
Publication number: 20230145452Abstract: A method and an apparatus for model training are provided. The method for model training includes: training a first model to obtain a parameter set of the trained first model, in which first layers in the first model share the same weight parameters; copying the parameter set for multiple times as weight parameters of second layers of a second model; and training the second model to realize model convergence. The first model and the second model have the same computation graph, and the number of the second layers is equal to or greater than the number of the first layers.Type: ApplicationFiled: October 19, 2022Publication date: May 11, 2023Inventors: Junyang LIN, An YANG, Rui MEN, Chang ZHOU, Hongxia YANG, Jingren ZHOU
-
Publication number: 20230133683Abstract: An interactive feature generation system may receive a plurality of distinct features that are associated with an application, and associate a plurality of nodes in a feature graph of a first order to the plurality of distinct features. The interactive feature generation system may iteratively generate interactive features of a higher order from interactive features of a lower order to form a plurality of feature graphs of different orders. The interactive feature generation system may then propagate respective interactive features of the plurality of feature graphs of the different orders to a neural network to determine a number of interactive features of one or more orders, the determined number of interactive features of the one or more orders being used for training a predictive model to make inferences for the application.Type: ApplicationFiled: July 14, 2020Publication date: May 4, 2023Inventors: Yuexiang XIE, Zhen Wang, Bolin Ding, Yaliang Li, Jun Huang, Weidan Kong, Jingren Zhou, Wei Lin
-
Patent number: 11620286Abstract: Runtime statistics from the actual performance of operations on a set of data are collected and utilized to dynamically modify the execution plan for processing a set of data. The operations performed are modified to include statistics collection operations, the statistics being tailored to the specific operations being quantified. Optimization policy defines how often optimization is attempted and how much more efficient an execution plan should be to justify transitioning from the current one. Optimization is based on the collected runtime statistics but also takes into account already materialized intermediate data to gain further optimization by avoiding reprocessing.Type: GrantFiled: May 27, 2021Date of Patent: April 4, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Nicolas Bruno, Jingren Zhou
-
Publication number: 20230068103Abstract: Embodiments of this application provide an image generation system and method. In an exemplary manufacturing industry scenario, a style requirement of a product category in a manufacturing industry is automatically captured according to user behavior data and product description information associated with the product category. Based on these data, a style description text may be generated and converted to product images by using a text prediction-based image generation model. The product images are further screened by using an image-text matching model, to obtain a product image with high quality. This process covers from style description text mining to text-to-image prediction to image quality evaluation. It provides an automation product image generation capability for the manufacturing industry, shorten a cycle of designing and producing the product image in the manufacturing industry, and improve production efficiency of the product image.Type: ApplicationFiled: August 23, 2022Publication date: March 2, 2023Inventors: Huiling ZHOU, Jinbao XUE, Zhikang LI, Jie LIU, Shuai BAI, Chang ZHOU, Hongxia YANG, Jingren ZHOU
-
Patent number: 11271981Abstract: A low-latency cloud-scale computation environment includes a query language, optimization, scheduling, fault tolerance and fault recovery. An event model can be used to extend a declarative query language so that temporal analysis of event of an event stream can be performed. Extractors and outputters can be used to define and implement functions that extend the capabilities of the event-based query language. A script written in the extended query language can be translated into an optimal parallel continuous execution plan. Execution of the plan can be orchestrated by a streaming job manager which schedules vertices on available computing machines. The streaming job manager can monitor overall job execution. Fault tolerance can be provided by tracking execution progress and data dependencies in each vertex. In the event of a failure, another instance of the failed vertex can be scheduled. An optimal recovery point can be determined based on checkpoints and data dependencies.Type: GrantFiled: January 16, 2019Date of Patent: March 8, 2022Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jingren Zhou, Zhengping Qian, Peter Zabback, Wei Lin
-
Publication number: 20210357752Abstract: A method, an apparatus, a storage medium, and a processor for model processing are disclosed. The method includes: obtaining an original language model; determining a task that needs to be processed by the original language model; and converting the original language model based on features of the task to obtain a target language model for processing the task. The present disclosure solves the technical problem of the difficulty of effectively using a model.Type: ApplicationFiled: May 6, 2021Publication date: November 18, 2021Inventors: Daoyuan Chen, Yaliang Li, Minghui Qiu, Zhen Wang, Bofang Li, Bolin Ding, Hongbo Deng, Jun Huang, Wei Lin, Jingren Zhou
-
Publication number: 20210286811Abstract: Runtime statistics from the actual performance of operations on a set of data are collected and utilized to dynamically modify the execution plan for processing a set of data. The operations performed are modified to include statistics collection operations, the statistics being tailored to the specific operations being quantified. Optimization policy defines how often optimization is attempted and how much more efficient an execution plan should be to justify transitioning from the current one. Optimization is based on the collected runtime statistics but also takes into account already materialized intermediate data to gain further optimization by avoiding reprocessing.Type: ApplicationFiled: May 27, 2021Publication date: September 16, 2021Inventors: Nicolas BRUNO, Jingren ZHOU
-
Patent number: 11055283Abstract: Runtime statistics from the actual performance of operations on a set of data are collected and utilized to dynamically modify the execution plan for processing a set of data. The operations performed are modified to include statistics collection operations, the statistics being tailored to the specific operations being quantified. Optimization policy defines how often optimization is attempted and how much more efficient an execution plan should be to justify transitioning from the current one. Optimization is based on the collected runtime statistics but also takes into account already materialized intermediate data to gain further optimization by avoiding reprocessing.Type: GrantFiled: August 23, 2017Date of Patent: July 6, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Nicolas Bruno, Jingren Zhou
-
Patent number: 10749984Abstract: Processing a job request for multiple versions of a distributed computing service. The service processing node does this by at least interleavingly (e.g., via time sharing with rapid context switching, or by actually concurrently) running a first runtime library associated with a first service version of the distributed computerized service and a second runtime library associated with a different service version of the distributed computerized service. While running the first runtime library, job requests of a first service version may be at least partially processed using a first set of one or more executables that interact with the first runtime library. While running the second runtime library, job requests of a second service version may be at least partially processed using a second set of one or more executables that interact with the second runtime library.Type: GrantFiled: February 19, 2019Date of Patent: August 18, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Zhicheng Yin, Xiaoyu Chen, Tao Guan, Paul Michael Brett, Nan Zhang, Jaliya N. Ekanayake, Eric Boutin, Anna Korsun, Jingren Zhou, Haibo Lin, Pavel N. Iakovenko
-
Patent number: 10565208Abstract: Embodiments of the present invention allow multiple data streams to be analyzed as a single data set. The single data set may be described as a stream set herein. The multiple streams that are included in the stream set may be specified through a user script or query. For example, a query may be used to gather all streams created within a date range. The query could include one or more filters to gather certain information from the data streams or to exclude certain data streams that otherwise are in the query's range. A stream may be an unstructured byte stream of data. The stream may be created by append-only writing to the end of the stream. The stream could also be a structured stream that includes metadata that defines column structure and affinity/clustering information.Type: GrantFiled: June 11, 2013Date of Patent: February 18, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Edward John Triou, Jr., Fei Xu, Hiren Patel, Jingren Zhou
-
Patent number: 10530892Abstract: Processing received job requests for a multi-versioned distributed computerized service. For each received job request, the job request is channeled to an appropriate service processing node that depends on the version of the distributed computing service that is to handle the job request. A version of the distributed computing service is assigned to the incoming job request. A service processing node that runs a runtime library for the assigned service version is then identified. The identified service processing node also has an appropriate set of one or more executables that allows the service processing node to plan an appropriate role (e.g., compiler, scheduler, worker) in the distributed computing service. The job request is then dispatched to the identified service processing node.Type: GrantFiled: June 29, 2016Date of Patent: January 7, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Zhicheng Yin, Xiaoyu Chen, Tao Guan, Paul Michael Brett, Nan Zhang, Jaliya N. Ekanayake, Eric Boutin, Anna Korsun, Jingren Zhou, Haibo Lin, Pavel N. Iakovenko
-
Publication number: 20190387073Abstract: Processing a job request for multiple versions of a distributed computing service. The service processing node does this by at least interleavingly (e.g., via time sharing with rapid context switching, or by actually concurrently) running a first runtime library associated with a first service version of the distributed computerized service and a second runtime library associated with a different service version of the distributed computerized service. While running the first runtime library, job requests of a first service version may be at least partially processed using a first set of one or more executables that interact with the first runtime library. While running the second runtime library, job requests of a second service version may be at least partially processed using a second set of one or more executables that interact with the second runtime library.Type: ApplicationFiled: February 19, 2019Publication date: December 19, 2019Inventors: Zhicheng YIN, Xiaoyu CHEN, Tao GUAN, Paul Michael BRETT, Nan ZHANG, Jaliya N. EKANAYAKE, Eric BOUTIN, Anna KORSUN, Jingren ZHOU, Haibo LIN, Pavel N. IAKOVENKO
-
Publication number: 20190166173Abstract: A low-latency cloud-scale computation environment includes a query language, optimization, scheduling, fault tolerance and fault recovery. An event model can be used to extend a declarative query language so that temporal analysis of event of an event stream can be performed. Extractors and outputters can be used to define and implement functions that extend the capabilities of the event-based query language. A script written in the extended query language can be translated into an optimal parallel continuous execution plan. Execution of the plan can be orchestrated by a streaming job manager which schedules vertices on available computing machines. The streaming job manager can monitor overall job execution. Fault tolerance can be provided by tracking execution progress and data dependencies in each vertex. In the event of a failure, another instance of the failed vertex can be scheduled. An optimal recovery point can be determined based on checkpoints and data dependencies.Type: ApplicationFiled: January 16, 2019Publication date: May 30, 2019Inventors: Jingren Zhou, Zhengping Qian, Peter Zabback, Wei Lin
-
Patent number: 10225302Abstract: A low-latency cloud-scale computation environment includes a query language, optimization, scheduling, fault tolerance and fault recovery. An event model can be used to extend a declarative query language so that temporal analysis of event of an event stream can be performed. Extractors and outputters can be used to define and implement functions that extend the capabilities of the event-based query language. A script written in the extended query language can be translated into an optimal parallel continuous execution plan. Execution of the plan can be orchestrated by a streaming job manager which schedules vertices on available computing machines. The streaming job manager can monitor overall job execution. Fault tolerance can be provided by tracking execution progress and data dependencies in each vertex. In the event of a failure, another instance of the failed vertex can be scheduled. An optimal recovery point can be determined based on checkpoints and data dependencies.Type: GrantFiled: April 7, 2017Date of Patent: March 5, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Jingren Zhou, Zhengping Qian, Peter Zabback, Wei Lin
-
Patent number: 10212255Abstract: Processing a job request for multiple versions of a distributed computing service. The service processing node does this by at least interleavingly (e.g., via time sharing with rapid context switching, or by actually concurrently) running a first runtime library associated with a first service version of the distributed computerized service and a second runtime library associated with a different service version of the distributed computerized service. While running the first runtime library, job requests of a first service version may be at least partially processed using a first set of one or more executables that interact with the first runtime library. While running the second runtime library, job requests of a second service version may be at least partially processed using a second set of one or more executables that interact with the second runtime library.Type: GrantFiled: June 29, 2016Date of Patent: February 19, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Zhicheng Yin, Xiaoyu Chen, Tao Guan, Paul Michael Brett, Nan Zhang, Jaliya N. Ekanayake, Eric Boutin, Anna Korsun, Jingren Zhou, Haibo Lin, Pavel N. Iakovenko