Patents by Inventor Zheng Hu
Zheng Hu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12262563Abstract: A pixel cell is formed on a semiconductor substrate having a front surface. The pixel cell includes a photodiode, a floating diffusion region, and a transfer gate. The photodiode is disposed in the semiconductor substrate. The floating diffusion region includes a first doped region disposed in the semiconductor substrate, wherein the first doped region extends from the front surface to a first junction depth in the semiconductor substrate. The transfer gate is configured to selectively couple the photodiode to the floating diffusion region controlling charge transfer between the photodiode and the floating diffusion region. The transfer gate includes a planar gate disposed on the front surface of the semiconductor substrate and a pair of vertical gate electrodes. Each vertical gate electrode extending a gate depth from the planar gate into the semiconductor substrate. The first junction depth is greater than the gate depth.Type: GrantFiled: March 22, 2022Date of Patent: March 25, 2025Assignee: OmniVision Technologies, Inc.Inventors: Shiyu Sun, Yuanwei Zheng, Gang Chen, Sing-Chung Hu, Armin Yazdani
-
Publication number: 20250094138Abstract: Systems, methods, and other embodiments associated with automated fine-tuning of software code generation by large language models are described herein. In one embodiment, a method accesses a collection of software code samples that intermix sample code and human language description. The method generates prompts to an LLM to write code that performs as described by the human language description of the sample code. The method fine-tunes a large language model to generate software code based on a code generation loss function that evaluates code generated by the LLM in response to the prompts. The method generates an evaluation score for performance of the tuned large language model as a code generator based on code generation loss for second generated code. And, the method automatically signals that fine-tuning of the tuned large language is complete in response to the evaluation score satisfying a threshold.Type: ApplicationFiled: June 14, 2024Publication date: March 20, 2025Inventors: Yazhe HU, Mengqing GUO, Zheng WANG, Tao SHENG, Jun QIAN, Vinod MAMTANI
-
Publication number: 20250094716Abstract: Techniques for language model (LM) summarization using semantical clustering are provided. In one technique, a plurality of concepts reflected in text data is identified. A plurality of concept clusters is generated based on similarity among the plurality of concepts. Thus, some concept clusters may include multiple concepts. For each concept cluster of the plurality of concept clusters, an LM generates a summary of the text corresponding to that concept cluster. A summary response of the text data is generated by aggregating the summary of each concept cluster of the plurality of concept clusters. In another technique, an LM generates a summary based on text data. A first set of concepts reflected in the summary is identified and a second set of concepts reflected in the text data is identified. A difference between the two sets may indicate that the summary is missing one or more concepts.Type: ApplicationFiled: May 7, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod M. Mamtani
-
Publication number: 20250094704Abstract: Systems, methods, and other embodiments associated with automated fine-tuning of text summarization for large language models are described herein. In one embodiment, a method accesses a collection of text samples. The text samples include a body of text and an example summary. The method fine-tunes a large language model (LLM) based on a loss function that compares the example summary and a generated summary generated by the LLM. The example and generated summaries are compared at sentence, paragraph, and/or article levels. The method generates an evaluation score for performance of the tuned LLM as a text summarizer based on a further comparison of a reference summary and a summary generated by the tuned LLM. The method then automatically determines to deploy the tuned LLM to a text summarization task in response to the evaluation score satisfying a threshold.Type: ApplicationFiled: April 5, 2024Publication date: March 20, 2025Inventors: Yazhe HU, Mengqing GUO, Zheng WANG, Tao SHENG, Jun QIAN, Vinod MAMTANI
-
Publication number: 20250094687Abstract: Techniques for generating repetition-free text using a large language model (LLM) are provided. In one technique, textual content that was generated by an LLM is accessed, where the textual content comprises a plurality of sub-components including a first sub-component and a second sub-component. A first embedding that represents the first sub-component is generated and a second embedding that represents the second sub-component is generated. Based on a similarity between the first embedding and the second embedding, it is determined whether the second sub-component is repetitious with respect to the first sub-component. In response to determining that the second sub-component is repetitious with respect to the first sub-component, at least a portion of the second sub-component is removed from the textual content.Type: ApplicationFiled: June 28, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod Murli Mamtani
-
Publication number: 20250094814Abstract: Techniques are provided for fine-tuning large language models (LLMs) to reduce the instability of LLM outputs to prompt. In one technique, a plurality of prompts is stored. For each prompt of the plurality of prompts, a plurality of variants of that prompt is generated. A prompt generating LLM is fine-tuned based on that prompt and the plurality of variants. Each variant-prompt association (where the variant is generated based on the prompt and has an identical or similar meaning) is a training sample that is used to train or fine-tune the prompt generating LLM. The prompt generating LLM is configured to generate standardized prompts based on input prompts. In another technique, a response generating LLM is fine-tuned based on sets of training samples, each training sample in a set comprising a different variant of a prompt and a response that the response generating LLM generated based on the prompt.Type: ApplicationFiled: September 4, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod M Mamtani
-
Publication number: 20250094686Abstract: Techniques for modifying a narrative point of view for content generated by a machine-learned model, such as a large language model (LLM), are provided. In one technique, a first textual content that was generated by an LLM is accessed. A narrative point of view (NPOV) detection operation is performed on a first portion of the first textual content to identify a first NPOV corresponding to the first portion of the first textual content. Based on an output, of the NPOV detection operation, that indicates that the first NPOV does not meet one or more NPOV criteria, the first portion of the first textual content is modified to generate a modified textual content. The modified textual content is submitted to the LLM, causing the LLM to generate a second textual content.Type: ApplicationFiled: June 28, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod Murli Mamtani
-
Publication number: 20250094816Abstract: Systems, methods, and other embodiments associated with automated fine-tuning of text generation for large language models are described herein. In one embodiment, a method accesses a collection of text samples. The text samples include a natural language text prompt that combines content and instructions. The method extracts the instructions from the text prompt. The method fine-tunes a large language model to generate text in natural language based on a text generation loss function that penalizes non-compliance with the extracted instructions by a generated text response to the text prompt. The method generates an evaluation score for performance of the tuned large language model as a text generator based on a value of the text generation loss function for a second generated text response. And, the method automatically signals that the fine tuning of the tuned large language model is complete in response to the evaluation score satisfying a threshold.Type: ApplicationFiled: April 30, 2024Publication date: March 20, 2025Inventors: Yazhe HU, Mengqing GUO, Zheng WANG, Tao SHENG, Jun QIAN, Vinod MAMTANI
-
Publication number: 20250097171Abstract: Systems, methods, and other embodiments automated fine-tuning of chatbot performance for large language models are described herein. In one embodiment, a method accesses a collection of sample conversations between two entities. An individual sample conversation includes one or more rounds of natural language example prompt by a querent and example response by an agent. The method fine-tunes an LLM to generate responses in natural language based on a chatbot loss function that evaluates first responses generated by the LLM to the example prompts by the querent. The method generates an evaluation score for performance of the tuned LLM as a chatbot based on second responses generated by the tuned LLM to test prompts from a test conversation. And, the method automatically signals that the fine-tuning of the tuned LLM is complete in response to the evaluation score satisfying a threshold.Type: ApplicationFiled: July 10, 2024Publication date: March 20, 2025Inventors: Yazhe HU, Mengqing GUO, Zheng WANG, Tao SHENG, Jun QIAN, Vinod MAMTANI
-
POSITIVE ELECTRODE MATERIAL, SECONDARY BATTERY, BATTERY MODULE, BATTERY PACK, AND ELECTRIC APPARATUS
Publication number: 20250096272Abstract: A positive electrode material, and a secondary battery, battery module, battery pack, and electric apparatus including the same are disclosed. The positive electrode material includes an active substance and a binder, and the binder has the following formula: The fluorine-free binder in the positive electrode material not only has good flexibility and cohesiveness but also good NMP solubility and thus is a good substitute for PVDF.Type: ApplicationFiled: December 2, 2024Publication date: March 20, 2025Applicant: CONTEMPORARY AMPEREX TECHNOLOGY (HONG KONG) LIMITEDInventors: Lei LU, Changyuan HU, Zheng WANG, Yalong WANG, Shisong LI, Shunhao DAI -
Publication number: 20250094865Abstract: Techniques for ensuring that language models follow instructions indicated in prompts are provided. In one technique, a first language model generates a response based on a prompt. A set of instructions in the prompt is identified. For each instruction in the set, a second language model determines whether the response indicates that the first language model followed the instruction. In another technique, for each prompt of a plurality of prompts: (1) a first language model generates a response based on the prompt; (2) multiple instructions are identified based on the prompt; (3) a second language model generates, based on the plurality of instructions, an output that indicates that the first language model followed each instruction; and (4) the prompt, the response, and the multiple instructions are stored in a training instance. The first language model is finetuned based on the training instances.Type: ApplicationFiled: April 8, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod M. Mamtani
-
Publication number: 20250094866Abstract: Techniques for correcting hallucinations produced by generative large language models (LLMs). In one technique, a computing system accesses first output generated by an LLM. The computing system identifies, within the first output, a plurality of assertions. The computing system determines that a first assertion in the plurality of assertions is false. The computing system generates a prompt that indicates that the first assertion is false. The computing system submits the prompt as input to the LLM. The computing system accesses second output that is generated by the LLM, where the second output includes a second assertion that is different than the first assertion and corresponds to the first assertion.Type: ApplicationFiled: May 30, 2024Publication date: March 20, 2025Inventors: Zheng Wang, Yazhe Hu, Mengqing Guo, Tao Sheng, Jun Qian, Vinod Murli Mamtani
-
Publication number: 20250089422Abstract: Provided is a display device. The display device includes a display panel and a circuit board. The display panel includes a display region and a non-display region. A first wiring is distributed in the non-display region. A second wiring is distributed in the circuit board. The circuit board is bonded to the non-display region, and the second wiring is connected to the first wiring to form a near-field communication antenna.Type: ApplicationFiled: May 26, 2023Publication date: March 13, 2025Inventors: Mingqiang WANG, Zheng BAO, Dexing HE, Jiaxiang ZHANG, Chang WANG, Hongjin HU, Yuan FANG, Bin ZHANG, Seungyong OH
-
Publication number: 20250083480Abstract: An external TPMS sensor for heavy-duty trucks comprises a helical antenna and a 1225 cell, and is installed outside a tire of a vehicle; and system power consumption is reduced by increasing the communication rate of the TPMS sensor, and/or decreasing the sampling period, and/or decreasing the number of data frames transmitted every time, and/or decreasing RF output power, and/or disposing an LNA circuit at an input terminal of a corresponding receiver antenna. The requirement of the TPMS sensor for a long service life is met by reducing system power consumption; the helical antenna, the matching network, and the 5 dBm chip for outputting power are used to realize miniaturization of the TPMS sensor; and the 1225 cell is used, which, compared with traditional TPMS sensors, allows the diameter of the TPMS sensor to be reduced by about 8 mm and the weight of the TPMS sensor to be reduced by over ?, such that miniaturization of the TPMS sensor is further realized.Type: ApplicationFiled: June 21, 2023Publication date: March 13, 2025Inventors: Jianer ZHANG, Zenan HU, Zengchao JI, Mingguang YU, Chuan ZHANG, Zheng TANG
-
Publication number: 20250079531Abstract: This application relates to an electrode plate structure. With a ratio CB1 of a surface loading of a second cathode material layer to a surface loading of a first anode material layer controlled to be 1:(1.01-1.3) and a ratio CB2 of a surface loading of a first cathode material layer to a surface loading of a second anode material layer controlled to be 1:(1.03-1.4), the surface loadings of the first anode material layer and second anode material layer are slightly higher than the surface loadings of the corresponding second cathode material layer and first cathode material layer, so that during cycling, especially at a high rate, the first anode material layer and the second anode material layer can provide sufficient lithiation capacity, avoiding lithium precipitation on the anode. This application also implements a differential design for the material layers on two sides of the electrode plates.Type: ApplicationFiled: November 14, 2024Publication date: March 6, 2025Applicant: Contemporary Amperex Technology (Hong Kong) LimitedInventors: Yalong WANG, Lei LU, Zheng WANG, Kaiming DENG, Linzhen YU, Changyuan HU, Shisong LI, Shunhao DAI
-
Publication number: 20250066517Abstract: A resin composition includes a main polymer and a crosslinking agent. The crosslinking agent is represented by Formula (A). Ar1 is an aromatic group containing 6 to 50 ring atoms, each L1 is independently selected from one or more of a single bond, a C1 to C50 alkylene, or a dynamic covalent bond group, and each n1 is independently selected from integers from 1 to 3. A structural unit of the main polymer contains a structure represented by Formula (1). The main polymer contains no aromatic group.Type: ApplicationFiled: November 15, 2024Publication date: February 27, 2025Inventors: Changyuan HU, Lei LU, Yalong WANG, Zheng WANG
-
Patent number: 12234214Abstract: The present discloses relates to a method for continuously synthesizing 5-hydroxymethylfurfural by using a micro-channel reactor, which belongs to the technical field of micro-chemical engineering. The method includes: separately conveying an aqueous glucose solution, which contains FeCl3 and HCl, and methyl butyl ketone to a T-shaped micro-mixer, the T-shaped micro-mixer being in communication with a capillary tube; then enabling an aqueous glucose solution phase and a methyl butyl ketone phase to flow in the capillary tube in a segmented flow manner while performing an HMF synthesis reaction; and collecting a reaction product flowing out of the capillary tube, wherein HMF generated by the reaction is present in an organic phase of the reaction product. The method is easy to operate, has a controllable process, high product yield, a low by-product amount and a short synthesis period, and is a green method for efficient and continuous synthesis of HMF.Type: GrantFiled: July 19, 2024Date of Patent: February 25, 2025Assignee: CHINA CONSTRUCTION INDUSTRIAL & ENERGY ENGINEERING GROUP CO., LTDInventors: Kai Zhu, Yiping Huang, Meng Hu, Changhai Yue, Jingjing Huang, Shuangtao Li, Kai Guo, Zheng Fang
-
Publication number: 20250053279Abstract: The embodiments of the disclosure provide a method, apparatus, device and storage medium for presenting information, which relate to the technical field of computers. The method includes: obtaining object search information, in response to the object search information being object category information, determining a target object category corresponding to the object search information, and presenting object information of a plurality of target objects corresponding to the target object category in a search result presentation page; where all the plurality of target objects are different, and object information of a target object includes object attribute information and image resource information of the target object. By employing the above technical solution, when the user is searching the object category, object information of a plurality of target objects different from each other corresponding to the target object category is presented in the search result page.Type: ApplicationFiled: August 9, 2024Publication date: February 13, 2025Inventors: Jing LIN, Duanliang ZHOU, Long RU, Chao WU, Conghai YAO, Yelun LIU, Bin QIAN, Siyi YE, Jie WANG, Wenhao LI, Wenjing LIU, Shengan CAI, Tingting YANG, Yiwei WANG, Junjun YAO, Yifei QIU, Ju YANG, Yunfei SONG, Chuan ZHAO, Xianhui WEI, Xiaofeng WANG, Jianwen WU, Meng CHEN, Mang WANG, Peng HE, Kaijian LIU, Liangpeng XU, Yuhang LIU, Xiang XIAO, Runyu CHEN, Da LEI, Xiangnan LUO, Zheng PENG, Shaolong CHEN, Binghua XU, Hongtao XUE, Guorong ZHU, Qinglin XU, Pingping HUANG, Hongtao HU
-
Publication number: 20250048966Abstract: A self-adaptive control system for height and operation of ratooning rice stubble cutting includes a stubble righting device, a profiling contact sensor, a speed radar, and a controller; the stubble righting device is installed at the rear of the harvester, and is connected to the bracket by the stubble righting device hydraulic cylinder; the profiling contact sensor is configured to detect the height of the header above the ground and transmit data to the controller; the speed radar is configured to obtain and transmit speed information of the harvester to the controller; the hydraulic valve block is installed on the chassis frame and connected to the header lifting hydraulic cylinder and the stubble righting device hydraulic cylinder respectively; input terminals of the controller are connected to the profiling contact sensor and the speed radar, and output terminals of the controller are connected to the motor and hydraulic valve block.Type: ApplicationFiled: September 22, 2022Publication date: February 13, 2025Applicant: JIANGSU UNIVERSITYInventors: Lizhang XU, Jinpeng HU, Peng LIU, Shuaifeng XING, Jiahui PAN, Xiaoyu CHAI, Zheng MA
-
Publication number: 20250031366Abstract: A semiconductor structure is disclosed. The semiconductor structure includes a staircase structure disposed over a substrate. The staircase structure includes a plurality of layer stacks, where each layer stack is made of a first material layer over a portion of a second material layer. The staircase structure further includes a plurality of landing pads, where each landing pad is disposed over another portion of the second material layer of a respective layer stack.Type: ApplicationFiled: October 4, 2024Publication date: January 23, 2025Applicant: Yangtze Memory Technologies Co., Ltd.Inventors: Zhenyu LU, Jun CHEN, Xiaowang DAI, Jifeng ZHU, Qian TAO, Yu Ru HUANG, Si Ping HU, Lan YAO, Li Hong XIAO, A Man ZHENG, Kun BAO, Haohao YANG