Patents by Inventor Steven Zheng

Steven Zheng has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250086405
    Abstract: Some implementations relate to generating a training and/or evaluation dataset with LLM prompts (e.g., derived from user queries) based on a prompt complexity. An input prompt, for example derived from a user query, is received. The input prompt is decomposed into a prompt tree comprising a plurality of nodes. The plurality of nodes comprise: a plurality of leaf nodes corresponding to simple sub-prompts of the input query; a plurality of branch nodes of sub-prompts each corresponding to multiple simple sub-prompts; and a root node corresponding to the input prompt. A prompt complexity is determined based on a path length of the prompt tree. The prompt complexity is compared to a threshold complexity. If the prompt complexity is above the threshold complexity, the input prompt is included in a set of training prompts and/or a set of evaluation prompts.
    Type: Application
    Filed: October 5, 2023
    Publication date: March 13, 2025
    Inventors: Swaroop Mishra, Ragha Kotikalapudi, Obaid Sarvana, Sahitya Potluri, YaGuang Li, Taylor Bos, Steven Zheng, Hanzhao Lin, Chenkai Kuang, Heng-Tze Cheng, Ed H. Chi, Quoc Le
  • Publication number: 20250045534
    Abstract: Implementations relate to a method implemented by one or more processors, the method including: receiving natural language (NL) based input associated with a client device; generating, using a large language model (LLM) and based on processing the NL based input, LLM output; determining, based on the LLM output, a sequence of LLM responses, the sequence of LLM responses including at least one intermediate LLM response and a final LLM response. In some implementations, the method may further include causing the final LLM response to be rendered at the client device. In additional or alternative implementations, the method may further include storing, as an instance of training data for fine-tuning the LLM or an additional LLM, the NL based input along with the final LLM response.
    Type: Application
    Filed: October 10, 2023
    Publication date: February 6, 2025
    Inventors: Swaroop Mishra, Ragha Kotikalapudi, Sahitya Potluri, Taylor Bos, YaGuang Li, Hanzhao Lin, Steven Zheng, Yu Du, Chen Zhu, Chenkai Kuang, Xinying Song, Heng-Tze Cheng, Ed H. Chi, Quoc Le
  • Publication number: 20240394471
    Abstract: Implementations relate to improving instruction following capabilities of large language models (LLMs) using instruction decomposition, self-evaluation, and optionally progressive refinement. Processor(s) of a system can: obtain natural language (NL) based input, generate a plurality of candidate responses and evaluate the candidate responses based on instructions included in the NL based input, using an LLM, and progressively refine the candidate responses until it is determined that one or more termination criteria are satisfied. In some implementations, the NL based input can be received from a client device. In these implementations, a given candidate response that is progressively refined can be rendered for presentation at the client device and responsive to the NL base input. In additional or alternative implementations, the NL based input can be obtained from database(s). In these implementations, a given candidate response that is progressively refined can be utilized in fine-tuning of the LLM.
    Type: Application
    Filed: August 8, 2023
    Publication date: November 28, 2024
    Inventors: Ragha Kotikalapudi, Swaroop Mishra, Sahitya Potluri, Taylor Bos, Yu Du, Chen Zhu, Steven Zheng, Hanzhao Lin, Summer Yue, Heng-Tze Cheng, Quoc Le, Ed H. Chi
  • Publication number: 20240378394
    Abstract: Implementations described herein relate to using self-evaluation when utilizing a large language model (LLM) to generate a response to a natural language (NL) based input. The LLM can be used to process the NL based input to generate a plurality of responses, and to generate a critique of those responses by comparing the responses to a set of response evaluation criteria. One of the responses can then be selected based on the comparison with the set of response evaluation criteria which can vary from one NL based input to another. If the NL based input was obtained a user of a client device during an inference stage, then the selected response can be rendered for presentation to the user. If the NL based input was obtained during a training stage, then the selected response can be stored as a training instance and optionally in association with additional data.
    Type: Application
    Filed: August 8, 2023
    Publication date: November 14, 2024
    Inventors: Ragha Kotikalapudi, Chen Zhu, Steven Zheng, Sahitya Potluri, Yu Du, Heng-Tze Cheng, Quoc Le, Ed H. Chi
  • Patent number: 10801879
    Abstract: A magnesium zinc oxide (MZO) nanostructure modified quartz crystal microbalance (MZOnano-QCM) takes advantage of the unique sensing ability and biocompatibility of MZO-based nanostructures, and combines them with the dynamic impedance spectrum capability of the bulk acoustic wave (BAW) devices including QCM, to form a real-time, noninvasive and label-free cell monitoring biosensor, specifically detecting the susceptibility and resistance of bacterial and fungal strains and cancer cells to various antibiotic and antifungal drugs and anticancer drugs, respectively.
    Type: Grant
    Filed: March 21, 2018
    Date of Patent: October 13, 2020
    Assignee: RUTGERS, THE STATE UNIVERSITY OF NEW JERSEY
    Inventors: Yicheng Lu, Pavel I. Reyes, Steven Zheng, Andrew Zheng, Keyang Yang
  • Patent number: 10551384
    Abstract: Certain embodiments of the invention provide a method of detecting the presence of a biomarker associated with resistance to an mTOR kinase inhibitor in a subject, comprising determining the presence of the biomarker in a physiological sample from the subject, wherein the sample comprises a nucleic acid.
    Type: Grant
    Filed: April 7, 2017
    Date of Patent: February 4, 2020
    Assignee: Rutgers, The State University Of New Jersey
    Inventor: Steven Zheng
  • Publication number: 20180209836
    Abstract: A magnesium zinc oxide (MZO) nanostructure modified quartz crystal microbalance (MZOnano-QCM) takes advantage of the unique sensing ability and biocompatibility of MZO-based nanostructures, and combines them with the dynamic impedance spectrum capability of the bulk acoustic wave (BAW) devices including QCM, to form a real-time, noninvasive and label-free cell monitoring biosensor, specifically detecting the susceptibility and resistance of bacterial and fungal strains and cancer cells to various antibiotic and antifungal drugs and anticancer drugs, respectively.
    Type: Application
    Filed: March 21, 2018
    Publication date: July 26, 2018
    Inventors: Yicheng Lu, Pavel I. Reyes, Steven Zheng, Andrew Zheng, Keyang Yang
  • Publication number: 20170299596
    Abstract: Certain embodiments of the invention provide a method of detecting the presence of a biomarker associated with resistance to an mTOR kinase inhibitor in a subject, comprising determining the presence of the biomarker in a physiological sample from the subject, wherein the sample comprises a nucleic acid.
    Type: Application
    Filed: April 7, 2017
    Publication date: October 19, 2017
    Applicant: RUTGERS, THE STATE UNIVERSITY OF NEW JERSEY
    Inventor: Steven Zheng
  • Publication number: 20090030189
    Abstract: The present invention relates generally to molecular mechanisms of mTOR-related human diseases. More specifically, the invention relates to two novel related polypeptides, the ER-localization sequence (ELS) and Golgi-localization sequence (GLS) along with therapeutic, diagnostic and research utilities for these polynucleotides and proteins.
    Type: Application
    Filed: December 10, 2007
    Publication date: January 29, 2009
    Inventors: X. F. Steven Zheng, Xiangyu Liu