Patents Examined by Miranda M Huang
  • Patent number: 11847544
    Abstract: A mechanism is provided in a data processing system for preventing data leakage in automated machine learning. The mechanism receives a data set comprising a label for a target variable for a classifier machine learning model and a set of features. For each given feature in the set of features, the mechanism trains a subprime classifier model using the given feature as a target variable and remaining features as independent input features, tests the subprime classifier model, and records results of the subprime classifier model. The mechanism performs statistical analysis on the recorded results to identify an outlier result corresponding to an outlier subprime classifier model.
    Type: Grant
    Filed: July 21, 2020
    Date of Patent: December 19, 2023
    Assignee: International Business Machines Corporation
    Inventor: Kunal Sawarkar
  • Patent number: 11847565
    Abstract: Methods and apparatuses are described for automatic refinement of intent classification for virtual assistant applications. A computing device creates utterance groups comprising messages. The device determines, using a first machine learning (ML) classifier, a first predicted intent associated with each utterance group and determines, using a second ML classifier, a second predicted intent associated with each utterance group. The device combines the first and second predicted intents to generate a final predicted intent, determines, for each incomprehensible utterance group, whether the final predicted intent overlaps with the final predicted intent for other utterance groups, and selects messages having no overlapping intent to create new intents.
    Type: Grant
    Filed: February 14, 2023
    Date of Patent: December 19, 2023
    Assignee: FMR LLC
    Inventors: Hua Hao, Tieyi Guo, Byung Chun, Yachao He, Ou Li, Chao Yu
  • Patent number: 11842281
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a reinforcement learning system. The method includes: training an action selection policy neural network, and during the training of the action selection neural network, training one or more auxiliary control neural networks and a reward prediction neural network. Each of the auxiliary control neural networks is configured to receive a respective intermediate output generated by the action selection policy neural network and generate a policy output for a corresponding auxiliary control task. The reward prediction neural network is configured to receive one or more intermediate outputs generated by the action selection policy neural network and generate a corresponding predicted reward.
    Type: Grant
    Filed: February 24, 2021
    Date of Patent: December 12, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Volodymyr Mnih, Wojciech Czarnecki, Maxwell Elliot Jaderberg, Tom Schaul, David Silver, Koray Kavukcuoglu
  • Patent number: 11829874
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for determining neural network architectures. One of the methods includes generating, using a controller neural network, a batch of output sequences, each output sequence in the batch defining a respective architecture of a child neural network that is configured to perform a particular neural network task; for each output sequence in the batch: training a respective instance of the child neural network having the architecture defined by the output sequence; evaluating a performance of the trained instance of the child neural network on the particular neural network task to determine a performance metric for the trained instance of the child neural network on the particular neural network task; and using the performance metrics for the trained instances of the child neural network to adjust the current values of the controller parameters of the controller neural network.
    Type: Grant
    Filed: June 7, 2021
    Date of Patent: November 28, 2023
    Assignee: Google LLC
    Inventors: Barret Zoph, Quoc V. Le
  • Patent number: 11829921
    Abstract: A system and method for recommending demand-supply agent pairs for transactions uses a deep neural network on data of demand agents to produce a demand agent vector, which is used to select supply agents based on their likelihood of future transaction and to find k nearest neighbor demand agents for each of the demand agents. The candidate supply agents and the k nearest neighbor demand agents are then combined to produce candidate demand-supply agent pairs, which are used to find recommended demand-supply agent pairs by applying modeling using machine learning.
    Type: Grant
    Filed: March 5, 2020
    Date of Patent: November 28, 2023
    Assignee: VMWARE, INC.
    Inventors: Kiran Rama, Francis Chow, Ricky Ho, Sayan Putatunda, Ravi Prasad Kondapalli, Stephen Harris
  • Patent number: 11823028
    Abstract: An artificial neural network (ANN) quantization method for generating an output ANN by quantizing an input ANN includes: obtaining second parameters by quantizing first parameters of the input ANN; obtaining a sample distribution from an intermediate ANN in which the obtained second parameters have been applied to the input ANN; and obtaining a fractional length for the sample distribution by quantizing the obtained sample distribution.
    Type: Grant
    Filed: July 24, 2018
    Date of Patent: November 21, 2023
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Do-yun Kim, Han-young Yim, Byeoung-su Kim, Nak-woo Sung, Jong-han Lim, Sang-hyuck Ha
  • Patent number: 11823067
    Abstract: The present disclosure relates to system(s) and method(s) for tuning an analytical model. The system builds a global analytical model based on modelling data received from a user. Further, the system analyses a target eco-system to identify a set of target eco-system parameters. The system further selects a sub-set of model parameters, corresponding to the set of target eco-system parameters, from a set of model parameters. Further, the system generates a local analytical model based on updating the global analytical model, based on the sub-set of model parameters and one or more PMML wrappers. The system further deploys the local analytical model at each node, from a set of nodes, associated with the target eco-system. Further, the system gathers test results from each node based on executing the local analytical model. The system further tunes the sub-set of model parameters associated with the local analytical model using federated learning algorithms.
    Type: Grant
    Filed: June 20, 2018
    Date of Patent: November 21, 2023
    Assignee: HCL Technologies Limited
    Inventors: S U M Prasad Dhanyamraju, Satya Sai Prakash Kanakadandi, Sriganesh Sultanpurkar, Karthik Leburi, Vamsi Peddireddy
  • Patent number: 11822609
    Abstract: Systems and methods for forecasting the prominence of various attributes in a future subject matter area are disclosed. An attribute is determined based on inputs received by a computing system. A set of indicators is determined based on the attribute and features extracted from an existing document set. The prominence of the attribute in the existing document set is determined. A prominence estimate of the attribute in a future document set is determined.
    Type: Grant
    Filed: January 15, 2016
    Date of Patent: November 21, 2023
    Assignee: SRI INTERNATIONAL
    Inventors: John J Byrnes, Clint Frederickson, Kyle J McIntyre, Tulay Muezzinoglu, Edmond D Chow, William T Deans
  • Patent number: 11816400
    Abstract: The disclosure describes various aspects of techniques for optimal fault-tolerant implementations of controlled-Za gates and Heisenberg interactions. Improvements in the implementation of the controlled-Za gate can be made by using a clean ancilla and in-circuit measurement. Various examples are described that depend on whether the implementation is with or without measurement and feedforward. The implementation of the Heisenberg interaction can leverage the improved controlled-Za gate implementation. These implementations can cut down significantly the implementation costs associated with fault-tolerant quantum computing systems.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: November 14, 2023
    Assignee: IonQ, Inc.
    Inventors: Yunseong Nam, Dmitri Maslov
  • Patent number: 11809954
    Abstract: An encoding apparatus connected to a learning circuit processing learning of a deep neural network and configured to perform encoding for reconfiguring connection or disconnection of a plurality of edges in a layer of the deep neural network using an edge sequence generated based on a random number sequence and dropout information indicating a ratio between connected edges and disconnected edges of a plurality of edges included in a layer of the deep neural network.
    Type: Grant
    Filed: February 21, 2019
    Date of Patent: November 7, 2023
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Sungho Kang, Hyungdal Kwon, Cheon Lee, Yunjae Lim
  • Patent number: 11797825
    Abstract: The technology disclosed provides a so-called “joint many-task neural network model” to solve a variety of increasingly complex natural language processing (NLP) tasks using growing depth of layers in a single end-to-end model. The model is successively trained by considering linguistic hierarchies, directly connecting word representations to all model layers, explicitly using predictions in lower tasks, and applying a so-called “successive regularization” technique to prevent catastrophic forgetting. Three examples of lower level model layers are part-of-speech (POS) tagging layer, chunking layer, and dependency parsing layer. Two examples of higher level model layers are semantic relatedness layer and textual entailment layer. The model achieves the state-of-the-art results on chunking, dependency parsing, semantic relatedness and textual entailment.
    Type: Grant
    Filed: May 26, 2021
    Date of Patent: October 24, 2023
    Assignee: Salesforce, Inc.
    Inventors: Kazuma Hashimoto, Caiming Xiong, Richard Socher
  • Patent number: 11798681
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for jointly training an encoder neural network and a decoder neural network. In one aspect, a method comprises: updating current values of a set of encoder parameters and current values of a set of decoder parameters using gradients of a reconstruction loss function that measures an error in a reconstruction of multi-modal data from a training example, wherein: the reconstruction loss function comprises a plurality of scaling factors that each scale a respective term in the reconstruction loss function that measures an error in the reconstruction of a corresponding proper subset of feature dimensions of the multi-modal data from the training example.
    Type: Grant
    Filed: October 5, 2022
    Date of Patent: October 24, 2023
    Assignee: Neumora Therapeutics, Inc.
    Inventors: Tathagata Banerjee, Matthew Edward Kollada
  • Patent number: 11797864
    Abstract: Systems and methods for training a conditional generator model are described. Methods receive a sample, and determine a discriminator loss for the received sample. The discriminator loss is based on an ability to determine whether the sample is generated by the conditional generator model or is a ground truth sample. The method determines a secondary loss for the generated sample and updates the conditional generator model based on an aggregate of the discriminator loss and the secondary loss.
    Type: Grant
    Filed: November 16, 2018
    Date of Patent: October 24, 2023
    Inventors: Shabab Bazrafkan, Peter Corcoran
  • Patent number: 11790264
    Abstract: The present disclosure is directed to methods and systems for knowledge distillation.
    Type: Grant
    Filed: June 19, 2019
    Date of Patent: October 17, 2023
    Assignee: GOOGLE LLC
    Inventors: Thomas J. Duerig, Hongsheng Wang, Scott Alexander Rudkin
  • Patent number: 11790263
    Abstract: A system for program synthesis using annotations based on enumeration patterns includes a memory device for storing program code, and at least one processor device operatively coupled to the memory device. The at least one processor device is configured to execute program code stored on the memory device to obtain a set of annotated terms including one or more terms each annotated with an enumeration pattern, translate problem text into a formal specification using natural language processing, the formal specification being described as a set of rules associated with predicates, and synthesize one or more terms satisfying the set of rules of the formal specification based on the set of annotated terms to generate a computer program.
    Type: Grant
    Filed: February 25, 2019
    Date of Patent: October 17, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Futoshi Iwama, Takaaki Tateishi, Shin Saito
  • Patent number: 11775854
    Abstract: Systems, computer-implemented methods, and computer program products to facilitate characterizing crosstalk of a quantum computing system based on sparse data collection are provided. According to an embodiment, a system can comprise a memory that stores computer executable components and a processor that executes the computer executable components stored in the memory. The computer executable components can comprise a package component that packs subsets of quantum gates in a quantum device into one or more bins. The computer executable components can further comprise an assessment component that characterizes crosstalk of the quantum device based on a number of the one or more bins into which the subsets of quantum gates are packed.
    Type: Grant
    Filed: November 8, 2019
    Date of Patent: October 3, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Prakash Murali, Ali Javadiabhari, David C. Mckay
  • Patent number: 11769061
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for receiving a request from a client to process a computational graph; obtaining data representing the computational graph, the computational graph comprising a plurality of nodes and directed edges, wherein each node represents a respective operation, wherein each directed edge connects a respective first node to a respective second node that represents an operation that receives, as input, an output of an operation represented by the respective first node; identifying a plurality of available devices for performing the requested operation; partitioning the computational graph into a plurality of subgraphs, each subgraph comprising one or more nodes in the computational graph; and assigning, for each subgraph, the operations represented by the one or more nodes in the subgraph to a respective available device in the plurality of available devices for operation.
    Type: Grant
    Filed: June 11, 2020
    Date of Patent: September 26, 2023
    Assignee: Google LLC
    Inventors: Paul A. Tucker, Jeffrey Adgate Dean, Sanjay Ghemawat, Yuan Yu
  • Patent number: 11769036
    Abstract: An apparatus for optimizing a computational network is configure to receive an input at a first processing component. The first processing component may include at least a first programmable processing component and a second programmable processing component. The first programmable processing component is configured to compute a first nonlinear function and the second programmable processing component is configured to compute a second nonlinear function which is different than the second nonlinear function. The computational network which may be a recurrent neural network such as a long short-term memory may be operated to generate an inference based at least in part on outputs of the first programmable processing component and the second programmable processing component.
    Type: Grant
    Filed: April 18, 2018
    Date of Patent: September 26, 2023
    Assignee: QUALCOMM Incorporated
    Inventors: Rosario Cammarota, Michael Goldfarb, Manu Rastogi, Sarang Ozarde
  • Patent number: 11748622
    Abstract: A computing system is configured to access intermediate outputs of a neural network by augmenting a data flow graph generated for the neural network. The data flow graph includes a plurality of nodes interconnected by connections, each node representing an operation to be executed by the neural network. To access the intermediate output, the data flow graph is augmented by inserting a node representing an operation that saves the output of a node which produces the intermediate output. The node representing the save operation is inserted while maintaining all existing nodes and connections in the data flow graph, thereby preserving the behavior of the data flow graph. The augmenting can be performed using a compiler that generates the data flow graph from program code.
    Type: Grant
    Filed: March 4, 2019
    Date of Patent: September 5, 2023
    Assignee: Amazon Technologies, Inc.
    Inventors: Drazen Borkovic, Se jong Oh
  • Patent number: 11748411
    Abstract: A method, system and computer-usable medium for providing cognitive insights comprising receiving data from a plurality of data sources, the plurality of data sources comprising a blockchain data source, the blockchain data source providing blockchain data; processing the data from the plurality of data sources, the processing the data from the plurality of data sources performing data enriching to provide enriched data; generating the cognitive session graph, the cognitive session graph being associated with a session, the cognitive session graph comprising at least some enriched data; and, associating a cognitive blockchain with the cognitive session graph.
    Type: Grant
    Filed: April 10, 2020
    Date of Patent: September 5, 2023
    Assignee: Tecnotree Technologies, Inc.
    Inventors: Manoj Saxena, Matthew Sanchez, Richard Knuszka