Patents Examined by Selene A. Haedi
  • Patent number: 11829860
    Abstract: In one aspect, this specification describes a recurrent neural network system implemented by one or more computers that is configured to process input sets to generate neural network outputs for each input set. The input set can be a collection of multiple inputs for which the recurrent neural network should generate the same neural network output regardless of the order in which the inputs are arranged in the collection. The recurrent neural network system can include a read neural network, a process neural network, and a write neural network. In another aspect, this specification describes a system implemented as computer programs on one or more computers in one or more locations that is configured to train a recurrent neural network that receives a neural network input and sequentially emits outputs to generate an output sequence for the neural network input.
    Type: Grant
    Filed: February 24, 2022
    Date of Patent: November 28, 2023
    Assignee: Google LLC
    Inventors: Oriol Vinyals, Samuel Bengio
  • Patent number: 11823014
    Abstract: A method for machine learning based database management is provided. The method may include training a machine learning model to detect an anomaly that is present and/or developing in a database system. The anomaly in the database system may be detected by at least processing, with a trained machine learning model, one or more performance metrics for the database system. In response to detecting the presence of the anomaly at the database system, one or more remedial actions may be determined for correcting and/or preventing the anomaly at the database system. The one or more remedial actions may further be sent to a database management system associated with the database system. Related systems and articles of manufacture are also provided.
    Type: Grant
    Filed: November 21, 2018
    Date of Patent: November 21, 2023
    Assignee: SAP SE
    Inventors: Rajendra Kumar, Heinz Wolf, Lohit Kumar A. P, Venkatesh R
  • Patent number: 11816427
    Abstract: Aspects of the present disclosure provide techniques for automated data classification error correction through machine learning. Embodiments include receiving a set of predicted labels corresponding to a set of consecutive text strings that appear in a particular order in a document, including: a first text string corresponding to a first predicted label; a second text string that follows the first text string in the particular order and corresponds to a second predicted label; and a third text string that follows the second text string in the particular order and corresponds to a third predicted label. Embodiments include providing inputs to a machine learning model based on: the third text string; the second text string; the second predicted label; and the first predicted label. Embodiments include determining a corrected third label for the third text string based on an output provided by the machine learning model in response to the inputs.
    Type: Grant
    Filed: October 27, 2022
    Date of Patent: November 14, 2023
    Assignee: INTUIT, INC.
    Inventors: Mithun Ghosh, Vignesh Thirukazhukundram Subrahmaniam
  • Patent number: 11797835
    Abstract: An explainable transducer transformer (XTT) may be a finite state transducer, together with an Explainable Transformer. Variants of the XTT may include an explainable Transformer-Encoder and an explainable Transformer-Decoder. An exemplary Explainable Transducer may be used as a partial replacement in trained Explainable Neural Network (XNN) architectures or logically equivalent architectures. An Explainable Transformer may replace black-box model components of a Transformer with white-box model equivalents, in both the sub-layers of the encoder and decoder layers of the Transformer. XTTs may utilize an Explanation and Interpretation Generation System (EIGS), to generate explanations and filter such explanations to produce an interpretation of the answer, explanation, and its justification.
    Type: Grant
    Filed: January 18, 2023
    Date of Patent: October 24, 2023
    Assignee: UMNAI Limited
    Inventors: Angelo Dalli, Matthew Grech, Mauro Pirrone
  • Patent number: 11797893
    Abstract: A non-transitory computer-readable recording medium stores a program that causes a computer to execute a process including: inputting input data including one or more records that have one of a plurality of formats, each of the plurality of formats including a plurality of items; generating conversion data by generating an integrated record having an integrated format from the one or more records; and causing a learner to execute a learning process using the conversion data as input tensor, the learner performing deep learning by performing tensor decomposition on input tensor.
    Type: Grant
    Filed: March 25, 2019
    Date of Patent: October 24, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Takuya Nishino, Ryota Kikuchi
  • Patent number: 11797837
    Abstract: In an example, an apparatus comprises a plurality of execution units comprising at least a first type of execution unit and a second type of execution unit and logic, at least partially including hardware logic, to analyze a workload and assign the workload to one of the first type of execution unit or the second type of execution unit. Other embodiments are also disclosed and claimed.
    Type: Grant
    Filed: April 24, 2017
    Date of Patent: October 24, 2023
    Assignee: Intel Corporation
    Inventors: Altug Koker, Abhishek R. Appu, Kamal Sinha, Joydeep Ray, Balaji Vembu, Elmoustapha Ould-Ahmed-Vall, Sara S. Baghsorkhi, Anbang Yao, Kevin Nealis, Xiaoming Chen, John C. Weast, Justin E. Gottschlich, Prasoonkumar Surti, Chandrasekaran Sakthivel, Farshad Akhbari, Nadathur Rajagopalan Satish, Liwei Ma, Jeremy Bottleson, Eriko Nurvitadhi, Travis T. Schluessler, Ankur N. Shah, Jonathan Kennedy, Vasanth Ranganathan, Sanjeev Jahagirdar
  • Patent number: 11797890
    Abstract: Methods and apparatus disclosed herein autonomously evaluate replacement algorithms. An example system associated with a live environment executing a current algorithm includes at least one memory, instructions, and processor circuitry to execute the instructions to manage execution of the current algorithm in the live environment, the live environment associated with a deployment platform, and manage, based on receipt of at least one potential replacement algorithm, a shadow environment causes execution of the at least one potential replacement algorithm in the shadow environment, the shadow environment instantiated separately from the live environment, the shadow environment utilizes data from the live environment during execution of the at least one potential replacement algorithm in the shadow environment.
    Type: Grant
    Filed: October 3, 2022
    Date of Patent: October 24, 2023
    Assignee: GENERAL ELECTRIC COMPANY
    Inventors: Bradford Miller, Kirk Lars Bruns, Michael Kinstrey, Charles Theurer, Vrinda Rajiv
  • Patent number: 11797888
    Abstract: A method includes receiving, by a processor, bias data categories. A data input from a user for classification in data categories is received. A classification machine learning model is utilized to classify the data input in at least one data category and determine a first confidence probability in a classification outcome. A bias filter machine learning model is utilized to determine a second confidence probability that the classification outcome of classifying the data input into the at least one data category is based on at least one bias characteristic associated with at least one bias data category. A gate machine learning model is utilized to determine when to output the classification outcome of classifying the data input into the at least one data category to a computing device of a user based at least in part on the first confidence probability, the second confidence probability, and a predefined bias threshold.
    Type: Grant
    Filed: July 2, 2021
    Date of Patent: October 24, 2023
    Assignee: Capital One Services, LLC
    Inventors: Austin Walters, Mark Watson, Jeremy Goodsitt, Anh Truong
  • Patent number: 11699064
    Abstract: A neural network system executable on a processor. The neural network system, when executed on the processor, comprises a merged layer shareable between a first neural network and a second neural network. The merged layer is configured to receive input data from a prior layer of at least one of the first and second neural networks. The merged layer is configured to apply a superset of weights to the input data to generate intermediate feature data representative of at least one feature of the input data, the superset of weights being combined from a first set of weights associated with the first neural network and a second set of weights associated with the second neural network. The merged layer is also configured to output the intermediate feature data to at least one subsequent layer, the at least one subsequent layer serving the first and second neural networks.
    Type: Grant
    Filed: April 23, 2019
    Date of Patent: July 11, 2023
    Assignee: Arm Limited
    Inventors: Daren Croxford, Roberto Lopez Mendez
  • Patent number: 11676035
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. The neural network has a plurality of differentiable weights and a plurality of non-differentiable weights. One of the methods includes determining trained values of the plurality of differentiable weights and the non-differentiable weights by repeatedly performing operations that include determining an update to the current values of the plurality of differentiable weights using a machine learning gradient-based training technique and determining, using an evolution strategies (ES) technique, an update to the current values of a plurality of distribution parameters.
    Type: Grant
    Filed: January 23, 2020
    Date of Patent: June 13, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Karel Lenc, Karen Simonyan, Tom Schaul, Erich Konrad Elsen
  • Patent number: 11669724
    Abstract: Subject matter regards improving machine learning techniques using informed pseudolabels. A method can include receiving previously assigned labels indicating an expected classification for data, the labels having a specified uncertainty, generating respective pseudolabels for the data based on the previously assigned labels, the data, a class vector determined by an ML model, and a noise model indicating, based on the specified uncertainty, a likelihood of the previously assigned label given the class, and substituting the pseudolabels for the previously assigned labels in a next epoch of training the ML model.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: June 6, 2023
    Assignee: Raytheon Company
    Inventors: Philip A. Sallee, James Mullen, Franklin Tanner
  • Patent number: 11657265
    Abstract: Described herein are systems and methods for training first and second neural network models. A system comprises a memory comprising instruction data representing a set of instructions and a processor configured to communicate with the memory and to execute the set of instructions. The set of instructions, when executed by the processor, cause the processor to set a weight in the second model based on a corresponding weight in the first model, train the second model on a first dataset, wherein the training comprises updating the weight in the second model and adjust the corresponding weight in the first model based on the updated weight in the second model.
    Type: Grant
    Filed: November 15, 2018
    Date of Patent: May 23, 2023
    Assignee: KONINKLIJKE PHILIPS N.V.
    Inventors: Binyam Gebre, Erik Bresch, Dimitrios Mavroeidis, Teun van den Heuvel, Ulf Grossekathöfer
  • Patent number: 11645509
    Abstract: Embodiments for training a neural network using sequential tasks are provided. A plurality of sequential tasks are received. For each task in the plurality of tasks a copy of the neural network that includes a plurality of layers is generated. From the copy of the neural network a task specific neural network is generated by performing an architectural search on the plurality of layers in the copy of the neural network. The architectural search identifies a plurality of candidate choices in the layers of the task specific neural network. Parameters in the task specific neural network that correspond to the plurality of candidate choices and that maximize architectural weights at each layer are identified. The parameters are retrained and merged with the neural network. The neural network trained on the plurality of sequential tasks is a trained neural network.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: May 9, 2023
    Assignee: Salesforce.com, Inc.
    Inventors: Yingbo Zhou, Xilai Li, Caiming Xiong
  • Patent number: 11640521
    Abstract: A multi-task feature sharing neural network-based intelligent fault diagnosis method has the following steps: (1) separately collecting original vibration acceleration signals of rotating machinery under different experimental conditions, forming samples by means of intercepting signal data having a certain length, and performing labeling; (2) constructing a multi-task feature sharing neural network, having: an input layer, a feature extractor, a classification model and a prediction model; (3) using multi-task joint training to simultaneously train the classification model and the prediction model; and (4) inputting a vibration acceleration signal collected in an actual industrial environment into the trained models to obtain a multi-task diagnosis result.
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: May 2, 2023
    Assignee: SOUTH CHINA UNIVERSITY OF TECHNOLOGY
    Inventors: Weihua Li, Zhen Wang, Ruyi Huang
  • Patent number: 11625570
    Abstract: A determination apparatus generates an interval vector having a plurality of components that are adjacent occurrence intervals between a plurality of events that have occurred in chronological order. The determination apparatus generates a plurality of local variable points each of which includes specific components as one set of coordinates, using a predetermined number of consecutive interval vectors in the chronological order. The determination apparatus generates a Betti sequence by applying persistent homology transform to the plurality of local variable points for which the interval vectors serving as starting points are different. The determination apparatus determines a type of the plurality of events based on the Betti sequence.
    Type: Grant
    Filed: November 28, 2018
    Date of Patent: April 11, 2023
    Assignee: FUJITSU LIMITED
    Inventor: Yuhei Umeda
  • Patent number: 11625573
    Abstract: A first neural network is operated on a processor and a memory to encode a first natural language string into a first sentence encoding including a set of word encodings. Using a word-based attention mechanism with a context vector, a weight value for a word encoding within the first sentence encoding is adjusted to form an adjusted first sentence encoding. Using a sentence-based attention mechanism, a first relationship encoding corresponding to the adjusted first sentence encoding is determined. An absolute difference between the first relationship encoding and a second relationship encoding is computed. Using a multi-layer perceptron, a degree of analogical similarity between the first relationship encoding and a second relationship encoding is determined.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: April 11, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfio Massimiliano Gliozzo, Gaetano Rossiello, Robert G. Farrell
  • Patent number: 11615300
    Abstract: A neural network system includes an input layer, one or more hidden layers, and an output layer. A first layer circuit implements a first layer of the one or more hidden layers. The first layer includes a first weight space including one or more subgroups. A forward path circuit of the first layer circuit includes a multiply and accumulate circuit to receive an input from a layer preceding the first layer; and provide a first subgroup weighted sum using the input and a first plurality weights associated with a first subgroup. A scaling coefficient circuit provides a first scaling coefficient associated with the first subgroup, and applies the first scaling coefficient to the first subgroup weighted sum to generate a first subgroup scaled weighted sum. An activation circuit generates an activation based on the first subgroup scaled weighted sum and provide the activation to a layer following the first layer.
    Type: Grant
    Filed: June 13, 2018
    Date of Patent: March 28, 2023
    Assignee: XILINX, INC.
    Inventors: Julian Faraone, Michaela Blott, Nicholas Fraser
  • Patent number: 11610110
    Abstract: Systems, computer program products, and methods are described herein for de-conflicting data labeling in real-time deep learning systems. The present invention is configured to retrieve one or more dynamically generated expert profiles; and determine an optimal expert mix of experts to classify the transaction into a transaction types, wherein the expert profiles comprises: (i) shared information metrics, (ii) divergence metrics, (iii) characteristics associated with the one or more experts, (iv) a predictive accuracy of the one or more experts, (v) an exposure score associated with the one or more experts, and (vi) information associated with the transaction, wherein the optimal expert mix comprises: (i) a best expert for classifying the transaction, (ii) a combination score from at least the portion of the one or more experts evaluating the transaction simultaneously, and (iii) a sequence of at least the portion of the one or more experts analyzing the transaction.
    Type: Grant
    Filed: December 5, 2018
    Date of Patent: March 21, 2023
    Assignee: BANK OF AMERICA CORPORATION
    Inventors: Eren Kursun, William David Kahn
  • Patent number: 11599794
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for few-shot learning-based generator training are disclosed. An exemplary method may start with obtaining a teacher model and a plurality of training samples, as well as a generator for generating more training samples. After generating a plurality of additional training samples using the method may continue with feeding the plurality of generated additional training samples into the teacher model to obtain a plurality of first statistics; and feeding the plurality of training samples into the teacher model to obtain a plurality second statistics. Then the method further includes training the generator to minimize a distance between the plurality of first statistics and the plurality of second statistics.
    Type: Grant
    Filed: October 20, 2021
    Date of Patent: March 7, 2023
    Assignee: Moffett International Co., Limited
    Inventor: Enxu Yan
  • Patent number: 11593631
    Abstract: An explainable transducer transformer (XTT) may be a finite state transducer, together with an Explainable Transformer. Variants of the XTT may include an explainable Transformer-Encoder and an explainable Transformer-Decoder. An exemplary Explainable Transducer may be used as a partial replacement in trained Explainable Neural Network (XNN) architectures or logically equivalent architectures. An Explainable Transformer may replace black-box model components of a Transformer with white-box model equivalents, in both the sub-layers of the encoder and decoder layers of the Transformer. XTTs may utilize an Explanation and Interpretation Generation System (EIGS), to generate explanations and filter such explanations to produce an interpretation of the answer, explanation, and its justification.
    Type: Grant
    Filed: December 17, 2021
    Date of Patent: February 28, 2023
    Assignee: UMNAI Limited
    Inventors: Angelo Dalli, Matthew Grech, Mauro Pirrone