Patents Examined by Hal Schnee
  • Patent number: 11966831
    Abstract: Embodiments relate to a first processing node that processes an input data having a temporal sequence of spatial patterns by retaining a higher-level context of the temporal sequence. The first processing node performs temporal processing based at least on feedback inputs received from a second processing node. The first processing node determines whether learned temporal sequences are included in the input data based on sequence inputs transmitted within the same level of a hierarchy of processing nodes and the feedback inputs received from an upper level of the hierarchy of processing nodes.
    Type: Grant
    Filed: November 5, 2021
    Date of Patent: April 23, 2024
    Assignee: Numenta, Inc.
    Inventors: Jeffrey C. Hawkins, Subutai Ahmad
  • Patent number: 11948686
    Abstract: A computer-implemented method comprising: obtaining text from text-based messages sent between a patient and a therapist providing psychological therapy; determining at least one feature of the text; and determining a characteristic of the patient and/or the therapist using the at least one feature.
    Type: Grant
    Filed: May 6, 2021
    Date of Patent: April 2, 2024
    Assignee: IESO DIGITAL HEALTH LIMITED
    Inventors: Guy James Proctor Beauchamp, Ann Gail Hayes, Christine Howes, Rosemarie McCabe, Barnaby Adam Perks, Matthew Richard John Purver, Sarah Elisabeth Bateup
  • Patent number: 11922277
    Abstract: A high accuracy information extracting device construction system includes: a feature quantity extraction expression list generating unit for generating a feature quantity extraction expression list; a feature quantity calculating unit for calculating feature quantities of teacher data by means of respective feature quantity extracting expressions; a teacher data supply unit for supplying teacher data; an evaluation value calculating unit for generating information extracting expressions by means of machine learning on the basis of the calculated feature quantities of teacher data and the teacher data, and calculating evaluation values for the respective feature quantity extracting expressions; and a synthesis unit for constructing a high accuracy information extracting device using T weak information extracting parts F(X)t output from the evaluation value calculating unit 15 and confidence levels Ct corresponding thereto.
    Type: Grant
    Filed: July 6, 2018
    Date of Patent: March 5, 2024
    Assignee: OSAKA UNIVERSITY
    Inventor: Aya Nakae
  • Patent number: 11915126
    Abstract: Dynamic data quantization may be applied to minimize the power consumption of a system that implements a convolutional neural network (CNN). Under such a quantization scheme, a quantized representation of a 3×3 array of m-bit activation values may include 9 n-bit mantissa values and one exponent shared between the n-bit mantissa values (n<m); and a quantized representation of a 3×3 kernel with p-bit parameter values may include 9 q-bit mantissa values and one exponent shared between the q-bit mantissa values (q<p). Convolution of the kernel with the activation data may include computing a dot product of the 9 n-bit mantissa values with the 9 q-bit mantissa values, and summing the shared exponents. In a CNN with multiple kernels, multiple computing units (each corresponding to one of the kernels) may receive the quantized representation of the 3×3 array of m-bit activation values from the same quantization-alignment module.
    Type: Grant
    Filed: September 4, 2020
    Date of Patent: February 27, 2024
    Assignee: Recogni Inc.
    Inventors: Jian hui Huang, James Michael Bodwin, Pradeep R. Joginipally, Shabarivas Abhiram, Gary S. Goldman, Martin Stefan Patz, Eugene M. Feinberg, Berend Ozceri
  • Patent number: 11900245
    Abstract: Systems, devices, and methods are disclosed for decision making based on plasticity rules of a neural network. A method may include obtaining a multilayered model. The multilayered model may include an input layer including one or more input units. The multilayered model may include one or more hidden layers including one or more hidden units. Each input unit may have a first connection with at least one hidden unit. The multilayered model may include an output layer including one or more output units. The method may also include receiving an input at a first input unit. The method may include sending a first signal from the first input unit to at least one hidden unit via a first connection comprising a first strength. The method may also include making a decision based on the model receiving the input.
    Type: Grant
    Filed: August 30, 2018
    Date of Patent: February 13, 2024
    Assignee: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
    Inventors: Steven Skorheim, Maksim Bazhenov, Pavel Sanda
  • Patent number: 11887016
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for providing actionable suggestions are disclosed. In one aspect, a method includes receiving (i) an indication that an event detection module has determined that a shared event of a particular type is presently occurring or has occurred, and (ii) data referencing an attribute associated with the shared event. The method includes selecting, from among multiple output templates that are each associated with a different type of shared event, a particular output template associated with the particular type of shared event detected by the module. The method generates a notification for output using at least (i) the selected particular output template, and (ii) the data referencing the attribute associated with the shared event. The method then provides, for output to a user device, the notification that is generated.
    Type: Grant
    Filed: November 23, 2020
    Date of Patent: January 30, 2024
    Assignee: GOOGLE LLC
    Inventors: Daniel M. Keysers, Victor Carbune, Thomas Deselaers
  • Patent number: 11880758
    Abstract: Disclosed herein are neural networks for generating target classifications for an object from a set of input sequences. Each input sequence includes a respective input at each of multiple time steps, and each input sequence corresponds to a different sensing subsystem of multiple sensing subsystems. For each time step in the multiple time steps and for each input sequence in the set of input sequences, a respective feature representation is generated for the input sequence by processing the respective input from the input sequence at the time step using a respective encoder recurrent neural network (RNN) subsystem for the sensing subsystem that corresponds to the input sequence. For each time step in at least a subset of the multiple time steps, the respective feature representations are processed using a classification neural network subsystem to select a respective target classification for the object at the time step.
    Type: Grant
    Filed: August 2, 2021
    Date of Patent: January 23, 2024
    Assignee: Waymo LLC
    Inventors: Congcong Li, Ury Zhilinsky, Yun Jiang, Zhaoyin Jia
  • Patent number: 11868885
    Abstract: According to an embodiment, a learning device includes a memory and one or more processors coupled to the memory. The one or more processors are configured to: generate a transformation matrix from learning data in which feature quantities and target values are held in a corresponding manner; and learn about parameters of a neural network which includes nodes equal in number to the number of rows of the transformation matrix, a first output layer representing first estimation distribution according to the values of the nodes, and a second output layer representing second estimation distribution decided according to the product of the transformation matrix and the first estimation distribution.
    Type: Grant
    Filed: August 21, 2020
    Date of Patent: January 9, 2024
    Assignee: KABUSHIKI KAISHA TOSHIBA
    Inventors: Yuichi Kato, Kouta Nakata, Susumu Naito, Yasunori Taguchi, Kentaro Takagi
  • Patent number: 11868888
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a document classification neural network. One of the methods includes training an autoencoder neural network to autoencode input documents, wherein the autoencoder neural network comprises the one or more LSTM neural network layers and an autoencoder output layer, and wherein training the autoencoder neural network comprises determining pre-trained values of the parameters of the one or more LSTM neural network layers from initial values of the parameters of the one or more LSTM neural network layers; and training the document classification neural network on a plurality of training documents to determine trained values of the parameters of the one or more LSTM neural network layers from the pre-trained values of the parameters of the one or more LSTM neural network layers.
    Type: Grant
    Filed: December 13, 2021
    Date of Patent: January 9, 2024
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le
  • Patent number: 11853436
    Abstract: Mechanisms are provided for obfuscating training of trained cognitive model logic. The mechanisms receive input data for classification into one or more classes in a plurality of predefined classes as part of a cognitive operation of the cognitive system. The input data is processed by applying a trained cognitive model to the input data to generate an output vector having values for each of the plurality of predefined classes. A perturbation insertion engine modifies the output vector by inserting a perturbation in a function associated with generating the output vector, to thereby generate a modified output vector. The modified output vector is then output. The perturbation modifies the one or more values to obfuscate the trained configuration of the trained cognitive model logic while maintaining accuracy of classification of the input data.
    Type: Grant
    Filed: April 15, 2021
    Date of Patent: December 26, 2023
    Assignee: International Business Machines Corporation
    Inventors: Taesung Lee, Ian M. Molloy, Dong Su
  • Patent number: 11853893
    Abstract: A method includes generating, by a processor of a computing device, a first plurality of models (including a first number of models) based on a genetic algorithm and corresponding to a first epoch of the genetic algorithm. The method includes determining whether to modify an epoch size for the genetic algorithm during a second epoch of the genetic algorithm based on a convergence metric associated with at least one epoch that is prior to the second epoch. The second epoch is subsequent to the first epoch. The method further includes, based on determining to modify the epoch size, generating a second plurality of models (including a second number of models that is different than the first number) based on the genetic algorithm and corresponding to the second epoch. Each model of the first plurality of models and the second plurality of models includes data representative of neural networks.
    Type: Grant
    Filed: June 1, 2021
    Date of Patent: December 26, 2023
    Assignee: SPARKCOGNITION, INC.
    Inventors: Sari Andoni, Keith D. Moore, Elmira M. Bonab, Junhwan Choi
  • Patent number: 11842271
    Abstract: Methods and systems for allocating network resources responsive to network traffic include modeling spatial correlations between fine spatial granularity traffic and coarse spatial granularity traffic for different sites and regions to determine spatial feature vectors for one or more sites in a network. Temporal correlations at a fine spatial granularity are modeled across multiple temporal scales, based on the spatial feature vectors. Temporal correlations at a coarse spatial granularity are modeled across multiple temporal scales, based on the spatial feature vectors. A traffic flow prediction is determined for the one or more sites in the network, based on the temporal correlations at the fine spatial granularity and the temporal correlations at the coarse spatial granularity. Network resources are provisioned at the one or more sites in accordance with the traffic flow prediction.
    Type: Grant
    Filed: August 26, 2020
    Date of Patent: December 12, 2023
    Assignee: NEC Corporation
    Inventors: Yanchi Liu, Wei Cheng, Bo Zong, LuAn Tang, Haifeng Chen, Denghui Zhang
  • Patent number: 11829884
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input that is a sequence to generate a network output. In one aspect, one of the methods includes, for each particular sequence of layer inputs: for each attention layer in the neural network: maintaining episodic memory data; maintaining compressed memory data; receiving a layer input to be processed by the attention layer; and applying an attention mechanism over (i) the compressed representation in the compressed memory data for the layer, (ii) the hidden states in the episodic memory data for the layer, and (iii) the respective hidden state at each of the plurality of input positions in the particular network input to generate a respective activation for each input position in the layer input.
    Type: Grant
    Filed: September 25, 2020
    Date of Patent: November 28, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Jack William Rae, Anna Potapenko, Timothy Paul Lillicrap
  • Patent number: 11829876
    Abstract: A trained computer model includes a direct network and an indirect network. The indirect network generates expected weights or an expected weight distribution for the nodes and layers of the direct network. These expected characteristics may be used to regularize training of the direct network weights and encourage the direct network weights towards those expected, or predicted by the indirect network. Alternatively, the expected weight distribution may be used to probabilistically predict the output of the direct network according to the likelihood of different weights or weight sets provided by the expected weight distribution. The output may be generated by sampling weight sets from the distribution and evaluating the sampled weight sets.
    Type: Grant
    Filed: October 28, 2021
    Date of Patent: November 28, 2023
    Assignee: Uber Technologies, Inc.
    Inventors: Zoubin Ghahramani, Douglas Bemis, Theofanis Karaletsos
  • Patent number: 11831138
    Abstract: A fault-arc identification method, device and apparatus, and a storage medium. The method comprises: performing sampling on a target arc at a high frequency, and obtaining a high-frequency sampling signal (S11); preprocessing the high-frequency sampling signal, and obtaining a processed sampling signal (S12); performing feature extraction on the processed sampling signal, and obtaining a target arc feature (S13); and inputting the target arc feature to a neural network model, obtaining a target output result, and determining, according to the target output result, whether the target arc is a fault-arc (S14). Performing sampling on a target arc at a high frequency can obtain more arc features from the target arc. Moreover, since a neural network model has favorable data classification capability, using a neural network model to perform determination with respect to the target arc can improve the accuracy and reliability of a fault-arc detection result.
    Type: Grant
    Filed: December 24, 2020
    Date of Patent: November 28, 2023
    Assignee: QINGDAO TOPSCOMM COMMUNICATION CO., LTD
    Inventors: Huarong Wang, Jianhua Wang, Yue Ma
  • Patent number: 11823058
    Abstract: A method includes obtaining a set of training samples. During each of a plurality of training iterations, the method also includes sampling a batch of training samples from the set of training samples. The method includes, for each training sample in the batch of training samples, determining, using a data value estimator, a selection probability. The selection probability for the training sample is based on estimator parameter values of the data value estimator. The method also includes selecting, based on the selection probabilities of each training sample, a subset of training samples from the batch of training samples, and determining, using a predictor model with the subset of training samples, performance measurements. The method also includes adjusting model parameter values of the predictor model based on the performance measurements, and updating the estimator parameter values of the data value estimator based on the performance measurements.
    Type: Grant
    Filed: September 18, 2020
    Date of Patent: November 21, 2023
    Assignee: Google LLC
    Inventors: Sercan Omer Arik, Jinsung Yoon, Tomas Jon Pfister
  • Patent number: 11823068
    Abstract: An assistant executing at, at least one processor, is described that determines content for a conversation with a user of a computing device and selects, based on the content and information associated with the user, a modality to signal initiating the conversation with the user. The assistant is further described that causes, in the modality, a signaling of the conversation with the user.
    Type: Grant
    Filed: February 3, 2020
    Date of Patent: November 21, 2023
    Assignee: GOOGLE LLC
    Inventors: Vikram Aggarwal, Deniz Binay
  • Patent number: 11809971
    Abstract: Disclosed are systems and methods for autonomous computing replacing or augmenting a human user of computer programs, where access to internal operations of the computer program is not used. An application controller can use the display output of a computer program to determine a current state of the computer program, using the disclosed embodiments. For example, identity of menu options of the computer program can be determined from image frames obtained from the display output of the computer program and used to determine the current state of the computer program. The application controller can provide input commands to the computer program to execute the computer program from the current state to a destination state.
    Type: Grant
    Filed: April 16, 2021
    Date of Patent: November 7, 2023
    Inventor: Curtis Ray Robinson, Jr.
  • Patent number: 11803737
    Abstract: The present disclosure relates to a neural network system comprising: a controller including a processing unit configured to execute a spiking neural network, and an interface connecting the controller to an external memory. The controller is configured for executing the spiking neural network, the executing comprising generating read instructions and/or write instructions. The interface is configured for: generating read weighting vectors according to the read instructions, coupling read signals, representing the read weighting vectors, into input lines of the memory, thereby retrieving data from the memory, generating write weighting vectors according to the write instructions, coupling write signals, representing the write weighting vectors, into output lines of the memory, thereby writing data into the memory.
    Type: Grant
    Filed: July 2, 2020
    Date of Patent: October 31, 2023
    Assignee: International Business Machines Corporation
    Inventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
  • Patent number: 11803741
    Abstract: Provided herein is an integrated circuit including, in some embodiments, a special-purpose host processor, a neuromorphic co-processor, and a communications interface between the host processor and the co-processor configured to transmit information therebetween. The special-purpose host processor can be operable as a stand-alone processor. The neuromorphic co-processor may include an artificial neural network. The co-processor is configured to enhance special-purpose processing of the host processor through an artificial neural network. In such embodiments, the host processor is a pattern identifier processor configured to transmit one or more detected patterns to the co-processor over a communications interface. The co-processor is configured to transmit the recognized patterns to the host processor.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: October 31, 2023
    Assignee: SYNTIANT
    Inventors: Kurt F. Busch, Pieter Vorenkamp, Stephen W. Bailey, Jeremiah H. Holleman, III