Patents Examined by Eric Nilsson
-
Patent number: 12380324Abstract: An electronic calculator comprises a plurality of electronic calculation blocks, each of which is configured to implement one or more respective processing layers of an artificial neural network. The calculation blocks are of at least two different types among: a first type with fixed topology, fixed operation, and fixed parameters, a second type with fixed topology, fixed operation, and modifiable parameters, and a third type with modifiable topology, modifiable operation, and modifiable parameters. For each processing layer implemented by the respective calculation block, the topology is a connection topology for each artificial neuron, the operation is a type of processing to be performed for each artificial neuron, and the parameters include values determined via training of the neural network.Type: GrantFiled: September 15, 2021Date of Patent: August 5, 2025Assignee: Commissariat à l'énergie atomique et aux énergies alternativesInventors: Vincent Lorrain, Olivier Bichler, David Briand, Johannes Christian Thiele
-
Patent number: 12367391Abstract: Methods, systems, and apparatus for selecting actions to be performed by an agent interacting with an environment. One system includes a high-level controller neural network, low-level controller network, and subsystem. The high-level controller neural network receives an input observation and processes the input observation to generate a high-level output defining a control signal for the low-level controller. The low-level controller neural network receives a designated component of an input observation and processes the designated component and an input control signal to generate a low-level output that defines an action to be performed by the agent in response to the input observation.Type: GrantFiled: December 27, 2023Date of Patent: July 22, 2025Assignee: DeepMind Technologies LimitedInventors: Nicolas Manfred Otto Heess, Timothy Paul Lillicrap, Gregory Duncan Wayne, Yuval Tassa
-
Patent number: 12367427Abstract: Methods, computing systems, and computer-readable media for robust classification using active learning and domain knowledge are disclosed. In embodiments described herein, global feature data (such as a list of keywords) is generated for use in a classification task (such as a NLP text classification task). Expert knowledge, based on decisions made by human users, is combined with existing domain knowledge, which may be derived from existing trained classification models in the problem domain, such as keyword models trained using various datasets. By combining the expert knowledge with the domain knowledge, global feature data may be generated that is more effective in performing the classification task than either a classifier using the expert knowledge or a classifier using the domain knowledge.Type: GrantFiled: September 3, 2021Date of Patent: July 22, 2025Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Gopi Krishnan Rajbahadur, Haoxiang Zhang, Jack Zhenming Jiang
-
Patent number: 12361329Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: December 6, 2024Date of Patent: July 15, 2025Inventor: Steven D. Flinn
-
Patent number: 12361305Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for determining neural network architectures. One of the methods includes generating, using a controller neural network having controller parameters and in accordance with current values of the controller parameters, a batch of output sequences. The method includes, for each output sequence in the batch: generating an instance of a child convolutional neural network (CNN) that includes multiple instances of a first convolutional cell having an architecture defined by the output sequence; training the instance of the child CNN to perform an image processing task; and evaluating a performance of the trained instance of the child CNN on the task to determine a performance metric for the trained instance of the child CNN; and using the performance metrics for the trained instances of the child CNN to adjust current values of the controller parameters of the controller neural network.Type: GrantFiled: April 20, 2023Date of Patent: July 15, 2025Assignee: Google LLCInventors: Vijay Vasudevan, Barret Zoph, Jonathon Shlens, Quoc V. Le
-
Patent number: 12361328Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: December 6, 2024Date of Patent: July 15, 2025Inventor: Steven D. Flinn
-
Patent number: 12340442Abstract: A method for applying a style to an input image to generate a stylized image. The method includes maintaining data specifying respective parameter values for each image style in a set of image styles, receiving an input including an input image and data identifying an input style to be applied to the input image to generate a stylized image that is in the input style, determining, from the maintained data, parameter values for the input style, and generating the stylized image by processing the input image using a style transfer neural network that is configured to process the input image to generate the stylized image.Type: GrantFiled: September 6, 2023Date of Patent: June 24, 2025Assignee: Google LLCInventors: Jonathon Shlens, Vincent Dumoulin, Manjunath Kudlur Venkatakrishna
-
Patent number: 12340296Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: November 27, 2024Date of Patent: June 24, 2025Inventor: Steven D. Flinn
-
Patent number: 12333421Abstract: A synaptic circuit according to an embodiment is a circuit in which a weight value changed by learning is set. The synaptic circuit receives a binary input signal from a pre-synaptic neuron circuit and outputs an output signal to a post-synaptic neuron circuit. The synaptic circuit includes a propagation circuit and a control circuit. The propagation circuit supplies, to the post-synaptic neuron circuit, the output signal obtained by adding an influence of the weight value to the input signal. The control circuit stops output of the output signal from the propagation circuit to the post-synaptic neuron circuit when the weight value is smaller than a predetermined reference value.Type: GrantFiled: August 30, 2021Date of Patent: June 17, 2025Assignee: Kabushiki Kaisha ToshibaInventors: Kumiko Nomura, Yoshifumi Nishi, Takao Marukame, Koichi Mizushima
-
Patent number: 12333407Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: November 27, 2024Date of Patent: June 17, 2025Inventor: Steven D. Flinn
-
Patent number: 12327166Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: November 27, 2024Date of Patent: June 10, 2025Inventor: Steven D. Flinn
-
Patent number: 12321845Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: November 27, 2024Date of Patent: June 3, 2025Inventor: Steven D. Flinn
-
Patent number: 12314834Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: August 20, 2024Date of Patent: May 27, 2025Inventor: Steven D Flinn
-
Patent number: 12299600Abstract: In an aspect, provided is a method comprising monitoring one or more data analysis sessions, determining, based on the monitoring, a common data analysis technique performed across common data analysis sessions, identifying the common data analysis technique as a precedent, and providing the precedent to a precedent engine.Type: GrantFiled: September 8, 2022Date of Patent: May 13, 2025Assignee: QlikTech International ABInventors: Mohsen Rais-Ghasem, Elif Tutuk
-
Patent number: 12293291Abstract: A system for time series analysis using attention models is disclosed. The system may capture dependencies across different variables through input embedding and may map the order of a sample appearance to a randomized lookup table via positional encoding. The system may capture capturing dependencies within a single sequence through a self-attention mechanism and determine a range of dependency to consider for each position being analyzed. The system may obtain an attention weighting to other positions in the sequence through computation of an inner product and utilize the attention weighting to acquire a vector representation for a position and mask the sequence to enable causality. The system may employ a dense interpolation technique for encoding partial temporal ordering to obtain a single vector representation and a linear layer to obtain logits from the single vector representation. The system may use a type dependent final prediction layer.Type: GrantFiled: July 5, 2023Date of Patent: May 6, 2025Assignees: Arizona Board of Regents on Behalf of Arizona State University, Lawrence Livermore National Security, LLCInventors: Andreas Spanias, Huan Song, Jayaraman J. Thiagarajan, Deepta Rajan
-
Patent number: 12293270Abstract: An iterative attention-based neural network training and processing method and system iteratively applies a focus of attention of a trained neural network on syntactical elements and generates probabilities associated with representations of the syntactical elements, which in turn inform a subsequent focus of attention of the neural network, resulting in updated probabilities. The updated probabilities are then applied to generate syntactical elements for delivery to a user. The user may respond to the delivered syntactical elements, providing additional training information to the trained neural network.Type: GrantFiled: August 20, 2024Date of Patent: May 6, 2025Inventor: Steven D Flinn
-
Patent number: 12282303Abstract: A system and methods for multivariant learning and optimization repeatedly generate self-organized experimental units (SOEUs) based on the one or more assumptions for a randomized multivariate comparison of process decisions to be provided to users of a system. The SOEUs are injected into the system to generate quantified inferences about the process decisions. Responsive to injecting the SOEUs, at least one confidence interval is identified within the quantified inferences, and the SOEUs are iteratively modified based on the at least one confidence interval to identify at least one causal interaction of the process decisions within the system. The causal interaction can be used for testing, diagnosis, and optimization of the system performance.Type: GrantFiled: September 11, 2019Date of Patent: April 22, 2025Assignee: 3M Innovative Properties CompanyInventors: Gilles J. Benoit, Brian E. Brooks, Peter O. Olson, Tyler W. Olson
-
Patent number: 12265433Abstract: Aspects of the present disclosure describe techniques for cooling motional states in an ion trap for quantum computers. In an aspect, a method includes performing Doppler cooling and sideband cooling to sweep motional states associated with a motional mode to a zero motional state; applying a gate interaction on a red sideband; detecting, a population of non-zero motional states of the motional mode that remains after performing the Doppler cooling and the sideband cooling; and removing at least part of the population. In another aspect, a method includes performing a Doppler cooling; applying a gate interaction on a red sideband; detecting whether a population of non-zero motional states of the motional mode remains after performing the Doppler cooling; and redistributing the population of the non-zero motional states by Doppler cooling when a population is detected. A quantum information processing (QIP) system that performs these methods is also described.Type: GrantFiled: June 24, 2021Date of Patent: April 1, 2025Assignee: IonQ, Inc.Inventors: Jason Madjdi Amini, Kenneth Wright, Kristin Marie Beck
-
Patent number: 12265890Abstract: Techniques are described for identifying successful adversarial attacks for a black box reading comprehension model using an extracted white box reading comprehension model. The system trains a white box reading comprehension model that behaves similar to the black box reading comprehension model using the set of queries and corresponding responses from the black box reading comprehension model as training data. The system tests adversarial attacks, involving modified informational content for execution of queries, against the trained white box reading comprehension model. Queries used for successful attacks on the white box model may be applied to the black box model itself as part of a black box improvement process.Type: GrantFiled: December 9, 2020Date of Patent: April 1, 2025Assignee: Oracle International CorporationInventors: Naveen Jafer Nizar, Ariel Gedaliah Kobren
-
Patent number: 12242406Abstract: A set of quantum controllers are operable to transmit quantum state data to a quantum control switch. The quantum control switch comprises vector processors that operate on the quantum state data from the set of quantum controllers. Each vector processor transmits a result of the operation to a corresponding quantum controller in the set of quantum controllers.Type: GrantFiled: May 10, 2021Date of Patent: March 4, 2025Assignee: Q.M Technologies Ltd.Inventors: Itamar Sivan, Yonatan Cohen, Nissim Ofek, Ori Weber, Uri Abend