Patents Examined by Vincent Gonzales
  • Patent number: 10929781
    Abstract: A method for determining machine learning training parameters is disclosed. The method can include a processor receiving a first input. The processor may receive a first response to the first input, determine a first intent, and identify a first action. The processor can then determine first trainable parameter(s) and determine whether the first trainable parameter(s) is negative or positive. Further, the processor can update a training algorithm based on the first trainable parameter(s). The processor can then receive a second input and determine a second intent for the second input. The processor can also determine a second action for the second intent and transmit the second action to a user. The processor can then determine second trainable parameter(s) and determine whether the second trainable parameter(s) is positive or negative. Finally, the processor can further update the training algorithm based on the second trainable parameter(s).
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: February 23, 2021
    Assignee: CAPITAL ONE SERVICES, LLC
    Inventors: Omar Florez Choque, Erik Mueller, Zachary Kulis
  • Patent number: 10929745
    Abstract: A method and apparatus for constructing one of a neuroscience-inspired artificial neural network and a neural network array comprises one of a neuroscience-inspired dynamic architecture, a dynamic artificial neural network array and a neural network array of electrodes associated with neural tissue such as a brain, the method and apparatus having a special purpose display processor. The special purpose display processor outputs a display over a period of selected reference time units to demonstrate a neural pathway from, for example, one or a plurality of input neurons through intermediate destination neurons to an output neuron in three-dimensional space. The displayed neural network may comprise neurons and synapses in different colors and may be utilized, for example, to show the behavior of a neural network for classifying hand-written digits between values of 0 and 9 or recognizing vertical/horizontal lines in a grid image of lines.
    Type: Grant
    Filed: August 29, 2017
    Date of Patent: February 23, 2021
    Assignee: University of Tennessee Research Foundation
    Inventors: John Douglas Birdwell, Mark Edward Dean, Margaret Grace Drouhard, Catherine Dorothy Schuman
  • Patent number: 10915819
    Abstract: A method is disclosed including presenting a concept to a user via one or more presentation devices and monitoring the user's response to the presentation of the concept by a sensing device. The sensing device may generate sensor data based on the monitored user's response. The method further includes determining based on the sensor data generated by the sensor that the user requires clarification of the presented concept. In response to determining that the user requires clarification of the presented concept, the method further includes identifying an analogy that is configured to clarify the presented concept and presenting the identified analogy to the user via one or more of the presentation devices.
    Type: Grant
    Filed: July 1, 2016
    Date of Patent: February 9, 2021
    Assignee: International Business Machines Corporation
    Inventors: Clifford A. Pickover, Robert J. Schloss, Komminist S. Weldemariam, Lin Zhou
  • Patent number: 10915824
    Abstract: Methods, apparatus, and systems for analyzing data trends are described herein. The present disclosure includes the identification of trending terms in data through the use of an unsupervised algorithm. Trending terms are identified and counted during a first and second time period without reference to a library of pre-defined terms, along with at least one reason for using such trending terms. The set of trending terms and the at least one reason for use of the trending terms are displayed to a user.
    Type: Grant
    Filed: August 25, 2017
    Date of Patent: February 9, 2021
    Assignee: MATTERSIGHT CORPORATION
    Inventors: Christopher Danson, Douglas Brown, Roger Warford, Andrew Traba, Jordana Heller
  • Patent number: 10905383
    Abstract: Methods and apparatus for training a classification model and using the trained classification model to recognize gestures performed by a user. An apparatus comprises a processor that is programmed to: receive, via a plurality of neuromuscular sensors, a first plurality of neuromuscular signals from a user as the user performs a first single act of a gesture; train a classification model based on the first plurality of neuromuscular signals, the training including: deriving value(s) from the first plurality of neuromuscular signals, the value(s) indicative of distinctive features of the gesture including at least one feature that linearly varies with a force applied during performance of the gesture; and generating a first categorical representation of the gesture in the classification model based on the value(s); and determine that the user performed a second single act of the gesture, based on the trained classification model and a second plurality of neuromuscular signals.
    Type: Grant
    Filed: February 28, 2019
    Date of Patent: February 2, 2021
    Assignee: Facebook Technologies, LLC
    Inventor: Alexandre Barachant
  • Patent number: 10909450
    Abstract: A processing unit can determine a first feature value corresponding to a session by operating a first network computational model (NCM) based part on information of the session. The processing unit can determine respective second feature values corresponding to individual actions of a plurality of actions by operating a second NCM. The second NCM can use a common set of parameters in determining the second feature values. The processing unit can determine respective expectation values of some of the actions of the plurality of actions based on the first feature value and the respective second feature values. The processing unit can select a first action of the plurality of actions based on at least one of the expectation values. In some examples, the processing unit can operate an NCM to determine expectation values based on information of a session and information of respective actions.
    Type: Grant
    Filed: March 29, 2016
    Date of Patent: February 2, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jianshu Chen, Li Deng, Jianfeng Gao, Xiadong He, Lihong Li, Ji He, Mari Ostendorf
  • Patent number: 10902319
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing inputs using a neural network system that includes a batch normalization layer. One of the methods includes receiving a respective first layer output for each training example in the batch; computing a plurality of normalization statistics for the batch from the first layer outputs; normalizing each component of each first layer output using the normalization statistics to generate a respective normalized layer output for each training example in the batch; generating a respective batch normalization layer output for each of the training examples from the normalized layer outputs; and providing the batch normalization layer output as an input to the second neural network layer.
    Type: Grant
    Filed: September 16, 2019
    Date of Patent: January 26, 2021
    Assignee: Google LLC
    Inventors: Sergey Ioffe, Corinna Cortes
  • Patent number: 10902062
    Abstract: At an artificial intelligence system, a random cut tree corresponding to a sample of a multi-dimensional data set is traversed to determine a tree-specific vector indicating respective contributions of individual dimensions to an anomaly score of a particular data point. Level-specific vectors of per-dimension contributions obtained using bounding-box analyses at each level during the traversal are aggregated to obtain the tree-specific vector. An overall anomaly score contribution for at least one dimension is obtained using respective tree-specific vectors generated from one or more random cut trees, and an indication of the overall anomaly score contribution is provided.
    Type: Grant
    Filed: August 24, 2017
    Date of Patent: January 26, 2021
    Assignee: Amazon Technologies, Inc.
    Inventors: Sudipto Guha, Nina Mishra
  • Patent number: 10896384
    Abstract: In an example embodiment, a machine learning algorithm is used to train an objective prediction model to output a prediction value for an input member of a social networking service and a potential objective, based on member attribute information and action information. At prediction time, member attribute information and action information for a first user may be fed to the objective prediction model to obtain prediction values for a plurality of different potential objectives, one of which can be selected based on the prediction values. The selected objective can then be used to optimize coordinates, in a latent representation space, mapped to a plurality of different entities in a social network structure.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: January 19, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Uri Merhav, Dan Shacham, Steven Curtis McClung
  • Patent number: 10891540
    Abstract: A method and computer system for managing a neural network. Data is sent into an input layer in a portion of layers of nodes in the neural network. The data moves on an encode path through the portion such that an output layer in the portion outputs encoded data. The encoded data is sent into the output layer on a decode path through the portion back to the input layer to obtain a reconstruction of the data by the input layer. A determination is made as to whether an undesired amount of error has occurred in the output layer based on the data sent into the input layer and the reconstruction of the data. A number of new nodes is added to the output layer when a determination is present that the undesired amount of the error occurred, enabling reducing the error using the number of the new nodes.
    Type: Grant
    Filed: December 18, 2015
    Date of Patent: January 12, 2021
    Assignee: National Technology & Engineering Solutions of Sandia, LLC
    Inventors: Timothy J. Draelos, James Bradley Aimone
  • Patent number: 10885449
    Abstract: A mechanism is provided for computing a solution to a plan recognition problem. The plan recognition problem includes the model and a partially ordered sequence of observations or traces. The plan recognition is transformed into an AI planning problem such that a planner can be used to compute a solution to it. The approach is general. It addresses unreliable observations: missing observations, noisy observations (or observations that need to be discarded), and ambiguous observations). The approach does not require plan libraries or a possible set of goals. A planner can find either one solution to the resulting planning problem or multiple ranked solutions, which maps to the most plausible solution to the original problem.
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: January 5, 2021
    Assignee: International Business Machines Corporation
    Inventors: Anton Viktorovich Riabov, Shirin Sohrabi Araghi, Octavian Udrea
  • Patent number: 10885469
    Abstract: In one embodiment, a device trains a machine learning-based malware classifier using a first randomly selected subset of samples from a training dataset. The classifier comprises a random decision forest. The device identifies, using at least a portion of the training dataset as input to the malware classifier, a set of misclassified samples from the training dataset that the malware classifier misclassifies. The device retrains the malware classifier using a second randomly selected subset of samples from the training dataset and the identified set of misclassified samples. The device adjusts prediction labels of individual leaves of the random decision forest of the retrained malware classifier based in part on decision changes in the forest that result from assessing the entire training dataset with the classifier. The device sends the malware classifier with the adjusted prediction labels for deployment into a network.
    Type: Grant
    Filed: October 2, 2017
    Date of Patent: January 5, 2021
    Assignee: Cisco Technology, Inc.
    Inventors: Jan Brabec, Lukas Machlica
  • Patent number: 10885439
    Abstract: A method of generating a neural network includes iteratively performing operations including generating, for each neural network of a population, a matrix representation. The matrix representation of a particular neural network includes rows of values, where each row corresponds to a set of layers of the particular neural network and each value specifies a hyperparameter of the set of layers. The operations also include providing the matrix representations as input to a relative fitness estimator that is trained to generate estimated fitness data for neural networks of the population. The estimated fitness data are based on expected fitness of neural networks predicted by the relative fitness estimator. The operations further include generating, based on the estimated fitness data, a subsequent population of neural networks. The method also includes, when a termination condition is satisfied, outputting data identifying a neural network as a candidate neural network.
    Type: Grant
    Filed: May 13, 2020
    Date of Patent: January 5, 2021
    Assignee: SPARKCOGNITION, INC.
    Inventors: Tyler S. McDonnell, Bryson Greenwood
  • Patent number: 10878325
    Abstract: Provided is a method of obtaining state data indicating a state of a user. The method includes: obtaining estimation models for obtaining pieces of state data existing in a plurality of layers from sensor data obtained by a sensor; obtaining at least one piece of sensor data; obtaining state data of a lower layer from the at least one piece of sensor data, based on the estimation models; and obtaining state data of a higher layer from the state data of the lower layer, based on the estimation models.
    Type: Grant
    Filed: December 2, 2015
    Date of Patent: December 29, 2020
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Yoshiaki Akazawa, Jae-hwan Sim, Kousuke Hirasawa, Min-woo Gil
  • Patent number: 10878316
    Abstract: A circuit for performing neural network computations for a neural network, the circuit comprising: a systolic array comprising a plurality of cells; a weight fetcher unit configured to, for each of the plurality of neural network layers: send, for the neural network layer, a plurality of weight inputs to cells along a first dimension of the systolic array; and a plurality of weight sequencer units, each weight sequencer unit coupled to a distinct cell along the first dimension of the systolic array, the plurality of weight sequencer units configured to, for each of the plurality of neural network layers: shift, for the neural network layer, the plurality of weight inputs to cells along the second dimension of the systolic array over a plurality of clock cycles and where each cell is configured to compute a product of an activation input and a respective weight input using multiplication circuitry.
    Type: Grant
    Filed: March 23, 2020
    Date of Patent: December 29, 2020
    Assignee: Google LLC
    Inventor: Jonathan Ross
  • Patent number: 10872296
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. One of the methods includes receiving training data; training a neural network on the training data, wherein the neural network is configured to: receive a network input, convert the network input into a latent representation of the network input, and process the latent representation to generate a network output from the network input, and wherein training the neural network on the training data comprises training the neural network on a variational information bottleneck objective that encourages, for each training input, the latent representation generated for the training input to have low mutual information with the training input while the network output generated for the training input has high mutual information with the target output for the training input.
    Type: Grant
    Filed: May 3, 2019
    Date of Patent: December 22, 2020
    Assignee: Google LLC
    Inventor: Alexander Amir Alemi
  • Patent number: 10867251
    Abstract: An estimation results display system that, in the case of displaying an estimation result derived using a learning model, enables persons to recognize how condition determination is performed to select the learning model is provided. Input means 91 receives input of information associating information indicating a learning model selected depending on a determination result of whether or not an attribute in estimation data including one or more types of attributes satisfies one or more types of conditions and an estimation result derived using the learning model. Display means 92 displays the estimation result, in association with the information indicating the learning model used for deriving the estimation result and a condition subjected to determination of whether or not satisfied by the attribute in the estimation data when selecting the learning model.
    Type: Grant
    Filed: April 30, 2015
    Date of Patent: December 15, 2020
    Assignee: NEC Corporation
    Inventor: Yousuke Motohashi
  • Patent number: 10867246
    Abstract: Training datasets are determined for training neural networks. An input dataset comprising a plurality of samples is provided as training dataset to the neural network. Vector representations of samples of the input dataset are obtained from a hidden layer of the neural network. The samples are clustered using the vector representation. The samples are scored based on a metric that indicates the similarity of the sample to its cluster. A subset of samples is determined by excluding samples that have high similarity with their clusters. The subset of samples is labelled and used for training the neural network.
    Type: Grant
    Filed: August 24, 2017
    Date of Patent: December 15, 2020
    Assignee: ARIMO, LLC
    Inventors: Christopher T. Nguyen, Binh Han
  • Patent number: 10860939
    Abstract: An application performance analyzer adapted to analyze the performance of one or more applications running on IT infrastructure, comprises: a data collection engine collecting performance metrics for one or more applications running on the IT infrastructure; an anomaly detection engine analyzing the performance metrics and detecting anomalies, i.e. performance metrics whose values deviate from historic values with a deviation that exceeds a predefined threshold; a correlation engine detecting dependencies between plural anomalies, and generating anomaly clusters, each anomaly cluster consisting of anomalies that are correlated through one or more of the dependencies; a ranking engine ranking anomalies within an anomaly cluster; and a source detection engine pinpointing a problem source from the lowest ranked anomaly in an anomaly cluster.
    Type: Grant
    Filed: May 9, 2019
    Date of Patent: December 8, 2020
    Assignee: New Relic, Inc.
    Inventors: Frederick Ryckbosch, Stijn Polfliet, Bart De Vylder
  • Patent number: 10853722
    Abstract: Aspects of processing data for Long Short-Term Memory (LSTM) neural networks are described herein. The aspects may include one or more data buffer units configured to store previous output data at a previous timepoint, input data at a current timepoint, one or more weight values, and one more bias values. The aspects may further include multiple data processing units configured to parallelly calculate a portion of an output value at the current timepoint based on the previous output data at the previous timepoint, the input data at the current timepoint, the one or more weight values, and the one or more bias values.
    Type: Grant
    Filed: July 1, 2019
    Date of Patent: December 1, 2020
    Assignee: Sanghai Cambricon Information Technology Co., Ltd.
    Inventors: Yunji Chen, Xiaobing Chen, Shaoli Liu, Tianshi Chen