Patents Examined by M. Smith
  • Patent number: 11610109
    Abstract: In an example embodiment, a system is provided whereby a machine learning model is trained to predict a standardization for a given raw title. A neural network may be trained whose input is a raw title (such as a query string) and a list of candidate titles (either title identifications in a taxonomy, or English strings), which produces a probability that the raw title and each candidate belong to the same title. The model is able to standardize titles in any language included in the training data without first having to perform language identification or normalization of the title. Additionally, the model is able to benefit from the existence of “loan words” (words adopted from a foreign language with little or no modification) and relations between languages.
    Type: Grant
    Filed: September 26, 2018
    Date of Patent: March 21, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sebastian Alexander Csar, Uri Merhav, Dan Shacham
  • Patent number: 11593703
    Abstract: At least one label prediction model is trained, or learned, using training data that may comprise training instances that may be missing one or more labels. The at least one label prediction model may be used in identifying a content item's ground-truth label set comprising an indicator for each label in the label set indicating whether or not the label is applicable to the content item.
    Type: Grant
    Filed: June 13, 2019
    Date of Patent: February 28, 2023
    Assignee: YAHOO ASSETS LLC
    Inventors: Jia Li, Yi Chang, Xiangnan Kong
  • Patent number: 11593817
    Abstract: A demand prediction method using a computer includes calculating an error between a prediction value of an amount of a past demand and a measurement value of the amount of the past demand, the prediction value of the amount of the past demand being calculated by inputting a measurement value of a past explanatory variable to a prediction model that is constructed in accordance with a measurement value of an amount of a demand at a predetermined location and an explanatory variable that serves as an external factor that affects an increase or a decrease in the amount of the demand at the predetermined location, determining whether the calculated error is an abnormal value, acquiring a new explanatory variable if the error is determined to be the abnormal value, updating the prediction model in accordance with the acquired new explanatory variable, and newly predicting an amount of a future demand using the updated prediction model.
    Type: Grant
    Filed: June 16, 2016
    Date of Patent: February 28, 2023
    Assignee: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICA
    Inventors: Ryota Fujimura, Iku Ohama, Hideo Umetani, Yukie Shoda
  • Patent number: 11593625
    Abstract: Provided is a processor implemented method that includes performing training or an inference operation with a neural network by obtaining a parameter for the neural network in a floating-point format, applying a fractional length of a fixed-point format to the parameter in the floating-point format, performing an operation with an integer arithmetic logic unit (ALU) to determine whether to round off a fixed point based on a most significant bit among bit values to be discarded after a quantization process, and performing an operation of quantizing the parameter in the floating-point format to a parameter in the fixed-point format, based on a result of the operation with the ALU.
    Type: Grant
    Filed: October 15, 2018
    Date of Patent: February 28, 2023
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Shinhaeng Kang, Seungwon Lee
  • Patent number: 11588099
    Abstract: A reservoir element of the first aspect of the present disclosure includes: a first ferromagnetic layer; a plurality of second ferromagnetic layers positioned in a first direction with respect to the first ferromagnetic layer and spaced apart from each other in a plan view from the first direction; and a nonmagnetic layer positioned between the first ferromagnetic layer and the second ferromagnetic layers.
    Type: Grant
    Filed: September 10, 2019
    Date of Patent: February 21, 2023
    Assignee: TDK CORPORATION
    Inventors: Tomoyuki Sasaki, Tatsuo Shibata
  • Patent number: 11586960
    Abstract: Embodiments are directed to a method of performing autonomous learning for updating input features used for an artificial intelligence model, the method comprising receiving updated data of an information space that includes a graph of nodes having a defined topology, the updated data including historical data of requests to the artificial intelligence model and output results associated with the requests, wherein different categories of input data corresponds to different input nodes of the graph. The method may further comprise updating edge connections between the nodes of the graph by performing path optimizations that each use a set of agents to explore the information space over cycles to reduce a cost function, each connection including a strength value, wherein during each path optimization, path information is shared between the rest of agents at each cycle for determining a next position value for each of the set of agents in the graph.
    Type: Grant
    Filed: May 9, 2017
    Date of Patent: February 21, 2023
    Assignee: VISA INTERNATIONAL SERVICE ASSOCIATION
    Inventors: Theodore D. Harris, Craig O'Connell, Yue Li, Tatiana Korolevskaya
  • Patent number: 11580440
    Abstract: Methods, computer-readable media and systems are disclosed for building, deploying, operating, and maintaining an intelligent dynamic form in which a trained machine learning (ML) model is embedded. A universe of questions is associated with a plurality of output classifiers, which could represent eligibilities for respective benefits. The questions are partitioned into blocks. Each block can be associated with one or more of the classifiers, and each classifier can have a dependency on one or more blocks. An ML model is trained to make inferences from varied combinations of responses to questions and pre-existing data, and determine probabilities or predictions of values of the output classifiers. Based on outputs of the trained model, blocks of questions can be selectively rendered. The trained model is packaged with the question blocks and other components suitably for offline deployment. Uploading collected responses and maintenance of the dynamic form are also disclosed.
    Type: Grant
    Filed: December 2, 2016
    Date of Patent: February 14, 2023
    Assignee: SAP SE
    Inventor: Markus Schmidt-Karaca
  • Patent number: 11574206
    Abstract: The presently disclosed method and system includes a network of computer devices, sensors, and actuators operating in concert with application software to actively detect, identify, and localize threats and generate real-time countermeasures designed to delay and/or mitigate damage that may be caused by the threats. Application software, in the form of automated reasoning and logic control, initiates preparatory and countermeasure sequences automatically, which may be used by users or automatically executed by the system to at least delay an attack to a physical asset/area by adversaries with use of non-lethal actuators. Learned scenarios are generated and continuously adapted via feedback loops and decision rules to provide preparatory and countermeasure sequences that maximize results with minimal expenditure of assets.
    Type: Grant
    Filed: June 3, 2015
    Date of Patent: February 7, 2023
    Assignee: The Security Oracle, Inc.
    Inventors: Charles Lankford Butler, Jr., Samuel McArthur Smith, Vontella Kay Kimball
  • Patent number: 11568205
    Abstract: Techniques are generally described for causal impact estimation using machine learning. A first machine learning model is trained using non-treatment variables during training. A second machine learning model uses learned weights from the first machine learning model for non-treatment variables and is trained on one or more treatment variables. The second machine learning model estimates outcomes based on the presence or absence of an event represented by the treatment variable. Selection bias is reduced by warm-starting the second machine learning model with non-treatment variable weights learned during training of the first machine learning model.
    Type: Grant
    Filed: January 22, 2019
    Date of Patent: January 31, 2023
    Assignee: AMAZON TECHNOLOGIES, INC.
    Inventors: Naveen Sudhakaran Nair, Pragyana K. Mishra
  • Patent number: 11568301
    Abstract: A machine learning system includes multiple machine learning models. A target object, such as a file, is scanned for machine learning features. Context information of the target object, such as the type of the object and how the object was received in a computer, is employed to select a machine learning model among the multiple machine learning models. The machine learning model is also selected based on threat intelligence, such as census information of the target object. The selected machine learning model makes a prediction using machine learning features extracted from the target object. The target object is allowed or blocked depending on whether or not the prediction indicates that the target object is malicious.
    Type: Grant
    Filed: January 31, 2018
    Date of Patent: January 31, 2023
    Assignee: Trend Micro Incorporated
    Inventors: Peng-Yuan Yueh, Chia-Yen Chang, Po-I Wang, Te-Ching Chen
  • Patent number: 11556808
    Abstract: Content delivery optimization and recommendation is disclosed. A manner of delivering a content object to a mobile device may be determined at least in part by applying a behavior model associated with a user of the mobile device to attributes associated with the content object. The behavior model may be generated based at least in part on observed activities of the user. The content object is provided to the mobile device in the determined manner.
    Type: Grant
    Filed: August 29, 2014
    Date of Patent: January 17, 2023
    Assignee: Ivanti, Inc.
    Inventors: Mansu Kim, Suresh Kumar Batchu, Benjamin Markines
  • Patent number: 11544549
    Abstract: A processor-implemented neural network method includes calculating individual update values for a weight assigned to a connection relationship between nodes included in a neural network; generating an accumulated update value by accumulating the individual update values in an accumulation buffer; and training the neural network by updating the weight using the accumulated update value in response to the accumulated update value being equal to or greater than a threshold value.
    Type: Grant
    Filed: August 21, 2018
    Date of Patent: January 3, 2023
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Junhaeng Lee, Hyunsun Park, Yeongjae Choi
  • Patent number: 11544628
    Abstract: On the basis of a difference between first distribution regarding target task learning data as a plurality of learning data which belongs to a first category of a first task as a target task and second distribution regarding a plurality of learning data which belongs to the first category of source task learning data as learning data which belongs to the first category of a second task different from the first task, a transformation parameter for transforming the source task learning data is adjusted, the source task learning data is transformed based on the adjusted transformation parameter, and a classifier regarding the first task is generated based on the transformed source task learning data and the target task learning data.
    Type: Grant
    Filed: June 9, 2017
    Date of Patent: January 3, 2023
    Assignee: CANON KABUSHIKI KAISHA
    Inventor: Yusuke Mitarai
  • Patent number: 11537868
    Abstract: A method includes a computing system accessing a training sample that includes first sensor data obtained using a first sensor at a first geographic location, and first metadata comprising information relating to the first sensor. The system may train a machine-learning model by generating first map data by processing the training sample using the model and updating the model based on the generated first map data and target map data associated with the first geographic location. The system may then access second sensor data and second metadata, where the second sensor data is obtained using a second sensor. The system may generate second map data associated with a second geographic location by processing the second sensor data and the second metadata using the trained model. A high-definition map may be generated using the second map data.
    Type: Grant
    Filed: November 13, 2017
    Date of Patent: December 27, 2022
    Assignee: Lyft, Inc.
    Inventor: Gil Arditi
  • Patent number: 11537908
    Abstract: Provided are systems and methods for dynamically determining a discipline-specific knowledge value of an agent. In one example, the method may include receiving from an agent, via a computing device, a formulation of content that corresponds to an inquiry, determining a knowledge value for the agent based on one or more of the content and a property of the inquiry with respect to a dynamically defined solution to the inquiry, and storing the determined knowledge value of the agent within a storage device.
    Type: Grant
    Filed: December 16, 2019
    Date of Patent: December 27, 2022
    Assignee: HUMAN DX, LTD
    Inventors: Jayanth Komarneni, Irving Lin
  • Patent number: 11537840
    Abstract: A neural network classifies an input signal. For example, an accelerometer signal may be classified to detect human activity. In a first convolutional layer, two-valued weights are applied to the input signal. In a first two-valued function layer coupled at input to an output of the first convolutional layer, a two-valued function is applied. In a second convolutional layer coupled at input to an output of the first two-valued functional layer, weights of the second convolutional layer are applied. In a fully-connected layer coupled at input to an output of the second convolutional layer, two-valued weights of the fully connected layer are applied. In a second two-valued function layer coupled at input to an output of the fully connected layer, a two-valued function of the second two-valued function layer is applied. A classifier classifies the input signal based on an output signal of second two-valued function layer.
    Type: Grant
    Filed: November 13, 2018
    Date of Patent: December 27, 2022
    Assignee: STMICROELECTRONICS S.R.L.
    Inventors: Danilo Pietro Pau, Emanuele Plebani, Fabio Giuseppe De Ambroggi, Floriana Guido, Angelo Bosco
  • Patent number: 11521137
    Abstract: In one aspect there is provided a method. The method may include collecting one or more functions that implement the decision logic of a solution. A snapshot of the one or more functions can be generated. The snapshot can executable code associated with the one or more functions. The solution can be deployed by at least storing the snapshot of the one or more functions to a repository Systems and articles of manufacture, including computer program products, are also provided.
    Type: Grant
    Filed: April 25, 2017
    Date of Patent: December 6, 2022
    Assignee: Fair Isaac Corporation
    Inventors: Joshua Prismon, Andrei Palskoi, Andrew K. Holland, Fernando Felipe Campos Donati Jorge, Stuart Clarkson Wells
  • Patent number: 11514349
    Abstract: In some embodiments, an apparatus includes a geometric aggregator that receives data for a set of time periods and a location. The data has a first set of first metric values and a set of second metric values for each time period and lacks a mixture of Gaussian distributions. The geometric aggregator calculates a geometric aggregation of the data for each time period to produce a first metric value from a second set of first metric values, having a mixture of Gaussian distributions. The apparatus includes a Gaussian mixture model that predicts a set of Gaussian distributions, each uniquely associated with a season, within a set of histogram values for the data based on the second set of first metric values. The apparatus includes a presentation portion that produces a set of adjusted first metric values based on the set of histogram values, each uniquely associated with a season.
    Type: Grant
    Filed: September 22, 2020
    Date of Patent: November 29, 2022
    Assignee: Topia Limited
    Inventors: Mervi Sepp Rei, Sten Tamkivi, Ardo Illaste, Venkata Surendra Reddy Polam, Okan Polatkan
  • Patent number: 11514309
    Abstract: Embodiments of the present invention provide a method and apparatus for accelerating distributed training of a deep neural network. The method comprises: based on parallel training, the training of deep neural network is designed as a distributed training mode. A deep neural network to be trained is divided into multiple sub-networks. A set of training samples is divided into multiple subsets of samples. The training of the deep neural network to be trained is performed with the multiple subsets of samples based on a distributed cluster architecture and a preset scheduling method. The multiple sub-networks are simultaneously trained so as to fulfill the distributed training of the deep neural network. The utilization of the distributed cluster architecture and the preset scheduling method may reduce, through data localization, the effect of network delay on the sub-networks under distributed training; adapt the training strategy in real time; and synchronize the sub-networks trained in parallel.
    Type: Grant
    Filed: December 10, 2018
    Date of Patent: November 29, 2022
    Assignee: Beijing University of Posts and Telecommunications
    Inventors: Jianxin Liao, Jingyu Wang, Jing Wang, Qi Qi, Jie Xu
  • Patent number: 11494621
    Abstract: Implementations detailed herein include description of a computer-implemented method.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: November 8, 2022
    Assignee: Amazon Technologies, Inc.
    Inventors: Sudipta Sengupta, Poorna Chand Srinivas Perumalla, Dominic Rajeev Divakaruni, Nafea Bshara, Leo Parker Dirac, Bratin Saha, Matthew James Wood, Andrea Olgiati, Swaminathan Sivasubramanian