Patents Examined by Vincent Gonzales
  • Patent number: 11373102
    Abstract: Various examples are described for using a movement sensor to detecting an activity of an infant. In an example, an activity classification system includes a sensor configured to measure the activity of an infant and an external monitor. The monitor receives, from a sensor, a time series of data comprising an inertial measurement for each a time period. The monitor determines, from the time series and by using a predictive model, an activity from a list of identified activities. Examples of identified activities are deep sleep, light sleep, sitting, awake, nursing, or bottle feeding.
    Type: Grant
    Filed: April 12, 2019
    Date of Patent: June 28, 2022
    Assignees: THE PROCTER & GAMBLE COMPANY, VERILY LIFE SCIENCES LLC
    Inventors: Anupam Pathak, David He, Marty Gardner, Blanca Arizti
  • Patent number: 11361252
    Abstract: Methods and Systems for using reinforcement learning to optimize promotions. A promotion can be offered to a customer for a prepaid calling card using a reinforcement learning model with a sensitivity parameter. The reinforcement learning model can estimate a time period during which the customer will purchase the prepaid calling card. The customer's reaction to the promotion can be observed. A reward or a penalty can be collected based on the customer's reaction. The reinforcement learning model can be adapted based on the reward or the penalty to optimize the timing of the promotion by estimating a new time period during which the customer will purchase the prepaid calling card. The reward proxy and/or the penalty proxy can comprise frequency of usage.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: June 14, 2022
    Assignee: THE BOSTON CONSULTING GROUP, INC.
    Inventors: Muhammad Arjumand Masood, Arun Karthik Ravindran
  • Patent number: 11354585
    Abstract: An approach is provided in which an information handling system uses historical time durations of deprecated resources to compute an increased probability window. The increased probability window corresponds to an increase in probability that a currently active resource is likely to be active at a future point in time. Next, the information handling system identifies a set of active resources that have active time durations within the increased probability window and, in turn, marks the set of resources as a set of forecasted active resources. In turn, the information handling system generates a resource cost forecast based on the set of forecasted active resources.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: June 7, 2022
    Assignee: International Business Machines Corporation
    Inventors: Ankur Tagra, Harish Nayak
  • Patent number: 11328218
    Abstract: A system and method for identifying and predicting subjective attributes for entities (e.g., media clips, movies, television shows, images, newspaper articles, blog entries, persons, organizations, commercial businesses, etc.) are disclosed. In one aspect, subjective attributes for a first media item are identified based on a reaction to the first media item, and relevancy scores for the subjective attributes with respect to the first media item are determined. A classifier is trained using (i) a training input comprising a set of features for the first media item, and a target output for the training input, the target output comprising the respective relevancy scores for the subjective attributes with respect to the first media item.
    Type: Grant
    Filed: November 6, 2017
    Date of Patent: May 10, 2022
    Assignee: Google LLC
    Inventors: Hrishikesh Aradhye, Sanketh Shetty
  • Patent number: 11315152
    Abstract: A method and system for product recommendation. The method includes: defining, by a computing device, a hierarchical Bayesian model having a latent factor; training, by the computing device, the hierarchical Bayesian model using a plurality of training events to obtain a trained hierarchical Bayesian model, each event comprising feature of a product, brand of the product, feature of a user, and action of the user upon the product; predicting, by the computing device, a possibility a target user performing an action on a target product using the trained hierarchical Bayesian model; and providing product recommendation to the target user based on the possibility.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: April 26, 2022
    Assignees: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co., Ltd., JD.COM AMERICAN TECHNOLOGIES CORPORATION
    Inventors: Zhexuan Xu, Yongjun Bao
  • Patent number: 11315035
    Abstract: Computer-implemented methods are provided for implementing training of a machine learning model in a heterogeneous processing system comprising a host computer operatively interconnected with an accelerator unit. The training includes a stochastic optimization process for optimizing a function of a training data matrix X, having data elements Xi,j with row coordinates i=1 to n and column coordinates j=1 to m, and a model vector w having elements wj. For successive batches of the training data, defined by respective subsets of one of the row coordinates and column coordinates, random numbers associated with respective coordinates in a current batch b are generated in the host computer and sent to the accelerator unit. In parallel with generating the random numbers for batch b, batch b is copied from the host computer to the accelerator unit.
    Type: Grant
    Filed: December 10, 2018
    Date of Patent: April 26, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Thomas Parnell, Celestine Duenner, Charalampos Pozidis, Dimitrios Sarigiannis
  • Patent number: 11315006
    Abstract: A method for inferring patterns in multi-dimensional image data comprises providing a recursive network of sub-networks with a parent feature node and at least two child feature nodes; wherein each sub-network is associated with a distinct subset of the space; configuring nodes of the sub-networks with posterior distribution component; receiving image data feature input at the final child feature nodes; propagating node activation through the network layer hierarchy in a manner consistent with node connections of sub-networks of the network and the posterior prediction of child nodes; and outputting parent feature node selection to an inferred output.
    Type: Grant
    Filed: March 17, 2017
    Date of Patent: April 26, 2022
    Assignee: Vicarious FPC, Inc.
    Inventors: Dileep George, Kenneth Kansky, D. Scott Phoenix, Bhaskara Marthi, Christopher Laan, Wolfgang Lehrach
  • Patent number: 11295202
    Abstract: An apparatus comprises a mass storage unit and a plurality of circuit modules including a machine learning module, a programmable state machine module, and input/output interfaces. Switching circuitry is configured to selectively couple the circuit modules. Configuration circuitry is configured to access configuration data from the mass storage unit and to operate the switching circuitry to connect the circuit modules according to the configuration data.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: April 5, 2022
    Assignee: Seagate Technology LLC
    Inventors: Jon Trantham, Kevin Arthur Gomez, Frank Dropps, Antoine Khoueir, Scott Younger
  • Patent number: 11295201
    Abstract: Embodiments of the invention relate to a time-division multiplexed neurosynaptic module with implicit memory addressing for implementing a neural network. One embodiment comprises maintaining neuron attributes for multiple neurons and maintaining incoming firing events for different time steps. For each time step, incoming firing events for said time step are integrated in a time-division multiplexing manner. Incoming firing events are integrated based on the neuron attributes maintained. For each time step, the neuron attributes maintained are updated in parallel based on the integrated incoming firing events for said time step.
    Type: Grant
    Filed: March 29, 2019
    Date of Patent: April 5, 2022
    Assignee: International Business Machines Corporation
    Inventors: John V. Arthur, Bernard V. Brezzo, Leland Chang, Daniel J. Friedman, Paul A. Merolla, Dharmendra S. Modha, Robert K. Montoye, Jae-sun Seo, Jose A. Tierno
  • Patent number: 11288595
    Abstract: The system presented here can create a new machine learning model by improving and combining existing machine learning models in a modular way. By combining existing machine learning models, the system can avoid the step of training a new machine model. Further, by combining existing machine models in a modular way, the system can selectively train only a module, i.e. a part, of the new machine learning model. Using the disclosed system, the expensive steps of gathering 8 TB of data and using the data to train the new machine learning model over 16,000 processors for three days can be entirely avoided, or can be reduced by a half, a third, etc. depending on the size of the module requiring training.
    Type: Grant
    Filed: February 13, 2018
    Date of Patent: March 29, 2022
    Assignee: Groq, Inc.
    Inventors: Jonathan Alexander Ross, Douglas Wightman
  • Patent number: 11281966
    Abstract: A circuit for performing neural network computations for a neural network, the circuit comprising: a systolic array comprising a plurality of cells; a weight fetcher unit configured to, for each of the plurality of neural network layers: send, for the neural network layer, a plurality of weight inputs to cells along a first dimension of the systolic array; and a plurality of weight sequencer units, each weight sequencer unit coupled to a distinct cell along the first dimension of the systolic array, the plurality of weight sequencer units configured to, for each of the plurality of neural network layers: shift, for the neural network layer, the plurality of weight inputs to cells along the second dimension of the systolic array over a plurality of clock cycles and where each cell is configured to compute a product of an activation input and a respective weight input using multiplication circuitry.
    Type: Grant
    Filed: August 2, 2018
    Date of Patent: March 22, 2022
    Assignee: Google LLC
    Inventor: Jonathan Ross
  • Patent number: 11270230
    Abstract: Provided are a system and methodology for iteratively transforming data as between multiple sets thereof. Doing so, via normalization of the data, enables uniformity of interpretation and presentation of the data no matter the machine learning model that produced the data.
    Type: Grant
    Filed: April 12, 2021
    Date of Patent: March 8, 2022
    Assignee: Socure, Inc.
    Inventors: Pablo Y. Abreu, Omar Gutierrez, Ali Haddad, Stanislav Palatnik
  • Patent number: 11269974
    Abstract: Embodiments of the present invention provide a divide-and-conquer algorithm which divides expanded data into a cluster of machines. Each portion of data is used to train logistic classification models in parallel, and then combined at the end of the training phase to create a single ordinal model. The training scheme removes the need for synchronization between the parallel learning algorithms during the training period, making training on large datasets technically feasible without the use of supercomputers or computers with specific processing capabilities. Embodiments of the present invention also provide improved estimation and prediction performance of the model learned compared to the existing techniques for training models with large datasets.
    Type: Grant
    Filed: October 31, 2017
    Date of Patent: March 8, 2022
    Assignee: Amazon Technologies, Inc.
    Inventors: Sougata Chaudhuri, Lu Tang, Abraham Hossain Bagherjeiran
  • Patent number: 11263539
    Abstract: A method and system for distributed machine learning and model training are disclosed. In particular, a finite asynchronous parallel training scheme is described. The finite asynchronous parallel training takes advantage of the benefits of both asynchronous parallel training and synchronous parallel training. The computation delays in various distributed computation nodes are further considered when training parameter are updated during each round of iterative training. The disclosed method and system facilities increase of model training speed and efficiency.
    Type: Grant
    Filed: February 4, 2019
    Date of Patent: March 1, 2022
    Assignee: Tencent Technology (Shenzhen) Company Limited
    Inventors: Jiawei Jiang, Bin Cui, Ming Huang, Pin Xiao, Benlong Hu, Lele Yu
  • Patent number: 11263512
    Abstract: A novel and useful neural network (NN) processing core adapted to implement artificial neural networks (ANNs) and incorporating strictly separate control and data planes. The NN processor is constructed from self-contained computational units organized in a hierarchical architecture. The homogeneity enables simpler management and control of similar computational units, aggregated in multiple levels of hierarchy. Computational units are designed with minimal overhead as possible, where additional features and capabilities are aggregated at higher levels in the hierarchy. On-chip memory provides storage for content inherently required for basic operation at a particular hierarchy and is coupled with the computational resources in an optimal ratio. Lean control provides just enough signaling to manage only the operations required at a particular hierarchical level. Dynamic resource assignment agility is provided which can be adjusted as required depending on resource availability and capacity of the device.
    Type: Grant
    Filed: April 3, 2018
    Date of Patent: March 1, 2022
    Inventors: Avi Baum, Or Danon, Hadar Zeitlin, Daniel Ciubotariu, Rami Feig
  • Patent number: 11257005
    Abstract: A training method and a training system for a machine learning system are provided.
    Type: Grant
    Filed: August 31, 2018
    Date of Patent: February 22, 2022
    Inventor: Jun Zhou
  • Patent number: 11257003
    Abstract: The schematic flow chart diagrams included herein are generally set forth as logical flow chart diagrams. As such, the depicted order and labeled steps are indicative of one embodiment of the presented method. Other steps and methods may be conceived that are equivalent in function, logic, or effect to one or more steps, or portions thereof, of the illustrated method. Additionally, the format and symbols employed are provided to explain the logical steps of the method and are understood not to limit the scope of the method. Although various arrow types and line types may be employed in the flow chart diagrams, and they are understood not to limit the scope of the corresponding method.
    Type: Grant
    Filed: May 9, 2018
    Date of Patent: February 22, 2022
    Inventors: Kalpit Jain, Neil Reddy Chintala, Zhi Feng Huang, Kaushal Mehta, Alok Nandan
  • Patent number: 11244242
    Abstract: Systems, apparatuses, methods, and computer-readable media, are provided for distributed machine learning (ML) training using heterogeneous compute nodes in a heterogeneous computing environment, where the heterogeneous compute nodes are connected to a master node via respective wireless links. ML computations are performed by individual heterogeneous compute nodes on respective training datasets, and a master combines the outputs of the ML computations obtained from individual heterogeneous compute nodes. The ML computations are balanced across the heterogeneous compute nodes based on knowledge of network conditions and operational constraints experienced by the heterogeneous compute nodes. Other embodiments may be described and/or claimed.
    Type: Grant
    Filed: December 28, 2018
    Date of Patent: February 8, 2022
    Assignee: Intel Corporation
    Inventors: Saurav Prakash, Sagar Dhakal, Yair Yona, Nageen Himayat, Shilpa Talwar
  • Patent number: 11245665
    Abstract: Methods are taught for creating training data for a learning algorithm, training the learning algorithm with the training data and using the trained learning algorithm to suggest domain names to users. A domain name registrar may store activities of a user on a registrar website. Preferably, domain name searches, selected suggested domain names and domain names registered to the user are stored as the training data in a training database. The training data may be stored so that earlier activities act as inputs to the learning algorithm while later activities are the expected outputs of the learning algorithm. Once trained, the learning algorithm may receive activities of other users and suggest domain names to the other users based on their activities.
    Type: Grant
    Filed: January 28, 2019
    Date of Patent: February 8, 2022
    Assignee: Go Daddy Operating Company, LLC
    Inventors: Wei-Cheng Lai, Yu Tian, Wenbo Wang, Chungwei Yen
  • Patent number: 11243994
    Abstract: By formulizing a specific company's internal knowledge and terminology, the ontology programming accounts for linguistic meaning to surface relevant and important content for analysis. The ontology is built on the premise that meaningful terms are detected in the corpus and then classified according to specific semantic concepts, or entities. Once the main terms are defined, direct relations or linkages can be formed between these terms and their associated entities. Then, the relations are grouped into themes, which are groups or abstracts that contain synonymous relations. The disclosed ontology programming adapts to the language used in a specific domain, including linguistic patterns and properties, such as word order, relationships between terms, and syntactical variations. The ontology programming automatically trains itself to understand the domain or environment of the communication data by processing and analyzing a defined corpus of communication data.
    Type: Grant
    Filed: January 9, 2019
    Date of Patent: February 8, 2022
    Assignee: Verint Systems Ltd
    Inventor: Roni Romano