Abstract: The technology disclosed relates to methods for partitioning sets of features for a Bayesian classifier, finding a data partition that makes the classification process faster and more accurate, while discovering and taking into account feature dependence among sets of features in the data set. It relates to computing class entropy scores for a class label across all tuples that share the feature-subset and arranging the tuples in order of non-decreasing entropy scores for the class label, and constructing a data partition that offers the highest improvement in predictive accuracy for the data set. Also disclosed is a method for partitioning a complete set of records of features in a batch computation, computing increasing predictive power; and also relates to starting with singleton partitions, and using an iterative process to construct a data partition that offers the highest improvement in predictive accuracy for the data set.
Abstract: Provided is a production amount prediction system including: a storage unit which stores a production amount prediction model which is based on resources information including a resources amount obtained in a previously drilled wellbore and a resources recovery probability in the vicinity thereof; an input unit which receives a trajectory coordinate of a planned wellbore as an input; a production amount prediction unit which calculates a production amount of the planned wellbore based on the production amount prediction model by using a degree of influence of the previous wellbore on the planned wellbore as at least one parameter; and a display unit which displays the production amount of resources of the planned wellbore calculated by the production amount prediction unit.
Abstract: Embodiments relate to constructing a tree-shaped Bayesian network from variables associated with conditional dependencies in a given data set, the constructing being performed by a plurality of processors in parallel. An aspect includes assigning a plurality of variables as nodes to a respective plurality of processors. Another aspect includes operating the plurality of processors in a parallel manner to determine a correlation for each pair of nodes. Another aspect includes M variables that are randomly selected as primary nodes defining (M+1) sub-trees. Another aspect includes in each sub-tree the plurality of processors are operated in a parallel manner to determine a correlation for the remaining nodes with each of the primary nodes and to allocate each remaining node to one of the (M+1) sub-trees.
Type:
Grant
Filed:
February 3, 2014
Date of Patent:
June 7, 2016
Assignee:
INTERNATIONAL BUSINESS MACHINES CORPORATION
Abstract: The described implementations relate to automatically performing device actions. One implementation can obtain a contextual value of a contextor. The implementation can decide, using a decision engine, whether to perform an action on a computing device based on the contextual value. In an instance when the decision engine decides that the action is to be performed, the implementation can perform the action on the computing device. The implementation can also update the decision engine using feedback related to the action. As a specific example, the action can be prelaunching an application before a user has requested to execute the application. Prelaunching the application can reduce application latency relative to waiting for the user to request to execute the application before launching the application.
Type:
Grant
Filed:
December 30, 2011
Date of Patent:
November 17, 2015
Assignee:
Microsoft Technology Licensing, LLC
Inventors:
David Chu, Aman Kansal, Jie Liu, Tingxin Yan
Abstract: Certain aspects of the present disclosure support a technique for neural learning of natural multi-spike trains in spiking neural networks. A synaptic weight can be adapted depending on a resource associated with the synapse, which can be depleted by weight change and can recover over time. In one aspect of the present disclosure, the weight adaptation may depend on a time since the last significant weight change.