Patents by Inventor Tomoharu Iwata
Tomoharu Iwata has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240169204Abstract: A learning method executed by a computer including a memory and a processor, that includes: inputting a learning data set including a plurality of pieces of observation data; estimating, by a neural network, parameters of prior distributions of a plurality of pieces of data in a case where the post-missing observation data is expressed by a product of the plurality of pieces of data, using the post-missing observation data in which some values included in the observation data are set as missing values; updating the plurality of pieces of data using the parameters of the prior distributions such that the product of the plurality of pieces of data matches the post-missing observation data; estimating a missing value of the post-missing observation data from the plurality of pieces of updated data; and updating model parameters including parameters of the neural network to increase estimation accuracy of the missing value.Type: ApplicationFiled: March 11, 2021Publication date: May 23, 2024Inventor: Tomoharu IWATA
-
Patent number: 11940281Abstract: A learning apparatus includes an input unit configured to input route information indicating a route including one or more paths of a plurality of paths and moving body number information indicating the number of moving bodies for a date and a time on a path to be observed among the plurality of paths, and a learning unit configured to learn a parameter of a model indicating a relationship between the number of moving bodies for each of the plurality of paths and the number of moving bodies for the route and a relationship between the numbers of moving bodies for the route at different dates and times by using the route information and the moving body number information.Type: GrantFiled: February 12, 2020Date of Patent: March 26, 2024Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Tomoharu Iwata, Hitoshi Shimizu
-
Publication number: 20240045921Abstract: A parameter estimation device for estimating a plurality of parameters used for calculating high-resolution data from aggregated data aggregated to coarse granularity, the parameter estimation device comprising: a parameter estimation unit configured to estimate a plurality of parameters that are unknown variables in a model so as to maximize a marginal likelihood based on the assumption that actually observed aggregated data is generated from the model based on a multivariate Gaussian process in which a plurality of latent Gaussian processes for a plurality of types of aggregated data in a plurality of domains are represented by linear mixing; and a storage unit configured to store the plurality of parameters, wherein the plurality of parameters include a hyperparameter of a prior distribution to a mixing coefficient used in the linear mixing.Type: ApplicationFiled: December 10, 2020Publication date: February 8, 2024Inventors: Yusuke TANAKA, Tomoharu IWATA, Takeshi KURASHIMA, Hiroyuki TODA
-
Publication number: 20240012869Abstract: An estimation apparatus includes: input means for inputting aggregated data in which a plurality of data are aggregated, and feature data representing a feature of the aggregated data; determination means for determining a parameter of a model of the plurality of data before the aggregation of the aggregated data, using a predetermined function and the feature data; and estimation means for estimating a parameter of the function and the plurality of data by optimizing a predetermined objective function, using the aggregated data.Type: ApplicationFiled: November 7, 2019Publication date: January 11, 2024Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Tomoharu IWATA, Hitoshi SHIMIZU
-
Patent number: 11859992Abstract: The present disclosure relates to an apparatus and methods of estimating a traffic volume of moving objects. In particular, the present disclosure estimates the traffic volume based on amounts of traffic volume of the moving objects observed at observation points based on a routing matrix and a visitor matrix. The routing matrix indicates whether moving objects that pass through specific waypoints are to be observed at an observation point. The visitor matrix indicates whether a moving object departing or arriving at the observation point. The present disclosure enables estimating a traffic volume of moving objects on various routes based on observed data with errors in data and varying lengths in observation periods.Type: GrantFiled: March 26, 2019Date of Patent: January 2, 2024Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Hitoshi Shimizu, Tatsushi Matsubayashi, Yusuke Tanaka, Takuma Otsuka, Hiroshi Sawada, Tomoharu Iwata, Naonori Ueda
-
Publication number: 20230419120Abstract: A learning method according to an embodiment causes a computer to execute: an input step of inputting a plurality of data sets; and a learning step of learning, based on the plurality of input data sets, an estimation model for estimating a parameter of a topic model from a smaller amount of data than an amount ot data included in the plurality of data sets.Type: ApplicationFiled: October 5, 2020Publication date: December 28, 2023Inventor: Tomoharu IWATA
-
Publication number: 20230325661Abstract: A learning method, executed by a computer including a memory and a processor, includes: inputting a plurality of items of data, and a plurality of labels representing clusters to which the plurality of items of data belong; converting each of the plurality of items of data by a predetermined neural network, to generate a plurality of items of representation data; clustering the plurality of items of representation data; calculating a predetermined evaluation scale indicating performance of the clustering, based on the clustering result and the plurality of labels; and learning a parameter of the neural network, based on the evaluation scale.Type: ApplicationFiled: September 18, 2020Publication date: October 12, 2023Inventor: Tomoharu IWATA
-
Publication number: 20230274133Abstract: A learning method includes: receiving as input a set of data sets {D1, . . . , DT} wherein Dt for a task tin a task set {1, . . . , T} includes feature amount vectors of cases of t; sampling t from the task set, and sampling a first subset from Dt and a second subset from Dt excluding the first subset; generating a task vector representing a property oft corresponding to the first subset by a first neural network; nonlinearly transforming feature amount vectors included in data included in the second subset by a second neural network using the task vector; calculating scores representing degrees of anomaly of the feature amount vectors using the transformed feature amount vectors and a preset center vector; and learning parameters of the first and second neural networks so as to make an index value representing generalized performance of anomaly detection higher using the scores.Type: ApplicationFiled: July 6, 2020Publication date: August 31, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventor: Tomoharu IWATA
-
Publication number: 20230244928Abstract: A learning apparatus includes a memory and a processor to execute: receiving as input, when denoting a set of indices representing response variables of a task r in a set of tasks R, as Cr, a data set Drc composed of pairs of the response variables and explanatory variable; sampling the task r from R, an index c from Cr, and a first subset from Drc and a second subset from a set of Drc excluding the first subset; generating a task vector representing a property of a task corresponding to the first subset with a first neural network; calculating, from the task vector and explanatory variables in the second subset, predicted values of response variables for the explanatory variables with a second neural network; and updating the first and second neural networks using an error between response variables in the second subset and the predicted values thereof.Type: ApplicationFiled: June 8, 2020Publication date: August 3, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventor: Tomoharu IWATA
-
Publication number: 20230222319Abstract: A learning method, executed by a computer, according to one embodiment includes an input procedure for receiving a series data set set X={Xd}d?D composed of series data sets Xd for learning in a task d?D when a task set is set as D, a sampling procedure for sampling the task d from the task set D and then sampling a first subset from a series data set Xd corresponding to the task d and a second subset from a set obtained by excluding the first subset from the series data set Xd, a generation procedure for generating a task vector representing characteristics of the first subset using parameters of a first neural network, a prediction procedure for calculating, from the task vector and series data included in the second subset, a predicted value of each value included in the series data using parameters of a second neural network, and a learning procedure for updating learning target parameters including the parameters of the first neural network and the parameters of the second neural network using an error bType: ApplicationFiled: June 8, 2020Publication date: July 13, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventor: Tomoharu IWATA
-
Publication number: 20230222324Abstract: A method includes receiving data including cases and labels therefor, calculating a predicted value of a label for each case included in the data using parameters of a neural network and information representing cases in which the labels are observed among the cases in the data, selecting one case from the data using parameters of another neural network and information representing the cases where the labels are observed among the cases in the data, training the parameters of the neural network using an error between the predicted value and a value of the label for each case in the data, and training the parameters of the other neural network using the error and another error between a predicted value of a label for each case when the one case is additionally observed and a value of the label for the case.Type: ApplicationFiled: June 8, 2020Publication date: July 13, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventor: Tomoharu IWATA
-
Publication number: 20230196097Abstract: A ranking function generating apparatus includes a memory and a processor configured to execute producing training data including at least a first search log related to a first item included in a search result of a search query, a second search log related to a second item included in the search result, and respective domains of the first search log and the second search log; and learning, using the training data, parameters of a neural network that implements ranking functions for a plurality of domains through multi-task learning regarding each of the domains as a task.Type: ApplicationFiled: May 18, 2020Publication date: June 22, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Hitoshi SHIMIZU, Tomoharu IWATA
-
Patent number: 11615273Abstract: In a classifier whose classification accuracy is maintained without frequently collecting labeled learning data, a learning unit learns a classification criterion of a classifier at each time point in the past until the present and learns a time series change of the classification criterion by using data for learning to which a label is given and that is collected until the present. A classifier creating unit predicts a classification criterion of a future classifier and creates a classifier that outputs a label representing an attribute of input data by using the learned classification criterion and time series change.Type: GrantFiled: January 19, 2017Date of Patent: March 28, 2023Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsutoshi Kumagai, Tomoharu Iwata
-
Publication number: 20230016231Abstract: A learning device relating to one embodiment includes: an input unit configured to input a plurality of datasets of different feature spaces; a first generation unit configured to generate a feature latent vector indicating a property of an individual feature of the dataset for each of the datasets; a second generation unit configured to generate an instance latent vector indicating the property of observation data for each of observation vectors included in the datasets; a prediction unit configured to predict a solution by a model for solving a machine learning problem of interest by using the feature latent vector and the instance latent vector; and a learning unit configured to learn a parameter of the model by optimizing a predetermined objective function by using the feature latent vector, the instance latent vector and the solution for each of the datasets.Type: ApplicationFiled: November 29, 2019Publication date: January 19, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Tomoharu IWATA, Atsutoshi KUMAGAI
-
Publication number: 20220405585Abstract: A latent representation calculation unit (131) uses a first model to calculate, from samples belonging to a domain, a latent representation representing a feature of the domain. A domain-by-domain objective function generation unit (132) and an all-domain objective function generation unit (133) generate, from the samples belonging to the domain and from the latent representation of the domain calculated by the latent representation calculation unit (131), an objective function related to a second model that calculates an anomaly score of each of the samples. An update unit (134) updates the first model and the second model so as to optimize the objective functions of a plurality of the domains calculated by the domain-by-domain objective function generation unit (132) and the all-domain objective function generation unit (133).Type: ApplicationFiled: October 16, 2019Publication date: December 22, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsutoshi KUMAGAI, Tomoharu IWATA
-
Publication number: 20220405624Abstract: An acquisition unit 15a acquires data in a task. The learning unit 15b learns a generation model representing a distribution of a probability of the data in the task so that a mutual information amount between a latent variable and an observed variable is minimized in the model.Type: ApplicationFiled: November 21, 2019Publication date: December 22, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Hiroshi TAKAHASHI, Tomoharu IWATA, Sekitoshi KANAI, Atsutoshi KUMAGAI, Yuki YAMANAKA, Masanori YAMADA, Satoshi YAGI
-
Publication number: 20220398497Abstract: A control device according to one embodiment includes control means that selects an action at for controlling a people flow in accordance with a measure ? at each control step âtâ of an agent in A2C by using a state st obtained by observation of a traffic condition about the people flow in a simulator and learning means that learns a parameter of a neural network which realizes an advantage function expressed by an action value function representing a value of selection of the action at in the state st under the measure ? and by a state value function representing a value of the state st under the measure ?.Type: ApplicationFiled: November 6, 2019Publication date: December 15, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Hitoshi SHIMIZU, Tomoharu IWATA
-
Publication number: 20220351052Abstract: A training apparatus includes a calculation unit that takes aggregate data obtained by aggregating history data representing a history of second objects for each first object from a predetermined viewpoint, auxiliary data representing auxiliary information regarding the second object, and partial history data that is a part of the history data as inputs and calculates a value of a predetermined objective function, which represents a degree of matching between co-occurrence information representing a co-occurrence relationship of two second objects, and the aggregate data, the auxiliary data, and the partial history data, and a derivative of the objective function with respect to a parameter, and an updating unit that updates the parameter such that the value of the objective function is maximized or minimized using the value of the objective function and the derivative calculated by the calculation unit.Type: ApplicationFiled: September 18, 2019Publication date: November 3, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventor: Tomoharu IWATA
-
Publication number: 20220284313Abstract: A learning device includes a learning unit that learns parameters for determining an occurrence probability of an event at each time and each location on the basis of history information relating to the event, the history information including a time, a location, and an event type, and features of an area corresponding to the location, so that a likelihood expressing a combined effect of the event type and the features of the area on the event is optimized.Type: ApplicationFiled: July 4, 2019Publication date: September 8, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Maya OKAWA, Tomoharu IWATA, Hiroyuki TODA, Takeshi KURASHIMA, Yusuke TANAKA
-
Publication number: 20220253736Abstract: A parameter estimation section 106 is configured to perform estimation, for aggregate data in which values are associated with respective regions obtained by subdividing a space and for a Gaussian process model that expresses a plurality of aggregate data of differing partition granularity. The estimation is performed based on the Gaussian process model including a spatial scale parameter of a correlation function between regions of the aggregate data and including a noise variance parameter of the correlation function, by estimating the spatial scale parameter and the noise variance parameter so as to maximize a function expressing values of the aggregate data by area integrals of a Gaussian process.Type: ApplicationFiled: July 10, 2020Publication date: August 11, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yusuke TANAKA, Tomoharu IWATA, Takeshi KURASHIMA, Hiroyuki TODA, Toshiyuki TANAKA