Patents by Inventor Kentaro Takagi
Kentaro Takagi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250182447Abstract: According to one embodiment, a data analysis apparatus includes processing circuitry. The processing circuitry acquires a plurality of pieces of first data satisfying a first condition, generates a plurality of first feature vectors by unsupervised learning of the plurality of pieces of first data, generates a first clustering result by clustering the plurality of first feature vectors, acquires a plurality of pieces of second data satisfying a second condition different from the first condition, generates a plurality of second feature vectors by unsupervised learning of at least some of the plurality of pieces of first data and the plurality of pieces of second data, generates a second clustering result by clustering the second feature vectors, and generates a comparison result regarding the plurality of pieces of first data and the plurality of pieces of second data by comparing the first clustering result with the second clustering result.Type: ApplicationFiled: August 26, 2024Publication date: June 5, 2025Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kouta NAKATA, Kentaro TAKAGI, Yaling TAO
-
Patent number: 12257552Abstract: The present invention relates to a separation membrane element including a supply-side channel member, in which: the supply-side channel member has a net shape in which plural fibrous rows X including fibrous objects A and plural fibrous rows Y including fibrous objects B cross each other sterically to form intersections; at least one of the fibrous objects A and the fibrous objects B have a large diameter portion and a small diameter portion along a longitudinal direction; at least one of the fibrous objects A and the fibrous objects B include a thread that is thinner at a central portion located between intersection portions than at the large diameter portion; and a fiber between an arbitrary intersection and an adjacent intersection is a tapered fiber whose diameter increases like a taper in a direction from one intersection to the other intersection.Type: GrantFiled: August 26, 2020Date of Patent: March 25, 2025Assignee: TORAY INDUSTRIES, INC.Inventors: Shu Taniguchi, Kentaro Takagi, Takeshi Konda
-
Publication number: 20250078250Abstract: According to one embodiment, a defect classification support apparatus includes a processor. The processor acquires a defect image of an outer appearance of a target object having a defect. The processor extracts a defect patch image from the defect image, the defect patch image being a partial image that includes the defect. The processor extracts a normal patch image from the defect image, the normal patch image being a partial image free of the defect. The processor computes a feature amount of the defect based on the defect patch image and the normal patch image.Type: ApplicationFiled: July 1, 2024Publication date: March 6, 2025Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kentaro TAKAGI, Toshiyuki OSHIMA, Kouta NAKATA
-
Publication number: 20250037038Abstract: According to one embodiment, the information processing apparatus includes a processor. The processor extracts a plurality of features from a plurality of training data by using a machine learning model. The processor generates a prediction result relating to a task, from the training data and teaching data corresponding to the training data. The processor calculates a similarity between features with respect to the plurality of features. The processor updates a parameter of the machine learning model, based on the prediction result and the similarity, in such a manner that the features become farther from each other.Type: ApplicationFiled: February 27, 2024Publication date: January 30, 2025Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi KATO, Kentaro TAKAGI, Kouta NAKATA
-
Publication number: 20250029356Abstract: According to one embodiment, an information processing apparatus includes a processor. The processor acquires training data that is used for training of a first feature extractor and a second feature extractor. The processor determines a model size of the second feature extractor. The processor extracts a first feature by inputting the training data to the first feature extractor. The processor extracts a second feature by inputting the first feature to the second feature extractor. The processor trains the first feature extractor in such a manner as to make the first feature closer to the second feature.Type: ApplicationFiled: February 27, 2024Publication date: January 23, 2025Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi KATO, Kentaro TAKAGI, Kouta NAKATA
-
Publication number: 20240404245Abstract: A similar image set creating apparatus includes processing circuitry. The processing circuitry acquires a plurality of images. The processing circuitry extracts first features from the images by using a first model that executes an image classification task. The processing circuitry extracts second features from the images by using a second model that executes an image classification task. The second model is trained in such a manner that mutually similar images in a latent space are continuously distributed, compared to the first model. The processing circuitry selects, from the images, an image of interest serving as a reference of a similar image set, and an auxiliary image similar to the image of interest, based on the first features and the second features.Type: ApplicationFiled: February 28, 2024Publication date: December 5, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kouta NAKATA, Kentaro TAKAGI, Yaling TAO
-
Patent number: 12033370Abstract: According to an embodiment, a learning device includes one or more processors. The processors calculate a latent vector of each of a plurality of first target data, by using a parameter of a learning model configured to output a latent vector indicating a feature of a target data. The processors calculate, for each first target data, first probabilities that the first target data belongs to virtual classes on an assumption that the plurality of first target data belong to the virtual classes different from each other. The processors update the parameter such that a first loss of the first probabilities, and a second loss that is lower as, for each of element classes to which a plurality of elements included in each of the plurality of first target data belong, a relation with another element class is lower, become lower.Type: GrantFiled: August 24, 2020Date of Patent: July 9, 2024Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Yaling Tao, Kentaro Takagi, Kouta Nakata
-
Publication number: 20240095520Abstract: A representation learning apparatus executing: calculating a latent vector Sx in a latent space of the target data x using a first model parameter, calculate a non-interest latent vector Zx in a latent space of an non-interest feature included in the target data x and a non-interest latent vector Zb in the latent space of a non-interest data using a second model parameter, calculate a similarity S1 obtained by correcting a similarity between the latent vector Sx and its representative value S?x by a similarity between the latent vector Zx and its representative value Z?x, and a similarity S2 between the latent vector Zb and its representative value Z?b, and update the first and/or the second model parameter based on the loss function including the similarity S1 and S2.Type: ApplicationFiled: February 28, 2023Publication date: March 21, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kentaro TAKAGI, Toshiyuki Oshima
-
Publication number: 20240086428Abstract: According to one embodiment, a data labeling work support apparatus includes a processor including hardware. The processor acquires a first label assigned to data. The processor acquires the data. The processor extracts a feature of the data. The processor groups the data based on a similarity or a distance of the feature. The processor assigns a second label to the grouped data. The processor calculates a degree of matching between the first label and the second label. The processor outputs information regarding a combination of the first label and the second label having a low degree of matching.Type: ApplicationFiled: February 28, 2023Publication date: March 14, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kouta NAKATA, Kentaro TAKAGI, Yaling TAO
-
Patent number: 11868885Abstract: According to an embodiment, a learning device includes a memory and one or more processors coupled to the memory. The one or more processors are configured to: generate a transformation matrix from learning data in which feature quantities and target values are held in a corresponding manner; and learn about parameters of a neural network which includes nodes equal in number to the number of rows of the transformation matrix, a first output layer representing first estimation distribution according to the values of the nodes, and a second output layer representing second estimation distribution decided according to the product of the transformation matrix and the first estimation distribution.Type: GrantFiled: August 21, 2020Date of Patent: January 9, 2024Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi Kato, Kouta Nakata, Susumu Naito, Yasunori Taguchi, Kentaro Takagi
-
Patent number: 11860716Abstract: According to one embodiment, an information processing apparatus includes a processing circuit. The processing circuit calculates a first input/output error related to normal data and a second input/output error related to pseudo abnormal data different from the normal data, for each of a plurality of autoencoders having different network structures. The processing circuit outputs relational data indicating a relation between the network structure and the first input/output error and the second input/output error.Type: GrantFiled: February 22, 2022Date of Patent: January 2, 2024Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi Kato, Kentaro Takagi, Kouta Nakata
-
Publication number: 20230252361Abstract: According to one embodiment, an information processing apparatus includes a processor. The processor generates a machine learning model by coupling one feature extractor to each of a plurality of predictors, the feature extractor being configured to extract a feature amount of data. The processor trains the machine learning model for a specific task using a result of ensembling a plurality of outputs from the predictors.Type: ApplicationFiled: September 12, 2022Publication date: August 10, 2023Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi KATO, Kentaro TAKAGI, Kouta NAKATA
-
Patent number: 11601074Abstract: An actuator capable of attaining high output. The actuator includes a frame structure part that forms a frame structure surrounding a housing part, and a volume change part housed in the housing part. The volume change part increases a volume thereof by input of external energy. The frame structure part has a higher Young's modulus than a Young's modulus of the volume change part. The housing part has an anisotropic shape, with a maximum width in first direction of the housing part longer than a maximum width in second direction different from the first direction of the housing part.Type: GrantFiled: March 24, 2021Date of Patent: March 7, 2023Assignees: DENSO CORPORATION, TOKYO INSTITUTE OF TECHNOLOGY, NATIONAL UNIVERSITY CORPORATION TOKAI NATIONAL HIGHER EDUCATION AND RESEARCH SYSTEMInventors: Daichi Sakurai, Seiichiro Washino, Shota Chatani, Haruhiko Watanabe, Masatoshi Shioya, Daisuke Kimura, Toshihira Irisawa, Kentaro Takagi, Takashi Hasegawa
-
Publication number: 20220398146Abstract: According to one embodiment, an information processing apparatus includes a processing circuit. The processing circuit calculates a first input/output error related to normal data and a second input/output error related to pseudo abnormal data different from the normal data, for each of a plurality of autoencoders having different network structures. The processing circuit outputs relational data indicating a relation between the network structure and the first input/output error and the second input/output error.Type: ApplicationFiled: February 22, 2022Publication date: December 15, 2022Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi KATO, Kentaro TAKAGI, Kouta NAKATA
-
Publication number: 20220314171Abstract: The present invention relates to a separation membrane element including a supply-side channel member, in which: the supply-side channel member has a net shape in which plural fibrous rows X including fibrous objects A and plural fibrous rows Y including fibrous objects B cross each other sterically to form intersections; at least one of the fibrous objects A and the fibrous objects B have a large diameter portion and a small diameter portion along a longitudinal direction; at least one of the fibrous objects A and the fibrous objects B include a thread that is thinner at a central portion located between intersection portions than at the large diameter portion; and a fiber between an arbitrary intersection and an adjacent intersection is a tapered fiber whose diameter increases like a taper in a direction from one intersection to the other intersection.Type: ApplicationFiled: August 26, 2020Publication date: October 6, 2022Applicant: Toray Industries, Inc.Inventors: Shu Taniguchi, Kentaro Takagi, Takeshi Konda
-
Publication number: 20220076049Abstract: According to one embodiment, the importance analysis apparatus includes an importance calculator and a distribution calculator. Based on a trained model and a plurality of input data samples, the importance calculator calculates an importance of each of a plurality of feature amounts of each of the input data samples. The distribution calculator calculates a distribution of the importances of each of the feature amounts across the input data samples.Type: ApplicationFiled: February 26, 2021Publication date: March 10, 2022Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kentaro TAKAGI, Kouta NAKATA
-
Patent number: 11167249Abstract: An object of the present invention is to provide a separation membrane having high permeability and selective removability of divalent/monovalent ions. The separation membrane of the present invention includes a supporting membrane and a separation functional layer formed on the supporting membrane, in which the separation functional layer contains a polymerized product of a polyfunctional amine with a polyfunctional acid halide, the polyfunctional amine contains a polyfunctional aliphatic amine as a main component, the separation functional layer has a hollow protuberant structure, and the separation functional layer has a relative surface area of 1.1-10.0.Type: GrantFiled: July 29, 2016Date of Patent: November 9, 2021Assignee: TORAY INDUSTRIES, INC.Inventors: Masakazu Koiwa, Ryoma Miyamoto, Hiroaki Tanaka, Koji Nakatsuji, Kentaro Takagi
-
Publication number: 20210305917Abstract: An actuator capable of attaining high output. The actuator includes a frame structure part that forms a frame structure surrounding a housing part, and a volume change part housed in the housing part. The volume change part increases a volume thereof by input of external energy. The frame structure part has a higher Young's modulus than a Young's modulus of the volume change part. The housing part has an anisotropic shape, with a maximum width in first direction of the housing part longer than a maximum width in second direction different from the first direction of the housing part.Type: ApplicationFiled: March 24, 2021Publication date: September 30, 2021Applicants: DENSO CORPORATION, TOKYO INSTITUTE OF TECHNOLOGY, NATIONAL UNIVERSITY CORPORATION TOKAI NATIONAL HIGHER EDUCATION AND RESEARCH SYSTEMInventors: Daichi SAKURAI, Seiichiro WASHINO, Shota CHATANI, Haruhiko WATANABE, Masatoshi SHIOYA, Daisuke KIMURA, Toshihira IRISAWA, Kentaro TAKAGI, Takashi HASEGAWA
-
Publication number: 20210264259Abstract: According to an embodiment, a learning device includes a memory and one or more processors coupled to the memory. The one or more processors are configured to: generate a transformation matrix from learning data in which feature quantities and target values are held in a corresponding manner; and learn about parameters of a neural network which includes nodes equal in number to the number of rows of the transformation matrix, a first output layer representing first estimation distribution according to the values of the nodes, and a second output layer representing second estimation distribution decided according to the product of the transformation matrix and the first estimation distribution.Type: ApplicationFiled: August 21, 2020Publication date: August 26, 2021Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuichi KATO, Kouta NAKATA, Susumu NAITO, Yasunori TAGUCHI, Kentaro TAKAGI
-
Publication number: 20210248426Abstract: According to an embodiment, a learning device includes one or more processors. The processors calculate a latent vector of each of a plurality of first target data, by using a parameter of a learning model configured to output a latent vector indicating a feature of a target data. The processors calculate, for each first target data, first probabilities that the first target data belongs to virtual classes on an assumption that the plurality of first target data belong to the virtual classes different from each other. The processors update the parameter such that a first loss of the first probabilities, and a second loss that is lower as, for each of element classes to which a plurality of elements included in each of the plurality of first target data belong, a relation with another element class is lower, become lower.Type: ApplicationFiled: August 24, 2020Publication date: August 12, 2021Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yaling TAO, Kentaro TAKAGI, Kouta NAKATA