Patents by Inventor Kazuya KAKIZAKI
Kazuya KAKIZAKI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240062109Abstract: A robustness evaluation device includes a similarity calculation unit that calculates the similarity between a feature of an input to an authentication model and a feature of a template; a local Lipschitz constant estimation unit, that estimates a local Lipschitz constant of a function for calculating similarity between the feature of the input to the authentication model and the feature of the template, in a sphere centered on the input to the authentication model; and an evaluation value estimation unit that estimates the evaluation value of robustness of the authentication model based on the similarity, the determination threshold value for the similarity, and the local Lipschitz constant.Type: ApplicationFiled: October 25, 2023Publication date: February 22, 2024Applicant: NEC CorporationInventor: Kazuya KAKIZAKI
-
Publication number: 20230306273Abstract: An information processing device includes a guide data acquirer that acquires a plurality of guide data items classified into a single target class; and an adversarial sample generator that generates one adversarial sample by using the plurality of guide data items.Type: ApplicationFiled: August 20, 2020Publication date: September 28, 2023Applicant: NEC CorporationInventors: Kazuya KAKIZAKI, Inderjeet SINGH
-
Publication number: 20230281964Abstract: Deep metric learning models are trained with multi-target adversarial examples by initializing a perturbation applied to a clean sample selected from a training sample set to form an adversarial example, the clean sample associated with a label sample, applying a deep metric learning model to the adversarial example and a plurality of target samples selected from the training sample set to obtain an adversarial feature vector and a plurality of target feature vectors, respectively, adjusting the perturbation to reduce difference among the adversarial feature vector and the plurality of target feature vectors to generate a multi-target adversarial example, applying the deep metric learning model to the clean sample, the label sample, and the multi-target adversarial example to obtain a clean feature vector, a label feature vector, and a multi-target adversarial feature vector, respectively, and adjusting the deep metric learning model based on the clean feature vector, the label feature vector, and the multi-tType: ApplicationFiled: March 4, 2022Publication date: September 7, 2023Inventors: Inderjeet SINGH, Kazuya KAKIZAKI, Toshinori ARAKI
-
Publication number: 20230259818Abstract: Calculate a plurality of feature vectors representing features of an input sample from the input sample which is multidimensional data by using a plurality of feature calculation models. Calculate similarity between an average value of the plurality of feature vectors and a representative vector corresponding to a class to which the input sample belongs among a plurality of representative vectors corresponding to a plurality of classes respectively, the representative vector having same dimensionality as each of the plurality of feature vectors. Learn parameters of the plurality of feature calculation models based on an evaluation function in which a value is larger as the similarity between the average value of the plurality of feature vectors and the representative vector corresponding to the class to which the input sample belongs is smaller.Type: ApplicationFiled: July 6, 2020Publication date: August 17, 2023Applicant: NEC CorporationInventors: Takuma AMADA, Kazuya KAKIZAKI
-
Publication number: 20230222782Abstract: An adversarial example detection device includes a first feature extraction unit configured to extract first features with respect to input data and comparative data in a first calculation method, a second feature extraction unit configured to extract second features with respect to the input data and the comparative data in a second calculation method different from the first calculation method, and a determination unit configured to determine whether or not at least one piece of the input data and the comparative data is an adversarial example through calculation using the first features and the second features.Type: ApplicationFiled: June 5, 2020Publication date: July 13, 2023Applicant: NEC CorporationInventors: Takuma AMADA, Kazuya KAKIZAKI, Toshinori ARAKI
-
Publication number: 20220343214Abstract: A robustness evaluation device includes a similarity calculation unit that calculates the similarity between a feature of an input to an authentication model and a feature of a template; a local Lipschitz constant estimation unit, that estimates a local Lipschitz constant of a function for calculating similarity between the feature of the input to the authentication model and the feature of the template, in a sphere centered on the input to the authentication model; and an evaluation value estimation unit that estimates the evaluation value of robustness of the authentication model based on the similarity, the determination threshold value for the similarity, and the local Lipschitz constant.Type: ApplicationFiled: August 29, 2019Publication date: October 27, 2022Applicant: NEC CorporationInventor: Kazuya KAKIZAKI
-
Publication number: 20220335298Abstract: A robust learning device is a learning device that, with a parameter of n neural networks, training data, and a correct label serving as inputs, outputs the updated parameter, including: a model selection unit that selects neural networks, which are less than n and equal to or more than two, among the n neural networks; a limited objective function calculation unit that calculates, in a calculation process of an objective function including a process in which a value of the objective function becomes smaller as an output of the neural networks to the training data is closer to the correct label and a degree of similarity between the neural networks is smaller, a limited objective function including only the process relating to the neural networks selected by the model selection unit; and an update unit that updates the parameter such that a value of the limited objective function is decreased.Type: ApplicationFiled: October 1, 2019Publication date: October 20, 2022Applicant: NEC CorporationInventors: Takuma AMADA, Kazuya KAKIZAKI, Toshinori ARAKI
-
Publication number: 20220237416Abstract: A learning apparatus includes: a prediction loss calculating device that calculates a prediction loss function based on an error between outputs of machine learning models to which training data is inputted and a ground truth label; a gradient loss calculating device that calculates a gradient loss function based on a gradient of the prediction loss function; and an updating device that performs an update operation of updating the machine learning models on the basis of the prediction loss function and the gradient loss function, the gradient loss calculating device calculates the gradient loss function based on the gradient when the number of times which the update operation is performed is smaller than a predetermined number, and calculates a function that represents zero as the gradient loss function when the number of times which the update operation is performed is larger than the predetermined number.Type: ApplicationFiled: May 21, 2019Publication date: July 28, 2022Applicant: NEC CorporationInventors: Toshinori Araki, Takuma Amada, Kazuya Kakizaki
-
Publication number: 20220188706Abstract: There is provided a system for computing a secure statistical classifier, comprising: at least one hardware processor executing a code for: accessing code instructions of an untrained statistical classifier, accessing a training dataset, accessing a plurality of cryptographic keys, creating a plurality of instances of the untrained statistical classifier, creating a plurality of trained sub-classifiers by training each of the plurality of instances of the untrained statistical classifier by iteratively adjusting adjustable classification parameters of the respective instance of the untrained statistical classifier according to a portion of the training data serving as input and a corresponding ground truth label, and at least one unique cryptographic key of the plurality of cryptographic keys, wherein the adjustable classification parameters of each trained sub-classifier have unique values computed according to corresponding at least one unique cryptographic key, and providing the statistical classifier, wheType: ApplicationFiled: March 1, 2022Publication date: June 16, 2022Applicants: NEC Corporation Of America, Bar-Ilan University, NEC CorporationInventors: Jun FURUKAWA, Joseph KESHET, Kazuma OHARA, Toshinori ARAKI, Hikaru TSUCHIDA, Takuma AMADA, Kazuya KAKIZAKI, Shir AVIV-REUVEN
-
Patent number: 11315037Abstract: There is provided a system for computing a secure statistical classifier, comprising: at least one hardware processor executing a code for: accessing code instructions of an untrained statistical classifier, accessing a training dataset, accessing a plurality of cryptographic keys, creating a plurality of instances of the untrained statistical classifier, creating a plurality of trained sub-classifiers by training each of the plurality of instances of the untrained statistical classifier by iteratively adjusting adjustable classification parameters of the respective instance of the untrained statistical classifier according to a portion of the training data serving as input and a corresponding ground truth label, and at least one unique cryptographic key of the plurality of cryptographic keys, wherein the adjustable classification parameters of each trained sub-classifier have unique values computed according to corresponding at least one unique cryptographic key, and providing the statistical classifier, wheType: GrantFiled: March 14, 2019Date of Patent: April 26, 2022Assignees: NEC Corporation Of America, Bar-Ilan University, NEC CorporationInventors: Jun Furukawa, Joseph Keshet, Kazuma Ohara, Toshinori Araki, Hikaru Tsuchida, Takuma Amada, Kazuya Kakizaki, Shir Aviv-Reuven
-
Publication number: 20220121991Abstract: A model building apparatus includes: a building unit that builds a generation model that outputs an adversarial example, which causes misclassification by a learned model, when a source sample is entered into the generation model; and a calculating unit that calculates a first evaluation value and a second evaluation value, wherein the first evaluation value is smaller as a difference is smaller between an actual visual feature of the adversarial example outputted from the generation model and a target visual feature of the adversarial example that are set to be different from a visual feature of the source sample, and the second evaluation value is smaller as there is a higher possibility that the learned model misclassifies the adversarial example outputted from the generation model. The building unit builds the generation model by updating the generation model such that an index value based on the first and second evaluation values is smaller.Type: ApplicationFiled: February 12, 2019Publication date: April 21, 2022Applicant: NEC CorporationInventors: Kazuya Kakizaki, Kosuke Yoshida
-
Publication number: 20220027677Abstract: An information processing device includes a sample candidate generation unit that generates a sample candidate to be authenticated to belong to a target class that is a class inducing erroneous authentication, from source data belonging to a class other than the target class, on the basis of a similarity degree with data belonging to the target class in a template that is data registered in advance, and a similarity degree with data not belonging to the target class in the template.Type: ApplicationFiled: December 12, 2018Publication date: January 27, 2022Applicant: NEC CorporationInventor: Kazuya KAKIZAKI
-
Publication number: 20220006824Abstract: A control system (10) includes plural sensors (14), plural actuators (16), and a controller (18). An information processing apparatus (2000) acquires configuration information representing a configuration of the control system (10), a control rule representing a rule of control of each actuator (16) by the controller (18), and behavior log data indicating a combination of time-series data of an observed value of the sensor (14), and a state of each actuator (16) at each time. The information processing apparatus (2000) generates, for each combination of states of the actuators (16) and for the each sensor (14), a behavioral function representing a temporal change of the observed value of the sensor (14) regarding the combination of states of the plural actuators, using the behavioral log data, and generates a system model of the control system (10) using the configuration information, the control rule, and the behavioral function.Type: ApplicationFiled: November 22, 2018Publication date: January 6, 2022Applicant: NEC CorporationInventor: Kazuya KAKIZAKI
-
Publication number: 20210241119Abstract: A pre-trained model update device includes: an alternative example generation unit configured to generate an alternative example and a correct answer label corresponding to the alternative example, based on a generative model representing training data used in generating a pre-trained model; an adversarial example generation unit configured to generate an adversarial example inducing the pre-trained model to misclassify and a correction label corresponding to the adversarial example, based on an attack model and based on the alternative example and the correct answer label generated by the alternative example generation unit; and a model update unit configured to perform additional learning based on a result of generation by the alternative example generation unit and a result of generation by the adversarial example generation unit, and generate an updated model.Type: ApplicationFiled: April 27, 2018Publication date: August 5, 2021Applicant: NEC CorporationInventors: Tsubasa TAKAHASHI, Kazuya KAKIZAKI
-
Publication number: 20200293944Abstract: There is provided a system for computing a secure statistical classifier, comprising: at least one hardware processor executing a code for: accessing code instructions of an untrained statistical classifier, accessing a training dataset, accessing a plurality of cryptographic keys, creating a plurality of instances of the untrained statistical classifier, creating a plurality of trained sub-classifiers by training each of the plurality of instances of the untrained statistical classifier by iteratively adjusting adjustable classification parameters of the respective instance of the untrained statistical classifier according to a portion of the training data serving as input and a corresponding ground truth label, and at least one unique cryptographic key of the plurality of cryptographic keys, wherein the adjustable classification parameters of each trained sub-classifier have unique values computed according to corresponding at least one unique cryptographic key, and providing the statistical classifier, wheType: ApplicationFiled: March 14, 2019Publication date: September 17, 2020Applicants: NEC Corporation Of America, Bar-Ilan University, NEC CorporationInventors: Jun FURUKAWA, Joseph KESHET, Kazuma OHARA, Toshinori ARAKI, Hikaru TSUCHIDA, Takuma AMADA, Kazuya KAKIZAKI, Shir AVIV-REUVEN