Patents by Inventor Shuhei Nitta
Shuhei Nitta has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240135155Abstract: In a data processing device, a fixed-point position control unit determines, as first control. The fixed-point position control unit causes a detection calculation unit to perform calculation processing on processing target data at a processing point in time. The saturation rate control unit instructs, as second control to be repeated by the fixed-point position control unit, the fixed-point position control unit to move at least the fixed-point position as control to increase a lower limit saturation rate proportional to a magnitude of a counted lower limit counter value with respect to a result of the first control. The fixed-point position control unit performs, as the second control, a predetermined determination on the basis of the instruction from the saturation rate control unit and the metadata, determines the fixed-point position moved for each layer, and causes calculation processing to be performed.Type: ApplicationFiled: December 28, 2020Publication date: April 25, 2024Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Saki HATTA, Hiroyuki UZAWA, Shuhei YOSHIDA, Daisuke KOBAYASHI, Yuya OMORI, Ken NAKAMURA, Koyo NITTA
-
Patent number: 11947507Abstract: A traffic monitoring apparatus that monitors traffic of a monitoring target network and includes a statistical information processor that acquires statistical information per specific flow of the traffic, and a packet capture unit that captures a packet of the specific flow, in which the statistical information processor includes a statistical information aggregation unit that aggregates the pieces of statistical information, and a statistical information file generation unit that generates a statistical information file based on the pieces of aggregated statistical information.Type: GrantFiled: May 26, 2020Date of Patent: April 2, 2024Assignee: Nippon Telegraph and Telephone CorporationInventors: Hiroyuki Uzawa, Shuhei Yoshida, Namiko Ikeda, Koyo Nitta
-
Patent number: 11926195Abstract: A gap formed in a front portion of a door is easily and reliably filled so that occurrence of noise is reduced. A frame front side portion retaining a glass run front side portion being inserted is provided in a front portion of a window frame. A front side of an outer panel is provided with a mirror base to which a mirror base cover is attached. A rear edge of the mirror base is provided with a rear plate portion extending to a cabin inner side. A gap is formed between the frame front side portion and the rear plate portion. An outer sealing plate portion overlapping with a cabin outer side of the mirror base is provided with a gap filling portion extending downward.Type: GrantFiled: October 22, 2021Date of Patent: March 12, 2024Assignee: Nishikawa Rubber Co., Ltd.Inventors: Haruka Yanoshita, Masaki Motodera, Keizo Matsuoka, Shuhei Nitta, Tatsuya Nagai
-
Patent number: 11916763Abstract: A traffic monitoring apparatus includes: a header analysis circuit configured to acquire one or more identifiers from a header of a received packet; a rule registration circuit configured to convert a rule table including rules in which one or more rule elements are registered for each of the rules into a predetermined format and register the rule table in a rule matching circuit; and the rule matching circuit configured to search for rules to be matched with the acquired identifiers.Type: GrantFiled: July 1, 2019Date of Patent: February 27, 2024Assignee: Nippon Telegraph and Telephone CorporationInventors: Yuta Ukon, Shuhei Yoshida, Shoko Oteru, Namiko Ikeda, Koyo Nitta
-
Publication number: 20240028902Abstract: According to one embodiment, a learning apparatus includes a processor. The processor trains a neural network model having a plurality of pathways and generate a trained model. The processor performs pruning on the trained model and calculate a number of remaining parameters of each of the pathways. The processor generates a candidate model for reconstruction, the candidate model for reconstruction being generated by deleting a pathway in which the number of parameters is equal to or less than a threshold. The processor determines whether or not deletion of a further pathway included in the candidate model for reconstruction is possible. If it is determined that deletion of the further pathway is possible, the candidate model for reconstruction is subjected to each of the training, the pruning, and the generating.Type: ApplicationFiled: February 24, 2023Publication date: January 25, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Albert RODRIGUEZ MULET, Shuhei NITTA, Ryusuke HIRAI
-
Publication number: 20240028901Abstract: According to one embodiment, a learning apparatus includes a processor. The processor performs, on a neural network model, an adaptation processing that includes at least either insertion of an activation function, or correction of the activation function. The processor generates a trained model by training the neural network model on which the adaptation processing has been performed. The processor performs pruning on the trained model to generate a reconstructed model from which a parameter has been reduced.Type: ApplicationFiled: February 22, 2023Publication date: January 25, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Albert RODRIGUEZ MULET, Shuhei NITTA, Yoshiyuki KOKOJIMA, Ryusuke HIRAI, Yasutaka FURUSHO, Manabu NISHIYAMA, Yusuke NATSUI
-
Publication number: 20240005172Abstract: According to one embodiment, a learning system includes a plurality of local devices and a server. Each of the local devices includes a processor. The processor selects a mini-batch from local data. The processor trains a local model using the mini-batch. The processor generates local data information relating to the local data included in the mini-batch and indicating information different from a label. The processor transmits a local model parameter relating to the local model and the local data information to the server. The server includes a processor. The processor calculates an integrated parameter using the local data information acquired from each of the local devices. The processor updates a global model using the integrated parameter and the local model parameter acquired from each of the local devices.Type: ApplicationFiled: February 15, 2023Publication date: January 4, 2024Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Shuhei NITTA, Ryusuke HIRAI, Yoshiyuki KOKOJIMA
-
Publication number: 20230297811Abstract: According to one embodiment, a learning apparatus includes a processor. The processor divides target data into pieces of partial data. The processor inputs the pieces of partial data into a first network model to output a first prediction result and calculates a first confidence indicating a degree of contribution to the first prediction result. The processor inputs the target data into a second network model to output a second prediction result and calculates a second confidence indicating a degree of contribution to the second prediction result. The processor updates a parameter of the first network model, based on the first prediction result, the second prediction result, the first confidence and the second confidence.Type: ApplicationFiled: August 31, 2022Publication date: September 21, 2023Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Atsushi YAGUCHI, Shuhei NITTA, Akiyuki TANIZAWA, Ryusuke HIRAI
-
Patent number: 11747424Abstract: A magnetic resonance imaging apparatus according to an embodiment includes an MRI system and a processing circuitry. The MRI system includes a receiving coil to receive a magnetic resonance signal. The processing circuitry is configured to generate an image based on the magnetic resonance signal, the image including a plurality of pixels; calculate a feature value corresponding to a signal value of the pixel; correct the feature values based on a sensitivity of the receiving coil; and reduce noise in the image based on distribution of the corrected feature values.Type: GrantFiled: March 2, 2022Date of Patent: September 5, 2023Assignee: CANON MEDICAL SYSTEMS CORPORATIONInventors: Kenzo Isogawa, Toshiyuki Ono, Kenichi Shimoyama, Nobuyuki Matsumoto, Shuhei Nitta, Satoshi Kawata, Toshimitsu Kaneko, Mai Murashima
-
Patent number: 11704570Abstract: A learning device includes a structure search unit that searches for a first learned model structure obtained by selecting search space information in accordance with a target constraint condition of target hardware for each of a plurality of convolution processing blocks included in a base model structure in a neural network model; a parameter search unit that searches for a learning parameter of the neural network model in accordance with the target constraint condition; and a pruning unit that deletes a unit of at least one of the plurality of convolution processing blocks in the first learned model structure based on the target constraint condition and generates a second learned model structure.Type: GrantFiled: February 26, 2020Date of Patent: July 18, 2023Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Akiyuki Tanizawa, Wataru Asano, Atsushi Yaguchi, Shuhei Nitta, Yukinobu Sakata
-
Patent number: 11654553Abstract: A robot system according to an embodiment includes one or more processors. The processors acquire first input data predetermined as data affecting an operation of a robot. The processors calculate a calculation cost of inference processing using a machine learning model for inferring control data used for controlling the robot, on the basis of the first input data. The processors infer the control data by the machine learning model set according to the calculation cost. The processors control the robot using the inferred control data.Type: GrantFiled: February 25, 2020Date of Patent: May 23, 2023Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Shuhei Nitta, Atsushi Yaguchi, Yukinobu Sakata, Akiyuki Tanizawa, Yasutoyo Takeyama, Tomoki Watanabe
-
Patent number: 11640530Abstract: A learning device includes one or more processors. The processors acquire input data and a target label indicating a correct answer of inference based on the input data. The processors add noise to at least one of the input data and intermediate layer data of the neural network and perform inference by the neural network with respect to the input data. The noise is based on contributions of a plurality of elements included in the input data with respect to an inference result when the input data is input to a neural network. The processors update parameters of the neural network so that the inference result by the neural network matches the target label.Type: GrantFiled: February 24, 2020Date of Patent: May 2, 2023Assignee: KABUSHIKI KAISHA TOSHIBAInventor: Shuhei Nitta
-
Publication number: 20230090616Abstract: According to one embodiment, a learning system includes a plurality of local devices and a server. Each of the local devices includes a processor. The processor of the local device selects a first parameter set from a plurality of parameters related to the local model, and transmits the first parameter set to the server. At least one of the local devices is different from other local devices in a size of the local model in accordance with a resolution of input data. The server comprises a processor. The processor of the server integrates first parameter sets acquired from the local devices and update a global model. The processor of the server transmits the second parameter set to a local device that has transmitted the corresponding first parameter set.Type: ApplicationFiled: February 23, 2022Publication date: March 23, 2023Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Shuhei NITTA, Atsushi YAGUCHI, Akiyuki TANIZAWA
-
Patent number: 11604999Abstract: A learning device according to an embodiment includes one or more hardware processors configured to function as a generation unit, an inference unit, and a training unit. The generation unit generates input data with which an error between a value output from each of one or more target nodes and a preset aimed value is equal to or less than a preset value, the target nodes being in a target layer of a plurality of layers included in a first neural network. The inference unit causes the input data to propagate in a forward direction of the first neural network to generate output data. The training unit trains a second neural network differing from the first neural network by using training data including a set of the input data and the output data.Type: GrantFiled: February 26, 2020Date of Patent: March 14, 2023Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Wataru Asano, Akiyuki Tanizawa, Atsushi Yaguchi, Shuhei Nitta, Yukinobu Sakata
-
Publication number: 20230056947Abstract: According to one embodiment, a learning apparatus includes a processing circuit. The processing circuit acquires a first training condition and a first model trained in accordance with the first training condition, sets a second training condition used to reduce a model size of the first model, different from the first training condition, in accordance with the second training condition and based on the first model, trains a second model whose model size is smaller than that of the first model, and in accordance with a third training condition that is not the same as the second training condition and complies with the first training condition, trains a third model based on the second model.Type: ApplicationFiled: February 28, 2022Publication date: February 23, 2023Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Shuhei NITTA, Yasutaka Furusho, Albert Rodriguez Mulet, Atsushi Yaguchi, Akiyuki TANIZAWA
-
Publication number: 20230022566Abstract: According to one embodiment, a machine learning apparatus includes a processing circuit. The processing circuit trains a first learning: parameter of an extraction layer configured to extract feature data of the input data, based on a plurality of training data. The processing circuit trains a second learning parameter of a reconstruction layer configured to generate reconstructed data of the input data, based on a plurality of training feature data obtained by applying the trained extraction layer to the plurality of training data. The second learning parameter represents representative vectors as many as a dimension count of the feature data. The representative vectors as many as the dimension count are based on a weighted sum of the plurality of training data.Type: ApplicationFiled: February 25, 2022Publication date: January 26, 2023Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yasutaka FURUSHO, Yukinobu SAKATA, Shuhei NITTA
-
Publication number: 20220284238Abstract: According to one embodiment, a learning apparatus includes a processor. The processor determines, based on a data resolution of subject data obtained at a subject device, a plurality of data resolutions that differ from one another within a range covering the data resolution of the subject data, the data resolutions each indicating a corresponding amount of information per unit. The processor trains a scalable network with training samples corresponding to each of the plurality of data resolutions, the scalable network being a neural network adapted to change a data resolution of input data.Type: ApplicationFiled: August 30, 2021Publication date: September 8, 2022Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Shuhei NITTA, Akiyuki TANIZAWA
-
Patent number: 11436490Abstract: A providing apparatus according to an embodiment of the present disclosure includes a memory and a hardware processor coupled to the memory. The hardware processor is configured to: store, in the memory, a first machine learning model capable of changing an amount of calculation of a model of a neural network; acquire device information; set, based on the device information, extraction conditions representing conditions for extracting second machine learning models from the first machine learning model; extract the second machine learning models from the first machine learning model based on the extraction conditions; and provide the second machine learning models to a device specified by the device information.Type: GrantFiled: February 26, 2020Date of Patent: September 6, 2022Assignee: KABUSHIKI KAISHA TOSHIBAInventors: Akiyuki Tanizawa, Atsushi Yaguchi, Shuhei Nitta, Yukinobu Sakata
-
Publication number: 20220187402Abstract: A magnetic resonance imaging apparatus according to an embodiment includes an MRI system and a processing circuitry. The MRI system includes a receiving coil to receive a magnetic resonance signal. The processing circuitry is configured to generate an image based on the magnetic resonance signal, the image including a plurality of pixels; calculate a feature value corresponding to a signal value of the pixel; correct the feature values based on a sensitivity of the receiving coil; and reduce noise in the image based on distribution of the corrected feature values.Type: ApplicationFiled: March 2, 2022Publication date: June 16, 2022Applicant: CANON MEDICAL SYSTEMS CORPORATIONInventors: Kenzo ISOGAWA, Toshiyuki ONO, Kenichi SHIMOYAMA, Nobuyuki MATSUMOTO, Shuhei NITTA, Satoshi KAWATA, Toshimitsu KANEKO, Mai MURASHIMA
-
Patent number: 11327031Abstract: A photon counting X-ray CT apparatus according to an embodiment includes: data acquiring circuitry, and processing circuitry. The data acquiring circuitry is configured to allocate energy measured by signals output from a photon counting detector in response to incidence of X-ray photons to any of a plurality of first energy bins so as to acquire a first data group as count data of each of the first energy bins. The processing circuitry is configured to determine a plurality of second energy bins obtained by grouping the first energy bins in accordance with a decomposition target material that is a material to be decomposed in a imaging region, allocate the first data group to any of the second energy bins so as to generate a second data group, and use the second data group to generate an image representing a distribution of the decomposition target material.Type: GrantFiled: October 1, 2019Date of Patent: May 10, 2022Assignee: CANON MEDICAL SYSTEMS CORPORATIONInventors: Kenta Moriyasu, Taichiro Shiodera, Shuhei Nitta, Tomoyuki Takeguchi, Hidenori Takeshima, Toshiyuki Ono, Takashi Ida, Hiroaki Nakai