Patents by Inventor Takashi Fukuda

Takashi Fukuda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11641009
    Abstract: A light-emitting device including a solid-state light source that emits light having a peak wavelength in the range of 480 nm or less and a fluorescent film that covers the solid-state light source and includes at least one kind of phosphor, wherein the fluorescent film includes at least one kind of near-infrared phosphor that is excited by light from the solid-state light source, has a peak wavelength in the range exceeding 700 nm, and has an emission spectrum with a full width at half maximum of 100 nm or more in a range including the peak wavelength.
    Type: Grant
    Filed: June 12, 2019
    Date of Patent: May 2, 2023
    Assignees: NATIONAL INSTITUTE OF ADVANCED INDUSTRIAL SCIENCE AND TECHNOLOGY, PHOENIX ELECTRIC CO., LTD., NATIONAL INSTITUTE FOR MATERIALS SCIENCE
    Inventors: Takashi Fukuda, Tetsuya Gouda, Yuta Sakimoto, Naoto Hirosaki, Kohsei Takahashi
  • Patent number: 11610108
    Abstract: A student neural network may be trained by a computer-implemented method, including: selecting a teacher neural network among a plurality of teacher neural networks, inputting an input data to the selected teacher neural network to obtain a soft label output generated by the selected teacher neural network, and training a student neural network with at least the input data and the soft label output from the selected teacher neural network.
    Type: Grant
    Filed: July 27, 2018
    Date of Patent: March 21, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Takashi Fukuda, Masayuki Suzuki, Osamu Ichikawa, Gakuto Kurata, Samuel Thomas, Bhuvana Ramabhadran
  • Publication number: 20230048604
    Abstract: The present invention provides a reflective polarized-light separating diffraction-element usable in a wide wavelength region including an ultraviolet region, and an optical measurement device comprising the same. The reflective polarized-light separating diffraction-element comprises: a substrate (1); a reflection surface (2) formed on a surface of the substrate (1); and a lattice structured body assembly (3) that is provided on the reflection surface (2) and shows a form birefringence (?n*). The lattice structured body assembly (3) consists of lattice structured bodies (3A, 3B, 3C and 3D) of four patterns having lattice structures of different azimuths. The lattice structured bodies (3A, 3B, 3C and 3D) of a plurality of patterns are aligned on the reflection surface 2 in a predetermined direction such that the azimuths of the lattice structures change in a structurally periodic manner.
    Type: Application
    Filed: December 16, 2019
    Publication date: February 16, 2023
    Applicants: JASCO CORPORATION, NATIONAL INSTITUTE OF ADVANCED INDUSTRIAL SCIENCE AND TECHNOLOGY
    Inventors: Takashi FUKUDA, Akira EMOTO, Yoshihito NARITA, Hiroshi HAYAKAWA, Yuichi MIYOSHI
  • Patent number: 11574181
    Abstract: Fusion of neural networks is performed by obtaining a first neural network and a second neural network. The first and the second neural networks are the result of a parent neural network subjected to different training. A similarity score is calculated of a first component of the first neural network and a corresponding second component of the second neural network. An interpolation weight is determined for the first and the second components by using the similarity score. A neural network parameter of the first component is updated based on the interpolation weight and a corresponding neural network parameter of the second component to obtain a fused neural network.
    Type: Grant
    Filed: May 8, 2019
    Date of Patent: February 7, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Takashi Fukuda, Masayuki Suzuki, Gakuto Kurata
  • Publication number: 20220414448
    Abstract: Methods and systems for training a neural network include training language-specific teacher models using different respective source language datasets. A student model is trained, using the different respective source language datasets and soft labels generated by the language-specific teacher models, including shuffling the source language datasets and shuffling weights of language-dependent layers in language-specific parts of the student model. Weights of language-independent layers of the student model are copied to a language-independent layers of a target model to initialize language-independent layers of the target model. The target model is trained with a target language dataset.
    Type: Application
    Filed: June 24, 2021
    Publication date: December 29, 2022
    Inventors: Takashi Fukuda, Samuel Thomas
  • Publication number: 20220382753
    Abstract: In a method for improving generation and relevancy of search results, a processor receives a search query comprising a search term. A processor generates a document group based on the search query and at least one synonym related to the search term in a synonym dictionary. The synonym dictionary may include search document attributes for base words and synonyms of the base words. A processor extracts, from the document group, an extracted document having a document attribute matching a search document attribute of the at least one synonym. A processor lists the extracted document as a search result.
    Type: Application
    Filed: May 27, 2021
    Publication date: December 1, 2022
    Inventors: Kenta Watanabe, Takahito Tashiro, Takashi Fukuda
  • Publication number: 20220375484
    Abstract: A method, computer system, and a computer program product for audio data augmentation are provided. Sets of audio data from different sources may be obtained. A respective normalization factor for at least two sources of the different sources may be calculated. The normalization factors from the at least two sources may be mixed to determine a mixed normalization factor. A first set of the sets may be normalized by using the mixed normalization factor and to obtain training data for training an acoustic model.
    Type: Application
    Filed: May 21, 2021
    Publication date: November 24, 2022
    Inventors: Toru Nagano, Takashi Fukuda, Masayuki Suzuki
  • Patent number: 11429676
    Abstract: A first user request which specifies a target document set wherein a first subset of the documents is flagged by a user. A primary flag table is created for the target document set. A first document subset is created matching the first user request. It is determined whether a number of flagged documents exceeds a first threshold. If so, a secondary flag table is created for the first document subset and flag data corresponding to the first document subset is stored in the secondary flag table. The flag data in the secondary flag table is merged into the primary flag table.
    Type: Grant
    Filed: October 18, 2019
    Date of Patent: August 30, 2022
    Assignee: International Business Machines Corporation
    Inventors: Hiroaki Kikuchi, Yuichi Suzuki, Takashi Fukuda
  • Patent number: 11416741
    Abstract: A technique for constructing a model supporting a plurality of domains is disclosed. In the technique, a plurality of teacher models, each of which is specialized for different one of the plurality of the domains, is prepared. A plurality of training data collections, each of which is collected for different one of the plurality of the domains, is obtained. A plurality of soft label sets is generated by inputting each training data in the plurality of the training data collections into corresponding one of the plurality of the teacher models. A student model is trained using the plurality of the soft label sets.
    Type: Grant
    Filed: June 8, 2018
    Date of Patent: August 16, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Takashi Fukuda, Osamu Ichikawa, Samuel Thomas, Bhuvana Ramabhadran
  • Patent number: 11410029
    Abstract: A technique for generating soft labels for training is disclosed. A teacher model having a teacher side class set is prepared. A collection of class pairs for respective data units is obtained. Class pairs includes classes labelled to corresponding data units from the teacher side class set and a student side class set different from the teacher side class set. A training input is fed into the teacher model to obtain a set of outputs for the teacher side class set. A set of soft labels for the student side class set is calculated from the set of the outputs by using at least an output obtained for a class within a subset of the teacher side class set having relevance to the member of the student side class set, based at least in part on observations in the collection of the class pairs.
    Type: Grant
    Filed: January 2, 2018
    Date of Patent: August 9, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Takashi Fukuda, Samuel Thomas, Bhuvana Ramabhadran
  • Publication number: 20220235473
    Abstract: In a hydrogen production apparatus, a front end of a housing is closed by an opening/closing door. In a closed state of the opening/closing door, the upper end of the upper rod and the lower end of the lower rod in the opening/closing switching mechanism are disposed behind the first front frame piece and the second front frame piece of the housing, respectively. The axial distance between the upper end of the lower collar member and the second stopper is greater than the axial distance between the upper end of the upper collar member and the first stopper.
    Type: Application
    Filed: January 24, 2022
    Publication date: July 28, 2022
    Inventors: Kazuyoshi MIYAJIMA, Takashi FUKUDA
  • Publication number: 20220188622
    Abstract: An approach to identifying alternate soft labels for training a student model may be provided. A teaching model may generate a soft label for a labeled training data. The training data can be an acoustic file for speech or a spoken natural language. A pool of soft labels previously generated by teacher models can be searched at the label level to identify soft labels that are similar to the generated soft label. The similar soft labels can have similar length or sequence at the word phoneme, and/or state level. The identified similar soft labels can be used in conjunction with the generated soft label to train a student model.
    Type: Application
    Filed: December 10, 2020
    Publication date: June 16, 2022
    Inventors: Toru Nagano, Takashi Fukuda, Gakuto Kurata
  • Publication number: 20220188643
    Abstract: A method of training a student neural network is provided. The method includes feeding a data set including a plurality of input vectors into a teacher neural network to generate a plurality of output values, and converting two of the plurality of output values from the teacher neural network for two corresponding input vectors into two corresponding soft labels. The method further includes combining the two corresponding input vectors to form a synthesized data vector, and forming a masked soft label vector from the two corresponding soft labels. The method further includes feeding the synthesized data vector into the student neural network, using the masked soft label vector to determine an error for modifying weights of the student neural network, and modifying the weights of the student neural network.
    Type: Application
    Filed: December 11, 2020
    Publication date: June 16, 2022
    Inventor: Takashi Fukuda
  • Publication number: 20220180206
    Abstract: Methods and systems for training a neural network include clustering a full set of training data samples into specialized training clusters. Specialized teacher neural networks are trained using respective specialized training clusters of the specialized training clusters. Soft labels are generated for the full set of training data samples using the specialized teacher neural networks. A student model is trained using the full set of training data samples, the specialized training clusters, and the soft labels.
    Type: Application
    Filed: December 9, 2020
    Publication date: June 9, 2022
    Inventor: Takashi Fukuda
  • Publication number: 20220059768
    Abstract: Provided are (i) a solution for forming an organic semiconductor layer which solution has an excellent coating property, (ii) an organic semiconductor which is produced with use of the solution and which has high heat resistance, (iii) a layer which contains the organic semiconductor, and (iv) an organic thin film transistor which exhibits high electrical properties. A composition containing: an organic semiconductor; and a polymer (1) having at least one unit selected from the group consisting of units represented by formulae (1-a), (1-b), and (1-c). A composition containing the organic semiconductor, the polymer (1), and an organic solvent can be suitably used as a solution for forming an organic semiconductor layer.
    Type: Application
    Filed: December 19, 2019
    Publication date: February 24, 2022
    Applicant: TOSOH CORPORATION
    Inventors: Takahiro MORI, Takashi FUKUDA
  • Publication number: 20220051105
    Abstract: Some embodiments of the present invention are directed to techniques for training teacher neural networks (TNNs) and student neural networks (SNNs). A training data set is received with a lossless set of data and a corresponding lossy set of data. Two branches of a TNN are established, with one branch trained using the lossless data (a lossless branch) and one trained using the lossy data (a lossy branch). Weights for the two branches are tied together. The lossy branch, now isolated from the lossless branch, generates a set of soft targets for initializing an SNN. These generated soft targets benefit from the training of lossless branch through the weights that were tied together between each branch, despite isolating the lossless branch from the lossy branch during soft-target generation.
    Type: Application
    Filed: August 17, 2020
    Publication date: February 17, 2022
    Inventors: Takashi Fukuda, Samuel Thomas
  • Patent number: 11227579
    Abstract: A technique for data augmentation for speech data is disclosed. Original speech data including a sequence of feature frames is obtained. A partially prolonged copy of the original speech data is generated by inserting one or more new frames into the sequence of the feature frames. The partially prolonged copy is output as augmented speech data for training an acoustic model for training an acoustic model.
    Type: Grant
    Filed: August 8, 2019
    Date of Patent: January 18, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Toru Nagano, Takashi Fukuda, Masayuki Suzuki, Gakuto Kurata
  • Patent number: 11120820
    Abstract: A technique for detecting a signal tone in an audio signal is disclosed. A determination is made as to whether a peak modulation frequency in the audio signal is in a specific range or not to obtain a determination result. A measure regarding a modulation spectrum of the audio signal is calculated. The measure is calculated based on at least components of the modulation spectrum above a specific limit of modulation frequency. By using the determination result and the measure regarding the modulation spectrum, a judgement is done as to whether the audio signal contains a signal tone or not.
    Type: Grant
    Filed: December 5, 2018
    Date of Patent: September 14, 2021
    Assignee: International Business Machines Corporation
    Inventors: Takashi Fukuda, Masayuki Suzuki
  • Patent number: 11106974
    Abstract: A technique for training a neural network including an input layer, one or more hidden layers and an output layer, in which the trained neural network can be used to perform a task such as speech recognition. In the technique, a base of the neural network having at least a pre-trained hidden layer is prepared. A parameter set associated with one pre-trained hidden layer in the neural network is decomposed into a plurality of new parameter sets. The number of hidden layers in the neural network is increased by using the plurality of the new parameter sets. Pre-training for the neural network is performed.
    Type: Grant
    Filed: July 5, 2017
    Date of Patent: August 31, 2021
    Assignee: International Business Machines Corporation
    Inventors: Takashi Fukuda, Osamu Ichikawa
  • Publication number: 20210249569
    Abstract: A light-emitting device including a solid-state light source that emits light having a peak wavelength in the range of 480 nm or less and a fluorescent film that covers the solid-state light source and includes at least one kind of phosphor, wherein the fluorescent film includes at least one kind of near-infrared phosphor that is excited by light from the solid-state light source, has a peak wavelength in the range exceeding 700 nm, and has an emission spectrum with a full width at half maximum of 100 nm or more in a range including the peak wavelength.
    Type: Application
    Filed: June 12, 2019
    Publication date: August 12, 2021
    Applicants: NATIONAL INSTITUTE OF ADVANCED INDUSTRIAL SCIENCE AND TECHNOLOGY, PHOENIX ELECTRIC CO., LTD., NATIONAL INSTITUTE FOR MATERIALS SCIENCE
    Inventors: Takashi FUKUDA, Tetsuya GOUDA, Yuta SAKIMOTO, Naoto HIROSAKI, Kohsei TAKAHASHI