Patents by Inventor Kazuma Hashimoto

Kazuma Hashimoto has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11967803
    Abstract: A light emitting module includes: a first light emitting device including: a first package, a plurality of first semiconductor laser elements mounted in the first package, and a first lens member having lens portions, a number of the lens portion is the same as a number of the first semiconductor laser elements; and a second light emitting device including: a second package, a plurality of second semiconductor laser elements mounted in the second package, wherein a quantity of the second semiconductor laser elements is fewer than a quantity of the first semiconductor laser elements, and a second lens member which is structured the same as the first lens member; and one or more mounting substrates in which the first light emitting device and the second light emitting device are mounted.
    Type: Grant
    Filed: November 14, 2022
    Date of Patent: April 23, 2024
    Assignee: NICHIA CORPORATION
    Inventors: Kazuma Kozuru, Takuya Hashimoto
  • Patent number: 11942869
    Abstract: A power module is applied to an electric power conversion device in which multiple upper-lower arm circuits are connected to an electric power line in parallel. The power module includes the multiple upper-lower arm circuits; a capacitor connected to each of the multiple upper-lower arm circuits in parallel; an upper wiring that connects an upper arm and a positive electrode terminal of the capacitor; a lower wiring that connects a lower arm and a negative electrode of the capacitor; an upper electric power wiring that is an electric power wiring connected to the electric power line and connects a high potential line of the electric power line and the upper wiring; and a lower electric power wiring that is an electric power wiring connected to the electric power line and connects a lower potential line of the electric power line and the lower wiring.
    Type: Grant
    Filed: January 22, 2021
    Date of Patent: March 26, 2024
    Assignee: DENSO CORPORATION
    Inventors: Tomohisa Sano, Ryota Tanabe, Yuta Hashimoto, Kazuma Fukushima, Mamoru Kuwahara
  • Patent number: 11922305
    Abstract: Embodiments described herein provide safe policy improvement (SPI) in a batch reinforcement learning framework for a task-oriented dialogue. Specifically, a batch reinforcement learning framework for dialogue policy learning is provided, which improves the performance of the dialogue and learns to shape a reward that reasons the invention behind human response rather than just imitating the human demonstration.
    Type: Grant
    Filed: November 25, 2020
    Date of Patent: March 5, 2024
    Assignee: Salesforce, Inc.
    Inventors: Govardana Sachithanandam Ramachandran, Kazuma Hashimoto, Caiming Xiong, Richard Socher
  • Publication number: 20240070456
    Abstract: Provided are systems and methods for corrective reward optimization for generative sequential labeling. In particular, example aspects of the present disclosure are directed to an effective framework for generative reward optimization of text (or other) data sequences, certain example implementations of which can be referred to as “GROOT”. Example implementations of the proposed framework work by training a generative sequential labeling model to match the decoder output distribution with that of the (possibly black-box) reward function. Using an iterative training regime, the framework can first generate prediction candidates and then correct errors in the candidate. Finally, a loss function can be used that contrasts those candidates based on their reward values (e.g., as measured by a reward function that encodes the specific objectives for a particular setting or application).
    Type: Application
    Filed: August 31, 2023
    Publication date: February 29, 2024
    Inventors: Karthik Raman, Kazuma Hashimoto
  • Patent number: 11902221
    Abstract: A conversation engine performs conversations with users using chatbots customized for performing a set of tasks that can be performed using an online system. The conversation engine loads a chatbot configuration that specifies the behavior of a chatbot including the tasks that can be performed by the chatbot, the types of entities relevant to each task, and so on. The conversation may be voice based and use natural language. The conversation engine may load different chatbot configurations to implement different chatbots. The conversation engine receives a conversation engine configuration that specifies the behavior of the conversation engine across chatbots. The system may be a multi-tenant system that allows customization of the chatbots for each tenant.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: February 13, 2024
    Assignee: Salesforce, Inc.
    Inventors: Xinyi Yang, Tian Xie, Caiming Xiong, Wenhao Liu, Huan Wang, Kazuma Hashimoto, Jin Qu, Feihong Wu, Yingbo Zhou
  • Patent number: 11887599
    Abstract: A conversation engine performs conversations with users using chatbots customized for performing a set of tasks that can be performed using an online system. The conversation engine loads a chatbot configuration that specifies the behavior of a chatbot including the tasks that can be performed by the chatbot, the types of entities relevant to each task, and so on. The conversation may be voice based and use natural language. The conversation engine may load different chatbot configurations to implement different chatbots. The conversation engine receives a conversation engine configuration that specifies the behavior of the conversation engine across chatbots. The system may be a multi-tenant system that allows customization of the chatbots for each tenant.
    Type: Grant
    Filed: February 10, 2023
    Date of Patent: January 30, 2024
    Assignee: Salesforce, Inc.
    Inventors: Xinyi Yang, Tian Xie, Caiming Xiong, Wenhao Liu, Huan Wang, Kazuma Hashimoto, Yingbo Zhou, Xugang Ye, Jin Qu, Feihong Wu
  • Publication number: 20230419050
    Abstract: Embodiments described herein provide a pipelined natural language question answering system that improves a BERT-based system. Specifically, the natural language question answering system uses a pipeline of neural networks each trained to perform a particular task. The context selection network identifies premium context from context for the question. The question type network identifies the natural language question as a yes, no, or span question and a yes or no answer to the natural language question when the question is a yes or no question. The span extraction model determines an answer span to the natural language question when the question is a span question.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Inventors: Akari ASAI, Kazuma HASHIMOTO, Richard SOCHER, Caiming XIONG
  • Patent number: 11822897
    Abstract: Approaches for the translation of structured text include an embedding module for encoding and embedding source text in a first language, an encoder for encoding output of the embedding module, a decoder for iteratively decoding output of the encoder based on generated tokens in translated text from previous iterations, a beam module for constraining output of the decoder with respect to possible embedded tags to include in the translated text for a current iteration using a beam search, and a layer for selecting a token to be included in the translated text for the current iteration. The translated text is in a second language different from the first language. In some embodiments, the approach further includes scoring and pointer modules for selecting the token based on the output of the beam module or copied from the source text or reference text from a training pair best matching the source text.
    Type: Grant
    Filed: August 31, 2021
    Date of Patent: November 21, 2023
    Assignee: salesforce.com, inc.
    Inventors: Kazuma Hashimoto, Raffaella Buschiazzo, James Bradbury, Teresa Anna Marshall, Caiming Xiong, Richard Socher
  • Patent number: 11797825
    Abstract: The technology disclosed provides a so-called “joint many-task neural network model” to solve a variety of increasingly complex natural language processing (NLP) tasks using growing depth of layers in a single end-to-end model. The model is successively trained by considering linguistic hierarchies, directly connecting word representations to all model layers, explicitly using predictions in lower tasks, and applying a so-called “successive regularization” technique to prevent catastrophic forgetting. Three examples of lower level model layers are part-of-speech (POS) tagging layer, chunking layer, and dependency parsing layer. Two examples of higher level model layers are semantic relatedness layer and textual entailment layer. The model achieves the state-of-the-art results on chunking, dependency parsing, semantic relatedness and textual entailment.
    Type: Grant
    Filed: May 26, 2021
    Date of Patent: October 24, 2023
    Assignee: Salesforce, Inc.
    Inventors: Kazuma Hashimoto, Caiming Xiong, Richard Socher
  • Patent number: 11783164
    Abstract: The technology disclosed provides a so-called “joint many-task neural network model” to solve a variety of increasingly complex natural language processing (NLP) tasks using growing depth of layers in a single end-to-end model. The model is successively trained by considering linguistic hierarchies, directly connecting word representations to all model layers, explicitly using predictions in lower tasks, and applying a so-called “successive regularization” technique to prevent catastrophic forgetting. Three examples of lower level model layers are part-of-speech (POS) tagging layer, chunking layer, and dependency parsing layer. Two examples of higher level model layers are semantic relatedness layer and textual entailment layer. The model achieves the state-of-the-art results on chunking, dependency parsing, semantic relatedness and textual entailment.
    Type: Grant
    Filed: October 26, 2020
    Date of Patent: October 10, 2023
    Assignee: Salesforce.com, Inc.
    Inventors: Kazuma Hashimoto, Caiming Xiong, Richard Socher
  • Patent number: 11775775
    Abstract: Embodiments described herein provide a pipelined natural language question answering system that improves a BERT-based system. Specifically, the natural language question answering system uses a pipeline of neural networks each trained to perform a particular task. The context selection network identifies premium context from context for the question. The question type network identifies the natural language question as a yes, no, or span question and a yes or no answer to the natural language question when the question is a yes or no question. The span extraction model determines an answer span to the natural language question when the question is a span question.
    Type: Grant
    Filed: November 26, 2019
    Date of Patent: October 3, 2023
    Assignee: Salesforce.com, Inc.
    Inventors: Akari Asai, Kazuma Hashimoto, Richard Socher, Caiming Xiong
  • Patent number: 11763090
    Abstract: An online system that allows users to interact with it using expressions in natural language form includes an intent inference module allowing it to infer an intent represented by a user expression. The intent inference module has a set of possible intents, along with a small set of example natural language expressions known to represent that intent. When a user interacts with the system using a natural language expression for which the intent is not already known, the intent inference module applies a natural language inference model to compute scores indicating whether the user expression textually entails the various example natural language expressions. Based on the scores, the intent inference module determines an intent that is most applicable for the expression. If an intent cannot be determined with sufficient confidence, the intent inference module may further attempt to determine whether the various example natural language expressions textually entail the user expression.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: September 19, 2023
    Assignee: Salesforce, Inc.
    Inventors: Tian Xie, Kazuma Hashimoto, Xinyi Yang, Caiming Xiong
  • Patent number: 11741142
    Abstract: Embodiments described herein provide document summarization systems and methods that utilize fine-tuning of pre-trained abstractive summarization models to produce summaries that more faithfully track the content of the documents. Such abstractive summarization models may be pre-trained using a corpus consisting of pairs of articles and associated summaries. For each article-summary pair, a pseudo label or control code is generated and represents a faithfulness of the summary with respect to the article. The pre-trained model is then fine-tuned based on the article-summary pairs and the corresponding control codes. The resulting fine-tuned models then provide improved faithfulness in document summarization tasks.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: August 29, 2023
    Assignee: salesforce.com, inc.
    Inventors: Haopeng Zheng, Semih Yavuz, Wojciech Kryscinski, Kazuma Hashimoto, Yingbo Zhou
  • Publication number: 20230186916
    Abstract: A conversation engine performs conversations with users using chatbots customized for performing a set of tasks that can be performed using an online system. The conversation engine loads a chatbot configuration that specifies the behavior of a chatbot including the tasks that can be performed by the chatbot, the types of entities relevant to each task, and so on. The conversation may be voice based and use natural language. The conversation engine may load different chatbot configurations to implement different chatbots. The conversation engine receives a conversation engine configuration that specifies the behavior of the conversation engine across chatbots. The system may be a multi-tenant system that allows customization of the chatbots for each tenant.
    Type: Application
    Filed: February 10, 2023
    Publication date: June 15, 2023
    Inventors: Xinyi Yang, Tian Xie, Caiming Xiong, Wenhao Liu, Huan Wang, Kazuma Hashimoto, Yingbo Zhou, Xugang Ye, Jin Qu, Feihong Wu
  • Patent number: 11669712
    Abstract: A method for evaluating robustness of one or more target neural network models using natural typos. The method includes receiving one or more natural typo generation rules associated with a first task associated with a first input document type, receiving a first target neural network model, and receiving a first document and corresponding its ground truth labels. The method further includes generating one or more natural typos for the first document based on the one or more natural typo generation rules, and providing, to the first target neural network model, a test document generated based on the first document and the one or more natural typos as an input document to generate a first output. A robustness evaluation result of the first target neural network model is generated based on a comparison between the output and the ground truth labels.
    Type: Grant
    Filed: September 3, 2019
    Date of Patent: June 6, 2023
    Assignee: salesforce.com, inc.
    Inventors: Lichao Sun, Kazuma Hashimoto, Jia Li, Richard Socher, Caiming Xiong
  • Publication number: 20230153542
    Abstract: Embodiments described herein provide a cross-lingual sentence alignment framework that is trained only on rich-resource language pairs. To obtain an accurate aligner, a pretrained multi-lingual language model is used, and a classifier is trained on parallel data from rich-resource language pairs. This trained classifier may then be used for cross-lingual transfer with low-resource languages.
    Type: Application
    Filed: January 21, 2022
    Publication date: May 18, 2023
    Inventors: Tong Niu, Kazuma Hashimoto, Yingbo Zhou, Caiming Xiong
  • Publication number: 20230098809
    Abstract: An information processing apparatus includes a control unit configured to execute a scene detection process, a parameter extraction process, and an output process. The scene detection process detects a scene from an input content. The parameter extraction process extracts a realistic sensation parameter for wave control that corresponds to a scene that is detected by the scene detection process. The output process outputs a wave signal for the content that is produced by processing sound data of the input content by a realistic sensation parameter that is extracted by the parameter extraction process.
    Type: Application
    Filed: March 18, 2022
    Publication date: March 30, 2023
    Applicant: DENSO TEN Limited
    Inventors: Shinichi SHIOTSU, Yoshikuni MIKI, Rei HIROMI, Yohei KAKEE, Iku NAKAJO, Kazuma HASHIMOTO
  • Publication number: 20230054068
    Abstract: Embodiments described herein provide document summarization systems and methods that utilize fine-tuning of pre-trained abstractive summarization models to produce summaries that more faithfully track the content of the documents. Such abstractive summarization models may be pre-trained using a corpus consisting of pairs of articles and associated summaries. For each article-summary pair, a pseudo label or control code is generated and represents a faithfulness of the summary with respect to the article. The pre-trained model is then fine-tuned based on the article-summary pairs and the corresponding control codes. The resulting fine-tuned models then provide improved faithfulness in document summarization tasks.
    Type: Application
    Filed: January 31, 2022
    Publication date: February 23, 2023
    Inventors: Haopeng Zheng, Semih Yavuz, Wojciech Kryscinski, Kazuma Hashimoto, Yingbo Zhou
  • Publication number: 20230055188
    Abstract: Embodiments described herein provide a question answering approach that answers a question by generating an executable logical form. First, a ranking model is used to select a set of good logical forms from a pool of logical forms obtained by searching over a knowledge graph. The selected logical forms are good in the sense that they are close to (or exactly match, in some cases) the intents in the question and final desired logical form. Next, a generation model is adopted conditioned on the question as well as the selected logical forms to generate the target logical form and execute it to obtain the final answer. For example, at inference stage, when a question is received, a matching logical form is identified from the question, based on which the final answer can be generated based on the node that is associated with the matching logical form in the knowledge base.
    Type: Application
    Filed: December 29, 2021
    Publication date: February 23, 2023
    Inventors: Xi Ye, Semih Yavuz, Kazuma Hashimoto, Yingbo Zhou
  • Publication number: 20230059870
    Abstract: Embodiments described herein provide a question answering approach that answers a question by generating an executable logical form. First, a ranking model is used to select a set of good logical forms from a pool of logical forms obtained by searching over a knowledge graph. The selected logical forms are good in the sense that they are close to (or exactly match, in some cases) the intents in the question and final desired logical form. Next, a generation model is adopted conditioned on the question as well as the selected logical forms to generate the target logical form and execute it to obtain the final answer. For example, at inference stage, when a question is received, a matching logical form is identified from the question, based on which the final answer can be generated based on the node that is associated with the matching logical form in the knowledge base.
    Type: Application
    Filed: December 29, 2021
    Publication date: February 23, 2023
    Inventors: Xi Ye, Semih Yavuz, Kazuma Hashimoto, Yingbo Zhou