Patents by Inventor Andrew Mattarella-Micke
Andrew Mattarella-Micke has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11907657Abstract: Systems and methods dynamically extracting n-grams for automated vocabulary updates. Text is received. An n-gram extracted from the text is matched to a canonical n-gram from a vocabulary to identify a tag for the text. An n-gram weight is computed for the n-gram extracted from the text. The n-gram weight may be computed by adjusting a term frequency of the n-gram. A relevancy score is computed for the tag using the n-gram weight and using an n-gram frequency of the canonical n-gram. The relevancy score is computed by dividing the n-gram weight by a value proportional to the n-gram frequency of the canonical n-gram. The relevancy score of the n-gram is presented.Type: GrantFiled: June 30, 2023Date of Patent: February 20, 2024Assignee: Intuit Inc.Inventors: Byungkyu Kang, Shivakumara Narayanaswamy, Andrew Mattarella-Micke
-
Publication number: 20230385087Abstract: A processor may obtain historic clickstream data indicating a plurality of interactions with a user interface (UI) by a plurality of users. The processor may select at least one user for real-time monitoring by processing, using a machine learning (ML) model, the historic clickstream data and at least one user feature and predicting, from the processing, that the at least one user will utilize a UI resource. The processor may monitor ongoing clickstream data of the selected at least one user and configure the UI resource according to the ongoing clickstream data.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Applicant: INTUIT INC.Inventors: Tomer TAL, Prarit LAMBA, Clifford Green, Xiaoyu ZENG, Neo YUCHEN, Andrew MATTARELLA-MICKE
-
Publication number: 20230316157Abstract: A machine learning system executed by a processor may generate predictions for a variety of natural language processing (NLP) tasks. The machine learning system may include a single deployment implementing a parameter efficient transfer learning architecture. The machine learning system may use adapter layers to dynamically modify a base model to generate a plurality of fine-tuned models. Each fine-tuned model may generate predictions for a specific NLP task. By transferring knowledge from the base model to each fine-tuned model, the ML system achieves a significant reduction in the number of tunable parameters required to generate a fine-tuned NLP model and decreases the fine-tuned model artifact size. Additionally, the ML system reduces training times for fine-tuned NLP models, promotes transfer learning across NLP tasks with lower labeled data volumes, and enables easier and more computationally efficient deployments for multi-task NLP.Type: ApplicationFiled: June 2, 2023Publication date: October 5, 2023Applicant: INTUIT INC.Inventors: Terrence J. TORRES, Tharathorn Rimchala, Andrew Mattarella-Micke
-
Publication number: 20230281399Abstract: Embodiments disclosed herein provide language-agnostic routing prediction models. The routing prediction models input text queries in any language and generate a routing prediction for the text queries. For a language that may have sparse training text data, the models, which are machine learning models, are trained using a machine translation to a prevalent language (e.g., English) to the language having sparse training text data -with the original text corpus and the translated text corpus being an input to multi-language embedding layers. The trained machine learning model makes routing predictions for text queries for the language having sparse training text data.Type: ApplicationFiled: March 3, 2022Publication date: September 7, 2023Applicant: INTUIT INC.Inventors: Prarit LAMBA, Clifford GREEN, Tomer TAL, Andrew MATTARELLA-MICKE
-
Patent number: 11704602Abstract: A machine learning system executed by a processor may generate predictions for a variety of natural language processing (NLP) tasks. The machine learning system may include a single deployment implementing a parameter efficient transfer learning architecture. The machine learning system may use adapter layers to dynamically modify a base model to generate a plurality of fine-tuned models. Each fine-tuned model may generate predictions for a specific NLP task. By transferring knowledge from the base model to each fine-tuned model, the ML system achieves a significant reduction in the number of tunable parameters required to generate a fine-tuned NLP model and decreases the fine-tuned model artifact size. Additionally, the ML system reduces training times for fine-tuned NLP models, promotes transfer learning across NLP tasks with lower labeled data volumes, and enables easier and more computationally efficient deployments for multi-task NLP.Type: GrantFiled: January 2, 2020Date of Patent: July 18, 2023Assignee: Intuit Inc.Inventors: Terrence J. Torres, Tharathorn Rimchala, Andrew Mattarella-Micke
-
Patent number: 11610113Abstract: A data management system trains an analysis model with a machine learning process to understand the semantic meaning of queries received from users of the data management system. The machine learning process includes retrieving assistance documents that each include a query and an answer to the query. A training model analyzes each answer and generates first topic distribution data indicating, for each answer, how relevant each of a plurality of topics is to the answer. The queries are passed to the analysis model and the analysis model is trained to generate second topic distribution data that converges with the first topic distribution data based on analysis of the queries.Type: GrantFiled: October 22, 2019Date of Patent: March 21, 2023Assignee: Intuit Inc.Inventor: Andrew Mattarella-Micke
-
Patent number: 11563846Abstract: A method including receiving an incoming call from a calling device of a caller and determining identification information for the calling device. The method also includes receiving voice audio data of the caller from the calling device, converting the voice audio data to caller phones, and identifying a customer account associated with the identification information. The method further includes obtaining user phones for multiple candidate users associated with the identified customer account, comparing the caller phones to the user phones for the multiple candidate users, and determining the identity of the caller based on the comparison.Type: GrantFiled: May 31, 2022Date of Patent: January 24, 2023Assignee: Intuit Inc.Inventors: Andrew Mattarella-Micke, Neo Yuchen, Xiaoyu Zeng, Manisha Panta
-
Publication number: 20210248617Abstract: A method and system train an analysis model with a machine learning process to predict whether a current user of the data management system will contact customer assistance agents of the data management system. The machine learning process utilizes historical clickstream data indicating actions taken by a plurality of historical users of the data management system while using the data management system. The analysis model predicts whether the current user will contact customer assistance agents by analyzing current clickstream data associated with the current user.Type: ApplicationFiled: February 10, 2020Publication date: August 12, 2021Applicant: Intuit Inc.Inventors: Andrew Mattarella-Micke, Pavlo Malynin, David S. Grayson, Tianhao Luo
-
Publication number: 20210209513Abstract: A machine learning system executed by a processor may generate predictions for a variety of natural language processing (NLP) tasks. The machine learning system may include a single deployment implementing a parameter efficient transfer learning architecture. The machine learning system may use adapter layers to dynamically modify a base model to generate a plurality of fine-tuned models. Each fine-tuned model may generate predictions for a specific NLP task. By transferring knowledge from the base model to each fine-tuned model, the ML system achieves a significant reduction in the number of tunable parameters required to generate a fine-tuned NLP model and decreases the fine-tuned model artifact size. Additionally, the ML system reduces training times for fine-tuned NLP models, promotes transfer learning across NLP tasks with lower labeled data volumes, and enables easier and more computationally efficient deployments for multi-task NLP.Type: ApplicationFiled: January 2, 2020Publication date: July 8, 2021Applicant: Intuit Inc.Inventors: Terrence J. TORRES, Tharathorn Rimchala, Andrew Mattarella-Micke
-
Publication number: 20210117777Abstract: A data management system trains an analysis model with a machine learning process to understand the semantic meaning of queries received from users of the data management system. The machine learning process includes retrieving assistance documents that each include a query and an answer to the query. A training model analyzes each answer and generates first topic distribution data indicating, for each answer, how relevant each of a plurality of topics is to the answer. The queries are passed to the analysis model and the analysis model is trained to generate second topic distribution data that converges with the first topic distribution data based on analysis of the queries.Type: ApplicationFiled: October 22, 2019Publication date: April 22, 2021Applicant: Intuit Inc.Inventor: Andrew Mattarella-Micke