Patents by Inventor Nandan Gautam Thor
Nandan Gautam Thor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12271833Abstract: To automatically identify a sequence of recommended account/product pairs with highest likelihood of becoming a realized opportunity, an account/product sequence recommender uses an account propensity (AP) model and a reinforcement learning (RL) model and target engagement sequence generators trained on historical time series data, firmographic data, and product data. The trained AP model assigns propensity values to each product corresponding to received account characteristics. The trained RL model generates an optimal sequence of products that maximizes the reward over future realized opportunities. The target engagement sequence generators create target engagement sequences corresponding to the optimal sequence of products. The recommender prunes the optimal sequence of products based on the propensity values from the trained AP model, the completeness of these target engagement sequences, and a desired product sequence length.Type: GrantFiled: March 31, 2019Date of Patent: April 8, 2025Assignee: Palo Alto Networks, Inc.Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Gorkem Kilic, Juho Pekanpoika Parviainen, Erik Michael Bower
-
Patent number: 12039428Abstract: To identify a target engagement sequence with a highest likelihood of realizing an opportunity, a target engagement sequence generator uses models (artificial recurrent neural network (RNN) and a hidden Markov model (HMM)) trained with historical time series data for a particular combination of values for opportunity characteristics. The trained RNN identifies a sequence of personas for realizing the opportunity described by the opportunity characteristics values. Data from regression analysis indicates key individuals for realizing an opportunity within each organizational classification that occurred within the historical data. The HMM identifies the importance of each persona in the sequence of personas with communicates to the key individuals. The resulting sequence of individuals indicates an optimal sequence of individuals and order for contacting those individuals in order to realize an opportunity.Type: GrantFiled: September 15, 2022Date of Patent: July 16, 2024Assignee: Palo Alto Networks, Inc.Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Erik Michael Bower, René Bonvanie
-
Publication number: 20230376695Abstract: Dynamic content tags are generated as content is received by a dynamic content tagging system. A natural language processor (NLP) tokenizes the content and extracts contextual N-grams based on local or global context for the tokens in each document in the content. The contextual N-grams are used as input to a generative model that computes a weighted vector of likelihood values that each contextual N-gram corresponds to one of a set of unlabeled topics. A tag is generated for each unlabeled topic comprising the contextual N-gram having a highest likelihood to correspond to that unlabeled topic. Topic-based deep learning models having tag predictions below a threshold confidence level are retrained using the generated tags, and the retrained topic-based deep learning models dynamically tag the content.Type: ApplicationFiled: August 1, 2023Publication date: November 23, 2023Inventors: Nandan Gautam Thor, Vasiliki Arvaniti, Jere Armas Michael Helenius, Erik Michael Bower
-
Patent number: 11763091Abstract: Dynamic content tags are generated as content is received by a dynamic content tagging system. A natural language processor (NLP) tokenizes the content and extracts contextual N-grams based on local or global context for the tokens in each document in the content. The contextual N-grams are used as input to a generative model that computes a weighted vector of likelihood values that each contextual N-gram corresponds to one of a set of unlabeled topics. A tag is generated for each unlabeled topic comprising the contextual N-gram having a highest likelihood to correspond to that unlabeled topic. Topic-based deep learning models having tag predictions below a threshold confidence level are retrained using the generated tags, and the retrained topic-based deep learning models dynamically tag the content.Type: GrantFiled: February 25, 2020Date of Patent: September 19, 2023Assignee: Palo Alto Networks, Inc.Inventors: Nandan Gautam Thor, Vasiliki Arvaniti, Jere Armas Michael Helenius, Erik Michael Bower
-
Publication number: 20230011066Abstract: To identify a target engagement sequence with a highest likelihood of realizing an opportunity, a target engagement sequence generator uses models (artificial recurrent neural network (RNN) and a hidden Markov model (HMM)) trained with historical time series data for a particular combination of values for opportunity characteristics. The trained RNN identifies a sequence of personas for realizing the opportunity described by the opportunity characteristics values. Data from regression analysis indicates key individuals for realizing an opportunity within each organizational classification that occurred within the historical data. The HMM identifies the importance of each persona in the sequence of personas with communicates to the key individuals. The resulting sequence of individuals indicates an optimal sequence of individuals and order for contacting those individuals in order to realize an opportunity.Type: ApplicationFiled: September 15, 2022Publication date: January 12, 2023Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Erik Michael Bower, René Bonvanie
-
Patent number: 11494610Abstract: To identify a target engagement sequence with a highest likelihood of realizing an opportunity, a target engagement sequence generator uses models (artificial recurrent neural network (RNN) and a hidden Markov model (HMM)) trained with historical time series data for a particular combination of values for opportunity characteristics. The trained RNN identifies a sequence of personas for realizing the opportunity described by the opportunity characteristics values. Data from regression analysis indicates key individuals for realizing an opportunity within each organizational classification that occurred within the historical data. The HMM identifies the importance of each persona in the sequence of personas with communicates to the key individuals. The resulting sequence of individuals indicates an optimal sequence of individuals and order for contacting those individuals in order to realize an opportunity.Type: GrantFiled: March 31, 2019Date of Patent: November 8, 2022Assignee: Palo Alto Networks, Inc.Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Erik Michael Bower, René Bonvanie
-
Publication number: 20210264116Abstract: Dynamic content tags are generated as content is received by a dynamic content tagging system. A natural language processor (NLP) tokenizes the content and extracts contextual N-grams based on local or global context for the tokens in each document in the content. The contextual N-grams are used as input to a generative model that computes a weighted vector of likelihood values that each contextual N-gram corresponds to one of a set of unlabeled topics. A tag is generated for each unlabeled topic comprising the contextual N-gram having a highest likelihood to correspond to that unlabeled topic. Topic-based deep learning models having tag predictions below a threshold confidence level are retrained using the generated tags, and the retrained topic-based deep learning models dynamically tag the content.Type: ApplicationFiled: February 25, 2020Publication date: August 26, 2021Inventors: Nandan Gautam Thor, Vasiliki Arvaniti, Jere Armas Michael Helenius, Erik Michael Bower
-
Publication number: 20200311585Abstract: To automatically identify a sequence of recommended account/product pairs with highest likelihood of becoming a realized opportunity, an account/product sequence recommender uses an account propensity (AP) model and a reinforcement learning (RL) model and target engagement sequence generators trained on historical time series data, firmographic data, and product data. The trained AP model assigns propensity values to each product corresponding to received account characteristics. The trained RL model generates an optimal sequence of products that maximizes the reward over future realized opportunities. The target engagement sequence generators create target engagement sequences corresponding to the optimal sequence of products. The recommender prunes the optimal sequence of products based on the propensity values from the trained AP model, the completeness of these target engagement sequences, and a desired product sequence length.Type: ApplicationFiled: March 31, 2019Publication date: October 1, 2020Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Gorkem Kilic, Juho Pekanpoika Parviainen, Erik Michael Bower
-
Publication number: 20200311513Abstract: To identify a target engagement sequence with a highest likelihood of realizing an opportunity, a target engagement sequence generator uses models (artificial recurrent neural network (RNN) and a hidden Markov model (HMM)) trained with historical time series data for a particular combination of values for opportunity characteristics. The trained RNN identifies a sequence of personas for realizing the opportunity described by the opportunity characteristics values. Data from regression analysis indicates key individuals for realizing an opportunity within each organizational classification that occurred within the historical data. The HMM identifies the importance of each persona in the sequence of personas with communicates to the key individuals. The resulting sequence of individuals indicates an optimal sequence of individuals and order for contacting those individuals in order to realize an opportunity.Type: ApplicationFiled: March 31, 2019Publication date: October 1, 2020Inventors: Jere Armas Michael Helenius, Nandan Gautam Thor, Erik Michael Bower, René Bonvanie