Patents by Inventor Andrew Paul McGovern
Andrew Paul McGovern has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12050841Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to both perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. For example, some embodiments can automatically and intelligently cause a switch to a page the user needs and automatically and intelligently cause a population of particular fields of the page the user needs based on a user view context and the voice utterance.Type: GrantFiled: August 8, 2023Date of Patent: July 30, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Jaclyn Carley Knapp, Andrew Paul McGovern, Harris Syed, Chad Steven Estes, Jesse Daniel Eskes Rusak, David Ernesto Heekin Burkett, Allison Anne O'Mahony, Ashok Kuppusamy, Jonathan Reed Harris, Jose Miguel Rady Allende, Diego Hernan Carlomagno, Talon Edward Ireland, Michael Francis Palermiti, II, Richard Leigh Mains, Jayant Krishnamurthy
-
Publication number: 20240241624Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. Such heavy integration also allows particular embodiments to support multi-modal input from a user for a single conversational interaction. In this way, client application user interface interactions, such as clicks, touch gestures, or text inputs are executed alternative or in addition to the voice utterances.Type: ApplicationFiled: March 27, 2024Publication date: July 18, 2024Inventors: Tudor Buzasu KLEIN, Viktoriya TARANOV, Sergiy GAVRYLENKO, Jaclyn Carley KNAPP, Andrew Paul MCGOVERN, Harris SYED, Chad Steven ESTES, Jesse Daniel Eskes RUSAK, David Ernesto Heekin BURKETT, Allison Anne O'MAHONY, Ashok KUPPUSAMY, Jonathan Reed HARRIS, Jose Miguel Rady ALLENDE, Diego Hernan CARLOMAGNO, Talon Edward IRELAND, Michael Francis PALERMITI, II, Richard Leigh MAINS, Jayant KRISHNAMURTHY
-
Patent number: 11972095Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. Such heavy integration also allows particular embodiments to support multi-modal input from a user for a single conversational interaction. In this way, client application user interface interactions, such as clicks, touch gestures, or text inputs are executed alternative or in addition to the voice utterances.Type: GrantFiled: October 22, 2021Date of Patent: April 30, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Tudor Buzasu Klein, Viktoriya Taranov, Sergiy Gavrylenko, Jaclyn Carley Knapp, Andrew Paul McGovern, Harris Syed, Chad Steven Estes, Jesse Daniel Eskes Rusak, David Ernesto Heekin Burkett, Allison Anne O'Mahony, Ashok Kuppusamy, Jonathan Reed Harris, Jose Miguel Rady Allende, Diego Hernan Carlomagno, Talon Edward Ireland, Michael Francis Palermiti, II, Richard Leigh Mains, Jayant Krishnamurthy
-
Publication number: 20230401031Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to both perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. For example, some embodiments can automatically and intelligently cause a switch to a page the user needs and automatically and intelligently cause a population of particular fields of the page the user needs based on a user view context and the voice utterance.Type: ApplicationFiled: August 8, 2023Publication date: December 14, 2023Inventors: Jaclyn Carley KNAPP, Andrew Paul MCGOVERN, Harris SYED, Chad Steven ESTES, Jesse Daniel Eskes RUSAK, David Ernesto Heekin BURKETT, Allison Anne O'MAHONY, Ashok KUPPUSAMY, Jonathan Reed HARRIS, Jose Miguel Rady ALLENDE, Diego Hernan CARLOMAGNO, Talon Edward IRELAND, Michael Francis PALERMITI, II, Richard Leigh MAINS, Jayant KRISHNAMURTHY
-
Patent number: 11789696Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to both perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. For example, some embodiments can automatically and intelligently cause a switch to a page the user needs and automatically and intelligently cause a population of particular fields of the page the user needs based on a user view context and the voice utterance.Type: GrantFiled: June 30, 2021Date of Patent: October 17, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jaclyn Carley Knapp, Andrew Paul McGovern, Harris Syed, Chad Steven Estes, Jesse Daniel Eskes Rusak, David Ernesto Heekin Burkett, Allison Anne O'Mahony, Ashok Kuppusamy, Jonathan Reed Harris, Jose Miguel Rady Allende, Diego Hernan Carlomagno, Talon Edward Ireland, Michael Francis Palermiti, II, Richard Leigh Mains, Jayant Krishnamurthy
-
Publication number: 20220308828Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to both perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. For example, some embodiments can automatically and intelligently cause a switch to a page the user needs and automatically and intelligently cause a population of particular fields of the page the user needs based on a user view context and the voice utterance.Type: ApplicationFiled: June 30, 2021Publication date: September 29, 2022Inventors: Jaclyn Carley KNAPP, Andrew Paul MCGOVERN, Harris SYED, Chad Steven ESTES, Jesse Daniel Eskes Rusak, David Ernesto Heekin Burkett, Allison Anne O'Mahony, Ashok Kuppusamy, Jonathan Reed Harris, Jose Miguel Rady Allende, Diego Hernan Carlomagno, Talon Edward Ireland, Michael Francis Palermiti, II, Richard Leigh Mains, Jayant Krishnamurthy
-
Publication number: 20220308718Abstract: Various embodiments discussed herein enable client applications to be heavily integrated with a voice assistant in order to perform commands associated with voice utterances of users via voice assistant functionality and also seamlessly cause client applications to automatically perform native functions as part of executing the voice utterance. Such heavy integration also allows particular embodiments to support multi-modal input from a user for a single conversational interaction. In this way, client application user interface interactions, such as clicks, touch gestures, or text inputs are executed alternative or in addition to the voice utterances.Type: ApplicationFiled: October 22, 2021Publication date: September 29, 2022Inventors: Tudor Buzasu KLEIN, Viktoriya TARANOV, Sergiy GAVRYLENKO, Jaclyn Carley KNAPP, Andrew Paul MCGOVERN, Harris SYED, Chad Steven ESTES, Jesse Daniel Eskes RUSAK, David Ernesto Heekin BURKETT, Allison Anne O'MAHONY, Ashok KUPPUSAMY, Jonathan Reed HARRIS, Jose Miguel Rady ALLENDE, Diego Hernan CARLOMAGNO, Talon Edward IRELAND, Michael Francis PALERMITI, II, Richard Leigh MAINS, Jayant KRISHNAMURTHY
-
Publication number: 20150278370Abstract: One or more techniques and/or systems are provided for facilitating task completion. For example, a natural language input (e.g., “where should we eat”) may be received from a user of a client device. The natural language input may be evaluated using a set of user contextual signals, opted-in for exposure by the user for facilitating task completion, to identify a user task intent. For example, a user task intent of viewing a local Mexican restaurant menu may be identified based upon a social network post of the user indicating that the user is meeting a friend for Mexican food. Task completion functionality may be exposed to the user based upon the user task intent. For example, a restaurant app may be deep launched to display a menu of a local Mexican restaurant.Type: ApplicationFiled: April 1, 2014Publication date: October 1, 2015Inventors: Kevin Niels Stratvert, Yu-Ting Kuo, Andrew Paul McGovern, Xiao Wei, Gaurav Anand, Thomas Lin, Adam C. Lusch
-
Patent number: 8977625Abstract: Methods, systems, and media are provided for facilitating generation of an inference index. In embodiments, a canonical entity is referenced. The canonical entity is associated with web documents. One or more queries that, when input, result in a selection of at least one of the web documents are identified. An entity document is generated for the canonical entity. The entity document includes the identified queries and/or associated text from the content of a document or from an entity title that result in the selection of the at least one of the web documents. The entity document and corresponding canonical entity can be combined with additional related entity documents and canonical entities to generate an inference index.Type: GrantFiled: December 15, 2010Date of Patent: March 10, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Gregory T. Buehrer, Li Jiang, Paul Alfred Viola, Andrew Paul McGovern, Jakub Jan Szymanski, Sanaz Ahari
-
Publication number: 20120158738Abstract: Methods, systems, and media are provided for facilitating generation of an inference index. In embodiments, a canonical entity is referenced. The canonical entity is associated with web documents. One or more queries that, when input, result in a selection of at least one of the web documents are identified. An entity document is generated for the canonical entity. The entity document includes the identified queries and/or associated text from the content of a document or from an entity title that result in the selection of the at least one of the web documents. The entity document and corresponding canonical entity can be combined with additional related entity documents and canonical entities to generate an inference index.Type: ApplicationFiled: December 15, 2010Publication date: June 21, 2012Applicant: MICROSOFT CORPORATIONInventors: Gregory T. Buehrer, Li Jiang, Paul Alfred Viola, Andrew Paul McGovern, Jakub Jan Szymanski, Sanaz Ahari
-
Publication number: 20110191014Abstract: Maps of a particular location are often generated with an inset area map, rendered in an inset region of the location map, of the general area including the location at a lower zoom level than the location map. However, this presentation may be disadvantageous in some scenarios (e.g., the where a user is interested in examining the extended area around the location, the area between two locations, or the spatial layout of the locations.) Instead, a composite map may be generated comprising an area map at an area map zoom level, and an inset location map rendered in an inset region of the location map and illustrating a location at a higher zoom level than the area map zoom level. Such composite maps may also be requested programmatically of a map generating service, and may be provided in an automated manner for use in an application.Type: ApplicationFiled: February 4, 2010Publication date: August 4, 2011Applicant: Microsoft CorporationInventors: Shuangtong Feng, Chad Raynor, Andrew Paul McGovern, Michael John Narayan