Patents by Inventor Dan Banay
Dan Banay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 8660843Abstract: Systems and methods are described for systems that utilize an interaction manager to manage interactions—also known as requests or dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains a priority for each of the interactions, such as via an interaction list, where the priority of the interactions corresponds to an order in which the interactions are to be processed. Interactions are normally processed in the order in which they are received. However, the systems and method described herein may provide a grace period after processing a first interaction and before processing a second interaction. If a third interaction that is chained to the first interaction is received during this grace period, then the third interaction may be processed before the second interaction.Type: GrantFiled: January 23, 2013Date of Patent: February 25, 2014Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, Dan Banay, David Michael Miller
-
Patent number: 8447616Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: GrantFiled: March 31, 2010Date of Patent: May 21, 2013Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, David Michael Miller, Dan Banay, Clement Chun Pong Yip
-
Patent number: 8374879Abstract: Systems and methods are described for speech systems that utilize an interaction manager to manage interactions—also known as dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains an interaction list. An application wishing to utilize the speech system submits one or more interactions to the interaction manager. Interactions are normally processed in the order in which they are received. An exception to this rule is an interaction that is configured by an application to be processed immediately, which causes the interaction manager to place the interaction at the front of the interaction list of interactions. If an application has designated an interaction to interrupt a currently processing interaction, then the newly submitted application will interrupt any interaction currently being processed and, therefore, it will be processed immediately.Type: GrantFiled: December 16, 2005Date of Patent: February 12, 2013Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Yip, Dan Banay, David Miller
-
Publication number: 20100191529Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: ApplicationFiled: March 31, 2010Publication date: July 29, 2010Applicant: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, David Michael Miller, Dan Banay
-
Patent number: 7742925Abstract: Systems and methods are described for a speech system that includes one or more speech controls incorporated into one or more speech-enabled applications that run on the speech system. The controls allow applications to be developed with minimal programming effort to incorporate common speech-enabled application functions. A question control provides a customizable template for requesting information from a user. An announcer control allows a speech-enabled application to provide a user with information without having to re-create an entire announcer process each time it is used. A command control provides a simple way to attach command and control functions to speech-enabled applications. A word trainer control provides a way to associate user-selected voice tags with certain information. Providing the controls for use with speech-enabled applications ensures standardized user interfaces across multiple speech-enabled applications.Type: GrantFiled: December 19, 2005Date of Patent: June 22, 2010Assignee: Microsoft CorporationInventors: Stephen R Falcon, Clement Yip, Dan Banay, David M Miller
-
Patent number: 7720678Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: GrantFiled: December 16, 2005Date of Patent: May 18, 2010Assignee: Microsoft CorporationInventors: Steve Falcon, Clement Yip, David Miller, Dan Banay
-
Patent number: 7363229Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: GrantFiled: November 4, 2005Date of Patent: April 22, 2008Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, Dan Banay, David Michael Miller
-
Publication number: 20070268317Abstract: A computer system or computing device includes a display for displaying the visual output of any number of software applications. A computer-implemented method of selectively displaying a magnified rendering of a portion of the display screen is executed on the computer system or computing device. The method allows the user to select a portion of the display screen for magnification and then displays a magnified rendering of that portion of the display screen. The magnified rendering retains the functional and interactive aspects of the underlying, non-magnified source content. The method also provides a configurable means of controlling the amount of magnification in the magnified rendering. The method permits using the magnified rendering to pan around within the underlying, non-magnified source content.Type: ApplicationFiled: May 18, 2006Publication date: November 22, 2007Inventor: Dan Banay
-
Patent number: 7299185Abstract: Systems and methods are described for speech systems that utilize an interaction manager to manage interactions—also known as dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains an interaction list. An application wishing to utilize the speech system submits one or more interactions to the interaction manager. Interactions are normally processed in the order in which they are received. An exception to this rule is an interaction that is configured by an application to be processed immediately, which causes the interaction manager to place the interaction at the front of the interaction list of interactions. If an application has designated an interaction to interrupt a currently processing interaction, then the newly submitted application will interrupt any interaction currently being processed and, therefore, it will be processed immediately.Type: GrantFiled: November 1, 2005Date of Patent: November 20, 2007Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, Dan Banay, David Michael Miller
-
Patent number: 7257776Abstract: Systems and methods are described for scaling a graphical user interface (GUI) to fit proportionally in displays of different sizes. Bounds of display objects to be displayed in the graphical user interface are defined in terms of position relative to horizontal and vertical dimensions of a display on which the GUI is rendered. An application defines the GUI in relative terms, but an end user may alter the look and feel of controls in the GUI. A tiered sizing schema is described that provides size constraints for display objects. The end user is limited as to how much a size of a display object can be altered in order to preserve the integrity of the original specifications of the GUI when the GUI is displayed on displays of various dimensions.Type: GrantFiled: February 5, 2002Date of Patent: August 14, 2007Assignee: Microsoft CorporationInventors: Richard St. Clair Bailey, Stephen Russell Falcon, Dan Banay
-
Patent number: 7254545Abstract: Systems and methods are described for a speech system that includes one or more speech controls incorporated into one or more speech-enabled applications that run on the speech system. The controls allow applications to be developed with minimal programming effort to incorporate common speech-enabled application functions. A question control provides a customizable template for requesting information from a user. An announcer control allows a speech-enabled application to provide a user with information without having to re-create an entire announcer process each time it is used. A command control provides a simple way to attach command and control functions to speech-enabled applications. A word trainer control provides a way to associate user-selected voice tags with certain information. Providing the controls for use with speech-enabled applications ensures standardized user interfaces across multiple speech-enabled applications.Type: GrantFiled: November 2, 2005Date of Patent: August 7, 2007Assignee: Microsoft CorporationInventors: Stephen R Falcon, Clement Chun Pong Yip, Dan Banay, David M Miller
-
Publication number: 20070143115Abstract: Systems and methods are described for speech systems that utilize an interaction manager to manage interactions—also known as dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains an interaction list. An application wishing to utilize the speech system submits one or more interactions to the interaction manager. Interactions are normally processed in the order in which they are received. An exception to this rule is an interaction that is configured by an application to be processed immediately, which causes the interaction manager to place the interaction at the front of the interaction list of interactions. If an application has designated an interaction to interrupt a currently processing interaction, then the newly submitted application will interrupt any interaction currently being processed and, therefore, it will be processed immediately.Type: ApplicationFiled: December 16, 2005Publication date: June 21, 2007Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, Dan Banay, David Miller
-
Patent number: 7188066Abstract: Systems and methods are described for a speech system that includes one or more speech controls incorporated into one or more speech-enabled applications that run on the speech system. The controls allow applications to be developed with minimal programming effort to incorporate common speech-enabled application functions. A question control provides a customizable template for requesting information from a user. An announcer control allows a speech-enabled application to provide a user with information without having to re-create an entire announcer process each time it is used. A command control provides a simple way to attach command and control functions to speech-enabled applications. A word trainer control provides a way to associate user-selected voice tags with certain information. Providing the controls for use with speech-enabled applications ensures standardized user interfaces across multiple speech-enabled applications.Type: GrantFiled: February 4, 2002Date of Patent: March 6, 2007Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, Dan Banay, David Michael Miller
-
Patent number: 7167831Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: GrantFiled: February 4, 2002Date of Patent: January 23, 2007Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, David Michael Miller, Dan Banay
-
Patent number: 7139713Abstract: Systems and methods are described for speech systems that utilize an interaction manager to manage interactions—also known as dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains an interaction list. An application wishing to utilize the speech system submits one or more interactions to the interaction manager. Interactions are normally processed in the order in which they are received. An exception to this rule is an interaction that is configured by an application to be processed immediately, which causes the interaction manager to place the interaction at the front of the interaction list of interactions. If an application has designated an interaction to interrupt a currently processing interaction, then the newly submitted application will interrupt any interaction currently being processed and, therefore, it will be processed immediately.Type: GrantFiled: February 4, 2002Date of Patent: November 21, 2006Assignee: Microsoft CorporationInventors: Stephen Russell Falcon, Clement Chun Pong Yip, Dan Banay, David Michael Miller
-
Publication number: 20060161429Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: ApplicationFiled: December 16, 2005Publication date: July 20, 2006Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, David Miller, Dan Banay
-
Publication number: 20060106617Abstract: Systems and methods are described for a speech system that includes one or more speech controls incorporated into one or more speech-enabled applications that run on the speech system. The controls allow applications to be developed with minimal programming effort to incorporate common speech-enabled application functions. A question control provides a customizable template for requesting information from a user. An announcer control allows a speech-enabled application to provide a user with information without having to re-create an entire announcer process each time it is used. A command control provides a simple way to attach command and control functions to speech-enabled applications. A word trainer control provides a way to associate user-selected voice tags with certain information. Providing the controls for use with speech-enabled applications ensures standardized user interfaces across multiple speech-enabled applications.Type: ApplicationFiled: December 19, 2005Publication date: May 18, 2006Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, Dan Banay, David Miller
-
Publication number: 20060069573Abstract: Systems and methods are described for a speech system that includes one or more speech controls incorporated into one or more speech-enabled applications that run on the speech system. The controls allow applications to be developed with minimal programming effort to incorporate common speech-enabled application functions. A question control provides a customizable template for requesting information from a user. An announcer control allows a speech-enabled application to provide a user with information without having to re-create an entire announcer process each time it is used. A command control provides a simple way to attach command and control functions to speech-enabled applications. A word trainer control provides a way to associate user-selected voice tags with certain information. Providing the controls for use with speech-enabled applications ensures standardized user interfaces across multiple speech-enabled applications.Type: ApplicationFiled: November 2, 2005Publication date: March 30, 2006Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, Dan Banay, David Miller
-
Publication number: 20060069571Abstract: Systems and methods are described for speech systems that utilize an interaction manager to manage interactions—also known as dialogues—from one or more applications. The interactions are managed properly even if multiple applications use different grammars. The interaction manager maintains an interaction list. An application wishing to utilize the speech system submits one or more interactions to the interaction manager. Interactions are normally processed in the order in which they are received. An exception to this rule is an interaction that is configured by an application to be processed immediately, which causes the interaction manager to place the interaction at the front of the interaction list of interactions. If an application has designated an interaction to interrupt a currently processing interaction, then the newly submitted application will interrupt any interaction currently being processed and, therefore, it will be processed immediately.Type: ApplicationFiled: November 1, 2005Publication date: March 30, 2006Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, Dan Banay, David Miller
-
Publication number: 20060053016Abstract: Systems and methods are described for a speech system that manages multiple grammars from one or more speech-enabled applications. The speech system includes a speech server that supports different grammars and different types of grammars by exposing several methods to the speech-enabled applications. The speech server supports static grammars that do not change and dynamic grammars that may change after a commit. The speech server provides persistence by supporting persistent grammars that enable a user to issue a command to an application even when the application is not loaded. In such a circumstance, the application is automatically launched and the command is processed. The speech server may enable or disable a grammar in order to limit confusion between grammars. Global and yielding grammars are also supported by the speech server. Global grammars are always active (e.g., “call 9-1-1”) while yielding grammars may be deactivated when an interaction whose grammar requires priority is active.Type: ApplicationFiled: November 4, 2005Publication date: March 9, 2006Applicant: Microsoft CorporationInventors: Stephen Falcon, Clement Yip, Dan Banay, David Miller