Patents by Inventor Josh A. Clow
Josh A. Clow has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10606564Abstract: Architecture that generates a companion window in combination with a source application experience to enable the accomplishment of a side task yet not switch away from the context of the source application. The companion window experience is a window that is rendered proximate (e.g., beside) a user's source application experience, in a predictable location, and with a predictable user model for invocation and dismissal. The companion window allows the user to retain full visual context of the associated source application experience, while rendering activities that directly pertain to the source application experience or activities that allow the user to interact with two applications.Type: GrantFiled: December 27, 2010Date of Patent: March 31, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jonathan Gordner, Stephan Hoefnagels, Josh Clow, Colin Jeanne, Alexander Allen, Kenneth Parker, Nandini Bhattacharya, Jonathan Li, Kieran Snyder
-
Patent number: 9329768Abstract: Computer-readable media, computerized methods, and computer systems for intuitively invoking a panning action (e.g., moving content within a content region of a display area) by applying a user-initiated input at the content region rendered at a touchscreen interface are provided. Initially, aspects of the user-initiated input include a location of actuation (e.g., touch point on the touchscreen interface) and a gesture. Upon ascertaining that the actuation location occurred within the content region and that the gesture is a drag operation, based on a distance of uninterrupted tactile contact with the touchscreen interface, a panning mode may be initiated. When in the panning mode, and if the application rendering the content at the display area supports scrolling functionality, the gesture will control movement of the content within the content region. In particular, the drag operation of the gesture will pan the content within the display area when surfaced at the touchscreen interface.Type: GrantFiled: February 11, 2013Date of Patent: May 3, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING LLCInventors: David A Matthews, Jan-Kristian Markiewicz, Reed L Townsend, Pamela De La Torre Baltierra, Todd A Torset, Josh A Clow, Xiao Tu, Leroy B Keely
-
Patent number: 9081439Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: GrantFiled: December 27, 2013Date of Patent: July 14, 2015Assignee: MICROSOFT TECHNOOGY LICENSING, LLCInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Publication number: 20140111440Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: ApplicationFiled: December 27, 2013Publication date: April 24, 2014Applicant: MICROSOFT CORPORATIONInventors: ADRIAN J. GARSIDE, F. DAVID JONES, JOSH A. CLOW, JUDY C. TANDOG, LEROY B. KEELY, TRACY DIANNE SCHULTZ
-
Patent number: 8619053Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: GrantFiled: July 27, 2012Date of Patent: December 31, 2013Assignee: Microsoft CorporationInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Patent number: 8375336Abstract: Computer-readable media, computerized methods, and computer systems for intuitively invoking a panning action (e.g., moving content within a content region of a display area) by applying a user-initiated input at the content region rendered at a touchscreen interface are provided. Initially, aspects of the user-initiated input include a location of actuation (e.g., touch point on the touchscreen interface) and a gesture. Upon ascertaining that the actuation location occurred within the content region and that the gesture is a drag operation, based on a distance of uninterrupted tactile contact with the touchscreen interface, a panning mode may be initiated. When in the panning mode, and if the application rendering the content at the display area supports scrolling functionality, the gesture will control movement of the content within the content region. In particular, the drag operation of the gesture will pan the content within the display area when surfaced at the touchscreen interface.Type: GrantFiled: October 3, 2008Date of Patent: February 12, 2013Assignee: Microsoft CorporationInventors: David A. Matthews, Jan-Kristian Markiewicz, Reed L. Townsend, Pamela De La Torre Baltierra, Todd A. Torset, Josh A. Clow, Xiao Tu, Leroy B. Keely
-
Publication number: 20120293418Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: ApplicationFiled: July 27, 2012Publication date: November 22, 2012Applicant: MICROSOFT CORPORATIONInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Patent number: 8253708Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: GrantFiled: July 13, 2009Date of Patent: August 28, 2012Assignee: Microsoft CorporationInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Publication number: 20120167004Abstract: Architecture that generates a companion window in combination with a source application experience to enable the accomplishment of a side task yet not switch away from the context of the source application. The companion window experience is a window that is rendered proximate (e.g., beside) a user's source application experience, in a predictable location, and with a predictable user model for invocation and dismissal. The companion window allows the user to retain full visual context of the associated source application experience, while rendering activities that directly pertain to the source application experience or activities that allow the user to interact with two applications.Type: ApplicationFiled: December 27, 2010Publication date: June 28, 2012Applicant: MICROSOFT CORPORATIONInventors: Jonathan Ian Gordner, Stephan Hoefnagels, Josh Clow, Colin Jeanne, Alexander Allen, Kenneth Parker, Nandini Bhattacharya, Jonathan Li, Kieran Snyder
-
Patent number: 7634738Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.Type: GrantFiled: November 19, 2004Date of Patent: December 15, 2009Assignee: Microsoft CorporationInventors: Josh A. Clow, Adrian J. Garside, David V. Winkler
-
Publication number: 20090292989Abstract: Computer-readable media, computerized methods, and computer systems for intuitively invoking a panning action (e.g., moving content within a content region of a display area) by applying a user-initiated input at the content region rendered at a touchscreen interface are provided. Initially, aspects of the user-initiated input include a location of actuation (e.g., touch point on the touchscreen interface) and a gesture. Upon ascertaining that the actuation location occurred within the content region and that the gesture is a drag operation, based on a distance of uninterrupted tactile contact with the touchscreen interface, a panning mode may be initiated. When in the panning mode, and if the application rendering the content at the display area supports scrolling functionality, the gesture will control movement of the content within the content region. In particular, the drag operation of the gesture will pan the content within the display area when surfaced at the touchscreen interface.Type: ApplicationFiled: October 3, 2008Publication date: November 26, 2009Applicant: MICROSOFT CORPORATIONInventors: David A. Matthews, Jan-Kristian Markiewicz, Reed L. Townsend, Pamela De La Torre Baltierra, Todd A. Torset, Josh A. Clow, Xiao Tu, Leroy B. Keely
-
Publication number: 20090273565Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: ApplicationFiled: July 13, 2009Publication date: November 5, 2009Applicant: Microsoft CorporationInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Patent number: 7561145Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: GrantFiled: March 18, 2005Date of Patent: July 14, 2009Assignee: Microsoft CorporationInventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
-
Patent number: 7461348Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.Type: GrantFiled: November 19, 2004Date of Patent: December 2, 2008Assignee: Microsoft CorporationInventors: Josh A. Clow, Adrian J. Garside, David V. Winkler
-
Publication number: 20060209040Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.Type: ApplicationFiled: March 18, 2005Publication date: September 21, 2006Applicant: Microsoft CorporationInventors: Adrian Garside, F. Jones, Josh Clow, Judy Tandog, Leroy Keely, Tracy Schultz
-
Publication number: 20060123159Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.Type: ApplicationFiled: November 19, 2004Publication date: June 8, 2006Applicant: Microsoft CorporationInventors: Josh Clow, Adrian Garside, David Winkler
-
Publication number: 20060112349Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.Type: ApplicationFiled: November 19, 2004Publication date: May 25, 2006Applicant: Microsoft CorporationInventors: Josh Clow, Adrian Garside, David Winkler