Patents by Inventor Adrian J. Garside

Adrian J. Garside has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140109008
    Abstract: This document describes techniques for application reporting in an application-selectable user interface. These techniques permit a user to view reports for applications in a user interface through which these applications may be selected. By so doing, a user may quickly and easily determine which applications to select based on their respective reports and then select them or their content through the user interface.
    Type: Application
    Filed: December 17, 2013
    Publication date: April 17, 2014
    Applicant: Microsoft Corporation
    Inventors: Nazia Zaman, Adrian J. Garside, Christopher T. Bush, Lindsey R. Barcheck, Chantal M. Leonard, Jesse Clay Satterfield
  • Patent number: 8689123
    Abstract: This document describes techniques for application reporting in an application-selectable user interface. These techniques permit a user to view reports for applications in a user interface through which these applications may be selected. By so doing, a user may quickly and easily determine which applications to select based on their respective reports and then select them or their content through the user interface.
    Type: Grant
    Filed: December 23, 2010
    Date of Patent: April 1, 2014
    Assignee: Microsoft Corporation
    Inventors: Nazia Zaman, Adrian J. Garside, Christopher T. Bush, Lindsey R. Barcheck, Chantal M. Leonard, Jesse Clay Satterfield
  • Patent number: 8619053
    Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.
    Type: Grant
    Filed: July 27, 2012
    Date of Patent: December 31, 2013
    Assignee: Microsoft Corporation
    Inventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
  • Publication number: 20130063443
    Abstract: Tile cache techniques are described. In at least some embodiments, a tile cache is maintained that stores tile content for a plurality of tiles. The tile content is ordered in the tile cache to match a visual order of tiles in a graphical user interface. When tiles are moved (e.g., panned and/or scrolled) in the graphical user interface, tile content can be retrieved from the tile cache and displayed.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Inventors: Adrian J. Garside, Milena Salman, Vivek Y. Tripathi
  • Publication number: 20120304061
    Abstract: Various embodiments enable target disambiguation and correction. In one or more embodiments, target disambiguation includes an entry mode in which attempts are made to disambiguate one or more targets that have been selected by a user, and an exit mode which exits target disambiguation. Entry mode can be triggered in a number of different ways including, by way of example and not limitation, acquisition of multiple targets, selection latency, a combination of multiple target acquisition and selection latency, and the like. Exit mode can be triggered in a number of different ways including, by way of example and not limitation, movement of a target selection mechanism outside of a defined geometry, speed of movement of the target selection mechanism, and the like.
    Type: Application
    Filed: May 27, 2011
    Publication date: November 29, 2012
    Inventors: Paul Armistead Hoover, Michael J. Patten, Theresa B. Pittappilly, Jan-Kristian Markiewicz, Adrian J. Garside, Maxim V. Mazeev, Jarrod Lombardo
  • Publication number: 20120293418
    Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.
    Type: Application
    Filed: July 27, 2012
    Publication date: November 22, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
  • Patent number: 8253708
    Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.
    Type: Grant
    Filed: July 13, 2009
    Date of Patent: August 28, 2012
    Assignee: Microsoft Corporation
    Inventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
  • Publication number: 20120167011
    Abstract: This document describes techniques for application reporting in an application-selectable user interface. These techniques permit a user to view reports for applications in a user interface through which these applications may be selected. By so doing, a user may quickly and easily determine which applications to select based on their respective reports and then select them or their content through the user interface.
    Type: Application
    Filed: December 23, 2010
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: Nazia Zaman, Adrian J. Garside, Christopher T. Bush, Lindsey R. Barcheck, Chantal M. Leonard, Jesse Clay Satterfield
  • Patent number: 8140994
    Abstract: An object is associated with one or more controls in a software application. An object associated with a control determines the operation of the data entry user interface when the data entry user interface is being employed to enter data into the control. More particularly, the object may communicate interface e characteristics to a component that is responsible for providing the user interface to the user. Such a component may be, for example, a shared software module that renders the user interface on a display, receives input data from the user through the user interface, and routes the entered data to a designated destination. Alternately, the object itself may create a user interface having the specified characteristics.
    Type: Grant
    Filed: January 23, 2009
    Date of Patent: March 20, 2012
    Assignee: Microsoft Corporation
    Inventors: Kyril Feldman, Robert L Chambers, Steve Dodge, Takanobu Murayama, Tobias Zielinski, Todd A Torset, Thomas R Wick, Adrian J Garside
  • Patent number: 7970812
    Abstract: A system and method for redistributing space in ink-to-text conversions is described. In stylus-based computing systems, users often desire to convert ink from ink into text. Sometimes the conversion is made such that an interaction region is made too small for effective recognition correction or interaction. A system and procedure is described that adjusts the spacing of text to allow easier interaction with the recognition results.
    Type: Grant
    Filed: March 17, 2005
    Date of Patent: June 28, 2011
    Assignee: Microsoft Corporation
    Inventors: Adrian J. Garside, Alice Dai, Takanobu Murayama, Tracy D. Schultz
  • Patent number: 7634738
    Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.
    Type: Grant
    Filed: November 19, 2004
    Date of Patent: December 15, 2009
    Assignee: Microsoft Corporation
    Inventors: Josh A. Clow, Adrian J. Garside, David V. Winkler
  • Publication number: 20090273565
    Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.
    Type: Application
    Filed: July 13, 2009
    Publication date: November 5, 2009
    Applicant: Microsoft Corporation
    Inventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
  • Patent number: 7562296
    Abstract: A correction tool that displays a correction widget when a user acts to correct text is provided. More particularly, if the user places an insertion point in or to the immediate left of the text, or selects the text, the tool displays the correction widget immediately to the left of the selected text. The user can then quickly access a correction interface for correcting the text simply by moving the pointer the short distance from the insertion point to the correction widget. When the user activates the correction widget, the tool displays the correction interface immediately proximal to the correction widget. Thus, the user need only move the pointer a small distance further to then correct the text using the correction interface.
    Type: Grant
    Filed: July 27, 2005
    Date of Patent: July 14, 2009
    Assignee: Microsoft Corporation
    Inventors: Ravipal Soin, Adrian J. Garside, David V. Winkler, Luis M. Huapaya, Marieke Iwema
  • Patent number: 7561145
    Abstract: User interfaces, methods, systems, and computer-readable media for activating and/or displaying text input systems on display devices may include: (a) displaying a text input system activation target at a user changeable location on a display device; (b) receiving user input directed to the activation target; and (c) activating a text input system in response to the user input. Such user interfaces, methods, and systems further may include (d) displaying a pre-interaction condition of the activation target; (e) receiving user input directed to the activation target in this pre-interaction condition; and (f) changing an appearance of the activation target from the pre-interaction condition to a larger size and/or a different visual appearance in response to this user input. Additional aspects of this invention relate to computer-readable media for providing user interfaces, systems, and methods as described above.
    Type: Grant
    Filed: March 18, 2005
    Date of Patent: July 14, 2009
    Assignee: Microsoft Corporation
    Inventors: Adrian J. Garside, F. David Jones, Josh A. Clow, Judy C. Tandog, Leroy B. Keely, Tracy Dianne Schultz
  • Patent number: 7551779
    Abstract: Described is a computer-implemented system and method that detects and differentiates scratch-out gestures from other electronic ink, e.g., entered via a pen. The system and method compare boundary-based criteria to differentiate, which eliminates the need to have a specially-shaped scratch-out pattern, instead allowing a wide variety of scratch-out styles to be detected. Criteria includes boundary-based evaluations such as whether the potential scratch-out gesture intersects previously recognized words or characters, whether the scratch-out gesture has a width that is at least a threshold percentage of the width of a word or character bounding box, whether the electronic ink extends beyond the midpoint of the bounding box, and whether at least some portion of the scratch-out gesture is above a baseline of the word or character. Scratch-out gestures entered in freeform input writing areas and boxed input writing areas are supported.
    Type: Grant
    Filed: March 17, 2005
    Date of Patent: June 23, 2009
    Assignee: Microsoft Corporation
    Inventors: Adrian J. Garside, Takanobu Murayama, Tracy Schultz, Daphne Guericke, Ernest L. Pennington, Shou-Ching Schilling
  • Publication number: 20090150777
    Abstract: An object is associated with one or more controls in a software application. An object associated with a control determines the operation of the data entry user interface when the data entry user interface is being employed to enter data into the control. More particularly, the object may communicate interface e characteristics to a component that is responsible for providing the user interface to the user. Such a component may be, for example, a shared software module that renders the user interface on a display, receives input data from the user through the user interface, and routes the entered data to a designated destination. Alternately, the object itself may create a user interface having the specified characteristics.
    Type: Application
    Filed: January 23, 2009
    Publication date: June 11, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Kyril Feldman, Robert L. Chambers, Steve Dodge, Takanobu Murayama, Tobias Zielinski, Todd A. Torset, Thomas R. Wick, Adrian J. Garside
  • Publication number: 20090150776
    Abstract: An object is associated with one or more controls in a software application. An object associated with a control determines the operation of the data entry user interface when the data entry user interface is being employed to enter data into the control. More particularly, the object may communicate interface e characteristics to a component that is responsible for providing the user interface to the user. Such a component may be, for example, a shared software module that renders the user interface on a display, receives input data from the user through the user interface, and routes the entered data to a designated destination. Alternately, the object itself may create a user interface having the specified characteristics.
    Type: Application
    Filed: January 23, 2009
    Publication date: June 11, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Kyril Feldman, Robert L. Chambers, Steve Dodge, Takanobu Murayama, Tobias Zielinski, Todd A. Torset, Thomas R. Wick, Adrian J. Garside
  • Publication number: 20090132951
    Abstract: An object is associated with one or more controls in a software application. An object associated with a control determines the operation of the data entry user interface when the data entry user interface is being employed to enter data into the control. More particularly, the object may communicate interface e characteristics to a component that is responsible for providing the user interface to the user. Such a component may be, for example, a shared software module that renders the user interface on a display, receives input data from the user through the user interface, and routes the entered data to a designated destination. Alternately, the object itself may create a user interface having the specified characteristics.
    Type: Application
    Filed: January 23, 2009
    Publication date: May 21, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Kyril Feldman, Robert L Chambers, STEVE DODGE, TAKANOBU MURAYAMA, TOBIAS ZIELINSKI, TODD A TORSET, Thomas R Wick, Adrian J Garside
  • Patent number: 7490296
    Abstract: An object is associated with one or more controls in a software application. An object associated with a control determines the operation of the data entry user interface when the data entry user interface is being employed to enter data into that control. More particularly, the object may communicate interface characteristics to a component that is responsible for providing the user interface to the user. Such a component may be, for example, a shared software module that renders the user interface on a display, receives input data from the user through the user interface, and routes the entered data to a designated destination. Alternately, the object itself may create a user interface having the specified characteristics.
    Type: Grant
    Filed: April 30, 2003
    Date of Patent: February 10, 2009
    Assignee: Microsoft Corporation
    Inventors: Kyril Feldman, Robert L Chambers, Steve Dodge, Takanobu Murayama, Tobiasz Zielinski, Todd A Torset, Thomas R Wick, Adrian J Garside
  • Patent number: 7461348
    Abstract: Systems, methods, and computer-readable media process computer input data (such as electronic ink data, speech input data, keyboard input data, etc.), including focus change data, in a manner so that the input insertion range better comports with the user's original intent. More specifically, user input data may be accepted, before, during, and/or after a focus change event is initiated, and the systems and methods will process this input data in an intuitive manner, directing the data to areas of an application program or the operating system that better comport with the user's original intent. In this manner, loss of input data may be avoided and misdirected input data may be avoided, thereby lowering user frustration during focus change events.
    Type: Grant
    Filed: November 19, 2004
    Date of Patent: December 2, 2008
    Assignee: Microsoft Corporation
    Inventors: Josh A. Clow, Adrian J. Garside, David V. Winkler