Patents by Inventor Sarah G. Williams

Sarah G. Williams has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9594504
    Abstract: One or more techniques and/or systems are provided for utilizing input data received from an indirect interaction device (e.g., mouse, touchpad, etc.) to launch, engage, and/or close, etc. an object within a user interface. For example, a sensory surface of the indirect interaction device may be divided into two (or more) portions, a first portion utilized to launch, engage, and/or close an object and a second portion utilized to navigate (e.g., a cursor) within the user interface. When an object is launched based upon receipt of a predefined gesture(s), the first portion of the sensory surface may be mapped to the object to provide for interaction with the object via an interaction between a contact (e.g., finger) and the first portion. Also, the surface area of the first portion may be altered (e.g., enlarged) when it is mapped to the object and/or according to operations performed on the object.
    Type: Grant
    Filed: November 8, 2011
    Date of Patent: March 14, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Sarah G. Williams, Eric Boller Fleegal, William Roger Voss
  • Patent number: 9367230
    Abstract: One or more techniques and/or systems are provided for utilizing input data received from an indirect interaction device (e.g., mouse, touchpad, etc.) as if the data was received from a direct interaction device (e.g., touchscreen). Interaction models are described for handling input data received from an indirect interaction device. For example, the interaction models may provide for the presentation of two or more targets (e.g., cursors) on a display when two or more contacts (e.g., fingers) are detected by indirect interaction device. Moreover, based upon a number of contacts detected and/or a pressured applied by respective contacts, the presented target(s) may be respectively transitioned between a hover visualization and an engage visualization. Targets in an engage visualization may manipulate a size of an object presented in a user interface, pan the object, drag the object, rotate the object, and/or otherwise engage the object, for example.
    Type: Grant
    Filed: November 8, 2011
    Date of Patent: June 14, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sarah G. Williams, Scott Honji, Masahiko Kaneko, Jan-Kristian Markiewicz, Vincent Ball, Amish Patel, Paul R. Millsap
  • Publication number: 20140372916
    Abstract: In one embodiment, a graphical user interface 200 may keep an active header 304 present in the view frame 220 while scrolling through an active display item set 302, with smooth transitions between headers 214. The graphical user interface 200 for an operating system of the user device may display an active display item set 302 in a grid view. The graphical user interface 200 may automatically adjust a dependent scrolling motion animation of the active display item set 302 based on the form factor of the user device.
    Type: Application
    Filed: June 12, 2013
    Publication date: December 18, 2014
    Inventors: Julien Dollon, Sarah G. Williams
  • Publication number: 20130117715
    Abstract: One or more techniques and/or systems are provided for utilizing input data received from an indirect interaction device (e.g., mouse, touchpad, etc.) to launch, engage, and/or close, etc. an object within a user interface. For example, a sensory surface of the indirect interaction device may be divided into two (or more) portions, a first portion utilized to launch, engage, and/or close an object and a second portion utilized to navigate (e.g., a cursor) within the user interface. When an object is launched based upon receipt of a predefined gesture(s), the first portion of the sensory surface may be mapped to the object to provide for interaction with the object via an interaction between a contact (e.g., finger) and the first portion. Also, the surface area of the first portion may be altered (e.g., enlarged) when it is mapped to the object and/or according to operations performed on the object.
    Type: Application
    Filed: November 8, 2011
    Publication date: May 9, 2013
    Applicant: Microsoft Corporation
    Inventors: Sarah G. Williams, Eric Boller Fleegal, William Roger Voss
  • Publication number: 20130113716
    Abstract: One or more techniques and/or systems are provided for utilizing input data received from an indirect interaction device (e.g., mouse, touchpad, etc.) as if the data was received from a direct interaction device (e.g., touchscreen). Interaction models are described for handling input data received from an indirect interaction device. For example, the interaction models may provide for the presentation of two or more targets (e.g., cursors) on a display when two or more contacts (e.g., fingers) are detected by indirect interaction device. Moreover, based upon a number of contacts detected and/or a pressured applied by respective contacts, the presented target(s) may be respectively transitioned between a hover visualization and an engage visualization. Targets in an engage visualization may manipulate a size of an object presented in a user interface, pan the object, drag the object, rotate the object, and/or otherwise engage the object, for example.
    Type: Application
    Filed: November 8, 2011
    Publication date: May 9, 2013
    Applicant: Microsoft Corporation
    Inventors: Sarah G. Williams, Scott Honji, Masahiko Kaneko, Jan-Kristian Markiewicz, Vincent Ball, Amish Patel, Paul R. Millsap