Patents by Inventor William A. S. Buxton

William A. S. Buxton has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140250245
    Abstract: Described herein are techniques and systems that allow modification of functionalities based on distances between a shared device (e.g., a shared display, etc.) and an individual device (e.g., a mobile computing device, etc.). The shared device and the individual device may establish a communication to enable exchange of data. In some embodiments, the shared device or the individual device may measure a distance between the shared device and the individual device. Based on the distance, the individual device may operate in a different mode. In some instances, the shared device may then instruct the individual device to modify a functionality corresponding to the mode.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Kenneth P. Hinckley, William A.S. Buxton, Gina D. Venolia
  • Publication number: 20140247207
    Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Pierre P.N. Greborio, Kenneth P. Hinckley, William A.S. Buxton
  • Publication number: 20140123049
    Abstract: The subject disclosure is directed towards a graphical or printed keyboard having keys removed, in which the removed keys are those made redundant by gesture input. For example, a graphical or printed keyboard may be the same overall size and have the same key sizes as other graphical or printed keyboards with no numeric keys, yet via the removed keys may fit numeric and alphabetic keys into the same footprint. Also described is having three or more characters per key, with a tap corresponding to one character, and different gestures on the key differentiating among the other characters.
    Type: Application
    Filed: December 19, 2012
    Publication date: May 1, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: William A. S. Buxton, Ahmed Sabbir Arif, Michel Pahud, Kenneth P. Hinckley, Finbarr S. Duggan
  • Patent number: 8514264
    Abstract: Existing remote workspace sharing systems are difficult to use. For example, changes made on a common work product by one user often appear abruptly on displays viewed by remote users. As a result the interaction is perceived as unnatural by the users and is often inefficient. Images of a display of a common work product are received from a camera at a first location. These images may also comprise information about objects between the display and the camera such as a user's hand editing a document on a tablet PC. These images are combined with images of the shared work product and displayed at remote locations. Advance information about remote user actions is then visible and facilitates collaborative mediation between users. Depth information may be used to influence the process of combining the images.
    Type: Grant
    Filed: February 27, 2012
    Date of Patent: August 20, 2013
    Assignee: Microsoft Corporation
    Inventors: Ankur Agarwal, Antonio Criminisi, William A. S. Buxton, Andrew Blake, Andrew William Fitzgibbon
  • Publication number: 20130201113
    Abstract: Functionality is described herein for detecting and responding to gestures performed by a user using a computing device, such as, but not limited to, a tablet computing device. In one implementation, the functionality operates by receiving touch input information in response to the user touching the computing device, and movement input information in response to the user moving the computing device. The functionality then determines whether the input information indicates that a user has performed or is performing a multi-touch-movement (MTM) gesture. The functionality can then perform any behavior in response to determining that the user has performed an MTM gesture, such as by modifying a view or invoking a function, etc.
    Type: Application
    Filed: February 7, 2012
    Publication date: August 8, 2013
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, William A. S. Buxton
  • Publication number: 20120306995
    Abstract: A system facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes a first user and one or more second users. In response to determining a temporary absence of the first user from the telepresence session, a recordation of the telepresence session is initialized to enable a playback of a portion or a summary of the telepresence session that the first user has missed.
    Type: Application
    Filed: August 13, 2012
    Publication date: December 6, 2012
    Applicant: Microsoft Corporation
    Inventors: Christian Huitema, William A.S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Patent number: 8325020
    Abstract: Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is within close physical proximity. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: December 4, 2012
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Malcolm Hall, Stephen Hodges, William A. S. Buxton, David Alexander Butler
  • Patent number: 8253774
    Abstract: The claimed subject matter provides a system and/or a method that facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes two or more virtually represented users that communicate therein. A device can be utilized by at least one virtually represented user that enables communication within the telepresence session, the device includes at least one of an input to transmit a portion of a communication to the telepresence session or an output to receive a portion of a communication from the telepresence session. A detection component can adjust at least one of the input related to the device or the output related to the device based upon the identification of a cue, the cue is at least one of a movement detected, an event detected, or an ambient variation.
    Type: Grant
    Filed: March 30, 2009
    Date of Patent: August 28, 2012
    Assignee: Microsoft Corporation
    Inventors: Christian Huitema, William A. S. Buxton, John E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20120162093
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Application
    Filed: December 28, 2010
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: William A.S. Buxton, Michel Pahud, Kenneth P. Hinckley
  • Publication number: 20120154255
    Abstract: A computing device is described which includes plural display parts provided on respective plural device parts. The display parts define a display surface which provides interfaces to different tools. The tools, in turn, allow a local participant to engage in an interactive session with one or more remote participants. In one case, the tools include: a shared workspace processing module for providing a shared workspace for use by the participants; an audio-video conferencing module for enabling audio-video communication among the participants; and a reference space module for communicating hand gestures and the like among the participants, etc. In one case, the computing device is implemented as a portable computing device that can be held in a participant's hand during use.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, William A. S. Buxton
  • Publication number: 20120159401
    Abstract: Workspaces are manipulated on a mobile device having a display screen. A set of two or more discrete workspaces is established. A default discrete workspace is then displayed on the screen, where the default discrete workspace is one of the discrete workspaces in the set. Whenever a user gestures with the mobile device, the gesture is used to select one of the discrete workspaces from the set, and the selected discrete workspace will be displayed on the screen.
    Type: Application
    Filed: December 16, 2010
    Publication date: June 21, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
  • Patent number: 8171431
    Abstract: The claimed subject matter provides techniques to effectuate and facilitate efficient and flexible selection of display objects. The system can include devices and components that acquire gestures from pointing instrumentalities and thereafter ascertains velocities and proximities in relation to the displayed objects. Based at least upon these ascertained velocities and proximities falling below or within threshold levels, the system displays flags associated with the display object.
    Type: Grant
    Filed: October 5, 2007
    Date of Patent: May 1, 2012
    Assignee: Microsoft Corporation
    Inventors: Tovi Grossman, Patrick M. Baudisch, Kenneth P. Hinckley, William A. S. Buxton, Raman Sarin
  • Publication number: 20120092436
    Abstract: Telepresence of a mobile user (MU) utilizing a mobile device (MD) and remote users who are participating in a telepresence session is optimized. The MD receives video of a first remote user (FRU). Whenever the MU gestures with the MD using a first motion, video of the FRU is displayed. The MD can also receive video and audio of the FRU and a second remote user (SRU), display a workspace, and reproduce the audio of the FRU and SRU in a default manner. Whenever the MU gestures with the MD using the first motion, video of the FRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the FRU. Whenever the MU gestures with the MD using a second motion, video of the SRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the SRU.
    Type: Application
    Filed: October 19, 2010
    Publication date: April 19, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
  • Publication number: 20110320114
    Abstract: A map annotation message including map annotation data that represents a route is received by a receiving device. The map annotation data is analyzed to determine a route represented by the map annotation message, and the determined route is presented by the receiving device.
    Type: Application
    Filed: June 28, 2010
    Publication date: December 29, 2011
    Applicant: Microsoft Corporation
    Inventors: William A.S. Buxton, Flora P. Goldthwaite
  • Publication number: 20100322593
    Abstract: One or more embodiments disclosed herein provide methods and systems for indexing video recordings of scenes using artifacts. More specifically one or more embodiments provide methods which identify artifacts in video recordings and index the video recordings according to the artifacts. These methods and systems also output portions of the video recordings according to the artifact-based indexing.
    Type: Application
    Filed: June 18, 2009
    Publication date: December 23, 2010
    Applicant: Microsoft Corporation
    Inventor: William A.S. Buxton
  • Publication number: 20100275122
    Abstract: A “Click-Through Controller” uses various mobile electronic devices (e.g., cell phones, media players, digital cameras, etc.) to provide real-time interaction with content (e.g., maps, places, images, documents, etc.) displayed on the device's screen via selection of one or more “overlay menu items” displayed on top of that content. Navigation through displayed contents is provided by recognizing 2D and/or 3D device motions and rotations. This allows users to navigate through the displayed contents by simply moving the mobile device. Overlay menu items activate predefined or user-defined functions to interact with the content that is directly below the selected overlay menu item on the display. In various embodiments, there is a spatial correspondence between the overlay menu items and buttons or keys of the mobile device (e.g., a cell phone dial pad or the like) such that overlay menu items are directly activated by selection of one or more corresponding buttons.
    Type: Application
    Filed: April 27, 2009
    Publication date: October 28, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: William A. S. Buxton, John SanGiovanni
  • Publication number: 20100245536
    Abstract: The claimed subject matter provides a system and/or a method that facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes two or more virtually represented users that communicate therein. A device can be utilized by at least one virtually represented user that enables communication within the telepresence session, the device includes at least one of an input to transmit a portion of a communication to the telepresence session or an output to receive a portion of a communication from the telepresence session. A detection component can adjust at least one of the input related to the device or the output related to the device based upon the identification of a cue, the cue is at least one of a movement detected, an event detected, or an ambient variation.
    Type: Application
    Filed: March 30, 2009
    Publication date: September 30, 2010
    Applicant: Microsoft Corporation
    Inventors: Christian Huitema, William A.S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20090094560
    Abstract: The claimed subject matter provides techniques to effectuate and facilitate efficient and flexible selection of display objects. The system can include devices and components that acquire gestures from pointing instrumentalities and thereafter ascertains velocities and proximities in relation to the displayed objects. Based at least upon these ascertained velocities and proximities falling below or within threshold levels, the system displays flags associated with the display object.
    Type: Application
    Filed: October 5, 2007
    Publication date: April 9, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Tovi Grossman, Patrick M. Baudisch, Kenneth P. Hinckley, William A.S. Buxton, Raman Sarin
  • Patent number: D577720
    Type: Grant
    Filed: December 8, 2006
    Date of Patent: September 30, 2008
    Assignee: Microsoft Corporation
    Inventors: Robert M. Freed, William A. S. Buxton, Margaret Johnson, Margaret E. Winsor
  • Patent number: D578117
    Type: Grant
    Filed: December 8, 2006
    Date of Patent: October 7, 2008
    Assignee: Microsoft Corporation
    Inventors: Robert M. Freed, William A. S. Buxton, Margaret Johnson, Margaret E. Winsor