Patents by Inventor Kenneth P. Hinckley

Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10579216
    Abstract: In general, the multi-touch detection implementations described herein use touch detection technologies to provide new and advantageous interactions between users and touch displays using touch, pens and user-wearable devices (UWDs). These new and advantageous interactions include user-associated mobile menus, combined click-through and radial marking menus, menus to automate and improve drawing or manipulation of content on a display, new menus and methods of selecting objects and text on a display, and new interactions with UWDs and touchscreen displays by using UWDs with gestures. In addition, targeted haptic feedback to the UWD of specific users of a display is enabled. In some multi-touch detection implementations menus or tools available to act on object on a display can be ported entirely, or in part, between displays, such as between small and large displays.
    Type: Grant
    Filed: June 29, 2016
    Date of Patent: March 3, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton
  • Patent number: 10558341
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: February 11, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Patent number: 10416783
    Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.
    Type: Grant
    Filed: October 17, 2018
    Date of Patent: September 17, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Pierre P. N. Greborio, Kenneth P. Hinckley, William A. S. Buxton
  • Patent number: 10409381
    Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.
    Type: Grant
    Filed: August 10, 2015
    Date of Patent: September 10, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
  • Patent number: 10289239
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Grant
    Filed: December 12, 2016
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Patent number: 10282086
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Grant
    Filed: June 28, 2016
    Date of Patent: May 7, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 10268367
    Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.
    Type: Grant
    Filed: June 10, 2016
    Date of Patent: April 23, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Publication number: 20190050068
    Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.
    Type: Application
    Filed: October 17, 2018
    Publication date: February 14, 2019
    Inventors: Michel Pahud, Pierre P.N. Greborio, Kenneth P. Hinckley, William A.S. Buxton
  • Patent number: 10198109
    Abstract: A device and related methods, including a display and a display surface, are provided. The device includes logic configured to detect touch events received via the display surface. The display is coupled to a remote computing functionality that is configured to store shared work space information in a data store associated with the remote computing functionality to support collaborative tasks performed by participants in an interaction session. The device further includes logic configured to receive touch events and interpret the touch events as a request to drag an object, select a menu option, or display a web page. The device further includes logic configured to receive requests, not via the display surface of the display, but from user devices different from the display. Such requests can include requests to display an object on the display surface or execute an application.
    Type: Grant
    Filed: November 23, 2015
    Date of Patent: February 5, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Patent number: 10191940
    Abstract: The claimed subject matter provides a system and/or a method that facilitates in situ searching of data. An interface can receive a flick gesture from an input device. An in situ search component can employ an in situ search triggered by the flick gesture, wherein the in situ search is executed on at least one of a portion of data selected on the input device.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: January 29, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Kenneth P. Hinckley
  • Patent number: 10162511
    Abstract: Systems and/or methods are provided that facilitates revealing assistance information associated with a user interface. An interface can obtain input information related to interactions between the interface and a user. In addition, the interface can output assistance information in situ with the user interface. Further, a decision component that determines the in situ assistance information output by the interface based at least in part on the obtained input information.
    Type: Grant
    Filed: June 1, 2012
    Date of Patent: December 25, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Shengdong Zhao, Edward B. Cutrell, Raman K. Sarin, Patrick M. Baudisch, Darryl Yust
  • Patent number: 10139925
    Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: November 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Pierre P. N. Greborio, Kenneth P. Hinckley, William A. S. Buxton
  • Publication number: 20180300056
    Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.
    Type: Application
    Filed: February 23, 2018
    Publication date: October 18, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
  • Publication number: 20180239519
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Publication number: 20180239482
    Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer
  • Publication number: 20180239520
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Publication number: 20180239509
    Abstract: Use of pre-interaction context associated with gesture and touch interactions is provided. A user interface is configured to receive an interaction. A touch or gesture input may be received from a user via a touchscreen. Prior to receiving an interaction on the user interface from the user, pre-interaction context is detected. Based on the pre-interaction context, the user's interaction with the user interface is interpreted and the interaction is provided on the user interface.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Seongkook Heo
  • Publication number: 20180225021
    Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.
    Type: Application
    Filed: March 29, 2018
    Publication date: August 9, 2018
    Inventors: Kenneth P. HINCKLEY, Koji YATANI
  • Patent number: 10042512
    Abstract: The present invention provides a three-dimensional user interface for a computer system that allows a user to combine and store a group of windows as a task. The image of each task can be positioned within a three-dimensional environment such that the user may utilize spatial memory in order remember where a particular task is located.
    Type: Grant
    Filed: September 10, 2014
    Date of Patent: August 7, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: George G. Robertson, Mary P. Czerwinski, Kenneth P. Hinckley, Kirsten C. Risden, Daniel C. Robbins, Maarten R. van Dantzich
  • Patent number: 10044790
    Abstract: A unique system and method that facilitates extending input/output capabilities for resource deficient mobile devices and interactions between multiple heterogeneous devices is provided. The system and method involve an interactive surface to which the desired mobile devices can be connected. The interactive surface can provide an enhanced display space and customization controls for mobile devices that lack adequate displays and input capabilities. In addition, the interactive surface can be employed to permit communication and interaction between multiple mobile devices that otherwise are unable to interact with each other. When connected to the interactive surface, the mobile devices can share information, view information from their respective devices, and store information to the interactive surface. Furthermore, the interactive surface can resume activity states of mobile devices that were previously communicating upon re-connection to the surface.
    Type: Grant
    Filed: May 17, 2011
    Date of Patent: August 7, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Andrew D. Wilson