Patents by Inventor Kenneth P. Hinckley
Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10579216Abstract: In general, the multi-touch detection implementations described herein use touch detection technologies to provide new and advantageous interactions between users and touch displays using touch, pens and user-wearable devices (UWDs). These new and advantageous interactions include user-associated mobile menus, combined click-through and radial marking menus, menus to automate and improve drawing or manipulation of content on a display, new menus and methods of selecting objects and text on a display, and new interactions with UWDs and touchscreen displays by using UWDs with gestures. In addition, targeted haptic feedback to the UWD of specific users of a display is enabled. In some multi-touch detection implementations menus or tools available to act on object on a display can be ported entirely, or in part, between displays, such as between small and large displays.Type: GrantFiled: June 29, 2016Date of Patent: March 3, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Kenneth P. Hinckley, William Buxton
-
Patent number: 10558341Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: GrantFiled: February 20, 2017Date of Patent: February 11, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
-
Patent number: 10416783Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.Type: GrantFiled: October 17, 2018Date of Patent: September 17, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Pierre P. N. Greborio, Kenneth P. Hinckley, William A. S. Buxton
-
Patent number: 10409381Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: GrantFiled: August 10, 2015Date of Patent: September 10, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
-
Patent number: 10289239Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.Type: GrantFiled: December 12, 2016Date of Patent: May 14, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
-
Patent number: 10282086Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.Type: GrantFiled: June 28, 2016Date of Patent: May 7, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Koji Yatani
-
Patent number: 10268367Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: GrantFiled: June 10, 2016Date of Patent: April 23, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20190050068Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.Type: ApplicationFiled: October 17, 2018Publication date: February 14, 2019Inventors: Michel Pahud, Pierre P.N. Greborio, Kenneth P. Hinckley, William A.S. Buxton
-
Patent number: 10198109Abstract: A device and related methods, including a display and a display surface, are provided. The device includes logic configured to detect touch events received via the display surface. The display is coupled to a remote computing functionality that is configured to store shared work space information in a data store associated with the remote computing functionality to support collaborative tasks performed by participants in an interaction session. The device further includes logic configured to receive touch events and interpret the touch events as a request to drag an object, select a menu option, or display a web page. The device further includes logic configured to receive requests, not via the display surface of the display, but from user devices different from the display. Such requests can include requests to display an object on the display surface or execute an application.Type: GrantFiled: November 23, 2015Date of Patent: February 5, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud
-
Patent number: 10191940Abstract: The claimed subject matter provides a system and/or a method that facilitates in situ searching of data. An interface can receive a flick gesture from an input device. An in situ search component can employ an in situ search triggered by the flick gesture, wherein the in situ search is executed on at least one of a portion of data selected on the input device.Type: GrantFiled: December 16, 2014Date of Patent: January 29, 2019Assignee: Microsoft Technology Licensing, LLCInventor: Kenneth P. Hinckley
-
Patent number: 10162511Abstract: Systems and/or methods are provided that facilitates revealing assistance information associated with a user interface. An interface can obtain input information related to interactions between the interface and a user. In addition, the interface can output assistance information in situ with the user interface. Further, a decision component that determines the in situ assistance information output by the interface based at least in part on the obtained input information.Type: GrantFiled: June 1, 2012Date of Patent: December 25, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Shengdong Zhao, Edward B. Cutrell, Raman K. Sarin, Patrick M. Baudisch, Darryl Yust
-
Patent number: 10139925Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.Type: GrantFiled: March 14, 2013Date of Patent: November 27, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Pierre P. N. Greborio, Kenneth P. Hinckley, William A. S. Buxton
-
Publication number: 20180300056Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.Type: ApplicationFiled: February 23, 2018Publication date: October 18, 2018Applicant: Microsoft Technology Licensing, LLCInventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
-
Publication number: 20180239519Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
-
Publication number: 20180239482Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer
-
Publication number: 20180239520Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
-
Publication number: 20180239509Abstract: Use of pre-interaction context associated with gesture and touch interactions is provided. A user interface is configured to receive an interaction. A touch or gesture input may be received from a user via a touchscreen. Prior to receiving an interaction on the user interface from the user, pre-interaction context is detected. Based on the pre-interaction context, the user's interaction with the user interface is interpreted and the interaction is provided on the user interface.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Seongkook Heo
-
Publication number: 20180225021Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: March 29, 2018Publication date: August 9, 2018Inventors: Kenneth P. HINCKLEY, Koji YATANI
-
Patent number: 10042512Abstract: The present invention provides a three-dimensional user interface for a computer system that allows a user to combine and store a group of windows as a task. The image of each task can be positioned within a three-dimensional environment such that the user may utilize spatial memory in order remember where a particular task is located.Type: GrantFiled: September 10, 2014Date of Patent: August 7, 2018Assignee: Microsoft Technology Licensing, LLCInventors: George G. Robertson, Mary P. Czerwinski, Kenneth P. Hinckley, Kirsten C. Risden, Daniel C. Robbins, Maarten R. van Dantzich
-
Patent number: 10044790Abstract: A unique system and method that facilitates extending input/output capabilities for resource deficient mobile devices and interactions between multiple heterogeneous devices is provided. The system and method involve an interactive surface to which the desired mobile devices can be connected. The interactive surface can provide an enhanced display space and customization controls for mobile devices that lack adequate displays and input capabilities. In addition, the interactive surface can be employed to permit communication and interaction between multiple mobile devices that otherwise are unable to interact with each other. When connected to the interactive surface, the mobile devices can share information, view information from their respective devices, and store information to the interactive surface. Furthermore, the interactive surface can resume activity states of mobile devices that were previously communicating upon re-connection to the surface.Type: GrantFiled: May 17, 2011Date of Patent: August 7, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Andrew D. Wilson