Patents by Inventor Kenneth P. Hinckley

Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9965165
    Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.
    Type: Grant
    Filed: February 19, 2010
    Date of Patent: May 8, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 9940016
    Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.
    Type: Grant
    Filed: September 13, 2014
    Date of Patent: April 10, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
  • Publication number: 20180004386
    Abstract: Various technologies described herein pertain to utilizing sensed pre-touch interaction to control a mobile computing device. A pre-touch interaction of a user with the mobile computing device is detected. The pre-touch interaction includes a grip of the user on the mobile computing device and/or a hover of one or more fingers of the user with respect to a touchscreen of the mobile computing device. The finger(s) of the user can be within proximity but not touching the touchscreen as part of the hover. Parameter(s) of the pre-touch interaction of the user with the mobile computing device are identified, and a touch of the user on the touchscreen of the mobile computing device is detected. A computing operation is executed responsive to the touch, where the computing operation is based on the touch and the parameter(s) of the pre-touch interaction of the user with the mobile computing device.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventors: Kenneth P. Hinckley, Michel Pahud, Hrvoje Benko, William Arthur Stewart Buxton, Seongkook Heo
  • Patent number: 9857970
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Grant
    Filed: July 28, 2016
    Date of Patent: January 2, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani, Michel Pahud
  • Publication number: 20170277367
    Abstract: In general, the multi-touch detection implementations described herein use touch detection technologies to provide new and advantageous interactions between users and touch displays using touch, pens and user-wearable devices (UWDs). These new and advantageous interactions include user-associated mobile menus, combined click-through and radial marking menus, menus to automate and improve drawing or manipulation of content on a display, new menus and methods of selecting objects and text on a display, and new interactions with UWDs and touchscreen displays by using UWDs with gestures. In addition, targeted haptic feedback to the UWD of specific users of a display is enabled. In some multi-touch detection implementations menus or tools available to act on object on a display can be ported entirely, or in part, between displays, such as between small and large displays.
    Type: Application
    Filed: June 29, 2016
    Publication date: September 28, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton
  • Patent number: 9774653
    Abstract: The subject disclosure is directed towards co-located collaboration/data sharing that is based upon detecting the proxemics of people and/or the proxemics of devices. A federation of devices is established based upon proxemics, such as when the users have entered into a formation based upon distance between them and orientation. User devices may share content with other devices in the federation based upon micro-mobility actions performed on the devices, e.g., tilting and/or otherwise interacting with a sending device.
    Type: Grant
    Filed: March 4, 2016
    Date of Patent: September 26, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Nicolai Marquardt
  • Publication number: 20170269793
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Application
    Filed: June 6, 2017
    Publication date: September 21, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Desney S. TAN, Kenneth P. HINCKLEY, Steven N. BATHICHE, Ronald O. PESSNER, Bongshin LEE, Anoop GUPTA, Amir NETZ, Brett D. BREWER
  • Patent number: 9740361
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Grant
    Filed: February 7, 2014
    Date of Patent: August 22, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Desney S. Tan, Kenneth P. Hinckley, Steven N. Bathiche, Ronald O. Pessner, Bongshin Lee, Anoop Gupta, Amir Netz, Brett D. Brewer
  • Patent number: 9720559
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Grant
    Filed: February 7, 2014
    Date of Patent: August 1, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Desney S. Tan, Kenneth P. Hinckley, Steven N. Bathiche, Ronald O. Pessner, Bongshin Lee, Anoop Gupta, Amir Netz, Brett D. Brewer
  • Publication number: 20170115782
    Abstract: By correlating user grip information with micro-mobility events, electronic devices can provide support for a broad range of interactions and contextually-dependent techniques. Such correlation allows electronic devices to better identify device usage contexts, and in turn provide a more responsive and helpful user experience, especially in the context of reading and task performance. To allow for accurate and efficient device usage context identification, a model may be used to make device usage context determinations based on the correlated gesture and micro-mobility data. Once a context, device usage context, or gesture is identified, an action can be taken on one or more electronic devices.
    Type: Application
    Filed: October 23, 2015
    Publication date: April 27, 2017
    Inventors: Kenneth P. Hinckley, Hrvoje Benko, Michel Pahud, Dongwook Yoon
  • Publication number: 20170090666
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 12, 2016
    Publication date: March 30, 2017
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Publication number: 20170075549
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Application
    Filed: November 7, 2016
    Publication date: March 16, 2017
    Inventors: Kenneth P. Hinckley, Koji Yatani, Andrew S. Allen, Jonathan R. Harris, Georg F. Petschnigg
  • Publication number: 20170068387
    Abstract: An electronic device may include a touch screen electronic display configured to offset and/or shift the contact locations of touch implements and/or displayed content based on one or more calculated parallax values. The parallax values may be associated with the viewing angle of an operator relative to the display of the electronic device. In various embodiments, the parallax value(s) may be calculated using three-dimensional location sensors, an angle of inclination of a touch implement, and/or one or more displayed calibration objects. Parallax values may be utilized to remap contact locations by a touch implement, shift and/or offset displayed content, and/or perform other transformations as described herein. A stereoscopically displayed content may be offset such that a default display plane is coplanar with a touch surface rather than a display surface. Contacts by a finger may be remapped using portions of the contact region and/or a centroid of the contact region.
    Type: Application
    Filed: August 2, 2016
    Publication date: March 9, 2017
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Victoria Y.H. Wood, Lowell L. Wood, JR.
  • Patent number: 9582076
    Abstract: The description relates to a smart ring. In one example, the smart ring can be configured to be worn on a first segment of a finger of a user. The example smart ring can include at least one flexion sensor secured to the smart ring in a manner that can detect a distance between the at least one flexion sensor and a second segment of the finger. The example smart ring can also include an input component configured to analyze signals from the at least one flexion sensor to detect a pose of the finger.
    Type: Grant
    Filed: September 17, 2014
    Date of Patent: February 28, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Wolf Kienzle, Kenneth P. Hinckley
  • Publication number: 20170052699
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Application
    Filed: November 7, 2016
    Publication date: February 23, 2017
    Inventors: Kenneth P. Hinckley, Koji Yatani, Andrew S. Allen, Jonathan R. Harris, Georg F. Petschnigg
  • Publication number: 20170038965
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Application
    Filed: October 19, 2016
    Publication date: February 9, 2017
    Inventors: Kenneth P. Hinckley, Koji Yatani, Andrew S. Allen, Jonathan R. Harris, Georg F. Petschnigg
  • Publication number: 20170010848
    Abstract: In embodiments of multi-device pairing and combined display, mobile devices have device touchscreens, and housing bezels of the mobile devices can be positioned proximate each other to form a combined display from the respective device touchscreens. An input recognition system can recognize an input to initiate pairing the mobile devices, enabling a cross-screen display of an interface on the respective device touchscreens, such as with different parts of the interface displayed on different ones of the device touchscreens. The input to initiate pairing the mobile devices can be recognized as the proximity of the mobile devices, as a gesture input, as a sensor input, and/or as a voice-activated input to initiate the pairing of the mobile devices for the cross-screen display of the interface.
    Type: Application
    Filed: September 26, 2016
    Publication date: January 12, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Publication number: 20170010695
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20170010733
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20170011681
    Abstract: A display system includes a display, a content component, a focus region component, and a refresh rate component. The display is configured to selectively display information with refresh rates that vary across a plurality of display regions of the display screen. The content component is configured to receive content for display on the display screen and to provide the content to the display. The focus region component is configured to determine a focus region of a user in relation to the display screen. The focus region includes one of the plurality of display regions at which a user is likely looking. The refresh rate component is configured to select the refresh rates of the display elements in the plurality of display regions. A refresh rate in the focus region may be different than a refresh rate in one or more other display regions of the plurality of display regions.
    Type: Application
    Filed: September 12, 2016
    Publication date: January 12, 2017
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y.H. Wood