Patents by Inventor Kenneth P. Hinckley

Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160077734
    Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.
    Type: Application
    Filed: September 13, 2014
    Publication date: March 17, 2016
    Inventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
  • Publication number: 20160062467
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Application
    Filed: October 27, 2015
    Publication date: March 3, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: William A.S. BUXTON, Michel PAHUD, Kenneth P. HINCKLEY
  • Patent number: 9274682
    Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.
    Type: Grant
    Filed: February 19, 2010
    Date of Patent: March 1, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 9223471
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Grant
    Filed: December 28, 2010
    Date of Patent: December 29, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: William A. S. Buxton, Michel Pahud, Kenneth P. Hinckley
  • Publication number: 20150346837
    Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.
    Type: Application
    Filed: August 10, 2015
    Publication date: December 3, 2015
    Inventors: Meredith June Morris, Eric J. Horvitz, Andrew Daivd Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
  • Patent number: 9201539
    Abstract: A computing device includes a fingerprint detection module for detecting fingerprint information that may be contained within touch input event(s) provided by a touch input mechanism. The computing device can leverage the fingerprint information in various ways. In one approach, the computing device can use the fingerprint information to enhance an interpretation of the touch input event(s), such as by rejecting parts of the touch input event(s) associated with an unintended input action. In another approach, the computing device can use the fingerprint information to identify an individual associated with the fingerprint information. The computing device can apply this insight to provide a customized user experience to that individual, such as by displaying content that is targeted to that individual.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: December 1, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Patent number: 9189069
    Abstract: At least one tilt sensor generates a sensor value. A context information server, receives the sensor value and sets at least one context attribute. An application uses at least one context attribute to determine that a flinging gesture has been made and to change an image on a display in response to the flinging gesture.
    Type: Grant
    Filed: July 1, 2011
    Date of Patent: November 17, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventor: Kenneth P. Hinckley
  • Patent number: 9179021
    Abstract: Photos are shared among devices that are in close proximity to one another and for which there is a connection among the devices. The photos can be shared automatically, or alternatively based on various user inputs. Various different controls can also be placed on sharing photos to restrict the other devices with which photos can be shared, the manner in which photos can be shared, and/or how the photos are shared.
    Type: Grant
    Filed: April 25, 2012
    Date of Patent: November 3, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Kenneth P. Hinckley, Kevin Geisner, Steven Nabil Bathiche, Hrvoje Benko, Vivek Pradeep
  • Patent number: 9134798
    Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.
    Type: Grant
    Filed: December 15, 2008
    Date of Patent: September 15, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
  • Patent number: 9134760
    Abstract: An orientation of a device is detected based on a signal from at least one orientation sensor in the device. In response to the detected orientation, the device is placed in a full power mode.
    Type: Grant
    Filed: July 1, 2011
    Date of Patent: September 15, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Kenneth P. Hinckley
  • Patent number: 9075522
    Abstract: Embodiments of a multi-screen bookmark hold gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system, and the hold input is recognized when held in place proximate an edge of a journal page that is displayed on the first screen. A motion input is recognized at a second screen of the multi-screen system while the hold input remains held in place. A bookmark hold gesture can then be determined from the recognized hold and motion inputs, and the bookmark hold gesture is effective to bookmark the journal page at a location of the hold input on the first screen.
    Type: Grant
    Filed: February 25, 2010
    Date of Patent: July 7, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Publication number: 20150153890
    Abstract: Described embodiments include an apparatus and a method. In an apparatus, a touch tracking circuit detects a segment of a path defined by a user contact point moving across a touch sensitive display. A motion analysis circuit determines a parameter descriptive of a motion of the user contact point during its movement across the detected segment of the path (hereafter “motion parameter”). A filter predicts in response to the motion parameter a next contiguous segment of the path defined by the user-contact point moving across the touch sensitive display. A compensation circuit initiates a display by the touch sensitive display of the detected segment of the path and the predicted next segment of the path. An updating circuit initiates an update of the detected segment of the path and the predicted next contiguous segment of the path as the user contact point moves across the touch sensitive display.
    Type: Application
    Filed: December 3, 2013
    Publication date: June 4, 2015
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny A. Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y. H. Wood
  • Publication number: 20150153898
    Abstract: Described embodiments include an apparatus and a method. In an apparatus, a tracking circuit detects a segment of a path defined by a user contact point moving across a touch sensitive display. An analysis circuit determines a parameter descriptive of a motion of the user contact point during the detected segment. A selection circuit selects a time-interval forecasted to improve a correspondence between a predicted next segment of the path and a subsequently detected next segment of the path. A filter predicts in response to the motion parameter and the selected time-interval a next segment of the path. A compensation circuit initiates a display of the detected segment of the path and the predicted next segment of the path. An updating circuit initiates an update of the detected segment of the path and the predicted next segment of the path as the user contact point moves across the display.
    Type: Application
    Filed: December 3, 2013
    Publication date: June 4, 2015
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny A. Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y. H. Wood
  • Publication number: 20150153855
    Abstract: Described embodiments include an apparatus and a method. In an apparatus, a tracking circuit detects a segment of a path defined by a user contact point moving across a touch sensitive display. A filter predicts a next contiguous segment of the path defined by the user contact point in response to an adaptively learned motion parameter. The adaptively learned motion parameter is based on at least two previous instances of the determined motion parameters respectively descriptive of a motion of a user contact point during its movement across the touch sensitive display. A compensation circuit initiates a display by the touch sensitive display of the detected segment of the path and the predicted next contiguous segment of the path. An updating circuit updates the detected segment of the path and the predicted next contiguous segment of the path as the user contact point moves across the touch sensitive display.
    Type: Application
    Filed: December 3, 2013
    Publication date: June 4, 2015
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny A. Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y. H. Wood
  • Patent number: 9047002
    Abstract: An electronic device may include a touch screen electronic display configured to offset and/or shift the contact locations of touch implements and/or displayed content based on one or more calculated parallax values. The parallax values may be associated with the viewing angle of an operator relative to the display of the electronic device. In various embodiments, the parallax value(s) may be calculated using three-dimensional location sensors, an angle of inclination of a touch implement, and/or one or more displayed calibration objects. Parallax values may be utilized to remap contact locations by a touch implement, shift and/or offset displayed content, and/or perform other transformations as described herein. A stereoscopically displayed content may be offset such that a default display plane is coplanar with a touch surface rather than a display surface. Contacts by a finger may be remapped using portions of the contact region and/or a centroid of the contact region.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: June 2, 2015
    Assignee: ELWHA LLC
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Jr., Kenneth P. Hinckley, III, Roderick A Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Victoria Y. H. Wood, Lowell L. Wood, Jr.
  • Publication number: 20150102981
    Abstract: The description relates to eye tracking. One example can identify a location that a user is looking. The example can also identify content at the location.
    Type: Application
    Filed: February 10, 2014
    Publication date: April 16, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Bongshin LEE, Kenneth P. HINCKLEY, Steven N. BATHICHE
  • Publication number: 20150106740
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Application
    Filed: February 7, 2014
    Publication date: April 16, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Desney S. TAN, Kenneth P. HINCKLEY, Steven N. BATHICHE, Ronald O. PESSNER, Bongshin LEE, Anoop GUPTA, Amir NETZ, Brett D. BREWER
  • Publication number: 20150106386
    Abstract: The description relates to eye tracking. One example can identify a location that a user is looking. The example can also identify content at the location.
    Type: Application
    Filed: February 10, 2014
    Publication date: April 16, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Bongshin LEE, Kenneth P. HINCKLEY, Steven N. BATHICHE
  • Publication number: 20150106399
    Abstract: The claimed subject matter provides a system and/or a method that facilitates in situ searching of data. An interface can receive a flick gesture from an input device. An in situ search component can employ an in situ search triggered by the flick gesture, wherein the in situ search is executed on at least one of a portion of data selected on the input device.
    Type: Application
    Filed: December 16, 2014
    Publication date: April 16, 2015
    Inventor: Kenneth P. Hinckley
  • Publication number: 20150106739
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Application
    Filed: February 7, 2014
    Publication date: April 16, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Desney S. TAN, Kenneth P. HINCKLEY, Steven N. BATHICHE, Ronald O. PESSNER, Bongshin LEE, Anoop GUPTA, Amir NETZ, Brett D. BREWER