Patents by Inventor Kenneth P. Hinckley

Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8380246
    Abstract: A mobile device connection system is provided. The system includes an input medium to detect a device position or location. An analysis component determines a device type and establishes a connection with the device. The input medium can include vision systems to detect device presence and location where connections are established via wireless technologies.
    Type: Grant
    Filed: August 15, 2007
    Date of Patent: February 19, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Raman K. Sarin, Kenneth P. Hinckley
  • Patent number: 8274484
    Abstract: In an example embodiment, a method is adapted to tracking input with a device. The method includes an act of monitoring and acts of activating and displaying if a touch input is detected. The device has a first side and a second side, with the second side opposite the first side. The device has a display screen disposed on the first side, and a screen-reflective interface disposed on the second side. Respective positions on the screen-reflective interface correspond to respective locations of the display screen. The screen-reflective interface of the device is monitored. If a touch input is detected on the screen-reflective interface, the device performs acts of activating and displaying. Specifically, a tracking state is activated for the screen-reflective interface responsive to the detected touch input on the screen-reflective interface. The interface icon is displayed on the display screen to indicate that the tracking state has been activated.
    Type: Grant
    Filed: July 18, 2008
    Date of Patent: September 25, 2012
    Assignee: Microsoft Corporation
    Inventors: Patrick M. Baudisch, Georg F. Petschnigg, David H. Wykes, Albert Yiu-So Shum, Avi Geiger, Kenneth P. Hinckley, Michael J. Sinclair, Joel B. Jacobs, Jonathan D. Friedman, Rosanna H. Ho
  • Publication number: 20120240043
    Abstract: Systems and/or methods are provided that facilitates revealing assistance information associated with a user interface. An interface can obtain input information related to interactions between the interface and a user. In addition, the interface can output assistance information in situ with the user interface. Further, a decision component that determines the in situ assistance information output by the interface based at least in part on the obtained input information.
    Type: Application
    Filed: June 1, 2012
    Publication date: September 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Kenneth P. Hinckley, Shengdong Zhao, Edward B. Cutrell, Raman K. Sarin, Patrick M. Baudisch, Darryl Yust
  • Publication number: 20120236026
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Application
    Filed: May 30, 2012
    Publication date: September 20, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 8261213
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Grant
    Filed: January 28, 2010
    Date of Patent: September 4, 2012
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Publication number: 20120206330
    Abstract: A multi-touch orientation sensing input device may enhance task performance efficiency. The multi-touch orientation sensing input device may include a device body that is partially enclosed or completely enclosed by a multi-touch sensor. The multi-touch orientation sensing input device may further include an inertia measurement unit that is disposed on the device body, The inertia measurement unit may measures a tilt angle of the device body with respect to a horizontal surface, as well as a roll angle of the device body along a length-wise axis of the device body with respect to an initial point on the device body.
    Type: Application
    Filed: February 11, 2011
    Publication date: August 16, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Xiang Cao, Minghui Sun, Shahram Izadi, Hrvoje Benko, Kenneth P. Hinckley
  • Patent number: 8239785
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Grant
    Filed: January 27, 2010
    Date of Patent: August 7, 2012
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Koji Yatani, Georg F. Petschnigg
  • Publication number: 20120162093
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Application
    Filed: December 28, 2010
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: William A.S. Buxton, Michel Pahud, Kenneth P. Hinckley
  • Publication number: 20120154296
    Abstract: A computing device includes a fingerprint detection module for detecting fingerprint information that may be contained within touch input event(s) provided by a touch input mechanism. The computing device can leverage the fingerprint information in various ways. In one approach, the computing device can use the fingerprint information to enhance an interpretation of the touch input event(s), such as by rejecting parts of the touch input event(s) associated with an unintended input action. In another approach, the computing device can use the fingerprint information to identify an individual associated with the fingerprint information. The computing device can apply this insight to provide a customized user experience to that individual, such as by displaying content that is targeted to that individual.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Publication number: 20120154255
    Abstract: A computing device is described which includes plural display parts provided on respective plural device parts. The display parts define a display surface which provides interfaces to different tools. The tools, in turn, allow a local participant to engage in an interactive session with one or more remote participants. In one case, the tools include: a shared workspace processing module for providing a shared workspace for use by the participants; an audio-video conferencing module for enabling audio-video communication among the participants; and a reference space module for communicating hand gestures and the like among the participants, etc. In one case, the computing device is implemented as a portable computing device that can be held in a participant's hand during use.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, William A. S. Buxton
  • Publication number: 20120154295
    Abstract: A computing device is described which allows a user to convey a gesture through the cooperative use of two input mechanisms, such as a touch input mechanism and a pen input mechanism. A user uses a first input mechanism to demarcate content presented on a display surface of the computing device or other part of the computing device, e.g., by spanning the content with two fingers of a hand. The user then uses a second input mechanism to make gestures within the content that is demarcated by first input mechanism. In doing so, the first input mechanism establishes a context which governs the interpretation of gestures made by the second input mechanism. The computing device can also activate the joint use mode using two applications of the same input mechanism, such as two applications of a touch input mechanism.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Publication number: 20120154293
    Abstract: A computing device is described herein which accommodates gestures that involve intentional movement of the computing device, either by establishing an orientation of the computing device and/or by dynamically moving the computing device, or both. The gestures may also be accompanied by contact with a display surface (or other part) of the computing device. For example, the user may establish contact with the display surface via a touch input mechanism and/or a pen input mechanism and then move the computing device in a prescribed manner.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, Wenqi Shen
  • Publication number: 20120158629
    Abstract: A computing device is described herein for detecting and addressing unintended contact of a hand portion (such as a palm) or other article with a computing device. The computing device uses multiple factors to determine whether input events are accidental, including, for instance, the tilt of a pen device as it approaches a display surface of the computing device. The computing device can also capture and analyze input events which represent a hand that is close to the display surface, but not making physical contact with the display surface. The computing device can execute one or more behaviors to counteract the effect of any inadvertent input actions that it may detect.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Publication number: 20120154294
    Abstract: A computing device is described herein which collects input event(s) from at least one contact-type input mechanism (such as a touch input mechanism) and at least one movement-type input mechanism (such as an accelerometer and/or gyro device). The movement-type input mechanism can identify the orientation of the computing device and/or the dynamic motion of the computing device. The computing device uses these input events to interpret the type of input action that has occurred, e.g., to assess when at least part of the input action is unintentional. The computing device can then perform behavior based on its interpretation, such as by ignoring part of the input event(s), restoring a pre-action state, correcting at least part of the input event(s), and so on.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, Wenqi Shen
  • Patent number: 8196042
    Abstract: Systems and/or methods are provided that facilitates revealing assistance information associated with a user interface. An interface can obtain input information related to interactions between the interface and a user. In addition, the interface can output assistance information in situ with the user interface. Further, a decision component that determines the in situ assistance information output by the interface based at least in part on the obtained input information.
    Type: Grant
    Filed: January 21, 2008
    Date of Patent: June 5, 2012
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Shengdong Zhao, Edward B. Cutrell, Raman K. Sarin, Patrick M. Baudisch, Darryl Yust
  • Patent number: 8171431
    Abstract: The claimed subject matter provides techniques to effectuate and facilitate efficient and flexible selection of display objects. The system can include devices and components that acquire gestures from pointing instrumentalities and thereafter ascertains velocities and proximities in relation to the displayed objects. Based at least upon these ascertained velocities and proximities falling below or within threshold levels, the system displays flags associated with the display object.
    Type: Grant
    Filed: October 5, 2007
    Date of Patent: May 1, 2012
    Assignee: Microsoft Corporation
    Inventors: Tovi Grossman, Patrick M. Baudisch, Kenneth P. Hinckley, William A. S. Buxton, Raman Sarin
  • Patent number: 8120625
    Abstract: In a device having a display, a change in focus for an application is used with a requested usage of a context attribute to change the amount of information regarding the context attribute that is sent to another application. A method of changing the orientation of images on a device's display detects movement followed by an end of movement of the device. The orientation of the device is then determined and is used to set the orientation of images on the display. A method of setting the orientation of a display also includes storing information regarding an item displayed in a first orientation before changing the orientation. When the orientation is returned to the first orientation, the stored information is retrieved and is used to display the item in the first orientation. The stored information can include whether the item is to appear in the particular orientation.
    Type: Grant
    Filed: November 14, 2002
    Date of Patent: February 21, 2012
    Assignee: Microsoft Corporation
    Inventor: Kenneth P. Hinckley
  • Patent number: 8063882
    Abstract: A computer input device and computer system are provided that determine if the input device is at an edge of a pattern on a working surface based on an image of the working surface captured by the input device. An audio control message is generated based on the input device being positioned on the edge of the pattern and the audio control message is used to cause a speaker to generate an audio signal.
    Type: Grant
    Filed: March 2, 2009
    Date of Patent: November 22, 2011
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michael J. Sinclair, Richard S. Szeliski, Matthew J. Conway, Erik J. Hanson
  • Publication number: 20110273368
    Abstract: A unique system and method that facilitates extending input/output capabilities for resource deficient mobile devices and interactions between multiple heterogeneous devices is provided. The system and method involve an interactive surface to which the desired mobile devices can be connected. The interactive surface can provide an enhanced display space and customization controls for mobile devices that lack adequate displays and input capabilities. In addition, the interactive surface can be employed to permit communication and interaction between multiple mobile devices that otherwise are unable to interact with each other. When connected to the interactive surface, the mobile devices can share information, view information from their respective devices, and store information to the interactive surface. Furthermore, the interactive surface can resume activity states of mobile devices that were previously communicating upon re-connection to the surface.
    Type: Application
    Filed: May 17, 2011
    Publication date: November 10, 2011
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Andrew D. Wilson
  • Publication number: 20110267263
    Abstract: Movement of a device is detected using at least one sensor. In response to the detected movement, at least one value is altered to make it easier for a user to select an object on a display.
    Type: Application
    Filed: July 1, 2011
    Publication date: November 3, 2011
    Applicant: MICROSOFT CORPORATION
    Inventor: Kenneth P. Hinckley