Patents by Inventor Ken Hinckley
Ken Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10168827Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.Type: GrantFiled: July 1, 2017Date of Patent: January 1, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Patent number: 9870083Abstract: A grip of a primary user on a touch-sensitive computing device and a grip of a secondary user on the touch-sensitive computing device are sensed and correlated to determine whether the primary user is sharing or handing off the computing device to the secondary user. In the case of handoff, capabilities of the computing device may be restricted, while in a sharing mode only certain content on the computing device is shared. In some implementations both a touch-sensitive pen and the touch-sensitive computing device are passed from a primary user to a secondary user. Sensor inputs representing the grips of the users on both the pen and the touch-sensitive computing device are correlated to determine the context of the grips and to initiate a context-appropriate command in an application executing on the touch-sensitive pen or the touch-sensitive computing device. Meta data is also derived from the correlated sensor inputs.Type: GrantFiled: June 12, 2014Date of Patent: January 16, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Publication number: 20170300170Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.Type: ApplicationFiled: July 1, 2017Publication date: October 19, 2017Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Patent number: 9727161Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.Type: GrantFiled: June 12, 2014Date of Patent: August 8, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Patent number: 9659280Abstract: Information sharing between meeting attendees during a co-located group meeting in a meeting space is democratized using a computer that is operating cooperatively with one or more object sensing devices in the meeting space to identify postures formed by the meeting attendees.Type: GrantFiled: May 2, 2014Date of Patent: May 23, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING LLCInventors: Andrew Bragdon, Robert DeLine, Ken Hinckley, Meredith June Morris
-
Publication number: 20160210452Abstract: A processor-implemented method for collecting a sequence of security code characters includes: detecting a trajectory through a region proximate the device followed by an instrument; responsive to the trajectory, identifying one of a collection of defined gestures; and interpreting the identified gesture as the portion of the security code.Type: ApplicationFiled: January 19, 2015Publication date: July 21, 2016Inventors: Michel Pahud, William Buxton, Ken Hinckley, Ahmed Sabbir Arif
-
Patent number: 9294722Abstract: Telepresence of a mobile user (MU) utilizing a mobile device (MD) and remote users who are participating in a telepresence session is optimized. The MD receives video of a first remote user (FRU). Whenever the MU gestures with the MD using a first motion, video of the FRU is displayed. The MD can also receive video and audio of the FRU and a second remote user (SRU), display a workspace, and reproduce the audio of the FRU and SRU in a default manner. Whenever the MU gestures with the MD using the first motion, video of the FRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the FRU. Whenever the MU gestures with the MD using a second motion, video of the SRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the SRU.Type: GrantFiled: October 19, 2010Date of Patent: March 22, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
-
Patent number: 9229539Abstract: A user is able to triage information on a touch-enabled computing device. Information objects are displayed on a touch-sensitive screen of the device. Whenever the user makes a first gesture on the screen using a first user input modality, the first gesture and that it was made using this first modality are identified, a first information management operation specifically associated with the first gesture being made with this first modality is also identified, and this first operation is implemented on the objects. Whenever the user subsequently makes a second gesture on the screen using a second user input modality which is different than the first modality, the second gesture and that it was made using this second modality are identified, a second information management operation specifically associated with the second gesture being made with this second modality is also identified, and this second operation is implemented on the objects.Type: GrantFiled: June 7, 2012Date of Patent: January 5, 2016Assignee: Microsoft Technology Licensing, LLCInventor: Ken Hinckley
-
Publication number: 20150363034Abstract: A grip of a primary user on a touch-sensitive computing device and a grip of a secondary user on the touch-sensitive computing device are sensed and correlated to determine whether the primary user is sharing or handing off the computing device to the secondary user. In the case of handoff, capabilities of the computing device may be restricted, while in a sharing mode only certain content on the computing device is shared. In some implementations both a touch-sensitive pen and the touch-sensitive computing device are passed from a primary user to a secondary user. Sensor inputs representing the grips of the users on both the pen and the touch-sensitive computing device are correlated to determine the context of the grips and to initiate a context-appropriate command in an application executing on the touch-sensitive pen or the touch-sensitive computing device. Meta data is also derived from the correlated sensor inputs.Type: ApplicationFiled: June 12, 2014Publication date: December 17, 2015Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Publication number: 20150363035Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.Type: ApplicationFiled: June 12, 2014Publication date: December 17, 2015Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
-
Publication number: 20140245190Abstract: Information sharing between meeting attendees during a co-located group meeting in a meeting space is democratized using a computer that is operating cooperatively with one or more object sensing devices in the meeting space to identify postures formed by the meeting attendees.Type: ApplicationFiled: May 2, 2014Publication date: August 28, 2014Applicant: Microsoft CorporationInventors: Andrew Bragdon, Robert DeLine, Ken Hinckley, Meredith June Morris
-
Publication number: 20130328786Abstract: A user is able to triage information on a touch-enabled computing device. Information objects are displayed on a touch-sensitive screen of the device. Whenever the user makes a first gesture on the screen using a first user input modality, the first gesture and that it was made using this first modality are identified, a first information management operation specifically associated with the first gesture being made with this first modality is also identified, and this first operation is implemented on the objects. Whenever the user subsequently makes a second gesture on the screen using a second user input modality which is different than the first modality, the second gesture and that it was made using this second modality are identified, a second information management operation specifically associated with the second gesture being made with this second modality is also identified, and this second operation is implemented on the objects.Type: ApplicationFiled: June 7, 2012Publication date: December 12, 2013Applicant: MICROSOFT CORPORATIONInventor: Ken Hinckley
-
Publication number: 20130103446Abstract: Information sharing between meeting attendees during a co-located group meeting in a meeting space is democratized using a computer that is operating cooperatively with one or more object sensing devices in the meeting space to identify postures formed by the meeting attendees.Type: ApplicationFiled: October 20, 2011Publication date: April 25, 2013Applicant: MICROSOFT CORPORATIONInventors: Andrew Bragdon, Robert DeLine, Ken Hinckley, Meredith June Morris
-
Publication number: 20120159401Abstract: Workspaces are manipulated on a mobile device having a display screen. A set of two or more discrete workspaces is established. A default discrete workspace is then displayed on the screen, where the default discrete workspace is one of the discrete workspaces in the set. Whenever a user gestures with the mobile device, the gesture is used to select one of the discrete workspaces from the set, and the selected discrete workspace will be displayed on the screen.Type: ApplicationFiled: December 16, 2010Publication date: June 21, 2012Applicant: MICROSOFT CORPORATIONInventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
-
Publication number: 20120092436Abstract: Telepresence of a mobile user (MU) utilizing a mobile device (MD) and remote users who are participating in a telepresence session is optimized. The MD receives video of a first remote user (FRU). Whenever the MU gestures with the MD using a first motion, video of the FRU is displayed. The MD can also receive video and audio of the FRU and a second remote user (SRU), display a workspace, and reproduce the audio of the FRU and SRU in a default manner. Whenever the MU gestures with the MD using the first motion, video of the FRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the FRU. Whenever the MU gestures with the MD using a second motion, video of the SRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the SRU.Type: ApplicationFiled: October 19, 2010Publication date: April 19, 2012Applicant: MICROSOFT CORPORATIONInventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
-
Publication number: 20110227947Abstract: Multi-touch user interface interaction is described. In an embodiment, an object in a user interface (UI) is manipulated by a cursor and a representation of a plurality of digits of a user. At least one parameter, which comprises the cursor location in the UI, is used to determine that multi-touch input is to be provided to the object. Responsive to this, the relative movement of the digits is analyzed and the object manipulated accordingly. In another embodiment, an object in a UI is manipulated by a representation of a plurality of digits of a user. Movement of each digit by the user moves the corresponding representation in the UI, and the movement velocity of the representation is a non-linear function of the digit's velocity. After determining that multi-touch input is to be provided to the object, the relative movement of the representations is analyzed and the object manipulated accordingly.Type: ApplicationFiled: March 16, 2010Publication date: September 22, 2011Applicant: Microsoft CorporationInventors: Hrvoje Benko, Shahram Izadi, Andrew D. Wilson, Daniel Rosenfeld, Ken Hinckley, Xiang Cao, Nicolas Villar, Stephen Hodges
-
Patent number: 7817991Abstract: A method and apparatus for connecting two wireless devices to share information is disclosed. To connect the wireless devices the users communicate to each other a desire to connect their devices. Following this communication the users electronically identify each device, initiate and propose the connection. Once the connection has been made the users are able to share information across the devices. Alternative embodiments provide the user with expedited methods to identify the wireless device, identify the information to share, or provide additional security in forming the connection.Type: GrantFiled: August 29, 2006Date of Patent: October 19, 2010Assignee: Microsoft CorporationInventors: Ken Hinckley, Raman Sarin
-
Publication number: 20070191028Abstract: A method and apparatus for connecting two wireless devices to share information is disclosed. To connect the wireless devices the users communicate to each other a desire to connect their devices. Following this communication the users electronically identify each device, initiate and propose the connection. Once the connection has been made the users are able to share information across the devices. Alternative embodiments provide the user with expedited methods to identify the wireless device, identify the information to share, or provide additional security in forming the connection.Type: ApplicationFiled: August 29, 2006Publication date: August 16, 2007Applicant: Microsoft CorporationInventors: Ken Hinckley, Raman Sarin