Patents by Inventor Michel Pahud

Michel Pahud has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170090666
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 12, 2016
    Publication date: March 30, 2017
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Publication number: 20170010695
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20170010733
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20160378331
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Application
    Filed: July 28, 2016
    Publication date: December 29, 2016
    Inventors: Kenneth P. Hinckley, Koji Yatani, Michel Pahud
  • Patent number: 9521364
    Abstract: A system facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes a first user and one or more second users. In response to determining a temporary absence of the first user from the telepresence session, a recordation of the telepresence session is initialized to enable a playback of a portion or a summary of the telepresence session that the first user has missed.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: December 13, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christian Huitema, William A. S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20160283021
    Abstract: A computing device includes a fingerprint detection module for detecting fingerprint information that may be contained within touch input event(s) provided by a touch input mechanism. The computing device can leverage the fingerprint information in various ways. In one approach, the computing device can use the fingerprint information to enhance an interpretation of the touch input event(s), such as by rejecting parts of the touch input event(s) associated with an unintended input action. In another approach, the computing device can use the fingerprint information to identify an individual associated with the fingerprint information. The computing device can apply this insight to provide a customized user experience to that individual, such as by displaying content that is targeted to that individual.
    Type: Application
    Filed: November 23, 2015
    Publication date: September 29, 2016
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Publication number: 20160239092
    Abstract: An example system includes a plurality of moveable light emitters, each moveable light emitter configured to independently emit a display light from a current display location within that moveable light emitter's range of motion responsive to activation from a corresponding light activator. The system also includes a location engine to determine, for each light emitter, the current display location of that light emitter, and a mapping engine to map, for each current display location, the light activator activating the light emitter currently located at that current display location.
    Type: Application
    Filed: February 13, 2015
    Publication date: August 18, 2016
    Inventors: Sasa Junuzovic, Michel Pahud, Eyal Ofek, Yoichi Ochiai, Michael J. Sinclair
  • Patent number: 9411504
    Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.
    Type: Grant
    Filed: January 28, 2010
    Date of Patent: August 9, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani, Michel Pahud
  • Publication number: 20160210452
    Abstract: A processor-implemented method for collecting a sequence of security code characters includes: detecting a trajectory through a region proximate the device followed by an instrument; responsive to the trajectory, identifying one of a collection of defined gestures; and interpreting the identified gesture as the portion of the security code.
    Type: Application
    Filed: January 19, 2015
    Publication date: July 21, 2016
    Inventors: Michel Pahud, William Buxton, Ken Hinckley, Ahmed Sabbir Arif
  • Patent number: 9294722
    Abstract: Telepresence of a mobile user (MU) utilizing a mobile device (MD) and remote users who are participating in a telepresence session is optimized. The MD receives video of a first remote user (FRU). Whenever the MU gestures with the MD using a first motion, video of the FRU is displayed. The MD can also receive video and audio of the FRU and a second remote user (SRU), display a workspace, and reproduce the audio of the FRU and SRU in a default manner. Whenever the MU gestures with the MD using the first motion, video of the FRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the FRU. Whenever the MU gestures with the MD using a second motion, video of the SRU is displayed and audio of the FRU and SRU is reproduced in a manner that accentuates the SRU.
    Type: Grant
    Filed: October 19, 2010
    Date of Patent: March 22, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Ken Hinckley, William A. S. Buxton
  • Publication number: 20160077734
    Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.
    Type: Application
    Filed: September 13, 2014
    Publication date: March 17, 2016
    Inventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
  • Publication number: 20160062467
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Application
    Filed: October 27, 2015
    Publication date: March 3, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: William A.S. BUXTON, Michel PAHUD, Kenneth P. HINCKLEY
  • Patent number: 9223471
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Grant
    Filed: December 28, 2010
    Date of Patent: December 29, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: William A. S. Buxton, Michel Pahud, Kenneth P. Hinckley
  • Publication number: 20150363034
    Abstract: A grip of a primary user on a touch-sensitive computing device and a grip of a secondary user on the touch-sensitive computing device are sensed and correlated to determine whether the primary user is sharing or handing off the computing device to the secondary user. In the case of handoff, capabilities of the computing device may be restricted, while in a sharing mode only certain content on the computing device is shared. In some implementations both a touch-sensitive pen and the touch-sensitive computing device are passed from a primary user to a secondary user. Sensor inputs representing the grips of the users on both the pen and the touch-sensitive computing device are correlated to determine the context of the grips and to initiate a context-appropriate command in an application executing on the touch-sensitive pen or the touch-sensitive computing device. Meta data is also derived from the correlated sensor inputs.
    Type: Application
    Filed: June 12, 2014
    Publication date: December 17, 2015
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Publication number: 20150363035
    Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
    Type: Application
    Filed: June 12, 2014
    Publication date: December 17, 2015
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Patent number: 9201539
    Abstract: A computing device includes a fingerprint detection module for detecting fingerprint information that may be contained within touch input event(s) provided by a touch input mechanism. The computing device can leverage the fingerprint information in various ways. In one approach, the computing device can use the fingerprint information to enhance an interpretation of the touch input event(s), such as by rejecting parts of the touch input event(s) associated with an unintended input action. In another approach, the computing device can use the fingerprint information to identify an individual associated with the fingerprint information. The computing device can apply this insight to provide a customized user experience to that individual, such as by displaying content that is targeted to that individual.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: December 1, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Patent number: 9111263
    Abstract: A template and/or knowledge associated with a synchronous meeting are obtained by a computing device. The computing device then adaptively manages the synchronous meeting based at least in part on the template and/or knowledge.
    Type: Grant
    Filed: June 15, 2009
    Date of Patent: August 18, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jin Li, James E. Oker, Rajesh K. Hegde, Dinei Afonso Ferreira Florencio, Michel Pahud, Sharon K. Cunnington, Philip A. Chou, Zhengyou Zhang
  • Publication number: 20150124046
    Abstract: A system facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes a first user and one or more second users. In response to determining a temporary absence of the first user from the telepresence session, a recordation of the telepresence session is initialized to enable a playback of a portion or a summary of the telepresence session that the first user has missed.
    Type: Application
    Filed: December 16, 2014
    Publication date: May 7, 2015
    Inventors: Christian Huitema, William A.S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20150095843
    Abstract: Systems and methods for presenting a dynamic user-interaction control are presented. The dynamic user-interaction control enables a device user to interact with a touch-sensitive device in a single-handed manner. A triggering event causes the dynamic user-interaction control to be temporarily presented on a display screen. In various embodiments, a dynamic user-interaction control is presented at the location corresponding to the triggering event (i.e., the location of the device user's touch). The dynamic user-interaction control remains present on the display screen and the device user can interact with the control until a dismissal event is encountered. A dismissal event occurs under multiple conditions including the device user breaks touch connection with the dynamic user-interaction control for a predetermined amount of time.
    Type: Application
    Filed: September 27, 2013
    Publication date: April 2, 2015
    Applicant: Microsoft Corporation
    Inventors: Pierre Paul Nicolas Greborio, Michel Pahud
  • Patent number: 8994646
    Abstract: A computing device is described herein which accommodates gestures that involve intentional movement of the computing device, either by establishing an orientation of the computing device and/or by dynamically moving the computing device, or both. The gestures may also be accompanied by contact with a display surface (or other part) of the computing device. For example, the user may establish contact with the display surface via a touch input mechanism and/or a pen input mechanism and then move the computing device in a prescribed manner.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: March 31, 2015
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, Wenqi Shen