Patents by Inventor Michel Pahud

Michel Pahud has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8982045
    Abstract: A computing device is described herein which collects input event(s) from at least one contact-type input mechanism (such as a touch input mechanism) and at least one movement-type input mechanism (such as an accelerometer and/or gyro device). The movement-type input mechanism can identify the orientation of the computing device and/or the dynamic motion of the computing device. The computing device uses these input events to interpret the type of input action that has occurred, e.g., to assess when at least part of the input action is unintentional. The computing device can then perform behavior based on its interpretation, such as by ignoring part of the input event(s), restoring a pre-action state, correcting at least part of the input event(s), and so on.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: March 17, 2015
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, Wenqi Shen
  • Patent number: 8941710
    Abstract: A system facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes a first user and one or more second users. In response to determining a temporary absence of the first user from the telepresence session, a recordation of the telepresence session is initialized to enable a playback of a portion or a summary of the telepresence session that the first user has missed.
    Type: Grant
    Filed: August 13, 2012
    Date of Patent: January 27, 2015
    Assignee: Microsoft Corporation
    Inventors: Christian Huitema, William A. S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Patent number: 8938558
    Abstract: Described herein are techniques and systems that allow modification of functionalities based on distances between a shared device (e.g., a shared display, etc.) and an individual device (e.g., a mobile computing device, etc.). The shared device and the individual device may establish a communication to enable exchange of data. In some embodiments, the shared device or the individual device may measure a distance between the shared device and the individual device. Based on the distance, the individual device may operate in a different mode. In some instances, the shared device may then instruct the individual device to modify a functionality corresponding to the mode.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: January 20, 2015
    Assignee: Microsoft Corporation
    Inventors: Michel Pahud, Kenneth P. Hinckley, William A. S. Buxton, Gina D. Venolia
  • Patent number: 8902181
    Abstract: Functionality is described herein for detecting and responding to gestures performed by a user using a computing device, such as, but not limited to, a tablet computing device. In one implementation, the functionality operates by receiving touch input information in response to the user touching the computing device, and movement input information in response to the user moving the computing device. The functionality then determines whether the input information indicates that a user has performed or is performing a multi-touch-movement (MTM) gesture. The functionality can then perform any behavior in response to determining that the user has performed an MTM gesture, such as by modifying a view or invoking a function, etc.
    Type: Grant
    Filed: February 7, 2012
    Date of Patent: December 2, 2014
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, William A. S. Buxton
  • Publication number: 20140247240
    Abstract: A method, system, and one or more computer-readable storage media for providing multi-dimensional haptic touch screen interaction are provided herein. The method includes detecting a force applied to a touch screen by an object and determining a magnitude, direction, and location of the force. The method also includes determining a haptic force feedback to be applied by the touch screen on the object based on the magnitude, direction, and location of the force applied to the touch screen, and displacing the touch screen in a specified direction such that the haptic force feedback is applied by the touch screen on the object.
    Type: Application
    Filed: March 4, 2013
    Publication date: September 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Michael J. Sinclair, Michel Pahud, Hrvoje Benko
  • Publication number: 20140247207
    Abstract: Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Pierre P.N. Greborio, Kenneth P. Hinckley, William A.S. Buxton
  • Publication number: 20140250245
    Abstract: Described herein are techniques and systems that allow modification of functionalities based on distances between a shared device (e.g., a shared display, etc.) and an individual device (e.g., a mobile computing device, etc.). The shared device and the individual device may establish a communication to enable exchange of data. In some embodiments, the shared device or the individual device may measure a distance between the shared device and the individual device. Based on the distance, the individual device may operate in a different mode. In some instances, the shared device may then instruct the individual device to modify a functionality corresponding to the mode.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Kenneth P. Hinckley, William A.S. Buxton, Gina D. Venolia
  • Publication number: 20140123049
    Abstract: The subject disclosure is directed towards a graphical or printed keyboard having keys removed, in which the removed keys are those made redundant by gesture input. For example, a graphical or printed keyboard may be the same overall size and have the same key sizes as other graphical or printed keyboards with no numeric keys, yet via the removed keys may fit numeric and alphabetic keys into the same footprint. Also described is having three or more characters per key, with a tap corresponding to one character, and different gestures on the key differentiating among the other characters.
    Type: Application
    Filed: December 19, 2012
    Publication date: May 1, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: William A. S. Buxton, Ahmed Sabbir Arif, Michel Pahud, Kenneth P. Hinckley, Finbarr S. Duggan
  • Patent number: 8660978
    Abstract: A computing device is described herein for detecting and addressing unintended contact of a hand portion (such as a palm) or other article with a computing device. The computing device uses multiple factors to determine whether input events are accidental, including, for instance, the tilt of a pen device as it approaches a display surface of the computing device. The computing device can also capture and analyze input events which represent a hand that is close to the display surface, but not making physical contact with the display surface. The computing device can execute one or more behaviors to counteract the effect of any inadvertent input actions that it may detect.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: February 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Patent number: 8600731
    Abstract: The claimed subject matter provides a system and/or a method that facilitates communication within a telepresence session. A telepresence session can be initiated within a communication framework that includes two or more virtually represented users that communicate therein. The telepresence session can include at least one virtually represented user that communicates in a first language, the communication is at least one of a portion of audio, a portion of video, a portion of graphic, a gesture, or a portion of text. An interpreter component can evaluate the communication to translate an identified first language into a second language within the telepresence session, the translation is automatically provided to at least one virtually represented user within the telepresence.
    Type: Grant
    Filed: February 4, 2009
    Date of Patent: December 3, 2013
    Assignee: Microsoft Corporation
    Inventors: Sharon Kay Cunnington, Jin Li, Michel Pahud, Rajesh K. Hegde, Zhengyou Zhang
  • Patent number: 8542237
    Abstract: Font animation technique embodiments are presented which animate alpha-numeric characters of a message or document. In one general embodiment this is accomplished by the sender transmitting parametric information and animation instructions pertaining to the display of characters found in the message or document to a recipient. The parametric information identifies where to split the characters and where to rotate the resulting sections. The sections of each character affected are then translated and/or rotated and/or scaled as dictated by the animation instructions to create an animation over time. Additionally, if a gap in a stroke of an animated character exists between the sections of the character, a connecting section is displayed to close the stroke gap making the character appears contiguous.
    Type: Grant
    Filed: June 23, 2008
    Date of Patent: September 24, 2013
    Assignee: Microsoft Corporation
    Inventors: Michel Pahud, William Buxton, Sharon Cunnington
  • Publication number: 20130201113
    Abstract: Functionality is described herein for detecting and responding to gestures performed by a user using a computing device, such as, but not limited to, a tablet computing device. In one implementation, the functionality operates by receiving touch input information in response to the user touching the computing device, and movement input information in response to the user moving the computing device. The functionality then determines whether the input information indicates that a user has performed or is performing a multi-touch-movement (MTM) gesture. The functionality can then perform any behavior in response to determining that the user has performed an MTM gesture, such as by modifying a view or invoking a function, etc.
    Type: Application
    Filed: February 7, 2012
    Publication date: August 8, 2013
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud, William A. S. Buxton
  • Patent number: 8332755
    Abstract: The claimed subject matter provides a system and/or a method that facilitates replicating a telepresence session with a real world physical meeting. A telepresence session can be initiated within a communication framework that includes two or more virtually represented users that communicate therein. A trigger component can monitor the telepresence session in real time to identify a participant interaction with an object, wherein the object is at least one of a real world physical object or a virtually represented object within the telepresence session. A feedback component can implement a force feedback to at least one participant within the telepresence session based upon the identified participant interaction with the object, wherein the force feedback is employed via a device associated with at least one participant.
    Type: Grant
    Filed: May 27, 2009
    Date of Patent: December 11, 2012
    Assignee: Microsoft Corporation
    Inventors: Zhengyon Zhang, Xuedong D. Huang, Jin Li, Rajesh Kutpadi Hegde, Kori Marie Quinn, Michel Pahud, Jayman Dalal
  • Publication number: 20120306995
    Abstract: A system facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes a first user and one or more second users. In response to determining a temporary absence of the first user from the telepresence session, a recordation of the telepresence session is initialized to enable a playback of a portion or a summary of the telepresence session that the first user has missed.
    Type: Application
    Filed: August 13, 2012
    Publication date: December 6, 2012
    Applicant: Microsoft Corporation
    Inventors: Christian Huitema, William A.S. Buxton, Jonathan E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20120293439
    Abstract: Apparatus and methods for improving touch-screen interface usability and accuracy by determining the trajectory of a pointer as it approaches the touch-screen and modifying the touch-screen display accordingly. The system may predict an object on the display the user is likely to select next. The system may designate this object as a Designated Target Object, or DTO. The system may modify the appearance of the DTO by, for example, changing the size of the DTO, or by changing its shape, style, coloring, perspective, positioning, etc.
    Type: Application
    Filed: August 1, 2012
    Publication date: November 22, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Michel Pahud, Oscar E. Murillo, Amy K. Karlson, Benjamin B. Bederson
  • Patent number: 8261211
    Abstract: Apparatus and methods for improving touch-screen interface usability and accuracy by determining the trajectory of a pointer as it approaches the touch-screen and modifying the touch-screen display accordingly. The system may predict an object on the display the user is likely to select next. The system may designate this object as a Designated Target Object, or DTO. The system may modify the appearance of the DTO by, for example, changing the size of the DTO, or by changing its shape, style, coloring, perspective, positioning, etc.
    Type: Grant
    Filed: October 1, 2009
    Date of Patent: September 4, 2012
    Assignee: Microsoft Corporation
    Inventors: Michel Pahud, Oscar E. Murillo, Amy K. Karlson, Benjamin B. Bederson
  • Patent number: 8253774
    Abstract: The claimed subject matter provides a system and/or a method that facilitates managing one or more devices utilized for communicating data within a telepresence session. A telepresence session can be initiated within a communication framework that includes two or more virtually represented users that communicate therein. A device can be utilized by at least one virtually represented user that enables communication within the telepresence session, the device includes at least one of an input to transmit a portion of a communication to the telepresence session or an output to receive a portion of a communication from the telepresence session. A detection component can adjust at least one of the input related to the device or the output related to the device based upon the identification of a cue, the cue is at least one of a movement detected, an event detected, or an ambient variation.
    Type: Grant
    Filed: March 30, 2009
    Date of Patent: August 28, 2012
    Assignee: Microsoft Corporation
    Inventors: Christian Huitema, William A. S. Buxton, John E. Paff, Zicheng Liu, Rajesh Kutpadi Hegde, Zhengyou Zhang, Kori Marie Quinn, Jin Li, Michel Pahud
  • Publication number: 20120162093
    Abstract: This document relates to touch screen controls. For instance, the touch screen controls can allow a user to control a computing device by engaging a touch screen associated with the computing device. One implementation can receive at least one tactile contact from a region of a touch screen. This implementation can present a first command functionality on the touch screen proximate the region for a predefined time. It can await user engagement of the first command functionality. Lacking user engagement within the predefined time, the implementation can remove the first command functionality and offer a second command functionality.
    Type: Application
    Filed: December 28, 2010
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: William A.S. Buxton, Michel Pahud, Kenneth P. Hinckley
  • Publication number: 20120154296
    Abstract: A computing device includes a fingerprint detection module for detecting fingerprint information that may be contained within touch input event(s) provided by a touch input mechanism. The computing device can leverage the fingerprint information in various ways. In one approach, the computing device can use the fingerprint information to enhance an interpretation of the touch input event(s), such as by rejecting parts of the touch input event(s) associated with an unintended input action. In another approach, the computing device can use the fingerprint information to identify an individual associated with the fingerprint information. The computing device can apply this insight to provide a customized user experience to that individual, such as by displaying content that is targeted to that individual.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud
  • Publication number: 20120154295
    Abstract: A computing device is described which allows a user to convey a gesture through the cooperative use of two input mechanisms, such as a touch input mechanism and a pen input mechanism. A user uses a first input mechanism to demarcate content presented on a display surface of the computing device or other part of the computing device, e.g., by spanning the content with two fingers of a hand. The user then uses a second input mechanism to make gestures within the content that is demarcated by first input mechanism. In doing so, the first input mechanism establishes a context which governs the interpretation of gestures made by the second input mechanism. The computing device can also activate the joint use mode using two applications of the same input mechanism, such as two applications of a touch input mechanism.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Kenneth P. Hinckley, Michel Pahud