Patents by Inventor Michel Pahud
Michel Pahud has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220027118Abstract: Systems and methods relate to sharing content and data across users and devices. A shared folios include media primitives and tools as its constituents. The use of shared folios addresses an issue of reliably and efficiently transfer user activities with data. The use addresses scenarios of both single-user work utilizing multiple devices and collaborative work among users utilizing multiple devices. The media primitive includes content in various data types. The tool include data and access to the data by devices and applications. A data distributor manages automatic synchronization of the folios across devices using centralized and distributed transaction logs The folios are synchronized with resiliency against failure in client devices. The folio and its constituents are interactively accessible through top-level, semi-transparent user interface. The media primitive and the tools may programmatically access local applications to automatically transfer user activities among users and devices.Type: ApplicationFiled: October 5, 2021Publication date: January 27, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. HINCKLEY, Michel PAHUD, Jonathan D. GOLDSTEIN, Frederik Martin BRUDY
-
Publication number: 20220011924Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: ApplicationFiled: June 1, 2021Publication date: January 13, 2022Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Publication number: 20210397274Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.Type: ApplicationFiled: June 19, 2020Publication date: December 23, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
-
Patent number: 11188143Abstract: In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus. The surface may also be used to simulate a partial or full size keyboard. The use of a camera to sense the three-dimensional (3D) location or motion of the object may enable use of above-the-surface gestures, entry of directionality, and capture of real objects into a document being processed or stored by the electronic device. One or more objects may be used to manipulate elements displayed by the portable electronic device.Type: GrantFiled: January 4, 2016Date of Patent: November 30, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Eyal Ofek, Michel Pahud, Pourang P Irani
-
Patent number: 11144275Abstract: Systems and methods relate to sharing content and data across users and devices. A shared folios include media primitives and tools as its constituents. The use of shared folios addresses an issue of reliably and efficiently transfer user activities with data. The use addresses scenarios of both single-user work utilizing multiple devices and collaborative work among users utilizing multiple devices. The media primitive includes content in various data types. The tool include data and access to the data by devices and applications. A data distributor manages automatic synchronization of the folios across devices using centralized and distributed transaction logs The folios are synchronized with resiliency against failure in client devices. The folio and its constituents are interactively accessible through top-level, semi-transparent user interface. The media primitive and the tools may programmatically access local applications to automatically transfer user activities among users and devices.Type: GrantFiled: June 29, 2020Date of Patent: October 12, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, Jonathan D. Goldstein, Frederik Martin Brudy
-
Patent number: 11119581Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: GrantFiled: March 6, 2020Date of Patent: September 14, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 11068111Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: GrantFiled: November 26, 2019Date of Patent: July 20, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
-
Patent number: 11054894Abstract: In various embodiments, methods and systems for implementing an integrated mixed-input system are provided. The integrated mixed-input system includes paired mixed-input devices for interacting and controlling virtual space input interfaces using real inputs and virtual inputs, sensors, and passive and active haptic feedback associated with the paired mixed-input devices. Real device space tracker data and virtual device space tracker data are accessed via the paired mixed-input devices to determine real input and virtual input that are processed to determine virtual space input. The real device space tracker data and virtual device space tracker data also are used to generate different interaction contexts. In one embodiment, integrated mixed-input system supports interface deviation, where a physical mixed-input device interface is a different size from a size of the virtual space input interface. The virtual space input is communicated to control the virtual space input interface.Type: GrantFiled: June 30, 2017Date of Patent: July 6, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Eyal Ofek
-
Patent number: 11023109Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: GrantFiled: June 30, 2017Date of Patent: June 1, 2021Assignee: Microsoft Techniogy Licensing, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Patent number: 10983694Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.Type: GrantFiled: February 23, 2018Date of Patent: April 20, 2021Assignee: Microsoft Technology Licensing, LLCInventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
-
Patent number: 10976819Abstract: In some examples, a system senses the location of a stylus or finger of a user relative to a virtual displayed object to determine when a virtual touch occurs. Upon or after such a determination, the system may generate a haptic action that indicates to the user that the virtual touch occurred. The haptic action may be located on a particular portion of a haptic device, which may be a handheld device. The particular portion may correspond to the location where the virtual displayed object was virtually touched. In this way, the user may receive physical feedback associated with the virtual touch of the virtual displayed object. In some examples, the virtual displayed object may change in response to the virtual touch, thus further providing visual and physical feedback associated with the virtual touch of the virtual displayed object to the user.Type: GrantFiled: December 28, 2015Date of Patent: April 13, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Johnson T. Apacible, Sasa Junuzovic, David W. Brown
-
Patent number: 10895966Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.Type: GrantFiled: June 30, 2017Date of Patent: January 19, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Patent number: 10732759Abstract: Various technologies described herein pertain to utilizing sensed pre-touch interaction to control a mobile computing device. A pre-touch interaction of a user with the mobile computing device is detected. The pre-touch interaction includes a grip of the user on the mobile computing device and/or a hover of one or more fingers of the user with respect to a touchscreen of the mobile computing device. The finger(s) of the user can be within proximity but not touching the touchscreen as part of the hover. Parameter(s) of the pre-touch interaction of the user with the mobile computing device are identified, and a touch of the user on the touchscreen of the mobile computing device is detected. A computing operation is executed responsive to the touch, where the computing operation is based on the touch and the parameter(s) of the pre-touch interaction of the user with the mobile computing device.Type: GrantFiled: June 30, 2016Date of Patent: August 4, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, Hrovje Benko, William Arthur Stewart Buxton, Seongkook Heo
-
Publication number: 20200209978Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: ApplicationFiled: March 6, 2020Publication date: July 2, 2020Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
-
Patent number: 10699491Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.Type: GrantFiled: April 17, 2019Date of Patent: June 30, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 10684758Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: GrantFiled: February 20, 2017Date of Patent: June 16, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
-
Patent number: 10635291Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.Type: GrantFiled: February 20, 2017Date of Patent: April 28, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer
-
Patent number: 10620710Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: GrantFiled: June 15, 2017Date of Patent: April 14, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Publication number: 20200097119Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: ApplicationFiled: November 26, 2019Publication date: March 26, 2020Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
-
Patent number: 10579216Abstract: In general, the multi-touch detection implementations described herein use touch detection technologies to provide new and advantageous interactions between users and touch displays using touch, pens and user-wearable devices (UWDs). These new and advantageous interactions include user-associated mobile menus, combined click-through and radial marking menus, menus to automate and improve drawing or manipulation of content on a display, new menus and methods of selecting objects and text on a display, and new interactions with UWDs and touchscreen displays by using UWDs with gestures. In addition, targeted haptic feedback to the UWD of specific users of a display is enabled. In some multi-touch detection implementations menus or tools available to act on object on a display can be ported entirely, or in part, between displays, such as between small and large displays.Type: GrantFiled: June 29, 2016Date of Patent: March 3, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Kenneth P. Hinckley, William Buxton