Patents by Inventor Michel Pahud

Michel Pahud has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230421722
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11841998
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Grant
    Filed: July 20, 2022
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20230350628
    Abstract: Users can intuitively and non-obstructively obtain awareness of fellow collaborator views and/or statuses in an augmented reality environment. An interaction is detected between a first and second HMD. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request to generate a set of visual data as a response to the request. The second HMD can communicate the set of visual data to the first HMD, so that the first HMD can render the received set of visual data to be displayed concurrently with its own augmented view. Additional computer-generated object(s) can be positioned in accordance with a real-world or virtualized position of the second HMD, such that the now-visible state information associated with the second HMD does not obstruct a view of the first or second HMD's user.
    Type: Application
    Filed: July 12, 2023
    Publication date: November 2, 2023
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
  • Patent number: 11792364
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: October 17, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Patent number: 11782669
    Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: October 10, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Publication number: 20230308505
    Abstract: Aspects of the present disclosure relate to multi-user, multi-device gaze tracking. In examples, a system includes at least one processor, and memory storing instructions that, when executed by the at least one processor, causes the system to perform a set of operations. The set of operations include identifying a plurality of computing devices, and identifying one or more users. The set of operations may further include receiving gaze input data and load data, from two or more of the plurality of computing devices. The set of operations may further include performing load balancing between the plurality of devices, wherein the load balancing comprises assigning one or more tasks from a first of the plurality of computing devices to a second of the plurality of computing devices based upon the gaze input data.
    Type: Application
    Filed: March 22, 2022
    Publication date: September 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jatin SHARMA, Kenneth P. HINCKLEY, Jay C. BEAVERS, Michel PAHUD
  • Publication number: 20230266830
    Abstract: Aspects of the present disclosure relate to semantic user input for a computing device. In examples, user input is identified and processed to identify and automatically perform an associated semantic action. The semantic action may be determined based at least in part on an environmental context associated with the user input. Thus, an action determined for a given user input may change according to the environmental context in which the input was received. For example, an association between user input, an environmental context, and an action may be used to affect the behavior of a computing device as a result of identifying the user input in a scenario that has the environmental context. Such associations may be dynamically determined as a result of user interactions associated with manually provided input, for example to create, update, and/or remove semantic actions associated with a variety of user inputs.
    Type: Application
    Filed: February 22, 2022
    Publication date: August 24, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Eyal OFEK, Michel PAHUD, Edward Sean Lloyd RINTEL, Mar Gonzalez FRANCO, Payod PANDA
  • Publication number: 20230259322
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: April 26, 2023
    Publication date: August 17, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11678006
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Grant
    Filed: June 17, 2021
    Date of Patent: June 13, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Michel Pahud, Nathalie Maryline Riche, William Arthur Stewart Buxton, Nicolai Marquardt
  • Patent number: 11669294
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: June 6, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Publication number: 20230094527
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is shared with the primary computing device. An amount of the content item that is shared continues to be increased as long as the sensor continues to sense that the secondary computing device is tilting. Once the secondary computing device has reached the second tilt position, the entire content item is shared with the primary computing device.
    Type: Application
    Filed: December 7, 2022
    Publication date: March 30, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Nathalie M. RICHE, Molly NICHOLAS, Chunjong PARK, Nicolai MARQUARDT
  • Publication number: 20230046470
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Application
    Filed: July 20, 2022
    Publication date: February 16, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Patent number: 11550404
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is begun to be shared with the primary computing device. An amount of the content item that is shared with the primary computing device is continued to be increased as long as the sensor continues to sense that the secondary computing device is tilting from the first tilt position towards the second tilt position. The sensor senses that the secondary computing device has reached the second tilt position and shares the entire content item with the primary computing device.
    Type: Grant
    Filed: May 14, 2021
    Date of Patent: January 10, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Nathalie M. Riche, Molly Nicholas, Chunjong Park, Nicolai Marquardt
  • Patent number: 11543947
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Grant
    Filed: June 1, 2021
    Date of Patent: January 3, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Publication number: 20220408142
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Application
    Filed: June 17, 2021
    Publication date: December 22, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Michel PAHUD, Nathalie Maryline RICHE, William Arthur Stewart BUXTON, Nicolai MARQUARDT
  • Publication number: 20220385855
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220382506
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220365606
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is begun to be shared with the primary computing device. An amount of the content item that is shared with the primary computing device is continued to be increased as long as the sensor continues to sense that the secondary computing device is tilting from the first tilt position towards the second tilt position. The sensor senses that the secondary computing device has reached the second tilt position and shares the entire content item with the primary computing device.
    Type: Application
    Filed: May 14, 2021
    Publication date: November 17, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Nathalie M. RICHE, Molly NICHOLAS, Chunjong PARK, Nicolai MARQUARDT
  • Patent number: 11429203
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Grant
    Filed: June 19, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20220129060
    Abstract: In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus. The surface may also be used to simulate a partial or full size keyboard. The use of a camera to sense the three-dimensional (3D) location or motion of the object may enable use of above-the-surface gestures, entry of directionality, and capture of real objects into a document being processed or stored by the electronic device. One or more objects may be used to manipulate elements displayed by the portable electronic device.
    Type: Application
    Filed: November 4, 2021
    Publication date: April 28, 2022
    Inventors: Eyal Ofek, Michel Pahud, Pourang P. Irani