Patents by Inventor Nathalie Riche

Nathalie Riche has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11841998
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Grant
    Filed: July 20, 2022
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20230350628
    Abstract: Users can intuitively and non-obstructively obtain awareness of fellow collaborator views and/or statuses in an augmented reality environment. An interaction is detected between a first and second HMD. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request to generate a set of visual data as a response to the request. The second HMD can communicate the set of visual data to the first HMD, so that the first HMD can render the received set of visual data to be displayed concurrently with its own augmented view. Additional computer-generated object(s) can be positioned in accordance with a real-world or virtualized position of the second HMD, such that the now-visible state information associated with the second HMD does not obstruct a view of the first or second HMD's user.
    Type: Application
    Filed: July 12, 2023
    Publication date: November 2, 2023
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
  • Patent number: 11782669
    Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: October 10, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Publication number: 20230046470
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Application
    Filed: July 20, 2022
    Publication date: February 16, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Patent number: 11543947
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Grant
    Filed: June 1, 2021
    Date of Patent: January 3, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Patent number: 11429203
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Grant
    Filed: June 19, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20220011924
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Application
    Filed: June 1, 2021
    Publication date: January 13, 2022
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20210397274
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Application
    Filed: June 19, 2020
    Publication date: December 23, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Patent number: 11119581
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Grant
    Filed: March 6, 2020
    Date of Patent: September 14, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 11068111
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Grant
    Filed: November 26, 2019
    Date of Patent: July 20, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
  • Patent number: 11023109
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: June 1, 2021
    Assignee: Microsoft Techniogy Licensing, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Patent number: 10895966
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: January 19, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Publication number: 20200209978
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Application
    Filed: March 6, 2020
    Publication date: July 2, 2020
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Patent number: 10699491
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Grant
    Filed: April 17, 2019
    Date of Patent: June 30, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 10620710
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Grant
    Filed: June 15, 2017
    Date of Patent: April 14, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Publication number: 20200097119
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Application
    Filed: November 26, 2019
    Publication date: March 26, 2020
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
  • Patent number: 10514801
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Grant
    Filed: June 15, 2017
    Date of Patent: December 24, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
  • Patent number: 10417827
    Abstract: In various embodiments, computerized methods and systems for syndicating direct and indirect interactions with objects in a computer-mediated environment to facilitate precise interactions with the objects in the computer-mediated environment are provided. The system detects a direct interaction with an object in the computer-mediated reality environment. The direct interaction may be a natural or hypernatural interaction. Subsequently, the system may determine various options of indirect interaction with the object related to the direct interaction. The indirect interaction may be generated by a controller. Upon receiving an indirect interaction, the system may modify the object based on the syndication of the direct interaction and the indirect interaction.
    Type: Grant
    Filed: May 4, 2017
    Date of Patent: September 17, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 10416841
    Abstract: The claimed subject matter generates animated data visualization videos. A user interface is displayed that includes a clip library panel, a clips panel, and a configurations panel. The clip library panel includes available data clips; the clips panel includes multiple configured data clips; and, the configurations panel includes properties of a selected data clip from the clips panel. A user interface for entering a dataset is displayed. Multiple data clips dragged from the clip library panel to the clips panel, are added to the clips panel. Configuration settings for a dragged and dropped data clip are displayed in the configurations panel. One or more properties of the data clip are updated. An animated data visualization video is generated based on the dragged and dropped data clips and updated properties, and in a sequence specified in the clips panel.
    Type: Grant
    Filed: August 10, 2015
    Date of Patent: September 17, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Nathalie Riche, Bongshin Lee, Andres Monroy Hernandez, Fereshteh Amini
  • Publication number: 20190244434
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Application
    Filed: April 17, 2019
    Publication date: August 8, 2019
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER