Patents by Inventor Christophe HURTER
Christophe HURTER has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250054247Abstract: Methods and systems for rendering augmented reality display data to locations of a physical presentation environment based on a presentation configuration are provided. A physical presentation environment configuration may be accessed that includes locations of a physical presentation environment for mapping augmented reality display data. The augmented reality display data may include a plurality of augmented reality objects that are rendered for display. Presentation attributes of the augmented reality display data may be used in conjunction with the presentation configuration for mapping and rendering the augmented reality display data. The rendered augmented reality display data may be dynamically interactive, and may be generated based on previous presentation configurations, mapping preferences, mapping limitations, and/or other factors.Type: ApplicationFiled: October 21, 2024Publication date: February 13, 2025Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Publication number: 20240398319Abstract: A method for selecting data derived from an electroencephalogram, in the form of a set of starting scalograms, each scalogram being calculated from a portion of an electroencephalographic signal. The method includes: extracting, via an artificial neural network, a set of candidate scalograms; and for some candidate scalograms of the set of candidate scalograms: calculating characteristics of the electroencephalographic signal portion corresponding to the candidate scalogram; and when the plurality of characteristics are within prerequisite value ranges, selecting the electroencephalographic signal portion of the candidate scalogram within an electroencephalographic signal selection data structure.Type: ApplicationFiled: October 20, 2022Publication date: December 5, 2024Inventors: Ludovic Gardy, Emmanuel Barbeau, Christophe Hurter
-
Patent number: 12141927Abstract: Methods and systems for rendering augmented reality display data to locations of a physical presentation environment based on a presentation configuration are provided. A physical presentation environment configuration may be accessed that includes locations of a physical presentation environment for mapping augmented reality display data. The augmented reality display data may include a plurality of augmented reality objects that are rendered for display. Presentation attributes of the augmented reality display data may be used in conjunction with the presentation configuration for mapping and rendering the augmented reality display data. The rendered augmented reality display data may be dynamically interactive, and may be generated based on previous presentation configurations, mapping preferences, mapping limitations, and/or other factors.Type: GrantFiled: June 30, 2017Date of Patent: November 12, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Publication number: 20230350628Abstract: Users can intuitively and non-obstructively obtain awareness of fellow collaborator views and/or statuses in an augmented reality environment. An interaction is detected between a first and second HMD. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request to generate a set of visual data as a response to the request. The second HMD can communicate the set of visual data to the first HMD, so that the first HMD can render the received set of visual data to be displayed concurrently with its own augmented view. Additional computer-generated object(s) can be positioned in accordance with a real-world or virtualized position of the second HMD, such that the now-visible state information associated with the second HMD does not obstruct a view of the first or second HMD's user.Type: ApplicationFiled: July 12, 2023Publication date: November 2, 2023Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
-
Patent number: 11782669Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.Type: GrantFiled: April 28, 2017Date of Patent: October 10, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 11543947Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: GrantFiled: June 1, 2021Date of Patent: January 3, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Publication number: 20220011924Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: ApplicationFiled: June 1, 2021Publication date: January 13, 2022Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Patent number: 11119581Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: GrantFiled: March 6, 2020Date of Patent: September 14, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 11068111Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: GrantFiled: November 26, 2019Date of Patent: July 20, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
-
Patent number: 11023109Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: GrantFiled: June 30, 2017Date of Patent: June 1, 2021Assignee: Microsoft Techniogy Licensing, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Patent number: 10895966Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.Type: GrantFiled: June 30, 2017Date of Patent: January 19, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
-
Patent number: 10769841Abstract: A mechanism is provided for exploring three dimensional environments such as those generated from X-ray and tomography scans, in such a way as to “see round” obstacles to an article of interest, without degrading the overall context of an article of interest in terms of it position in relationship to other articles, and to the viewpoint of the user. This is achieved by defining a light guide curve leading to the user's viewpoint, with respect to which ray tracing is performed to define the image displayed to the user whereby rays passing within a predetermined distance of the light guide curve at a relative angle within a predetermined range thereto are deviated so as to conform therewith to a degree proportional to its distance therefrom.Type: GrantFiled: May 15, 2018Date of Patent: September 8, 2020Assignee: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe Hurter, Michael Traoré Sompagnimdi
-
Publication number: 20200209978Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: ApplicationFiled: March 6, 2020Publication date: July 2, 2020Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
-
Patent number: 10699491Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.Type: GrantFiled: April 17, 2019Date of Patent: June 30, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 10620710Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: GrantFiled: June 15, 2017Date of Patent: April 14, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Publication number: 20200097119Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: ApplicationFiled: November 26, 2019Publication date: March 26, 2020Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
-
Patent number: 10559118Abstract: In a crowded representation of a virtual three dimensional space defined in terms of voxels, an object of interest will often be occluded by one or more objects of varying densities between the virtual camera defining the user's point of view, and the object of interest. To automatically identify an optimal camera position, an number of candidate positions are considered, for example situated at the vertices of a regular polyhedron centred on the object of interest. For each of these candidate positions, a ray is cast towards the object of interest, and the occlusion for each intervening voxel is determined as the product of that voxel's density, and a density transfer function. The virtual camera position corresponding to the least occluded path is then selected as the new point of view.Type: GrantFiled: March 20, 2017Date of Patent: February 11, 2020Assignee: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe Hurter
-
Patent number: 10514801Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: GrantFiled: June 15, 2017Date of Patent: December 24, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
-
Patent number: 10438398Abstract: Objects in a voxel based computer generated three dimensional environment are identified by traversing adjacent voxels meeting a predetermined criterion with respect to a scalar metadata value associated with each voxel, such as opacity or density. These adjacent voxels may be explored in accordance with a tree-crawling algorithm such as a breadth first or depth first algorithm. Once all adjacent cells meeting the criterion are identified, these are determined to represent a discrete object, and displayed as such. The starting point for the traversing process may be the voxel closest to a virtual camera position along the line of sight of that virtual camera meeting the criterion.Type: GrantFiled: March 23, 2017Date of Patent: October 8, 2019Assignee: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe Hurter, Michael Traoré Sompagnimdi
-
Patent number: 10417827Abstract: In various embodiments, computerized methods and systems for syndicating direct and indirect interactions with objects in a computer-mediated environment to facilitate precise interactions with the objects in the computer-mediated environment are provided. The system detects a direct interaction with an object in the computer-mediated reality environment. The direct interaction may be a natural or hypernatural interaction. Subsequently, the system may determine various options of indirect interaction with the object related to the direct interaction. The indirect interaction may be generated by a controller. Upon receiving an indirect interaction, the system may modify the object based on the syndication of the direct interaction and the indirect interaction.Type: GrantFiled: May 4, 2017Date of Patent: September 17, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter