Patents by Inventor Christophe HURTER
Christophe HURTER has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10304251Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.Type: GrantFiled: June 15, 2017Date of Patent: May 28, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
-
Patent number: 10217295Abstract: To better explore a virtual 3d computer generated environment comprised of objects which may be voxels, polygons or any other construct are selectively not displayed so as to better reveal underlying objects. The objects are each associated with a metadata value which contributes to determining their visibility such as a density or opacity value. The manner of selection is somewhat analogous to the projection of a beam of light towards the objects from a virtual projector, where a display threshold is determined for each object within the field of view of said virtual projector on the basis of a display function having an inverse relation to distance from the virtual projector and further varying as a function of the angle defined by the orientation of the virtual projector and a line drawn from said virtual projector to each said object respectively.Type: GrantFiled: April 12, 2017Date of Patent: February 26, 2019Assignee: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe Hurter, Michael Traoré Sompagnimdi
-
Patent number: 10210637Abstract: Datasets such as two dimensional raster images or three dimensional voxel based representations are often processed for representation using a transfer function defined by a curve. A mechanism for manually adjusting such curves is described, whereby a user adds a second curve. The transfer curve is recalculated so as to draw closer to the second curve. By drawing the second curve in the shape required for the transfer curve, and repeating this gesture as the transfer curve evolves, the user can subtly and interactively develop the transfer curve until the processed representation is exactly as required. The attractive effect of the points of the first curve on those of adjacent point on the first curve and on those of the second may be attractive or repellent, may vary in any manner as a function of distance, and in particular may imitate the effects of physical forces such as magnetic, elastic, etc.Type: GrantFiled: March 22, 2017Date of Patent: February 19, 2019Assignee: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe Hurter
-
Publication number: 20190005724Abstract: Methods and systems for rendering augmented reality display data to locations of a physical presentation environment based on a presentation configuration are provided. A physical presentation environment configuration may be accessed that includes locations of a physical presentation environment for mapping augmented reality display data. The augmented reality display data may include a plurality of augmented reality objects that are rendered for display. Presentation attributes of the augmented reality display data may be used in conjunction with the presentation configuration for mapping and rendering the augmented reality display data. The rendered augmented reality display data may be dynamically interactive, and may be generated based on previous presentation configurations, mapping preferences, mapping limitations, and/or other factors.Type: ApplicationFiled: June 30, 2017Publication date: January 3, 2019Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Publication number: 20190004684Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.Type: ApplicationFiled: June 30, 2017Publication date: January 3, 2019Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Publication number: 20190004683Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.Type: ApplicationFiled: June 30, 2017Publication date: January 3, 2019Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
-
Publication number: 20180365897Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.Type: ApplicationFiled: June 15, 2017Publication date: December 20, 2018Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
-
Publication number: 20180364808Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.Type: ApplicationFiled: June 15, 2017Publication date: December 20, 2018Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
-
Publication number: 20180364853Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.Type: ApplicationFiled: June 15, 2017Publication date: December 20, 2018Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
-
Publication number: 20180330532Abstract: A mechanism is provided for exploring three dimensional environments such as those generated from X-ray and tomography scans, in such a way as to “see round” obstacles to an article of interest, without degrading the overall context of an article of interest in terms of it position in relationship to other articles, and to the viewpoint of the user. This is achieved by defining a light guide curve leading to the user's viewpoint, with respect to which ray tracing is performed to define the image displayed to the user whereby rays passing within a predetermined distance of the light guide curve at a relative angle within a predetermined range thereto are deviated so as to conform therewith to a degree proportional to its distance therefrom.Type: ApplicationFiled: May 15, 2018Publication date: November 15, 2018Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe HURTER, Michael TRAORÉ SOMPAGNIMDI
-
Publication number: 20180322701Abstract: In various embodiments, computerized methods and systems for syndicating direct and indirect interactions with objects in a computer-mediated environment to facilitate precise interactions with the objects in the computer-mediated environment are provided. The system detects a direct interaction with an object in the computer-mediated reality environment. The direct interaction may be a natural or hypernatural interaction. Subsequently, the system may determine various options of indirect interaction with the object related to the direct interaction. The indirect interaction may be generated by a controller. Upon receiving an indirect interaction, the system may modify the object based on the syndication of the direct interaction and the indirect interaction.Type: ApplicationFiled: May 4, 2017Publication date: November 8, 2018Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
-
Publication number: 20180314484Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.Type: ApplicationFiled: April 28, 2017Publication date: November 1, 2018Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
-
Publication number: 20180032836Abstract: A method of defining a path model from a set of realistic paths is provided, where each path in the set of realistic paths is expanded on piece-wise polynomial basis, and a respective centroid function and sequence of eigenfunctions calculated for each expanded representation. A set of principle paths representing the major variation of this set of paths is obtained describing the variations of the set of realistic paths with respect to the centroid. The path model thus comprises a linear combination of said principle paths. The path model may be used as the basis for the generation of new curves having similar characteristics to the original set of realistic paths.Type: ApplicationFiled: July 27, 2017Publication date: February 1, 2018Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe HURTER
-
Publication number: 20170301147Abstract: To better explore a virtual 3d computer generated environment comprised of objects which may be voxels, polygons or any other construct are selectively not displayed so as to better reveal underlying objects. The objects are each associated with a metadata value which contributes to determining their visibility such as a density or opacity value. The manner of selection is somewhat analogous to the projection of a beam of light towards the objects from a virtual projector, where a display threshold is determined for each object within the field of view of said virtual projector on the basis of a display function having an inverse relation to distance from the virtual projector and further varying as a function of the angle defined by the orientation of the virtual projector and a line drawn from said virtual projector to each said object respectively.Type: ApplicationFiled: April 12, 2017Publication date: October 19, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe HURTER, Michael TRAORÉ SOMPAGNIMDI
-
Publication number: 20170277418Abstract: In order to browse between a collection of datasets susceptible of graphical representation, these datasets are associated with points on a sliding scale of one, two or three dimensions. When a point corresponding to a particular dataset is selected by a user via a mouse pointer etc. it is rendered as a graphical representation and presented to the user. When an intermediate point is selected, a interpolation of the datasets corresponding to the nearby points is generated and the resulting dataset rendered as a graphical representation and presented to the user. The interaction may be implemented with a slider bar type widget having hybrid behaviour such that clicking on the bar causes the button to jump to the nearest point corresponding to a data, while sliding to a chosen intermediate position activates the interpolation of adjacent datasets.Type: ApplicationFiled: March 20, 2017Publication date: September 28, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe HURTER
-
Publication number: 20170278298Abstract: Objects in a voxel based computer generated three dimensional environment are identified by traversing adjacent voxels meeting a predetermined criterion with respect to a scalar metadata value associated with each voxel, such as opacity or density. These adjacent voxels may be explored in accordance with a tree-crawling algorithm such as a breadth first or depth first algorithm. Once all adjacent cells meeting the criterion are identified, these are determined to represent a discrete object, and displayed as such. The starting point for the traversing process may be the voxel closest to a virtual camera position along the line of sight of that virtual camera meeting the criterion.Type: ApplicationFiled: March 23, 2017Publication date: September 28, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe HURTER, Michael TRAORÉ SOMPAGNIMDI
-
Publication number: 20170278284Abstract: Datasets such as two dimensional raster images or three dimensional voxel based representations are often processed for representation using a transfer function defined by a curve. A mechanism for manually adjusting such curves is described, whereby a user adds a second curve. The transfer curve is recalculated so as to draw closer to the second curve. By drawing the second curve in the shape required for the transfer curve, and repeating this gesture as the transfer curve evolves, the user can subtly and interactively develop the transfer curve until the processed representation is exactly as required. The attractive effect of the points of the first curve on those of adjacent point on the first curve and on those of the second may be attractive or repellent, may vary in any manner as a function of distance, and in particular may imitate the effects of physical forces such as magnetic, elastic, etc.Type: ApplicationFiled: March 22, 2017Publication date: September 28, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe HURTER
-
Publication number: 20170278300Abstract: In a crowded representation of a virtual three dimensional space defined in terms of voxels, an object of interest will often be occluded by one or more objects of varying densities between the virtual camera defining the user's point of view, and the object of interest. To automatically identify an optimal camera position, an number of candidate positions are considered, for example situated at the vertices of a regular polyhedron centred on the object of interest. For each of these candidate positions, a ray is cast towards the object of interest, and the occlusion for each intervening voxel is determined as the product of that voxel's density, and a density transfer function. The virtual camera position corresponding to the least occluded path is then selected as the new point of view.Type: ApplicationFiled: March 20, 2017Publication date: September 28, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventor: Christophe HURTER
-
Publication number: 20170278309Abstract: The objects present in a particular computer generated 3d environment are represented to a user as distributed amongst a plurality of display area. The relative positions of the objects are maintained, and whenever an object is removed from one display area it is added to another. The point of view presented to the user may be the same for each display area, with all being controlled together, or separate control may be provided for each area or sub-group of areas.Type: ApplicationFiled: March 22, 2017Publication date: September 28, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe HURTER, Michael TRAORÉ SOMPAGNIMDI
-
Publication number: 20170108923Abstract: A graphical user interface supporting eye tracking is enriched with graphical representations of the degree of attention afforded respective areas of the user interface. These representations may comprise heatmaps, visual variable and the like. The generation of these representations may furthermore be used to adjust other user interface behaviour, for example by repositioning a mouse cursor to a part of the screen afforded greater protection. The attention information of a plurality of users may be compiled together and used to modify the graphical representation, providing each user with an indication of the focus of attention of their colleagues, thereby establishing a group awareness.Type: ApplicationFiled: October 7, 2016Publication date: April 20, 2017Applicant: ECOLE NATIONALE DE L'AVIATION CIVILEInventors: Christophe HURTER, Raïlane BENHACENE