Patents by Inventor Matthieu Fradet
Matthieu Fradet has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240127552Abstract: A camera captures part of a surrounding scene to obtain a user view, disambiguating information uniquely identifying at least one controllable device or group of controllable devices in a set of controllable devices of a given type or at least one location or direction for a mobile controllable device is obtained, the user view and the disambiguating information relating to at least one controllable device overlaid on the user view are displayed on a display, a command intended to control at least one controllable device is received, wherein a command including at least part of the disambiguating information and a message based on the command is sent towards a corresponding controllable device.Type: ApplicationFiled: January 10, 2022Publication date: April 18, 2024Inventors: Anthony Laurent, Vincent Alleaume, Caroline Baillard, Matthieu Fradet, Pierrick Jouet
-
Patent number: 11798239Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.Type: GrantFiled: January 21, 2022Date of Patent: October 24, 2023Assignee: INTERDIGITAL CE PATENT HOLDINGS, SASInventors: Caroline Baillard, Pierrick Jouet, Matthieu Fradet
-
Publication number: 20230326147Abstract: In an augmented reality system, helper data are associated to augmented reality anchors to describe the surroundings of the anchor in the real environment. This allows to verify that positional tracking is correct, in other words, that an augmented reality terminal is localized at the right place in an augmented reality scene. This helper data may be shown on request. Typical examples of helper data are a cropped 2D image or a 3D mesh.Type: ApplicationFiled: July 6, 2021Publication date: October 12, 2023Inventors: Pierrick Jouet, Caroline Baillard, Matthieu Fradet, Anthony Laurent
-
Publication number: 20230298280Abstract: In an augmented reality system, a map of the real environment is generated from a 3D textured mesh obtained through captured data representing the real environment. Some processing is done on the mesh to remove unnecessary elements and generate the map that comprises a set of 2D pictures: one picture for the ground level and one picture for the other elements of the scene. The generated map may then be rendered on an augmented reality device. The ground and the non-ground content may be rendered independently, then additional elements, such as other users of the augmented reality scene or virtual objects, are localized and represented in the map in real-time using a proxy. The rendering can be adapted to the user poses and to the devices themselves.Type: ApplicationFiled: July 6, 2021Publication date: September 21, 2023Inventors: Pierrick Jouet, Matthieu Fradet, Vincent Alleaume, Caroline Baillard, Tao Luo, Anthony Laurent
-
Publication number: 20230245400Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.Type: ApplicationFiled: April 7, 2023Publication date: August 3, 2023Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
-
Patent number: 11651576Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.Type: GrantFiled: August 25, 2022Date of Patent: May 16, 2023Assignee: InterDigital CE Patent HoldingsInventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
-
Publication number: 20230134130Abstract: According to embodiments, a (e.g., plurality of) lighting model(s) may be computed from a scene model by a processing device. The model (e.g., a geometric model of the scene complemented by a lighting model(s)) may be stored, for example on the processing device. The processing device may be coupled with a user interface running on any of the processing device or another (e.g., Tenderer) device. According to embodiments, a (e.g., specific) scene, for example, among a set of possible scenes, may be selected via the user interface, for being rendered by the AR application on any of the processing device and a (e.g., different) Tenderer device. According to embodiments, a (e.g., specific, virtual) outdoor lighting condition may be selected via the user interface, and a rendering of the selected scene may be adapted by the AR application according to the selected outdoor lighting condition.Type: ApplicationFiled: March 8, 2021Publication date: May 4, 2023Inventors: Philippe Robert, Vincent Alleaume, Matthieu Fradet, Tao Luo
-
Publication number: 20230063303Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.Type: ApplicationFiled: August 25, 2022Publication date: March 2, 2023Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
-
Patent number: 11580706Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before run-time and involving real-world constraints (23). Real-world information (22) is captured in there al 3D scene and the scenario is executed at runtime (14) in presence of there al-world constraints. When there al-world constraints are not identified (12) from there al-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of there al 3D scene. Application to mixed reality.Type: GrantFiled: August 3, 2021Date of Patent: February 14, 2023Assignee: INTERDIGITAL CE PATENT HOLDINGS, SASInventors: Anthony Laurent, Matthieu Fradet, Caroline Baillard
-
Publication number: 20230019181Abstract: Localization of a user device in a mixed reality environment in which the user device obtains at least one keyframe from a server, which can reside on the user device, displays at least one of the keyframes on a screen, captures by a camera an image of the environment, and obtains a localization result based on at least one feature of at least one keyframe and the image.Type: ApplicationFiled: December 10, 2020Publication date: January 19, 2023Inventors: Matthieu Fradet, Vincent Alleaume, Anthony Laurent, Philippe Robert
-
Patent number: 11443492Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.Type: GrantFiled: June 13, 2019Date of Patent: September 13, 2022Assignee: InterDigital CE Patent HoldingsInventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
-
Publication number: 20220148278Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.Type: ApplicationFiled: January 21, 2022Publication date: May 12, 2022Inventors: Caroline BAILLARD, Pierrick JOUET, Matthieu FRADET
-
Patent number: 11263816Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.Type: GrantFiled: November 29, 2017Date of Patent: March 1, 2022Assignee: INTERDIGITAL CE PATENT HOLDINGS, SASInventors: Caroline Baillard, Pierrick Jouet, Matthieu Fradet
-
Publication number: 20220036075Abstract: A method and apparatus is provided for enhancing immersive experiences. The progress of a virtual reality content provided in mixed reality environment is monitored. The mixed reality environment incorporates virtual reality content with images provided from a real environment. In addition, at least one virtual acoustic data associated with the virtual reality content is obtained and modified by incorporating said images provided in the real environment.Type: ApplicationFiled: September 13, 2019Publication date: February 3, 2022Inventors: Matthieu Fradet, Viktor Phoenix, Vincent Alleaume
-
Publication number: 20210366197Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before run-time and involving real-world constraints (23). Real-world information (22) is captured in there al 3D scene and the scenario is executed at runtime (14) in presence of there al-world constraints. When there al-world constraints are not identified (12) from there al-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of there al 3D scene. Application to mixed reality.Type: ApplicationFiled: August 3, 2021Publication date: November 25, 2021Inventors: Anthony LAURENT, Matthieu FRADET, Caroline BAILLARD
-
Patent number: 11113888Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before runtime and involving real-world constraints (23). Real-world information (22) is captured in the real 3D scene and the scenario is executed at runtime (14) in presence of the real-world constraints. When the real-world constraints are not identified (12) from the real-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of the real 3D scene. Application to mixed reality.Type: GrantFiled: December 19, 2017Date of Patent: September 7, 2021Assignee: INTERDIGITAL CE PATENT HOLDINGS, SASInventors: Anthony Laurent, Matthieu Fradet, Caroline Baillard
-
Publication number: 20210272373Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.Type: ApplicationFiled: June 13, 2019Publication date: September 2, 2021Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
-
Publication number: 20200394842Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.Type: ApplicationFiled: November 29, 2017Publication date: December 17, 2020Inventors: Caroline BAILLARD, Pierrick JOUET, Matthieu FRADET
-
Patent number: 10825249Abstract: In order to blur a virtual object in a video in real time as the video is acquired by a device capturing a real scene, a salient idea is used which estimates an apparent motion vector between two successive images, being captured at two successive device poses, in which the apparent motion vector estimation is based on a motion of the device. The successive images are then filtered based on the estimated apparent motion vector.Type: GrantFiled: September 14, 2017Date of Patent: November 3, 2020Assignee: INTERDIGITAL CE PATENT HOLDINGSInventors: Pierrick Jouet, Philippe Robert, Matthieu Fradet
-
Patent number: 10747307Abstract: A method of selection of an object in an environment including a plurality of real and/or virtual objects is described. The environment being displayed to a user through a display device includes an assignment of a gesture path to each object of the plurality of objects and the gesture path includes a series of gestures to be performed by the user to select the object.Type: GrantFiled: November 1, 2017Date of Patent: August 18, 2020Assignee: InterDigital CE Patent HoldingsInventors: Vincent Alleaume, Pierrick Jouet, Matthieu Fradet