Patents by Inventor Matthieu Fradet

Matthieu Fradet has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240127552
    Abstract: A camera captures part of a surrounding scene to obtain a user view, disambiguating information uniquely identifying at least one controllable device or group of controllable devices in a set of controllable devices of a given type or at least one location or direction for a mobile controllable device is obtained, the user view and the disambiguating information relating to at least one controllable device overlaid on the user view are displayed on a display, a command intended to control at least one controllable device is received, wherein a command including at least part of the disambiguating information and a message based on the command is sent towards a corresponding controllable device.
    Type: Application
    Filed: January 10, 2022
    Publication date: April 18, 2024
    Inventors: Anthony Laurent, Vincent Alleaume, Caroline Baillard, Matthieu Fradet, Pierrick Jouet
  • Patent number: 11798239
    Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.
    Type: Grant
    Filed: January 21, 2022
    Date of Patent: October 24, 2023
    Assignee: INTERDIGITAL CE PATENT HOLDINGS, SAS
    Inventors: Caroline Baillard, Pierrick Jouet, Matthieu Fradet
  • Publication number: 20230326147
    Abstract: In an augmented reality system, helper data are associated to augmented reality anchors to describe the surroundings of the anchor in the real environment. This allows to verify that positional tracking is correct, in other words, that an augmented reality terminal is localized at the right place in an augmented reality scene. This helper data may be shown on request. Typical examples of helper data are a cropped 2D image or a 3D mesh.
    Type: Application
    Filed: July 6, 2021
    Publication date: October 12, 2023
    Inventors: Pierrick Jouet, Caroline Baillard, Matthieu Fradet, Anthony Laurent
  • Publication number: 20230298280
    Abstract: In an augmented reality system, a map of the real environment is generated from a 3D textured mesh obtained through captured data representing the real environment. Some processing is done on the mesh to remove unnecessary elements and generate the map that comprises a set of 2D pictures: one picture for the ground level and one picture for the other elements of the scene. The generated map may then be rendered on an augmented reality device. The ground and the non-ground content may be rendered independently, then additional elements, such as other users of the augmented reality scene or virtual objects, are localized and represented in the map in real-time using a proxy. The rendering can be adapted to the user poses and to the devices themselves.
    Type: Application
    Filed: July 6, 2021
    Publication date: September 21, 2023
    Inventors: Pierrick Jouet, Matthieu Fradet, Vincent Alleaume, Caroline Baillard, Tao Luo, Anthony Laurent
  • Publication number: 20230245400
    Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.
    Type: Application
    Filed: April 7, 2023
    Publication date: August 3, 2023
    Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
  • Patent number: 11651576
    Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.
    Type: Grant
    Filed: August 25, 2022
    Date of Patent: May 16, 2023
    Assignee: InterDigital CE Patent Holdings
    Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
  • Publication number: 20230134130
    Abstract: According to embodiments, a (e.g., plurality of) lighting model(s) may be computed from a scene model by a processing device. The model (e.g., a geometric model of the scene complemented by a lighting model(s)) may be stored, for example on the processing device. The processing device may be coupled with a user interface running on any of the processing device or another (e.g., Tenderer) device. According to embodiments, a (e.g., specific) scene, for example, among a set of possible scenes, may be selected via the user interface, for being rendered by the AR application on any of the processing device and a (e.g., different) Tenderer device. According to embodiments, a (e.g., specific, virtual) outdoor lighting condition may be selected via the user interface, and a rendering of the selected scene may be adapted by the AR application according to the selected outdoor lighting condition.
    Type: Application
    Filed: March 8, 2021
    Publication date: May 4, 2023
    Inventors: Philippe Robert, Vincent Alleaume, Matthieu Fradet, Tao Luo
  • Publication number: 20230063303
    Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.
    Type: Application
    Filed: August 25, 2022
    Publication date: March 2, 2023
    Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
  • Patent number: 11580706
    Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before run-time and involving real-world constraints (23). Real-world information (22) is captured in there al 3D scene and the scenario is executed at runtime (14) in presence of there al-world constraints. When there al-world constraints are not identified (12) from there al-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of there al 3D scene. Application to mixed reality.
    Type: Grant
    Filed: August 3, 2021
    Date of Patent: February 14, 2023
    Assignee: INTERDIGITAL CE PATENT HOLDINGS, SAS
    Inventors: Anthony Laurent, Matthieu Fradet, Caroline Baillard
  • Publication number: 20230019181
    Abstract: Localization of a user device in a mixed reality environment in which the user device obtains at least one keyframe from a server, which can reside on the user device, displays at least one of the keyframes on a screen, captures by a camera an image of the environment, and obtains a localization result based on at least one feature of at least one keyframe and the image.
    Type: Application
    Filed: December 10, 2020
    Publication date: January 19, 2023
    Inventors: Matthieu Fradet, Vincent Alleaume, Anthony Laurent, Philippe Robert
  • Patent number: 11443492
    Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.
    Type: Grant
    Filed: June 13, 2019
    Date of Patent: September 13, 2022
    Assignee: InterDigital CE Patent Holdings
    Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
  • Publication number: 20220148278
    Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.
    Type: Application
    Filed: January 21, 2022
    Publication date: May 12, 2022
    Inventors: Caroline BAILLARD, Pierrick JOUET, Matthieu FRADET
  • Patent number: 11263816
    Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: March 1, 2022
    Assignee: INTERDIGITAL CE PATENT HOLDINGS, SAS
    Inventors: Caroline Baillard, Pierrick Jouet, Matthieu Fradet
  • Publication number: 20220036075
    Abstract: A method and apparatus is provided for enhancing immersive experiences. The progress of a virtual reality content provided in mixed reality environment is monitored. The mixed reality environment incorporates virtual reality content with images provided from a real environment. In addition, at least one virtual acoustic data associated with the virtual reality content is obtained and modified by incorporating said images provided in the real environment.
    Type: Application
    Filed: September 13, 2019
    Publication date: February 3, 2022
    Inventors: Matthieu Fradet, Viktor Phoenix, Vincent Alleaume
  • Publication number: 20210366197
    Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before run-time and involving real-world constraints (23). Real-world information (22) is captured in there al 3D scene and the scenario is executed at runtime (14) in presence of there al-world constraints. When there al-world constraints are not identified (12) from there al-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of there al 3D scene. Application to mixed reality.
    Type: Application
    Filed: August 3, 2021
    Publication date: November 25, 2021
    Inventors: Anthony LAURENT, Matthieu FRADET, Caroline BAILLARD
  • Patent number: 11113888
    Abstract: Dynamic virtual content(s) to be superimposed to a representation of a real 3D scene complies with a scenario defined before runtime and involving real-world constraints (23). Real-world information (22) is captured in the real 3D scene and the scenario is executed at runtime (14) in presence of the real-world constraints. When the real-world constraints are not identified (12) from the real-world information, a transformation of the representation of the real 3D scene to a virtually adapted 3D scene is carried out (13) before executing the scenario, so that the virtually adapted 3D scene fulfills those constraints, and the scenario is executed in the virtually adapted 3D scene replacing the real 3D scene instead of the real 3D scene. Application to mixed reality.
    Type: Grant
    Filed: December 19, 2017
    Date of Patent: September 7, 2021
    Assignee: INTERDIGITAL CE PATENT HOLDINGS, SAS
    Inventors: Anthony Laurent, Matthieu Fradet, Caroline Baillard
  • Publication number: 20210272373
    Abstract: A method of sharing and a method of presenting virtual content in a mixed reality scene rendered on at least two user devices having different viewing position and/or orientation onto the mixed reality scene and corresponding apparatus are described. At a first user device, a user is enabled to select a virtual content to be shared and a second user device with whom the virtual content is to be shared. Information related to the virtual content to be shared is provided, wherein the provided information comprises the 3D position of the virtual content to be shared. The information is received by the second user device and the shared virtual content is rendered with regard to the viewing position and/or orientation of the second user device onto the mixed reality scene.
    Type: Application
    Filed: June 13, 2019
    Publication date: September 2, 2021
    Inventors: Matthieu Fradet, Caroline Baillard, Anthony Laurent
  • Publication number: 20200394842
    Abstract: A method for a placement of a virtual object of an augmented or mixed reality application in a real-world 3D environment, comprises: selecting (14), at a runtime of the augmented or mixed reality application, one of a finite set of at least two candidate insertion areas predetermined in the real-world 3D environment for the placement of the virtual object in the real-world 3D environment, based on criteria combining, for each of the candidate insertion areas, relationships between each of: the real-world 3D environment, the virtual object considered with respect to a placement of that virtual object in that candidate insertion area, and a user position; and inserting (14) the virtual object in the selected candidate insertion area.
    Type: Application
    Filed: November 29, 2017
    Publication date: December 17, 2020
    Inventors: Caroline BAILLARD, Pierrick JOUET, Matthieu FRADET
  • Patent number: 10825249
    Abstract: In order to blur a virtual object in a video in real time as the video is acquired by a device capturing a real scene, a salient idea is used which estimates an apparent motion vector between two successive images, being captured at two successive device poses, in which the apparent motion vector estimation is based on a motion of the device. The successive images are then filtered based on the estimated apparent motion vector.
    Type: Grant
    Filed: September 14, 2017
    Date of Patent: November 3, 2020
    Assignee: INTERDIGITAL CE PATENT HOLDINGS
    Inventors: Pierrick Jouet, Philippe Robert, Matthieu Fradet
  • Patent number: 10747307
    Abstract: A method of selection of an object in an environment including a plurality of real and/or virtual objects is described. The environment being displayed to a user through a display device includes an assignment of a gesture path to each object of the plurality of objects and the gesture path includes a series of gestures to be performed by the user to select the object.
    Type: Grant
    Filed: November 1, 2017
    Date of Patent: August 18, 2020
    Assignee: InterDigital CE Patent Holdings
    Inventors: Vincent Alleaume, Pierrick Jouet, Matthieu Fradet