Patents by Inventor Nicolas Livet

Nicolas Livet has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220301295
    Abstract: A recurrent multi-task CNN with an encoder and multiple decoders infers single value output and dense (image) outputs such as heatmaps and segmentation masks. Recurrence is obtained by reinjecting (with mere concatenation) heatmaps or masks (or intermediate feature maps) to a next input image (or to next intermediate feature maps) for a next CNN inference. The inference outputs may be refined using cascaded refiner blocks specifically trained. Virtual annotation for training video sequences can be obtained using computer analysis. Benefits of these approaches allows the depth of the CNN, i.e. the number of layers, to be reduced. They also avoid parallel independent inferences to be run for different tasks, while keeping similar prediction quality. Multiple task inferences are useful for Augmented Reality applications.
    Type: Application
    Filed: June 18, 2019
    Publication date: September 22, 2022
    Inventor: Nicolas LIVET
  • Patent number: 8903177
    Abstract: The invention in particular relates to the hybrid tracking of representations of objects in a sequence of images using at least one key image. After acquiring a first and second images including a representation of the tracked object, a first image portion is identified in the first image, and a second image portion is retrieved from the key image. A relative pose of a first image portion of said second image, similar to the first image portion of the first image, is estimated. A second image portion of the first or second image, similar to the second image portion of the key image, is sought. The relative pose of the object is then estimated according to the relative poses of the first image portions and the second image portions.
    Type: Grant
    Filed: October 12, 2010
    Date of Patent: December 2, 2014
    Assignee: Total Immersion
    Inventors: Nicolas Livet, Thomas Pasquier, Jérémy Chamoux
  • Patent number: 8824736
    Abstract: Systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Grant
    Filed: November 20, 2012
    Date of Patent: September 2, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8805016
    Abstract: Methods, systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Grant
    Filed: November 20, 2012
    Date of Patent: August 12, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8675972
    Abstract: The invention relates to a method and a device for determining the exposure of a three-dimensional object in an image, characterised in that it comprises the following steps: acquiring a three-dimensional generic model of the object, projecting the three-dimensional generic model according to at least one two-dimensional representation and associating to each two-dimensional representation an exposure information of the three-dimensional object, electing and positioning a two-dimensional representation onto the object in said image, and determining the three-dimensional exposure of the object in the image from at least the exposure information associated with the selected two-dimensional representation.
    Type: Grant
    Filed: February 22, 2008
    Date of Patent: March 18, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8614705
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterized in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: December 24, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Patent number: 8374395
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374394
    Abstract: Methods and devices for the real-time tracking of one or more objects of a real scene in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374396
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20130004022
    Abstract: Methods and devices for the real-time tracking of one or more objects of a real scene in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: January 3, 2013
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120327249
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: December 27, 2012
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120328158
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: December 27, 2012
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8315432
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: November 20, 2012
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120201469
    Abstract: The invention in particular relates to the hybrid tracking of representations of objects in a sequence of images using at least one key image. After acquiring a first and second images including a representation of the tracked object, a first image portion is identified in the first image, and a second image portion is retrieved from the key image. A relative pose of a first image portion of said second image, similar to the first image portion of the first image, is estimated. A second image portion of the first or second image, similar to the second image portion of the key image, is sought. The relative pose of the object is then estimated according to the relative poses of the first image portions and the second image portions.
    Type: Application
    Filed: October 12, 2010
    Publication date: August 9, 2012
    Applicant: TOTAL IMMERSION
    Inventors: Nicolas Livet, Thomas Pasquier, Jérémy Chamoux
  • Publication number: 20120129605
    Abstract: The invention relates in particular to the detection of interactions with a software application according to a movement of an object situated in the field of an image sensor. After having received a first and a second image and having identified a first region of interest in the first image, a second region of interest, corresponding to the first region of interest, is identified in the second image. The first and second regions of interest are compared and a mask of interest characterizing a variation of at least one feature of corresponding points in the first and second regions of interest is determined. A movement of the object is then determined from said mask of interest. The movement is analyzed and, in response, a predetermined action is triggered or not triggered.
    Type: Application
    Filed: November 18, 2011
    Publication date: May 24, 2012
    Applicant: TOTAL IMMERSION
    Inventors: Nicolas Livet, Thomas Pasquier
  • Publication number: 20110096844
    Abstract: A communication method comprising the display, on a communication mobile terminal (2) equipped with a camera, a rich video comprising a real filmed scene in which are imbedded additional visual elements connected with said scene, within which video enrichment operations are carried out within a remote communication system (3) and are rendered on the mobile terminal (2) in real time.
    Type: Application
    Filed: March 13, 2009
    Publication date: April 28, 2011
    Inventors: Olivier Poupel, Marin Osmond, Stéphane Saada, Stéphane Dufosse, Valentin Lefevre, Nicolas Livet
  • Publication number: 20100220891
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Application
    Filed: January 18, 2008
    Publication date: September 2, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20100060632
    Abstract: The invention relates to a method and devices for imbedding, in at least one so-called first image of an image stream representing a real scene (120), at least one so-called second image extracted from at least one three-dimensional representation of at least one virtual object. After acquiring said at least one first image of said image stream (210), information for determining the position and the orientation of said at least one virtual object in said real scene using position data from said real scene are received (210, 214), a portion at least of this data being received from at least one sensor (135?, 135?) in the real scene, while other data can be determined by analysis of the first image. Said at least one second image is extracted from the three-dimensional representation of said at least one virtual object according to the orientation of said at least one virtual object.
    Type: Application
    Filed: January 3, 2008
    Publication date: March 11, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary
  • Publication number: 20100045665
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterised in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Application
    Filed: January 18, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Publication number: 20100045700
    Abstract: The invention relates to a real-time augmented-reality watching device (300), which comprises an image sensor (335) such as a PTZ camera, a visualisation system (320) and a control interface (310). In this device, the camera is controlled by a control interface operated by the user. Once the user has received (610) orientation information on the desired line of sight from the control interface, the orientation information on the line of sight is transmitted to the camera (615), the camera being powered and capable of movement. Then the camera transmits the orientation of its line of sight (620). The camera transmits a video stream in parallel. The position at which must be inserted data, such as the representation of a virtual three-dimensional object, is determined from the data resulting from the calibration and the orientation of the line of sight of the camera received from the latter (640).
    Type: Application
    Filed: January 10, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary