Patents by Inventor Valentin Lefevre

Valentin Lefevre has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8675972
    Abstract: The invention relates to a method and a device for determining the exposure of a three-dimensional object in an image, characterised in that it comprises the following steps: acquiring a three-dimensional generic model of the object, projecting the three-dimensional generic model according to at least one two-dimensional representation and associating to each two-dimensional representation an exposure information of the three-dimensional object, electing and positioning a two-dimensional representation onto the object in said image, and determining the three-dimensional exposure of the object in the image from at least the exposure information associated with the selected two-dimensional representation.
    Type: Grant
    Filed: February 22, 2008
    Date of Patent: March 18, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8614705
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterized in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: December 24, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Publication number: 20130157690
    Abstract: The subject of the invention is in particular the real-time interfacing of a plurality of mobile elements with a computing system. After having selected at least one location module integrated into a mobile element, the at least one location module is activated sequentially. At least one signal is then received from the at least one activated location module and at least one item of information relating to the position of the mobile element including the at least one activated location module is calculated in real time on the basis of the at least one signal received. A single location module may be activated at a given instant.
    Type: Application
    Filed: September 2, 2011
    Publication date: June 20, 2013
    Applicant: EPAWN
    Inventors: Valentin Lefevre, Christophe Duteil
  • Patent number: 8374395
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374394
    Abstract: Methods and devices for the real-time tracking of one or more objects of a real scene in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374396
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20130004022
    Abstract: Methods and devices for the real-time tracking of one or more objects of a real scene in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: January 3, 2013
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120327249
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: December 27, 2012
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120328158
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Application
    Filed: September 7, 2012
    Publication date: December 27, 2012
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8315432
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: November 20, 2012
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20110096844
    Abstract: A communication method comprising the display, on a communication mobile terminal (2) equipped with a camera, a rich video comprising a real filmed scene in which are imbedded additional visual elements connected with said scene, within which video enrichment operations are carried out within a remote communication system (3) and are rendered on the mobile terminal (2) in real time.
    Type: Application
    Filed: March 13, 2009
    Publication date: April 28, 2011
    Inventors: Olivier Poupel, Marin Osmond, Stéphane Saada, Stéphane Dufosse, Valentin Lefevre, Nicolas Livet
  • Publication number: 20100316281
    Abstract: The invention relates to a method and a device for determining the exposure of a three-dimensional object in an image, characterised in that it comprises the following steps: acquiring a three-dimensional generic model of the object, projecting the three-dimensional generic model according to at least one two-dimensional representation and associating to each two-dimensional representation an exposure information of the three-dimensional object, electing and positioning a two-dimensional representation onto the object in said image, and determining the three-dimensional exposure of the object in the image from at least the exposure information associated with the selected two-dimensional representation.
    Type: Application
    Filed: February 22, 2008
    Publication date: December 16, 2010
    Inventor: Valentin LEFEVRE
  • Publication number: 20100277468
    Abstract: The invention relates to a method and devices for enabling a user to visualise a virtual model in a real environment. According to the invention, a 2D representation of a 3D virtual object is inserted, in real-time, into the video flows of a camera aimed at a real environment in order to form an enriched video flow. A plurality of cameras generating a plurality of video flows can be simultaneously used to visualise the virtual object in the real environment according to different angles of view. A particular video flow is used to dynamically generate the effects of the real environment on the virtual model. The virtual model can be, for example, a digital copy or virtual enrichments of a real copy. A virtual 2D object, for example the representation of a real person, can be inserted into the enriched video flow.
    Type: Application
    Filed: August 9, 2006
    Publication date: November 4, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Jean-Marie Vaidie
  • Publication number: 20100220891
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Application
    Filed: January 18, 2008
    Publication date: September 2, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20100134601
    Abstract: The invention relates to a method for determining the arrangement of a video capturing means in the capture mark of at least one virtual object in three dimensions, said at least one virtual object being a modelling corresponding to at least one real object present in images of the video image flows. The inventive method is characterised in that it comprises the following steps: a video image flow is received from the video capturing means; the video image flow received and at least one virtual object flow are displayed; points of said at least one virtual object are paired up, in real-time, with corresponding points in the at least one real object present in images of the video image flows; and the arrangement of said video capturing means is determined according to the points of the at least one virtual object and the paired point thereof in the at least one real object present in the images of the video image flows.
    Type: Application
    Filed: August 9, 2006
    Publication date: June 3, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Marion Passama
  • Publication number: 20100060632
    Abstract: The invention relates to a method and devices for imbedding, in at least one so-called first image of an image stream representing a real scene (120), at least one so-called second image extracted from at least one three-dimensional representation of at least one virtual object. After acquiring said at least one first image of said image stream (210), information for determining the position and the orientation of said at least one virtual object in said real scene using position data from said real scene are received (210, 214), a portion at least of this data being received from at least one sensor (135?, 135?) in the real scene, while other data can be determined by analysis of the first image. Said at least one second image is extracted from the three-dimensional representation of said at least one virtual object according to the orientation of said at least one virtual object.
    Type: Application
    Filed: January 3, 2008
    Publication date: March 11, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary
  • Publication number: 20100045700
    Abstract: The invention relates to a real-time augmented-reality watching device (300), which comprises an image sensor (335) such as a PTZ camera, a visualisation system (320) and a control interface (310). In this device, the camera is controlled by a control interface operated by the user. Once the user has received (610) orientation information on the desired line of sight from the control interface, the orientation information on the line of sight is transmitted to the camera (615), the camera being powered and capable of movement. Then the camera transmits the orientation of its line of sight (620). The camera transmits a video stream in parallel. The position at which must be inserted data, such as the representation of a virtual three-dimensional object, is determined from the data resulting from the calibration and the orientation of the line of sight of the camera received from the latter (640).
    Type: Application
    Filed: January 10, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary
  • Publication number: 20100045665
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterised in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Application
    Filed: January 18, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Publication number: 20100002909
    Abstract: The invention consists in a system for detection in real time of interactions between a user and an augmented reality scene, the interactions resulting from the modification of the appearance of an object present in the image. After having created (110) and processed (115) a reference model in an initialization phase (100), the pose of the object in the image is determined (135) and a comparison model is extracted from the image (160). The reference and comparison models are compared (170), as a function of said determined pose of the object, and, in response to the comparison step, the interactions are detected.
    Type: Application
    Filed: June 30, 2009
    Publication date: January 7, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin LEFEVRE, Nicolas Livet, Thomas Pasquier
  • Publication number: 20090305198
    Abstract: The invention relates to a gunnery training device using a weapon (1), said device being characterised in that it comprises video capturing means (5) for capturing the field of vision, angular capturing means for detecting angles defining the position of insertion of the synthesis images, processing means (21, 22) for inserting, in real-time, synthesis images into the captured field of vision, and means for visualising (6) the captured field of vision containing said at least one inserted synthesis image.
    Type: Application
    Filed: August 9, 2006
    Publication date: December 10, 2009
    Applicant: GDI SIMULATION
    Inventors: Valentin Lefevre, Laurent Chabin, Emmanuel Marin