Patents Assigned to Total Immersion
  • Publication number: 20100141663
    Abstract: A system and method for modifying facial animations to include expression and microexpression information is disclosed. Particularly, a system and method for applying actor-generated expression data to a facial animation, either in realtime or in storage is disclosed. Present embodiments may also be incorporated into a larger training program, designed to train users to recognize various expressions and microexpressions.
    Type: Application
    Filed: October 2, 2009
    Publication date: June 10, 2010
    Applicant: TOTAL IMMERSION SOFTWARE, INC.
    Inventors: Michael J. Becker, Keith Copenhagen, Murray Taylor
  • Publication number: 20100134601
    Abstract: The invention relates to a method for determining the arrangement of a video capturing means in the capture mark of at least one virtual object in three dimensions, said at least one virtual object being a modelling corresponding to at least one real object present in images of the video image flows. The inventive method is characterised in that it comprises the following steps: a video image flow is received from the video capturing means; the video image flow received and at least one virtual object flow are displayed; points of said at least one virtual object are paired up, in real-time, with corresponding points in the at least one real object present in images of the video image flows; and the arrangement of said video capturing means is determined according to the points of the at least one virtual object and the paired point thereof in the at least one real object present in the images of the video image flows.
    Type: Application
    Filed: August 9, 2006
    Publication date: June 3, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Marion Passama
  • Publication number: 20100060632
    Abstract: The invention relates to a method and devices for imbedding, in at least one so-called first image of an image stream representing a real scene (120), at least one so-called second image extracted from at least one three-dimensional representation of at least one virtual object. After acquiring said at least one first image of said image stream (210), information for determining the position and the orientation of said at least one virtual object in said real scene using position data from said real scene are received (210, 214), a portion at least of this data being received from at least one sensor (135?, 135?) in the real scene, while other data can be determined by analysis of the first image. Said at least one second image is extracted from the three-dimensional representation of said at least one virtual object according to the orientation of said at least one virtual object.
    Type: Application
    Filed: January 3, 2008
    Publication date: March 11, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary
  • Publication number: 20100045700
    Abstract: The invention relates to a real-time augmented-reality watching device (300), which comprises an image sensor (335) such as a PTZ camera, a visualisation system (320) and a control interface (310). In this device, the camera is controlled by a control interface operated by the user. Once the user has received (610) orientation information on the desired line of sight from the control interface, the orientation information on the line of sight is transmitted to the camera (615), the camera being powered and capable of movement. Then the camera transmits the orientation of its line of sight (620). The camera transmits a video stream in parallel. The position at which must be inserted data, such as the representation of a virtual three-dimensional object, is determined from the data resulting from the calibration and the orientation of the line of sight of the camera received from the latter (640).
    Type: Application
    Filed: January 10, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Alan Savary
  • Publication number: 20100045665
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterised in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Application
    Filed: January 18, 2008
    Publication date: February 25, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Publication number: 20100002909
    Abstract: The invention consists in a system for detection in real time of interactions between a user and an augmented reality scene, the interactions resulting from the modification of the appearance of an object present in the image. After having created (110) and processed (115) a reference model in an initialization phase (100), the pose of the object in the image is determined (135) and a comparison model is extracted from the image (160). The reference and comparison models are compared (170), as a function of said determined pose of the object, and, in response to the comparison step, the interactions are detected.
    Type: Application
    Filed: June 30, 2009
    Publication date: January 7, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin LEFEVRE, Nicolas Livet, Thomas Pasquier
  • Patent number: 7471301
    Abstract: The invention concerns a method and a system for: (i) producing in a computer processing unit a flow of synthetic images, and (ii) tracing a scene by creating visual interactions between the synthetic image flow and at least one video image flow. The computer processing unit comprises: a motherboard, a graphics board for scene rendering and display, including a processor for accelerating 2D/3D processing, a work buffer and a texture memory, an acquisition means for acquiring in real time video images, in a video buffer. The specific rendering of the scene is carried out: by recopying the video buffer in a memory zone of the graphics board, by tracing the synthetic images in the work buffer.
    Type: Grant
    Filed: July 21, 2003
    Date of Patent: December 30, 2008
    Assignee: Total Immersion
    Inventor: Valentin Lefevre
  • Publication number: 20060074921
    Abstract: The invention concerns a method and a system for: (i) producing in a computer processing unit a flow of synthetic images, and (ii) tracing a scene by creating visual interactions between the synthetic image flow and at least one video image flow. The computer processing unit comprises: a motherboard, a graphics board for scene rendering and display, including a processor for accelerating 2D/3D processing, a work buffer and a texture memory, an acquisition means for acquiring in real time video images, in a video buffer. The specific rendering of the scene is carried out: by recopying the video buffer in a memory zone of the graphics board, by tracing the synthetic images in the work buffer.
    Type: Application
    Filed: July 21, 2003
    Publication date: April 6, 2006
    Applicant: TOTAL IMMERSION
    Inventor: Valentin Lefevre