Patents Assigned to Total Immersion
  • Patent number: 8903177
    Abstract: The invention in particular relates to the hybrid tracking of representations of objects in a sequence of images using at least one key image. After acquiring a first and second images including a representation of the tracked object, a first image portion is identified in the first image, and a second image portion is retrieved from the key image. A relative pose of a first image portion of said second image, similar to the first image portion of the first image, is estimated. A second image portion of the first or second image, similar to the second image portion of the key image, is sought. The relative pose of the object is then estimated according to the relative poses of the first image portions and the second image portions.
    Type: Grant
    Filed: October 12, 2010
    Date of Patent: December 2, 2014
    Assignee: Total Immersion
    Inventors: Nicolas Livet, Thomas Pasquier, Jérémy Chamoux
  • Patent number: 8824736
    Abstract: Systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Grant
    Filed: November 20, 2012
    Date of Patent: September 2, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8805016
    Abstract: Methods, systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Grant
    Filed: November 20, 2012
    Date of Patent: August 12, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8797352
    Abstract: The invention relates to a method and devices for enabling a user to visualize a virtual model in a real environment. According to the invention, a 2D representation of a 3D virtual object is inserted, in real-time, into the video flows of a camera aimed at a real environment in order to form an enriched video flow. A plurality of cameras generating a plurality of video flows can be simultaneously used to visualize the virtual object in the real environment according to different angles of view. A particular video flow is used to dynamically generate the effects of the real environment on the virtual model. The virtual model can be, for example, a digital copy or virtual enrichments of a real copy. A virtual 2D object, for example the representation of a real person, can be inserted into the enriched video flow.
    Type: Grant
    Filed: August 9, 2006
    Date of Patent: August 5, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Jean-Marie Vaidie
  • Patent number: 8675972
    Abstract: The invention relates to a method and a device for determining the exposure of a three-dimensional object in an image, characterised in that it comprises the following steps: acquiring a three-dimensional generic model of the object, projecting the three-dimensional generic model according to at least one two-dimensional representation and associating to each two-dimensional representation an exposure information of the three-dimensional object, electing and positioning a two-dimensional representation onto the object in said image, and determining the three-dimensional exposure of the object in the image from at least the exposure information associated with the selected two-dimensional representation.
    Type: Grant
    Filed: February 22, 2008
    Date of Patent: March 18, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20140043332
    Abstract: A particular subject of the invention is a method for generating a textured representation of a real object in a system comprising a data processing device linked to an image acquisition device. After receiving (310) a first image representing said real object in a given pose and after obtaining (320, 325, 330, 335) a second image representing said real object in a pose identical to said given pose, said second image being representative of the transparency of said real object in said given pose, said textured representation is generated (340) by combining said first and second images.
    Type: Application
    Filed: November 26, 2012
    Publication date: February 13, 2014
    Applicant: TOTAL IMMERSION
    Inventor: TOTAL IMMERSION
  • Patent number: 8614705
    Abstract: The invention relates to a method and a device for creating at least two key images each including an image representing at least one three-dimensional object in a three-dimensional environment, and the exposure of the object in said environment from the viewpoint of the associated image, said method being characterized in that it comprises the following steps: acquiring a first image representing the object in a a predetermined initial position; creating a first key image from the first acquired image and the relative exposure of the least one second image representing said object, the viewpoint of at least one said second image being different from the viewpoint of said first image; determining the relative exposure of the object in its environment based on the difference between the viewpoints of the first image and at least one said second image, each of said viewpoints being determined relative to a position and an orientation; and creating a second key image based on said at least one acquired second im
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: December 24, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet, Worou Pierrick Chabi, Yves Quemener
  • Publication number: 20130208092
    Abstract: A particular subject of the invention is the creation of three-dimensional representations from real models having similar and predetermined characteristics, using a system comprising a support suitable for receiving such a real object, configured to present the object in a pose similar to that of use of said at least one real object, an image acquisition device configured to obtain at least two distinct images of the real object from at least two separate viewpoints and a data processing device configured to receive these images, to clip a representation of the real object in each of these images in order to obtain at least two textures of the real object, obtaining a generic three-dimensional model of the real object and creating a three-dimensional model of the real object from the textures and the generic three-dimensional model obtained.
    Type: Application
    Filed: November 26, 2012
    Publication date: August 15, 2013
    Applicant: TOTAL IMMERSION
    Inventor: TOTAL IMMERSION
  • Publication number: 20130121531
    Abstract: Systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Application
    Filed: November 20, 2012
    Publication date: May 16, 2013
    Applicant: TOTAL IMMERSION
    Inventor: TOTAL IMMERSION
  • Publication number: 20130076790
    Abstract: Methods, systems and devices for augmenting a real scene in a video stream are disclosed herein.
    Type: Application
    Filed: November 20, 2012
    Publication date: March 28, 2013
    Applicant: TOTAL IMMERSION
    Inventor: TOTAL IMMERSION
  • Patent number: 8374395
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374396
    Abstract: Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8374394
    Abstract: Methods and devices for the real-time tracking of one or more objects of a real scene in a video stream for an augmented-reality application are disclosed herein.
    Type: Grant
    Filed: September 7, 2012
    Date of Patent: February 12, 2013
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Patent number: 8315432
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Grant
    Filed: January 18, 2008
    Date of Patent: November 20, 2012
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Nicolas Livet
  • Publication number: 20120201469
    Abstract: The invention in particular relates to the hybrid tracking of representations of objects in a sequence of images using at least one key image. After acquiring a first and second images including a representation of the tracked object, a first image portion is identified in the first image, and a second image portion is retrieved from the key image. A relative pose of a first image portion of said second image, similar to the first image portion of the first image, is estimated. A second image portion of the first or second image, similar to the second image portion of the key image, is sought. The relative pose of the object is then estimated according to the relative poses of the first image portions and the second image portions.
    Type: Application
    Filed: October 12, 2010
    Publication date: August 9, 2012
    Applicant: TOTAL IMMERSION
    Inventors: Nicolas Livet, Thomas Pasquier, Jérémy Chamoux
  • Publication number: 20120129605
    Abstract: The invention relates in particular to the detection of interactions with a software application according to a movement of an object situated in the field of an image sensor. After having received a first and a second image and having identified a first region of interest in the first image, a second region of interest, corresponding to the first region of interest, is identified in the second image. The first and second regions of interest are compared and a mask of interest characterizing a variation of at least one feature of corresponding points in the first and second regions of interest is determined. A movement of the object is then determined from said mask of interest. The movement is analyzed and, in response, a predetermined action is triggered or not triggered.
    Type: Application
    Filed: November 18, 2011
    Publication date: May 24, 2012
    Applicant: TOTAL IMMERSION
    Inventors: Nicolas Livet, Thomas Pasquier
  • Publication number: 20120046770
    Abstract: In one aspect, a mobile device comprises a local content store, one or more media playback components, one or more content capture components, and an instructional module agent comprising an authoring application and a playback application. The authoring application is configured to allow an author to create and edit instructional modules each comprising one or more media playback steps, each step comprising media that can be displayed or played, and to use the content capture components to capture content, store the captured content in the local content store, and configure at least one of the steps to display or play the captured content using the media playback components. The playback application is configured to play the instructional modules using the media playback components.
    Type: Application
    Filed: June 23, 2011
    Publication date: February 23, 2012
    Applicant: Total Immersion Software, Inc.
    Inventors: Michael J. Becker, Aaron Cammarata, Peter A. Bonanni, III, David S. Maynard, John Alan Main, John R. Lowell, Lawton Campbell
  • Patent number: 8010327
    Abstract: One of a plurality of description elements for a scenario is attached to a composite asset. The composite asset comprises one or more data sets. Each data set is associated with a simulation. And, a simulation asset is generated for a scenario based at least in part on the composite asset.
    Type: Grant
    Filed: April 25, 2008
    Date of Patent: August 30, 2011
    Assignee: Total Immersion Software, Inc.
    Inventors: Keith Copenhagen, David Nielsen, Steven Pollini
  • Publication number: 20100277468
    Abstract: The invention relates to a method and devices for enabling a user to visualise a virtual model in a real environment. According to the invention, a 2D representation of a 3D virtual object is inserted, in real-time, into the video flows of a camera aimed at a real environment in order to form an enriched video flow. A plurality of cameras generating a plurality of video flows can be simultaneously used to visualise the virtual object in the real environment according to different angles of view. A particular video flow is used to dynamically generate the effects of the real environment on the virtual model. The virtual model can be, for example, a digital copy or virtual enrichments of a real copy. A virtual 2D object, for example the representation of a real person, can be inserted into the enriched video flow.
    Type: Application
    Filed: August 9, 2006
    Publication date: November 4, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Jean-Marie Vaidie
  • Publication number: 20100220891
    Abstract: The invention relates to a method and to devices for the real-time tracking of one or more substantially planar geometrical objects of a real scene in at least two images of a video stream for an augmented-reality application. After receiving a first image of the video stream (300), the first image including the object to be tracked, the position and orientation of the object in the first image are determined from a plurality of previously determined image blocks (320), each image block of said plurality of image blocks being associated with an exposure of the object to be tracked. The first image and the position and the orientation of the object to be tracked in the first image define a key image. After receiving a second image from the video stream, the position and orientation of the object to be tracked in the second image are evaluated from the key image (300). The second image and the corresponding position and orientation of the object to be tracked can be stored as a key image.
    Type: Application
    Filed: January 18, 2008
    Publication date: September 2, 2010
    Applicant: TOTAL IMMERSION
    Inventors: Valentin Lefevre, Nicolas Livet