Patents by Inventor Hartmut Neven

Hartmut Neven has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 6917703
    Abstract: The present invention may be embodied in a method, and in a related apparatus, for classifying a feature in an image frame. In the method, an original image frame having an array of pixels is transformed using Gabor-wavelet transformations to generate a transformed image frame. Each pixel of the transformed image is associated with a respective pixel of the original image frame and is represented by a predetermined number of wavelet component values. A pixel of the transformed image frame associated with the feature is selected for analysis. A neural network is provided that has an output and a predetermined number of inputs. Each input of the neural network is associated with a respective wavelet component value of the selected pixel. The neural network classifies the local feature based on the wavelet component values, and indicates a class of the feature at an output of the neural network.
    Type: Grant
    Filed: February 28, 2001
    Date of Patent: July 12, 2005
    Assignee: Nevengineering, Inc.
    Inventors: Johannes B. Steffens, Hartwig Adam, Hartmut Neven
  • Patent number: 6876364
    Abstract: The present invention provides a technique for translating facial animation values to head mesh positions for rendering facial features of an animated avatar. In the method, an animation vector of dimension Na is provided. Na is the number of facial animation values in the animation vector. A mapping algorithm F is applied to the animation vector to generate a target mix vector of dimension M. M is the number of targets associated with the head mesh positions. The head mesh positions are deformed based on the target mix vector.
    Type: Grant
    Filed: August 9, 2002
    Date of Patent: April 5, 2005
    Assignee: Vidiator Enterprises Inc.
    Inventors: Ulrich F. Buddemeier, Karin M. Derlich, Hartmut Neven
  • Patent number: 6853379
    Abstract: The present invention provides a technique for translating an animation vector to a target mix vector.
    Type: Grant
    Filed: August 13, 2001
    Date of Patent: February 8, 2005
    Assignee: Vidiator Enterprises Inc.
    Inventors: Ulrich F. Buddemeier, Karin M. Derlich, Hartmut Neven
  • Patent number: 6834115
    Abstract: The present invention relates to a technique for optimizing off-line facial feature tracking. Facial features in a sequence of image frames are automatically tracked while a visual indication is presented of the plurality of tracking node locations on the respective image frames. The sequence of image frames may be manually paused at a particular image frame in the sequence of image frames if the visual indication of the tracking node locations indicates that at least one location of a tracking node for a respective facial feature is not adequately tracking the respective facial feature. The location of the tracking node may be reinitialized by manually placing the tracking node location at a position on the particular image frame in the monitor window that corresponds to the respective facial feature. Automatic tracking of the facial feature may be continued based on the reinitialized tracking node location.
    Type: Grant
    Filed: August 13, 2001
    Date of Patent: December 21, 2004
    Assignee: Nevengineering, Inc.
    Inventors: Thomas Maurer, Hartmut Neven, Bjoern Poehlker
  • Patent number: 6714661
    Abstract: The present invention is embodied in a method and system for customizing a visual sensor for facial feature tracking using a neutral face image of an actor. The method may include generating a corrector graph to improve the sensor's performance in tracking an actor's facial features.
    Type: Grant
    Filed: July 24, 2001
    Date of Patent: March 30, 2004
    Assignee: Nevengineering, Inc.
    Inventors: Ulrich F. Buddenmeier, Hartmut Neven
  • Patent number: 6580811
    Abstract: The present invention is embodied in an apparatus, and related method, for sensing a person's facial movements, features and characteristics and the like to generate and animate an avatar image based on facial sensing. The avatar apparatus uses an image processing technique based on model graphs and bunch graphs that efficiently represent image features as jets. The jets are composed of wavelet transforms processed at node or landmark locations on an image corresponding to readily identifiable features. The nodes are acquired and tracked to animate an avatar image in accordance with the person's facial movements. Also, the facial sensing may use jet similarity to determine the person's facial features and characteristic thus allows tracking of a person's natural characteristics without any unnatural elements that may interfere or inhibit the person's natural characteristics.
    Type: Grant
    Filed: May 31, 2001
    Date of Patent: June 17, 2003
    Assignee: Eyematic Interfaces, Inc.
    Inventors: Thomas Maurer, Egor Valerievich Elagin, Luciano Pasquale Agostino Nocera, Johannes Bernhard Steffens, Hartmut Neven
  • Publication number: 20030043153
    Abstract: The present invention provides a technique for translating facial animation values to head mesh positions for rendering facial features of an animated avatar. In the method, an animation vector of dimension Na is provided. Na is the number of facial animation values in the animation vector. A mapping algorithm F is applied to the animation vector to generate a target mix vector of dimension M. M is the number of targets associated with the head mesh positions. The head mesh positions are deformed based on the target mix vector.
    Type: Application
    Filed: August 9, 2002
    Publication date: March 6, 2003
    Inventors: Ulrich F. Buddemeier, Karin M. Derlich, Hartmut Neven
  • Publication number: 20030034978
    Abstract: The present invention provides a technique for translating an animation vector to a target mix vector.
    Type: Application
    Filed: August 13, 2001
    Publication date: February 20, 2003
    Inventors: Ulrich F. Buddemeier, Karin M. Derlich, Hartmut Neven
  • Publication number: 20030031344
    Abstract: The present invention relates to a technique for optimizing off-line facial feature tracking. Facial features in a sequence of image frames are automatically tracked while a visual indication is presented of the plurality of tracking node locations on the respective image frames. The sequence of image frames may be manually paused at a particular image frame in the sequence of image frames if the visual indication of the tracking node locations indicates that at least one location of a tracking node for a respective facial feature is not adequately tracking the respective facial feature. The location of the tracking node may be reinitialized by manually placing the tracking node location at a position on the particular image frame in the monitor window that corresponds to the respective facial feature. Automatic tracking of the facial feature may be continued based on the reinitialized tracking node location.
    Type: Application
    Filed: August 13, 2001
    Publication date: February 13, 2003
    Inventors: Thomas Maurer, Hartmut Neven, Bjoern Poehlker
  • Publication number: 20030031381
    Abstract: The invention relates to a technique for generating an animated three-dimensional video head based on sensed locations of facial features and texture mapping of corresponding two dimensional video image frames onto a shaped head mesh generated using the sensed locations.
    Type: Application
    Filed: August 13, 2001
    Publication date: February 13, 2003
    Inventors: Randall Ho, David Westwood, James Stewartson, Luciano Pasquale Agostino Nocera, Ulrich F. Buddemeier, Gregory Patrick Lane Lutter, Hartmut Neven
  • Publication number: 20030007666
    Abstract: The present invention is embodied in a method and apparatus for relief texture map flipping. The relief texture map flipping technique provides realistic avatar animation in a computationally efficient manner.
    Type: Application
    Filed: September 9, 2002
    Publication date: January 9, 2003
    Inventors: James A. Stewartson, David Westwood, Hartmut Neven
  • Publication number: 20020118195
    Abstract: Facial animation values are generated using a sequence of facial image frames and synchronously captured audio data of a speaking actor. In the technique, a plurality of visual-facial-animation values are provided based on tracking of facial features in the sequence of facial image frames of the speaking actor, and a plurality of audio-facial-animation values are provided based on visemes detected using the synchronously captured audio voice data of the speaking actor. The plurality of visual facial animation values and the plurality of audio facial animation values are combined to generate output facial animation values for use in facial animation.
    Type: Application
    Filed: August 13, 2001
    Publication date: August 29, 2002
    Inventors: Frank Paetzold, Ulrich F. Buddemeier, Yevgeniy V. Dzhurinskiy, Karin M. Derlich, Hartmut Neven
  • Publication number: 20020067362
    Abstract: The present invention is embodied in a method and system for generating an animation transform using a neutral face image. An avatar editor uses a frontal head image and a side head image of a neutral face model for generating an avatar. The avatar is generated by automatically finding head feature locations on the front and side head images using elastic bunch graph matching. Significant time savings may be accomplished by a generating an animation transform using the neutral face features. The animation transform for the neutral face features may be applied to the other facial expression avatar meshes to improve the quality of the resulting avatar. The neutral-face-based animation transform provides significant improvement to the facial expression head models without the significant editing time incurred by generating a particular animation transform for each particular facial expression (and/or pose) features.
    Type: Application
    Filed: July 24, 2001
    Publication date: June 6, 2002
    Inventors: Luciano Pasquale Agostino Nocera, Hartmut Neven
  • Publication number: 20020031253
    Abstract: The present invention is directed to a method and related system for determining a feature location in multiple dimensions including depth. The method includes providing left and right camera images of the feature and locating the feature in the left camera image and in the right camera image using bunch graph matching. The feature location is determined in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
    Type: Application
    Filed: July 24, 2001
    Publication date: March 14, 2002
    Inventors: Orang Dialameh, Hartmut Neven
  • Publication number: 20020015512
    Abstract: The present invention is embodied in a method and system for customizing a visual sensor for facial feature tracking using a neutral face image of an actor. The method may include generating a corrector graph to improve the sensor's performance in tracking an actor's facial features.
    Type: Application
    Filed: July 24, 2001
    Publication date: February 7, 2002
    Inventors: Ulrich F. Buddenmeier, Hartmut Neven
  • Publication number: 20010033675
    Abstract: The present invention is embodied in an apparatus, and related method, for sensing a person's facial movements, features and characteristics and the like to generate and animate an avatar image based on facial sensing. The avatar apparatus uses an image processing technique based on model graphs and bunch graphs that efficiently represent image features as jets. The jets are composed of wavelet transforms processed at node or landmark locations on an image corresponding to readily identifiable features. The nodes are acquired and tracked to animate an avatar image in accordance with the person's facial movements. Also, the facial sensing may use jet similarity to determine the person's facial features and characteristic thus allows tracking of a person's natural characteristics without any unnatural elements that may interfere or inhibit the person's natural characteristics.
    Type: Application
    Filed: May 31, 2001
    Publication date: October 25, 2001
    Inventors: Thomas Maurer, Egor Valerievich Elagin, Luciano Pasquale Agostino Nocera, Johannes Bernhard Steffens, Hartmut Neven
  • Patent number: 6301370
    Abstract: The present invention is embodied in an apparatus, and related method, for detecting and recognizing an object in an image frame. The object may be, for example, a head having particular facial characteristics. The object detection process uses robust and computationally efficient techniques. The object identification and recognition process uses an image processing technique based on model graphs and bunch graphs that efficiently represent image features as jets. The jets are composed of wavelet transforms and are processed at nodes or landmark locations on an image corresponding to readily identifiable features. The system of the invention is particularly advantageous for recognizing a person over a wide variety of pose angles.
    Type: Grant
    Filed: December 4, 1998
    Date of Patent: October 9, 2001
    Assignee: Eyematic Interfaces, Inc.
    Inventors: Johannes Bernhard Steffens, Egor Valerievich Elagin, Luciano Pasquale Agostino Nocera, Thomas Maurer, Hartmut Neven
  • Patent number: 6272231
    Abstract: The present invention is embodied in an apparatus, and related method, for sensing a person's facial movements, features and characteristics and the like to generate and animate an avatar image based on facial sensing. The avatar apparatus uses an image processing technique based on model graphs and bunch graphs that efficiently represent image features as jets. The jets are composed of wavelet transforms processed at node or landmark locations on an image corresponding to readily identifiable features. The nodes are acquired and tracked to animate an avatar image in accordance with the person's facial movements. Also, the facial sensing may use jet similarity to determine the person's facial features and characteristic thus allows tracking of a person's natural characteristics without any unnatural elements that may interfere or inhibit the person's natural characteristics.
    Type: Grant
    Filed: November 6, 1998
    Date of Patent: August 7, 2001
    Assignee: Eyematic Interfaces, Inc.
    Inventors: Thomas Maurer, Egor Valerievich Elagin, Luciano Pasquale Agostino Nocera, Johannes Bernhard Steffens, Hartmut Neven