Temporal Interpolation Or Processing Patents (Class 345/475)
  • Patent number: 8928671
    Abstract: In particular embodiments, a method includes generating a 3D display of an avatar of a person, where the avatar can receive inputs identifying a type of a physiological event, a location of the physiological event in or on a person's body in three spatial dimensions, a time range of the physiological event, a quality of the physiological event, and rendering the physiological event on the avatar based on the inputs.
    Type: Grant
    Filed: November 24, 2010
    Date of Patent: January 6, 2015
    Assignee: Fujitsu Limited
    Inventors: B. Thomas Adler, David Marvit, Jawahar Jain
  • Patent number: 8902233
    Abstract: Techniques that give animators the direct control they are accustomed to with key frame animation, while providing for path-based motion. A key frame animation-based interface is used to achieve path-based motion with rotation animation variable value correction using additional animation variables for smoothing. The value of the additional animation variables for smoothing can be directly controlled using a tangent handle in a user interface.
    Type: Grant
    Filed: March 4, 2011
    Date of Patent: December 2, 2014
    Assignee: Pixar
    Inventors: Chen Shen, Bena L. Currin, Timothy S. Milliron
  • Patent number: 8902232
    Abstract: Acquisition, modeling, compression, and synthesis of realistic facial deformations using polynomial displacement maps are described. An analysis phase can be included where the relationship between motion capture markers and detailed facial geometry is inferred. A synthesis phase can be included where detailed animated facial geometry is driven by a sparse set of motion capture markers. For analysis, an actor can be recorded wearing facial markers while performing a set of training expression clips. Real-time high-resolution facial deformations are captured, including dynamic wrinkle and pore detail, using interleaved structured light 3D scanning and photometric stereo. Next, displacements are calculated between a neutral mesh driven by the motion capture markers and the high-resolution captured expressions. These geometric displacements are stored in one or more polynomial displacement maps parameterized according to the local deformations of the motion capture dots.
    Type: Grant
    Filed: February 2, 2009
    Date of Patent: December 2, 2014
    Assignee: University of Southern California
    Inventors: Paul E. Debevec, Wan-Chun Ma, Timothy Hawkins
  • Patent number: 8902235
    Abstract: A computerized device implements an animation coding engine to analyze timeline data defining an animation sequence and generate a code package. The code package can represent the animation sequence using markup code that defines a rendered appearance of a plurality of frames and a structured data object also comprised in the code package and defining a parameter used by a scripting language in transitioning between frames. The markup code can also comprise a reference to a visual asset included within a frame. The code package further comprises a cascading style sheet defining an animation primitive as a style to be applied to the asset to reproduce one or more portions of the animation sequence without transitioning between frames.
    Type: Grant
    Filed: April 7, 2011
    Date of Patent: December 2, 2014
    Assignee: Adobe Systems Incorporated
    Inventor: Alexandru Chiculit{hacek over (a)}
  • Patent number: 8897821
    Abstract: A method for providing visual effect messages on a receiving end and associated transmitting end configuration is provided. At the transmitting end, visual effect positions and visual effects of messages are determined according to an input message. The visual effect positions and visual effect information are transmitted to the receiving end, and are displayed at the visual effect positions at the receiving end according to the visual information.
    Type: Grant
    Filed: April 19, 2012
    Date of Patent: November 25, 2014
    Assignee: MStar Semiconductor, Inc.
    Inventors: Chih-Hsien Huang, Sheng-Chi Yu
  • Patent number: 8878880
    Abstract: A method of driving an electrophoretic display device includes changing the gradation level of image data on the basis of correction data corresponding to the gradation level, converting image data with the changed gradation level to a dithering pattern, in which the first color and the second color are combined, corresponding to the changed gradation level for each predetermined region of image data, and driving the electrophoretic particles of the first color and the electrophoretic particles of the second color on the basis of image data converted to the dithering pattern for the plurality of pixels in the display section.
    Type: Grant
    Filed: April 7, 2011
    Date of Patent: November 4, 2014
    Assignee: Seiko Epson Corporation
    Inventors: Tetsuaki Otsuki, Kota Muto
  • Patent number: 8880044
    Abstract: A mobile terminal is presented. The mobile terminal includes a display including a touchscreen, and a controller for performing an editing operation on information displayed on the touchscreen according to a state of an object in near-proximity to the displayed information.
    Type: Grant
    Filed: January 26, 2009
    Date of Patent: November 4, 2014
    Assignee: LG Electronics Inc.
    Inventor: Jong Hwan Kim
  • Patent number: 8874309
    Abstract: A method for acquiring information from a driving operation of a vehicle, in which first information is acquired with respect to at least one operating state of the vehicle and additional second information is ascertained with respect to this at least one operating state using statistical methods, the first and second information concerning this at least one operating state being stored. A method for the assigning and diagnosis of at least one operating state of a vehicle, a control unit, a computer program and a computer-program product are also provided.
    Type: Grant
    Filed: September 10, 2008
    Date of Patent: October 28, 2014
    Assignee: Robert Bosch GmbH
    Inventors: Andreas Genssle, Michael Kolitsch, Tobias Pfister
  • Patent number: 8866823
    Abstract: Automatically creating a series of intermediate states may include receiving a start state and an end state of a reactive system, identifying one or more components of the start state and the end state and determining one or more events associated with the one or more components. One or more intermediate states between the start state and the end state, and one or more transitions from and to the one or more intermediate states are created using the one or more components of the start state and the end state and the one or more events associated with the one or more components. The one or more intermediate states and the one or more transitions form one or more time-based paths from the start state to the end state occurring in response to applying the one or more events to the associated one or more components.
    Type: Grant
    Filed: October 13, 2010
    Date of Patent: October 21, 2014
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Michael Desmond, Jacquelyn A. Martino, Paul M. Matchen, John T. Richards, Calvin B. Swart
  • Patent number: 8836859
    Abstract: Disclosed herein is an image processing apparatus including an interlace/progressive conversion section configured to carry out interpolation processing on image data of the current field by making use of the image data of the current field and image data of a field leading ahead of the current field by one field period in order to obtain image data of a progressive system with no delay time.
    Type: Grant
    Filed: July 19, 2012
    Date of Patent: September 16, 2014
    Assignee: Sony Corporation
    Inventor: Kazuhide Fujita
  • Patent number: 8836707
    Abstract: At least certain embodiments of the present disclosure include a method for animating a display region, windows, or views displayed on a display of a device. The method includes starting at least two animations. The method further includes determining the progress of each animation. The method further includes completing each animation based on a single timer.
    Type: Grant
    Filed: August 26, 2013
    Date of Patent: September 16, 2014
    Assignee: Apple Inc.
    Inventors: Andrew Platzer, John Harper
  • Patent number: 8810582
    Abstract: A lighting module of a hair/fur pipeline may be used to produce lighting effects in a lighting phase for a shot and an optimization module may be used to: determine if a cache hair state file including hair parameters exists; and determine if the cache hair state file includes matching hair parameters to be used in the shot, and if so, the hair parameter values from the cache hair state file are used in the lighting phase.
    Type: Grant
    Filed: May 11, 2007
    Date of Patent: August 19, 2014
    Assignees: Sony Corporation, Sony Pictures Entertainment Inc
    Inventors: Armin Walter Bruderlin, Francois Chardavoine, Clint Chua, Gustav Melich
  • Patent number: 8803889
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: August 12, 2014
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
  • Patent number: 8803887
    Abstract: A computer graphic system and methods for simulating hair is provided. In accordance with aspects of the disclosure a method for hybrid hair simulation using a computer graphics system is provided. The method includes generating a plurality of modeled hair strands using a processor of the computer graphics system. Each hair strand includes a plurality of particles and a plurality of spring members coupled in between the plurality of particles. The method also includes determining a first position and a first velocity for each particle in the plurality of modeled hair strands using the processor and coarsely modeling movement of the plurality of modeled hair strands with a continuum fluid solver. Self-collisions of the plurality of modeled hair strands are computed with a discrete collision model using the processor.
    Type: Grant
    Filed: January 15, 2010
    Date of Patent: August 12, 2014
    Assignee: Disney Enterprises, Inc.
    Inventors: Aleka McAdams, Andrew Selle, Kelly Ward, Eftychios Sifakis, Joseph Teran
  • Patent number: 8803886
    Abstract: The present invention provides a facial image display apparatus that can display moving images concentrated on the face when images of people's faces are displayed. A facial image display apparatus is provided wherein a facial area detecting unit (21) detects facial areas in which faces are displayed from within a target image for displaying a plurality of faces; a dynamic extraction area creating unit (22) creates, based on the facial areas detected by the facial area detecting means, a dynamic extraction area of which at least one of position and surface area varies over time in the target image; and a moving image output unit (27) sequentially extracts images in the dynamic extraction area and outputs the extracted images as a moving image.
    Type: Grant
    Filed: July 31, 2006
    Date of Patent: August 12, 2014
    Assignees: Sony Corporation, Sony Computer Entertainment Inc.
    Inventors: Munetaka Tsuda, Shuji Hiramatsu, Akira Suzuki
  • Patent number: 8797331
    Abstract: An information processing apparatus includes a bio-information obtaining unit configured to obtain bio-information of a subject; a kinetic-information obtaining unit configured to obtain kinetic information of the subject; and a control unit configured to determine an expression or movement of an avatar on the basis of the bio-information obtained by the bio-information obtaining unit and the kinetic information obtained by the kinetic-information obtaining unit and to perform a control operation so that the avatar with the determined expression or movement is displayed.
    Type: Grant
    Filed: August 4, 2008
    Date of Patent: August 5, 2014
    Assignee: Sony Corporation
    Inventors: Akane Sano, Masamichi Asukai, Taiji Ito, Yoichiro Sako
  • Patent number: 8786598
    Abstract: Discloses herein are methods, apparatuses, and systems for preparing and displaying images in frame-sequential stereoscopic 3D. Frame-sequential stereoscopic display includes an alternating sequence of left- and right-perspective images for display. Disclosed methods include identifying pixels that modulate due to the alternating sequence of left- and right-perspective images of the frame-sequential stereoscopic display. The disclosed methods also include processing the pixels to reduce one or more residual images caused by the alternating sequence of left- and right-perspective images of the frame-sequential stereoscopic display. The disclosed methods may be implemented by a processing unit and the processing unit may be included in a system (such as, a computer or video-game console).
    Type: Grant
    Filed: November 19, 2010
    Date of Patent: July 22, 2014
    Assignee: ATI Technologies, ULC
    Inventor: Philip L. Swan
  • Patent number: 8786612
    Abstract: An animation editing device includes animation data including time line data that defines frames on the basis of a time line showing temporal display order of the frames, and space line data that defines frames on the basis of a space line for showing a relative positional relationship between a display position of each of animation parts and a reference position shown by a tag by mapping the relative positional relationship onto a one-dimensional straight line, displays the time line and the space line, and the contents of the frames based on the time line and the space line, and accepts an editing command to perform an editing process according to the inputted editing command.
    Type: Grant
    Filed: March 31, 2009
    Date of Patent: July 22, 2014
    Assignee: Mitsubishi Electric Corporation
    Inventors: Akira Toyooka, Hiroki Konaka
  • Patent number: 8786608
    Abstract: Certain embodiments relate to combining or blending animations that are attempting to simultaneously animate the same target. Certain embodiments simplify the blending of animations in the application development environment. For example, certain embodiments allow animations to be used or specified by a developer without the developer having to specifically address the potential for time-overlapping animations. As a few specific examples, an application may specify animations by simply calling a function to change a property of a target or by sending a command to change a public property of the target. Certain embodiments provide a blender that intercepts such function calls and commands. If two animations require a change to the same target at the same time, the blender determines an appropriate blended result and sends an appropriate function call or command to the target. The function calls and commands need not be aware of the blender.
    Type: Grant
    Filed: October 14, 2008
    Date of Patent: July 22, 2014
    Assignee: Adobe Systems Incorporated
    Inventor: Chet S. Haase
  • Patent number: 8786613
    Abstract: A method and system for drawing, displaying, editing animating, simulating and interacting with one or more virtual polygonal, spline, volumetric models, three-dimensional visual models or robotic models. The method and system provide flexible simulation, the ability to combine rigid and flexible simulation on plural portions of a model, rendering of haptic forces and force-feedback to a user.
    Type: Grant
    Filed: March 11, 2013
    Date of Patent: July 22, 2014
    Inventor: Alan Millman
  • Patent number: 8773442
    Abstract: An event, such as a vertical blank interrupt or signal, received from a display adapter in a system is identified. Activation of a timer-driven animation routine that updates a state of an animation and activation of a paint controller module that identifies updates to the state of the animation and composes a frame that includes the updates to the state of the animation are aligned, both being activated based on the identified event in the system.
    Type: Grant
    Filed: July 6, 2012
    Date of Patent: July 8, 2014
    Assignee: Microsoft Corporation
    Inventors: Cenk Ergan, Benjamin C. Constable
  • Patent number: 8760469
    Abstract: A method that incorporates teachings of the present disclosure may include, for example, the steps of transmitting media content to a group of set top boxes for presentation with an overlay superimposed onto the media content, receiving a first comment from a first set top box of the group of set top boxes where the first comment is presentable with the overlay and the media content by the group of set top boxes, determining a first advertisement based on the first comment, and transmitting the first advertisement to the first set top box for presentation with the overlay and the media content. Other embodiments are disclosed.
    Type: Grant
    Filed: November 6, 2009
    Date of Patent: June 24, 2014
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: Linda Roberts, E-Lee Chang, Ja-Young Sung, Natasha Barrett Schultz, Robert Arthur King
  • Patent number: 8749560
    Abstract: The disclosed systems and methods make the motion of an object in an animation appear smooth by blending a number of subframes of visually adjusted images of the object for each frame of the animation. A request to animate an object along a motion path can be received by a graphics processing system of a device, where the motion path traverses at least a portion of a user interface presented on a display of the device. For each frame of the animation, the graphics processing system blends N subframes of visually adjusted images of the object to create a final blurred image which is rendered on the display. The graphics processing system can determine whether there is more processing time to perform additional blending of subframes prior to rendering a final frame for display, and then blending more subframes of images prior to rendering the final frame for display.
    Type: Grant
    Filed: May 18, 2012
    Date of Patent: June 10, 2014
    Assignee: Apple Inc.
    Inventor: Bas Ording
  • Patent number: 8743125
    Abstract: Natural inter-viseme animation of 3D head model driven by speech recognition is calculated by applying limitations to the velocity and/or acceleration of a normalized parameter vector, each element of which may be mapped to animation node outputs of a 3D model based on mesh blending and weighted by a mix of key frames.
    Type: Grant
    Filed: March 6, 2009
    Date of Patent: June 3, 2014
    Assignee: Sony Computer Entertainment Inc.
    Inventor: Masanori Omote
  • Patent number: 8744214
    Abstract: Over the past few years there has been a dramatic proliferation of digital cameras, and it has become increasingly easy to share large numbers of photographs with many other people. These trends have contributed to the availability of large databases of photographs. Effectively organizing, browsing, and visualizing such .seas. of images, as well as finding a particular image, can be difficult tasks. In this paper, we demonstrate that knowledge of where images were taken and where they were pointed makes it possible to visualize large sets of photographs in powerful, intuitive new ways. We present and evaluate a set of novel tools that use location and orientation information, derived semi-automatically using structure from motion, to enhance the experience of exploring such large collections of images.
    Type: Grant
    Filed: May 21, 2013
    Date of Patent: June 3, 2014
    Assignees: Microsoft Corporation, University of Washington
    Inventors: Keith Noah Snavely, Steven Maxwell Seitz, Richard Szeliski
  • Patent number: 8744627
    Abstract: A system of distributed control of an interactive animatronic show includes a plurality of animatronic actors, at least one of the actors a processor and one or more motors controlled by the processor. The system also includes a network interconnecting each of the actors, and a plurality of sensors providing messages to the network, where the messages are indicative of processed information. Each processor executed software that schedules and/or coordinates an action of the actor corresponding to the processor in accordance with the sensor messages representative of attributes of an audience viewing the show and the readiness of the corresponding actor. Actions of the corresponding actor can include animation movements of the actor, responding to another actor and/or responding to a member of the audience. The actions can result in movement of at least a component of the actor caused by control of the motor.
    Type: Grant
    Filed: September 22, 2011
    Date of Patent: June 3, 2014
    Assignee: Disney Enterprises, Inc.
    Inventor: Alexis Paul Wieland
  • Patent number: 8730245
    Abstract: In a method of defining an animation of a virtual object, during which values for attributes of the virtual object are updated at each of a series of time points, a user specifies a structure representing the update that includes a plurality of items and one or more connections between respective items. Each item represents a respective operation. Each connection represents that data output by the operation represented by one item is input to the operation represented by the connected item. The user specifies that the structure comprises one or more items in a predetermined category associated with a predetermined process that may be executed at most a predetermined number of times at each time point. An item belongs to the predetermined category if performing the respective operation represented by that item requires execution of the predetermined process. One or more rules are applied.
    Type: Grant
    Filed: August 20, 2009
    Date of Patent: May 20, 2014
    Assignee: NaturalMotion Ltd.
    Inventors: Thomas Lowe, Danny Chapman, Timothy Daoust, James Brewster
  • Patent number: 8726168
    Abstract: A system and method hides latency in the display of a subsequent user interface by animating the exit of the current user interface and animating the entrance of the subsequent user interface, causing continuity in the display of the two user interfaces. During either or both animations, information used to produce the user interface, animation of the entrance of the subsequent user interface, or both may be retrieved or processed or other actions may be performed.
    Type: Grant
    Filed: December 5, 2005
    Date of Patent: May 13, 2014
    Assignee: Adobe Systems Incorporated
    Inventor: Andrew Borovsky
  • Patent number: 8711151
    Abstract: A hair pipeline utilizes a surface definition module to define a surface and a control hair and a hair motion compositor module combines different control hair curve shapes associated with the control hair and the surface. In particular, the hair motion compositor module generates a static node defining a static control hair curve shape; generates an animation node defining an animation control hair curve shape; and combines the static control hair curve shape of the static node with the animation control hair curve hair shape of the animation node to produce a resultant control hair curve shape for the control hair.
    Type: Grant
    Filed: May 11, 2007
    Date of Patent: April 29, 2014
    Assignees: Sony Corporation, Sony Pictures Entertainment Inc.
    Inventors: Armin Walter Bruderlin, Francois Chardavoine, Clint Chua, Gustav Melich
  • Patent number: 8711178
    Abstract: A method for generating an animated morph between a first image and a second image is provided. The method may include: (i) reading a first set of cephalometric landmark points associated with the first image; (ii) reading a second set of cephalometric landmark points associated with the second image; (iii) defining a first set of line segments by defining a line segment between each of the first set of cephalometric landmarks; (iv) defining a second set of line segments by defining a line segment between each of the second set of cephalometric landmarks such that each line segment of the second set of line segments corresponds to a corresponding line segment of the first set of line segments; and (v) generating an animation progressively warping the first image to the second image based at least on the first set of line segments and the second set of line segments.
    Type: Grant
    Filed: May 19, 2011
    Date of Patent: April 29, 2014
    Assignee: Dolphin Imaging Systems, LLC
    Inventor: Emilio David Cortés Provencio
  • Patent number: 8707151
    Abstract: A user interface method and apparatus for a Rich Media service in a terminal. A decoder decodes a received stream to check a header of the received stream. A renderer adaptively composes a scene using scene composition elements of the received stream, according to adaptation information in the header checked by the decoder, and a display displays the adaptively composed scene.
    Type: Grant
    Filed: April 21, 2009
    Date of Patent: April 22, 2014
    Assignee: Samsung Electronics Co., Ltd
    Inventors: Seo-Young Hwang, Jae-Yeon Song, Kook-Heui Lee
  • Patent number: 8704828
    Abstract: A model is associated with a deep pose. When the model is changed from an attractor pose to a current pose, the current pose and the attractor pose are compared with the deep pose. If any portion of the current pose is more similar to the deep pose than the attractor pose, then the attractor pose is updated. A portion of the attractor pose may be set to the corresponding portion of the current pose. The attractor pose may be modified by a function. Pose attributes of each pose degrees of freedom for the attractor pose, the current pose, and the deep pose may be evaluated to potentially modify all or a portion of the attractor pose. The attractor pose and pose constraints are used to determine a pose of the model, for example by an optimization process based on the attractor pose while satisfying pose constraints.
    Type: Grant
    Filed: October 23, 2008
    Date of Patent: April 22, 2014
    Assignee: Pixar
    Inventors: Andrew Witkin, Michael Kass, Hayley Iben
  • Publication number: 20140098108
    Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.
    Type: Application
    Filed: October 21, 2013
    Publication date: April 10, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Christopher A. Glein
  • Patent number: 8692831
    Abstract: Provided is a parallel operation processing apparatus and method. The parallel operation processing apparatus and method may generate an interpolated matrix with respect to a character included in each of a current frame and a next frame using a matrix corresponding to each of the current frame and the next frame generated, based on joint information corresponding to a plurality of joints included in the character. Also, the parallel operation processing apparatus and method may display an interpolated frame using the interpolated matrix.
    Type: Grant
    Filed: June 28, 2010
    Date of Patent: April 8, 2014
    Assignees: Samsung Electronics Co., Ltd., Korea University of Technology and Education Industry-University Cooperation Foundation
    Inventors: Hyung Min Yoon, Oh Young Kwon, Byung In Yoo, Chang Mug Lee, Hyo Seok Seo
  • Patent number: 8687006
    Abstract: A display device includes a display panel having pixels and divided into first and second display regions; first and second image interpolation chips which receive an original image signal and output interpolated ¼, ½, and/or ¾ frames inserted between a previous (n?1)-th frame and a current n-th frame of the original image signal; a first timing unit which receives the interpolated ¼, ½, and/or ¾ frames from the first image interpolation chip and outputs a first quadruple-speed image signal to pixels in the first display region; and a second timing unit which receives the interpolated ¼, ½, and/or ¾ frames from the second image interpolation chip and outputs a second quadruple-speed image signal to pixels in the second display region. The first timing unit transmits data to the second timing unit, and the second timing unit transmits data to the first timing unit.
    Type: Grant
    Filed: June 25, 2009
    Date of Patent: April 1, 2014
    Assignee: Samsung Display Co., Ltd.
    Inventors: Dong-Won Park, Sang-Soo Kim
  • Patent number: 8683429
    Abstract: Methods for runtime control of hierarchical objects are provided. Certain embodiments provide kinematics procedures in a media content, runtime environment. Making these procedures available in the runtime environment allows the variables of the kinematics procedures to be specified at runtime, for example by the end user or by a runtime-executed script. One exemplary method comprises receiving a hierarchical object for a piece of media in a media content authoring environment and providing the piece of media to one or more runtime environments. The piece of media provided to the runtime environments comprises both object information about the hierarchical object and kinematics procedural information for performing kinematics on the hierarchical object, such as procedural classes for performing inverse kinematics procedures based on runtime-provided end-effector and target point variables.
    Type: Grant
    Filed: August 25, 2008
    Date of Patent: March 25, 2014
    Assignee: Adobe Systems Incorporated
    Inventor: Eric J. Mueller
  • Patent number: 8674998
    Abstract: The present disclosure includes, among other things, systems, methods and program products for generating animation keyframes and a corresponding 3D animation sequence from a plurality of 2D images.
    Type: Grant
    Filed: August 29, 2008
    Date of Patent: March 18, 2014
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Adam Schnitzer, Steve Sullivan
  • Patent number: 8665278
    Abstract: Architecture that enhances the visual experience of a slide presentation by animating slide content as “actors” in the same background “scene”. This is provided by multi-layered transitions between slides, where a slide is first separated into “layers” (e.g., with a level of transparency). Each layer can then be transitioned independently. All layers are composited together to accomplish the end effect. The layers can comprise one or more content layers, and a background layer. The background layer can further be separated into a background graphics layer and a background fill layer. The transition phase can include a transition effect such as a fade, a wipe, a dissolve effect, and other desired effects. To provide the continuity and uniformity of presentation the content on the same background scene, a transition effect is not applied to the background layer.
    Type: Grant
    Filed: November 23, 2012
    Date of Patent: March 4, 2014
    Assignee: Microsoft Corporation
    Inventors: Jason Zhao, Mark Pearson, Peter Lai
  • Patent number: 8659596
    Abstract: Systems and methods for automatically generating animation-ready 3D character models based upon model parameter and clothing selections are described. One embodiment of the invention includes an application server configured to receive the user defined model parameters and the clothing selection via a user interface.
    Type: Grant
    Filed: November 24, 2009
    Date of Patent: February 25, 2014
    Assignee: Mixamo, Inc.
    Inventors: Stefano Corazza, Emiliano Gambaretto
  • Patent number: 8659606
    Abstract: A computer-implemented method includes identifying a representation of a feature of an animated character by inverting a skinned representation of the feature in one position. The inversion includes a non-linear inversion of the skinned representation of the feature. The method also includes skinning the identified feature representation to produce the animated character in another position.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: February 25, 2014
    Assignee: LucasFilm Entertainment Company Ltd.
    Inventors: Frederic P. Pighin, Cary Phillips
  • Patent number: 8659623
    Abstract: A three dimensional (3D) virtual world wormhole includes hosting a 3D virtual world, and creating a wormhole at a selected location in the 3D virtual world for automatic transport of an avatar from the selected location to a selected destination in the 3D virtual world. Policies may be defined for the wormhole where the wormhole operates in accordance with the defined policies. An avatar may be automatically transported from the location to any one of a plurality of destinations based on a current capacity of each of the plurality of destinations.
    Type: Grant
    Filed: April 25, 2008
    Date of Patent: February 25, 2014
    Assignee: International Business Machines Corporation
    Inventors: Wiliam B. Nicol, II, Brian R. Bokor, Andrew B. Smith, Daniel E. House, Peter F. Haggar
  • Patent number: 8654251
    Abstract: Problem. Current systems for generating synchronized video sequences from multiple video sources are too complex, too restrictive, too cumbersome or too imprecise. There are also few practical ways of faithfully transmitting highly synchronized multivideo between video sources and video sinks. Solution. A system and a method for ensuring precise synchronization of high speed stereovision or multivision systems is disclosed. This method involves the use of matched video sources such as cameras or image sensors which are subjected to a common clock as well as identical operating conditions thereby guaranteeing an identical internal state and synchronized output timing behaviour without relying on the provision of a frame reset pulse or a line reset pulse generated by any one of the video sources in order to guarantee synchronization. This avoids the delays associated with the transmission of such pulses and hence allows much higher frames rates to be achieved.
    Type: Grant
    Filed: September 7, 2009
    Date of Patent: February 18, 2014
    Assignee: University of Malta
    Inventor: Marc Anthony Azzopardi
  • Patent number: 8654130
    Abstract: An animation wireframe is modified with three-dimensional (3D) range and color data having a corresponding shape surface. The animation wireframe is vertically scaled based on distances between consecutive features within the 3D range and color data and corresponding distances within the generic animation wireframe. For each animation wireframe point, the location of the animation wireframe point is adjusted to coincide with a point on the shape surface. The shape surface point lies along a scaling line connecting the animation wireframe point, the shape surface point and an origin point. The scaling line is within a horizontal point.
    Type: Grant
    Filed: June 6, 2011
    Date of Patent: February 18, 2014
    Assignee: Rakuten, Inc.
    Inventor: Joern Ostermann
  • Patent number: 8648866
    Abstract: A facial animation production method for producing 3-dimensional (3D) facial animation data in response to input video data includes the following steps. First, data positioning and character sorting processes are performed on the input video data to acquire first-layer character data, for indicating multiple first-layer character points, and first-layer model data. Next, first-layer model outline data and first-layer character outline data are respectively obtained according to the first-layer model data and the first-layer character data. Then, the first-layer character outline data is compared with the first-layer model outline data to judge whether a judgment condition is satisfied. If not, output character data are produced according to the first-layer character data, and fundamental facial-mesh transformation data are thus produced. Thereafter, the 3D facial animation data are displayed according to the fundamental facial-mesh transformation data.
    Type: Grant
    Filed: July 7, 2010
    Date of Patent: February 11, 2014
    Assignee: Industrial Technology Research Institute
    Inventors: Wen-Hung Ting, Chen-Lan Yen, Wen-Liang Chi, Duan-Li Liao
  • Patent number: 8629875
    Abstract: Methods and apparatus for animating images using bidirectional constraints are described.
    Type: Grant
    Filed: November 9, 2010
    Date of Patent: January 14, 2014
    Assignee: QUALCOMM Incorporated
    Inventors: Rachid El Guerrab, Andi Terrence Smithers, Baback Elmieh
  • Patent number: 8624904
    Abstract: A system includes a computer system capable of representing one or more animated characters. The computer system includes a blendshape manager that combines multiple blendshapes to produce the animated character. The computer system also includes an expression manager to respectively adjust one or more control parameters associated with each of the plurality of blendshapes for adjusting an expression of the animated character. The computer system also includes a corrective element manager that applies one or more corrective elements to the combined blendshapes based upon at least one of the control parameters. The one or more applied corrective elements are adjustable based upon one or more of the control parameters absent the introduction of one or more additional control parameters.
    Type: Grant
    Filed: June 22, 2012
    Date of Patent: January 7, 2014
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Michael Koperwas, Frederic P. Pighin, Cary Phillips, Steve Sullivan, Eduardo Hueso
  • Patent number: 8620143
    Abstract: An image processing apparatus, method, and program and program storage medium that enable easy search for a desired part. A plurality of video data are created from video data and are displayed, each as a motion picture, at time intervals in a display order on a plurality of display areas on a display screen.
    Type: Grant
    Filed: April 8, 2010
    Date of Patent: December 31, 2013
    Assignee: Sony Corporation
    Inventor: Junichi Ogikubo
  • Patent number: 8610828
    Abstract: The present invention provides a moving picture reproduction apparatus including a frequency change unit for changing a display frequency to a frequency of a moving picture when a reproduction of the moving picture starts and for changing the display frequency to a frequency prior to the start of the reproduction of the moving picture when the reproduction of the moving picture stops, and a reproduction unit for stopping the reproduction of the moving picture when the frequency change unit changes the display frequency, wherein the frequency change unit maintains the display frequency at the frequency of the moving picture in a case where the reproduction unit stops the reproduction of the moving picture in response to the change of the display frequency by the frequency change unit.
    Type: Grant
    Filed: August 4, 2009
    Date of Patent: December 17, 2013
    Assignee: Sony Corporation
    Inventors: Naoshi Koizumi, Fukukyo Sudo, Daisuke Kurosaki
  • Patent number: 8610713
    Abstract: In general, one or more aspects of the subject matter described in this specification can include associating with each clip in a sequence of one or more clips a copy of a three dimensional (3D) scene that was used to create the clip, where the clip is a sequence of one or more images that depict the clip's respective 3D scene from the perspective of one or more virtual cameras. Input identifying a clip in the sequence is received. In response to the receiving, a copy of the identified clip's associated copy of the 3D scene is presented in an editor.
    Type: Grant
    Filed: June 22, 2012
    Date of Patent: December 17, 2013
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Steve Sullivan, Max S-Han Chen, Jeffery Bruce Yost
  • Patent number: RE44717
    Abstract: There is provided an edge detecting method, which is capable of preventing a noise influence caused by imaging device and a color interpolation. The edge detecting method includes the steps of: setting a first kernel based on a center pixel in pixel data arranged in a mosaic structure; setting a second kernel based on the center pixel within the first kernel; detecting whether a pixel having a green value in the second kernel is a defective pixel, and correcting the pixel; converting all pixels of the second kernel into pixels having green value; calculating a slope value by using a mask for detecting an edge in the second kernel; and detecting an edge by adding the slope value to a luminance value obtained by a color space conversion.
    Type: Grant
    Filed: April 29, 2010
    Date of Patent: January 21, 2014
    Assignee: Intellectual Ventures II LLC
    Inventors: Dong-Seob Song, Hyun-Joo Ahn