Temporal Interpolation Or Processing Patents (Class 345/475)
  • Patent number: 8339392
    Abstract: A computer implemented method, a computer program product, and a data processing system manage regions within a virtual universe. A current location of an avatar is identified within a virtual universe, the current location being within a currently populated region. A vectored movement of the avatar is identified. Any adjacent region that may probabilistically encounter a horizon of the avatar within a region activation time of the virtual universe is identified. If the adjacent region is deactivated, then the adjacent region is activated. An unpopulated region that is currently active is identified, wherein the unpopulated region is within an extended distance from the avatar's current location. The unpopulated region is then deactivated.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: December 25, 2012
    Assignee: International Business Machines Corporation
    Inventors: Boas Betzler, Sean L. Dague, Peter George Finn
  • Patent number: 8339403
    Abstract: Architecture that enhances the visual experience of a slide presentation by animating slide content as “actors” in the same background “scene”. This is provided by multi-layered transitions between slides, where a slide is first separated into “layers” (e.g., with a level of transparency). Each layer can then be transitioned independently. All layers are composited together to accomplish the end effect. The layers can comprise one or more content layers, and a background layer. The background layer can further be separated into a background graphics layer and a background fill layer. The transition phase can include a transition effect such as a fade, a wipe, a dissolve effect, and other desired effects. To provide the continuity and uniformity of presentation the content on the same background scene, a transition effect is not applied to the background layer.
    Type: Grant
    Filed: April 16, 2008
    Date of Patent: December 25, 2012
    Assignee: Microsoft Corporation
    Inventors: Jason Zhao, Mark Pearson, Peter Lai
  • Patent number: 8334872
    Abstract: A method for a computer system comprising receiving a displacement for a first object model surface from a user determined in response to a first physical motion captured pose, determining a weighted combination of a first displacement group and a second displacement group from the displacement, wherein the first displacement group is determined from displacements between the first object model surface and a second object model surface, wherein the second object model surface is determined from displacements between a second physical motion captured pose, wherein the second displacement group is determined from displacements between the first object model surface and a third object model surface, wherein the third object model surface is determined from a third physical motion captured pose, determining a fourth object model surface from the first object model surface and the weighted combination, and displaying the fourth object model surface to the user on a display.
    Type: Grant
    Filed: June 30, 2009
    Date of Patent: December 18, 2012
    Assignee: Two Pic MC LLC
    Inventors: Doug Epps, Nate Reid
  • Patent number: 8334871
    Abstract: The present invention provides a computer implemented method and apparatus to project a projected avatar associated with an avatar in a virtual universe. A computer receives a command to project the avatar, the command having a projection point. The computer transmits a request to place a projected avatar at the projection point to a virtual universe host. The computer renders a tab associated with the projected avatar.
    Type: Grant
    Filed: June 6, 2011
    Date of Patent: December 18, 2012
    Assignee: International Business Machine Corporation
    Inventors: Rick Allen Hamilton, II, Brian Marshall O'Connell, Clifford Alan Pickover, Keith Raymond Walker
  • Patent number: 8319778
    Abstract: Variable motion blur is created by varying the evaluation time used to determine the poses of objects according to motion blur parameters when evaluating a blur frame. A blur parameter can be associated with one or more objects, portions of objects, or animation variables. The animation system modifies the time of the blur frame by a function including the blur parameter to determine poses of objects or portions thereof associated with the blur parameter in a blur frame. The animation system determines the values of animation variables at their modified times, rather than at the time of the blur frame, and poses objects or portions thereof accordingly. Multiple blur parameters can be used to evaluate the poses of different portions of a scene at different times for a blur frame. Portions of an object can be associated with different blur parameters, enabling motion blur to be varied within an object.
    Type: Grant
    Filed: January 31, 2008
    Date of Patent: November 27, 2012
    Assignee: Pixar
    Inventors: Rick Sayre, Martin Reddy, Peter Bernard Demoreuille
  • Patent number: 8319779
    Abstract: A game developer can “tag” an item in the game environment. When an animated character walks near the “tagged” item, the animation engine can cause the character's head to turn toward the item, and mathematically computes what needs to be done in order to make the action look real and normal. The tag can also be modified to elicit an emotional response from the character. For example, a tagged enemy can cause fear, while a tagged inanimate object may cause only indifference or indifferent interest.
    Type: Grant
    Filed: March 30, 2011
    Date of Patent: November 27, 2012
    Assignee: Nintendo of America, Inc.
    Inventors: Henry Sterchi, Jeff Kalles, Shigeru Miyamoto, Denis Dyack, Carey Murray
  • Patent number: 8319777
    Abstract: A character display for attracting user interest by increasing the variety of on-screen display while reducing data processing by making the time variation of posture common among a plurality of characters.
    Type: Grant
    Filed: May 16, 2008
    Date of Patent: November 27, 2012
    Assignee: Konami Digital Entertainment Co., Ltd.
    Inventor: Yuichi Asami
  • Patent number: 8314801
    Abstract: Embodiments described herein are directed to automatically generating an animation for a transition between a current state and a new state. In one embodiment, a computer system accesses state properties of a visual element corresponding to a current state the visual element is in and a new state the visual element is to be transitioned to. The state properties include visual properties and transition description information. The computer system determines the differences between the visual properties of the current state and the new state and automatically generates an animation based on the determined differences between the visual properties for the current state and the new state, such that the animation is playable to transition the visual element from the current state to new state.
    Type: Grant
    Filed: February 29, 2008
    Date of Patent: November 20, 2012
    Assignee: Microsoft Corporation
    Inventors: Kenneth L. Young, Steven Charles White, Christian B. Schormann
  • Patent number: 8311731
    Abstract: A robot is provided with a motion control unit that avoids collision between segments of the robot or between segments of the robot and other objects. The motion control unit of the robot comprises a distance computing module, a whole body control module, a collision avoidance module, and a blending control unit. The distance computing module calculates two closest points of different segments of the robot connected to each other via at least one joint or a segment of the robot and another object. The collision avoidance module is provided with the information about the two closest points. The blending control unit combines the weighted output control signals of the whole body control module and the collision avoidance control module. The weight of the whole body control output signal is higher when the risk of collision is lower. The weight of the collision avoidance control output signal is higher when the risk of collision is higher.
    Type: Grant
    Filed: March 20, 2008
    Date of Patent: November 13, 2012
    Assignee: Honda Research Institute Europe GmbH
    Inventors: Hisashi Sugiura, Herbert Janssen, Michael Gienger
  • Patent number: 8305379
    Abstract: In accordance with one or more embodiments, a method and system of managing animation data and related control data for recording on an enhanced navigation medium is provided. The method comprises constructing animation data comprising first image data into a first graphic MNG file in chunk data format, wherein the first graphic file comprises a first header portion, a second end portion, first control data and a frame containing additional data; and recording the first graphic file on an enhanced navigation medium.
    Type: Grant
    Filed: August 29, 2007
    Date of Patent: November 6, 2012
    Assignee: LG Electronics, Inc.
    Inventors: Woo Seong Yoon, Jea Yong Yoo, Limoniv Alexandre, Byung Jin Kim
  • Patent number: 8300054
    Abstract: In accordance with one or more embodiments, a method of managing animation data and related control data for recording on an enhanced navigation medium is provided. The method comprises constructing animation data comprising first image data into a first graphic MNG file in chunk data format, wherein the first graphic file comprises a first header portion, a second end portion, first control data and a frame containing additional data; and recording the first graphic file on an enhanced navigation medium.
    Type: Grant
    Filed: August 29, 2007
    Date of Patent: October 30, 2012
    Assignee: LG Electronics Inc.
    Inventors: Woo Seong Yoon, Jea Yong Yoo, Limoniv Alexandre, Byung Jin Kim
  • Patent number: 8300042
    Abstract: An interactive video display system uses strobing light to allow easy and unencumbered interaction between a person and projected video images. A camera may be synchronized to the strobing light and the strobing light may be in an infrared spectral range. A processing system detects images of a human in the field of view of the camera and controls the display so that the projected video images are changed based on an interaction between the human and the projected video images. The system may project the video images on a surface around the person and may move the projected video images on the surface based on a movement of the person. The projected video images may include computer-generated characters and/or virtual objects that react to the movement of a person.
    Type: Grant
    Filed: October 31, 2007
    Date of Patent: October 30, 2012
    Assignee: Microsoft Corporation
    Inventor: Matthew Bell
  • Patent number: 8284202
    Abstract: A method for a computer system includes receiving global positional data associated with a set of markers from a plurality of markers associated with a surface of an object at one or more time instances, wherein global positional data associated with a first marker from the plurality of markers is absent from a first time instance, using local statistical methods to determine global positional data associated with the first marker at the first time instance in response to the global positional data associated with the set of markers at the one or more time instances, and determining a model of the object in response to the global positional data associated with the set of markers and the global positional data associated with the first marker.
    Type: Grant
    Filed: July 2, 2007
    Date of Patent: October 9, 2012
    Assignee: Two Pic MC LLC
    Inventors: Jessica K. Hodgins, Sang Il Park
  • Patent number: 8271204
    Abstract: Provided are a human recognition apparatus and a human recognition method identifying a user based on a walking pattern. The human recognition apparatus includes a detecting unit detecting a vibration according to a user's walking, and outputting an electric signal, a pattern calculating unit acquiring a walker's walking pattern from the electric signal, and a user determining unit comparing the walking pattern with a previously measured reference data by user and identifying the user based on the comparison result. The human recognition apparatus and the human recognition method are robust against peripheral noise and can increase an acceptance rate through a simple structure and procedure by using the waling pattern, which is one-dimensional time information requiring no vast data throughput, as the user identification data.
    Type: Grant
    Filed: January 30, 2009
    Date of Patent: September 18, 2012
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Seung Min Choi, Ji Ho Chang, Jae Il Cho, Dae Hwan Hwang, Jae Yeon Lee, Do Hyung Kim, Eul Gyoon Lim, Ho Chul Shin
  • Patent number: 8269778
    Abstract: This disclosure relates to computer-generated imagery (CGI) and computer-aided animation. More specifically, this disclosure relates to techniques for preserving the shape of simulated and dynamic objects for use in CGI and computer-aided animation.
    Type: Grant
    Filed: December 3, 2010
    Date of Patent: September 18, 2012
    Assignee: Pixar
    Inventors: David Baraff, Christine Waggoner
  • Patent number: 8259118
    Abstract: A client device receives a user interface event corresponding to a spline curve associated with an object displayed on a mobile device. The user interface triggers creation of a new spline curve based on computation of a new spline tangent associated with the spline curve and phase space-based dynamics of a new state. A scene graph having state information associated with the new state is maintained. A rendering event triggers repainting of the object associated with the spline curve using the scene graph.
    Type: Grant
    Filed: December 12, 2008
    Date of Patent: September 4, 2012
    Assignee: MobiTV, Inc.
    Inventors: James Roseborough, Ian Farmer
  • Patent number: 8253728
    Abstract: In general, one or more aspects of the subject matter described in this specification can include associating with each clip in a sequence of one or more clips a copy of a three dimensional (3D) scene that was used to create the clip, where the clip is a sequence of one or more images that depict the clip's respective 3D scene from the perspective of one or more virtual cameras. Input identifying a clip in the sequence is received. In response to the receiving, a copy of the identified clip's associated copy of the 3D scene is presented in an editor.
    Type: Grant
    Filed: February 25, 2008
    Date of Patent: August 28, 2012
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Steve Sullivan, Max S-Han Chen, Jeffrey Bruce Yost
  • Patent number: 8253744
    Abstract: The system (13, 15) for virtually drawing on a physical surface of the invention comprises electronic circuitry, wherein the electronic circuitry is operative to detect movements of a physical object over the physical surface and project a drawing (11) corresponding to the movements on the physical surface. The computer program product of the invention enables a programmable device to function as the system of the invention.
    Type: Grant
    Filed: September 27, 2007
    Date of Patent: August 28, 2012
    Assignee: Koninklijke Philips Electronics N.V.
    Inventors: Marko Macura, Thomas Marzano, Kyriakos Mama
  • Patent number: 8243093
    Abstract: Aspects of the present invention relate to creation, modification and implementation of dither pattern structures applied to an image to diminish contouring artifacts. Some aspects relate to dither pattern structures with pixel values in a first color channel pattern that are spatially dispersed from pixel values in a corresponding pattern in a second color channel. Some aspects relate to application. Some aspects relate to systems and apparatus for creation and application of these dither pattern structures comprising pixel values dispersed across color channels.
    Type: Grant
    Filed: August 22, 2003
    Date of Patent: August 14, 2012
    Assignee: Sharp Laboratories of America, Inc.
    Inventors: Xiao-Fan Feng, Scott J. Daly
  • Patent number: 8243079
    Abstract: An event, such as a vertical blank interrupt or signal, received from a display adapter in a system is identified. Activation of a timer-driven animation routine that updates a state of an animation and activation of a paint controller module that identifies updates to the state of the animation and composes a frame that includes the updates to the state of the animation are aligned, both being activated based on the identified event in the system.
    Type: Grant
    Filed: September 14, 2010
    Date of Patent: August 14, 2012
    Assignee: Microsoft Corporation
    Inventors: Cenk Ergan, Benjamin C. Constable
  • Patent number: 8237720
    Abstract: Embodiments for shader-based finite state machine frame detection for implementing alternative graphical processing on an animation scenario are disclosed. In accordance with one embodiment, the embodiment includes assigning an identifier to each shader used to render animation scenarios. The embodiment also includes defining a finite state machine for a key frame in each of the animation scenarios, whereby each finite state machine representing a plurality of shaders that renders the key frame in each animation scenario. The embodiment further includes deriving a shader ID sequence for each finite state machine based on the identifier assigned to each shader. The embodiment additionally includes comparing an input shader ID sequence of a new frame of a new animation scenario to each derived shader ID sequences. Finally, the embodiment includes executing alternative graphics processing on the new animation scenario when the input shader ID sequence matches one of the derived shader ID sequences.
    Type: Grant
    Filed: February 12, 2009
    Date of Patent: August 7, 2012
    Assignee: Microsoft Corporation
    Inventors: Jinyu Li, Chen Li, Xin Tong
  • Patent number: 8232999
    Abstract: The surface of a body of water can be animated by deconstructing a master wave model into several layer models and then reconstructing the layer models to form an optimized wave model. A wave model is obtained, which describes the wave surfaces in a body of water. The wave model is comprised of a range of wave model frequencies over a given area. A primary layer model, secondary and tertiary layer models are constructed based on portions of the wave model frequencies. An optimized wave model is constructed by combining the primary, secondary, and tertiary layer models. A wave surface point location is determined within the given area. A wave height value is computed for the wave surface point location using the optimized wave model. The wave height value that is associated with the surface point location is stored.
    Type: Grant
    Filed: January 20, 2009
    Date of Patent: July 31, 2012
    Assignee: DreamWorks Animation LLC
    Inventor: Galen Gerald Gornowicz
  • Patent number: 8234572
    Abstract: A media player can have advanced-playlist creation capabilities such as the ability to automatically generate a playlist around a “seed” song selected by a user. In some embodiments, the accessory can determine whether the media player can use a particular song as a seed song for an advanced playlist and can so inform the user. The user can then operate the accessory's user interface to create an advanced playlist based on a particular song, rather than having to interact directly with the media player.
    Type: Grant
    Filed: March 10, 2009
    Date of Patent: July 31, 2012
    Assignee: Apple Inc.
    Inventors: Shailesh Rathi, Lawrence G. Bolton
  • Patent number: 8223154
    Abstract: Systems and methods for integrating graphic animation technologies with fantasy sports contest applications are provided. This invention enables a fantasy sports contest application to depict plays in various sporting events using graphic animation. The fantasy sports contest application may combine graphical representation of real-life elements such as, for example, player facial features, with default elements such as, for example, a generic player body, to create realistic graphic video. The fantasy sports contest application may provide links to animated videos for depicting plays on contest screens in which information associated with the plays may be displayed. The fantasy sports contest application may play the animated video for a user in response to the user selecting such a link. In some embodiment of the present invention, the fantasy sports contest application may also customize animated video based on user-supplied setup information.
    Type: Grant
    Filed: December 10, 2010
    Date of Patent: July 17, 2012
    Assignee: Rovi Technologies Corporation
    Inventors: Patrick J. Hughes, David Barber
  • Patent number: 8223144
    Abstract: A parallelization permission and prohibition management unit of a processor manages the permission or prohibition of the parallelization for each combination of partial spaces in cooperation with another parallelization permission and prohibition management unit of a different processor. Specifically, when any given object is present across the boundary between a first partial space and a second partial space, the parallelization is prohibited between the collision process to be performed by any given processor on the virtual objects in the first partial space and the collision process to be performed by another processor on the virtual object in the second partial space.
    Type: Grant
    Filed: December 5, 2007
    Date of Patent: July 17, 2012
    Assignee: Sony Corporation Entertainment Inc.
    Inventors: Tatsuya Ishiwata, Masahiro Yasue
  • Patent number: 8221237
    Abstract: A first sound volume calculation unit (251) obtains a length of a straight line connecting a sound emitting object and a sound detection object, and calculates a first sound volume attenuated from a predetermined reference sound volume in accordance with the length. A second volume calculation unit (252), in a case where on the straight line there is an other object that is an obstacle, calculates a second sound volume attenuated from the first sound volume by a predetermined ratio.
    Type: Grant
    Filed: March 12, 2007
    Date of Patent: July 17, 2012
    Assignee: Konami Digital Entertainment Co., Ltd.
    Inventor: Hiroyuki Nakayama
  • Patent number: 8218909
    Abstract: A method for deformable registration of 2 digital images includes providing a pair of digital images, including a fixed image and a moving image, extracting a set of edge images from each image of the pair of images, each edge set being extracted at a different resolution, selecting a pair of edge images with a lowest resolution, determining a mapping from edge points of the fixed image to edge points of moving image using a geodesic thin plate spline interpolation, applying the mapping to a next higher resolution edge point image of the moving image, selecting a pair of edge images at a next higher resolution, where a moving edge image is the moving edge image to which the mapping has been applied, repeating the steps at a next higher resolution for all edge images in the set of edge images, and applying the mapping to an entire moving image.
    Type: Grant
    Filed: August 26, 2008
    Date of Patent: July 10, 2012
    Assignee: Siemens Aktiengesellschaft
    Inventors: Ali Khamene, Fabrice Michel
  • Patent number: 8212823
    Abstract: A data path for a SIMD-based microprocessor is used to perform different simultaneous filter sub-operations in parallel data lanes of the SIMD-based microprocessor. Filter operations for sub-pixel interpolation are performed simultaneously on separate lanes of the SIMD processor's data path. Using a dedicated internal data path, precision higher than the native precision of the SIMD unit may be achieved. Through the data path according to this invention, a single instruction may be used to generate the value of two adjacent sub-pixels located diagonally with respect to integer pixel positions.
    Type: Grant
    Filed: September 28, 2006
    Date of Patent: July 3, 2012
    Assignee: Synopsys, Inc.
    Inventors: Carl Norman Graham, Kar-Lik Wong, Simon Jones, Aris Aristodemou
  • Patent number: 8208067
    Abstract: A method includes receiving a digital video segment simulating motion at one speed, the frames in the segment spaced at a uniform time interval. The method further includes receiving a desired speed of motion, determining the appropriate uniform time interval corresponding to the desired speed, and generating a frame sequence simulating the motion at the desired speed, the frames in the generated sequence spaced at the determined appropriate uniform time interval. If the generated frame sequence includes a frame from the original segment, then only frames from the original segment are included in the generated frame sequence, and if the generated frame sequence includes an interpolated frame then only interpolated frames are included in the generated frame sequence.
    Type: Grant
    Filed: July 11, 2007
    Date of Patent: June 26, 2012
    Assignee: Adobe Systems Incorporated
    Inventors: Harshdeep Singh, Samreen Dhillon
  • Patent number: 8207971
    Abstract: A system includes a computer system capable of representing one or more animated characters. The computer system includes a blendshape manager that combines multiple blendshapes to produce the animated character. The computer system also includes an expression manager to respectively adjust one or more control parameters associated with each of the plurality of blendshapes for adjusting an expression of the animated character. The computer system also includes a corrective element manager that applies one or more corrective elements to the combined blendshapes based upon at least one of the control parameters. The one or more applied corrective elements are adjustable based upon one or more of the control parameters absent the introduction of one or more additional control parameters.
    Type: Grant
    Filed: February 19, 2009
    Date of Patent: June 26, 2012
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Michael Koperwas, Frederic P. Pighin, Cary Phillips, Steve Sullivan, Eduardo Hueso
  • Patent number: 8207970
    Abstract: An information processing apparatus includes, for example, a touch panel placed over a display screen. For example, when a start of a touch input is detected, the first-touch-input coordinates are determined as object generation coordinates and an object is displayed in the coordinates. When an input direction based on coordinates continuously detected after the object generation coordinates until, for example, a touch-off is determined to be a predetermined direction, the object is moved in a specific direction. Alternatively, an input direction is determined based on coordinates continuously detected after the object generation coordinates until a touch-off, and then, based on the input direction, the direction opposite to the input direction, for example, is determined to be a moving direction and the object is moved in the moving direction.
    Type: Grant
    Filed: July 26, 2006
    Date of Patent: June 26, 2012
    Assignee: Nintendo Co., Ltd.
    Inventor: Hirofumi Matsuoka
  • Patent number: 8209612
    Abstract: Some embodiments provide a method of specifying speed effects for playing a video clip. The method defines a set of speed effects for the video clip. It then displays in real-time a presentation of the video clip that accounts for the set of speed effects defined for the video clip. In some embodiments, this method represents the playback speed of a video clip in terms of a graph that is part of a graphical user interface (“GUI”). This graph is defined along two axes, with one axis representing the playback time, and the other axis representing the content-time (i.e., the time within the video clip). In these embodiments, a user can change the playback speed of the video clip by using a set of GUI operations to select and modify the graph. For instance, a user can select and adjust the graph at different instances in time in order to change the playback speed of the video clip at these instances. Different embodiments use different types of graphs to represent playback speed.
    Type: Grant
    Filed: April 19, 2010
    Date of Patent: June 26, 2012
    Assignee: Apple Inc.
    Inventor: Gary Johnson
  • Publication number: 20120159360
    Abstract: A timing function that distributes progressive start times of a series of target animations, or staggers animations, is disclosed. The timing function includes a set of selectable parameters that are used to create a customized staggering animation in the user interface. The set of selectable parameters includes a user interface geometry for each of the target animations.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Jesse Bishop, Ruurd Johan Boeke, Terry Adams
  • Patent number: 8203561
    Abstract: A computer implemented method, computer program product, and a data processing system determine an excursion corridor within a virtual environment. A time-stamped snapshot of a location of at least one avatar within the virtual universe is recorded. An avatar tracking data structure is then updated. The avatar tracking data structure provides a time-based history of avatar locations within the virtual universe. A weighted density map is generated. The weighted density map is then correlated with virtual object locations. Each virtual object location corresponds to a virtual object. Excursion corridors are identified. The excursion corridor identifies frequently taken routes between the virtual object locations. Waypoints are identified. Each waypoint corresponds to a virtual object. Each waypoint is an endpoint for one of the excursion corridors.
    Type: Grant
    Filed: September 10, 2008
    Date of Patent: June 19, 2012
    Assignee: International Business Machines Corporation
    Inventors: William S. Carter, Guido D. Corona
  • Patent number: 8199150
    Abstract: A system for controlling a rendering engine by using specialized commands. The commands are used to generate a production, such as a television show, at an end-user's computer that executes the rendering engine. In one embodiment, the commands are sent over a network, such as the Internet, to achieve broadcasts of video programs at very high compression and efficiency. Commands for setting and moving camera viewpoints, animating characters, and defining or controlling scenes and sounds are described. At a fine level of control math models and coordinate systems can be used make specifications. At a coarse level of control the command language approaches the text format traditionally used in television or movie scripts. Simple names for objects within a scene are used to identify items, directions and paths. Commands are further simplified by having the rendering engine use defaults when specifications are left out.
    Type: Grant
    Filed: March 23, 2005
    Date of Patent: June 12, 2012
    Assignee: Quonsil PL. 3, LLC
    Inventor: Charles J. Kulas
  • Patent number: 8199151
    Abstract: A method of detecting an occurrence of an event of an event type during an animation, in which the animation comprises, for each of a plurality of object parts of an object, data defining the respective movement of that object part at each of a sequence of time-points for the animation, the method comprising: indicating the event type, wherein the event type specifies: one or more of the object parts; and a sequence of two or more event phases that occur during an event of that event type such that, for each event phase, the respective movements of the one or more specified object parts during that event phase are each constrained according to a constraint type associated with that event phase; and detecting an occurrence of an event of the event type by detecting a section of the animation during which the respective movements defined by the animation for the specified one or more object parts are constrained in accordance with the sequence of two or more event phases.
    Type: Grant
    Filed: February 13, 2009
    Date of Patent: June 12, 2012
    Assignee: Naturalmotion Ltd.
    Inventor: Nicholas MacDonald Spencer
  • Patent number: 8194079
    Abstract: A method is described to let animators control the extent by which kinematically scripted character motions affect dynamically simulated objects' motions. The dynamic objects are connected to the kinematic character, such as clothing or hair, and the motion of the dynamic objects is simulated based on the motion of the kinematic character. Such control is important to produce reasonable behavior of dynamic objects in the presence of physically unrealistic kinematic character motion. An Inertial Field Generator (IFG) is employed to compensate for the unreasonable behavior of dynamic objects when the kinematic character undergoes unrealistic motion.
    Type: Grant
    Filed: April 13, 2007
    Date of Patent: June 5, 2012
    Assignee: Pixar
    Inventors: David E. Baraff, Andrew Witkin
  • Patent number: 8194080
    Abstract: Among other disclosed subject matter, a computer-implemented method for generating a surface representation of an item includes identifying, for a point on an item in an animation process, at least first and second transformation points corresponding to respective first and second transformations of the point. Each of the first and second transformations represents an influence on a location of the point of respective first and second joints associated with the item. The method includes determining an axis for a cylindrical coordinate system using the first and second transformations. The method includes performing an interpolation of the first and second transformation points in the cylindrical coordinate system to obtain an interpolated point. The method includes recording the interpolated point in a surface representation of the item in the animation process.
    Type: Grant
    Filed: February 27, 2008
    Date of Patent: June 5, 2012
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Jason Smith, Frederic P. Pighin, Cary Phillips
  • Patent number: 8184122
    Abstract: An electronic entertainment system for creating a video sequence by executing video game camera behavior based upon a video game sound file includes a memory configured to store an action event/camera behavior (AE/CB) database, game software such as an action generator module, and one or more sound files. In addition, the system includes a sound processing unit coupled to the memory for processing a selected sound file, and a processor coupled to the memory and the sound processing unit. The processor randomly selects an AE pointer and a CB pointer from the AE/CB database. Upon selection of the CB pointer and the AE pointer, the action generator executes camera behavior corresponding to the selected CB pointer to view an action event corresponding to the selected AE pointer.
    Type: Grant
    Filed: July 22, 2010
    Date of Patent: May 22, 2012
    Assignee: Sony Computer Entertainment America LLC
    Inventor: Ed Annunziata
  • Patent number: 8184102
    Abstract: Touch sensor methods, devices and systems are disclosed. One embodiment of the present invention pertains to a method comprising monitoring a finger movement along a touch sensing surface based on position data of a finger touching the touch sensing surface, where the position data is obtained by locating a position of a force applied by the finger in a coordinate of the touch sensing surface. In addition, the method comprises generating direction data associated with the finger movement if the finger movement travels for more than a threshold distance. Furthermore, the method comprises determining a finger gesture which corresponds to the finger movement using a lookup table having multiple preconfigured finger gestures based on the direction data.
    Type: Grant
    Filed: December 17, 2008
    Date of Patent: May 22, 2012
    Assignee: Cypress Semiconductor Corporation
    Inventors: Tony Park, Luther Lu, Nelson Chow
  • Patent number: 8169438
    Abstract: In various embodiments, deformations caused by kinematic or reference objects to secondary objects such as hair or fur may be computed in parallel using a temporally coherent deformation technique. A single or uniform direction for a deformation may be determined from which the deformation of the secondary object will occur with respect to a reference object. The uniform direction for the deformation may be determined rather than allowing the direction of the deformation to vary along a dimension of the secondary object. The magnitude of the deformation may be determined to vary along the dimension of the secondary object in response to the penetration depth or the measure of how far inside the secondary object finds itself within the reference object.
    Type: Grant
    Filed: March 31, 2008
    Date of Patent: May 1, 2012
    Assignee: Pixar
    Inventors: David Baraff, Michael Fong, Christine Waggoner
  • Patent number: 8164590
    Abstract: A method for a computer system includes determining a plurality of illumination modes associated with a plurality of scene descriptors, wherein the plurality of scene descriptors includes a first scene descriptor and a second scene descriptor, determining a first plurality of weights, wherein each weight from the first plurality of weights is associated with an illumination mode from the plurality of illumination modes, determining illumination data associated with the first scene descriptor in response to the first plurality of weights and in response to the plurality of illumination modes, determining a second plurality of weights, wherein each weight from the second plurality of weights is associated with an illumination mode from the plurality of illumination modes, and determining illumination data associated with the second scene descriptor in response to the second plurality of weights and in response to the plurality of illumination modes.
    Type: Grant
    Filed: May 13, 2011
    Date of Patent: April 24, 2012
    Assignee: Pixar
    Inventors: John Anderson, Mark Meyer
  • Patent number: 8164596
    Abstract: Techniques are provided for automatically creating style sheet animations including keyframe information. In some embodiments, a style sheet animation creation tool with a timeline-based interface is provided. By interacting with the user-interface, the user can select a point on a timeline for an animation object to add a keyframe to an animation of the animation object. In response to the user's selection of the keyframe time point, the style sheet animation creation tool displays an interactive keyframe indicator on the timeline to indicate the selected time point. With the style sheet animation creation tool, a user can generate a style sheet animation without having to author style sheet language text statements by hand.
    Type: Grant
    Filed: October 6, 2011
    Date of Patent: April 24, 2012
    Assignee: Sencha, Inc.
    Inventor: Arne Nikolai Bech
  • Patent number: 8159504
    Abstract: A system that incorporates teachings of the present disclosure may include, for example, an avatar engine having a controller to retrieve a user profile, present a user an avatar having characteristics that correlate to the user profile, detect a change in a developmental growth of the user, adapt a portion of the characteristics of the avatar responsive to the detected change, and present the user the adapted avatar. Other embodiments are disclosed.
    Type: Grant
    Filed: October 16, 2008
    Date of Patent: April 17, 2012
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: E-Lee Chang, Horst Schroeter, Linda Roberts, Darnell Clayton, Madhur Khandelwal
  • Patent number: 8154545
    Abstract: The invention relates to a method and a computer-aided modelling system for creating a technical drawing from at least two modelled 3D bodies that collide with one another. In a first step, one or more of the regions of the 3D bodies that are affected by the collision are selected. In a second step, a group of colliding faces of the selected regions of the two or more 3D bodies are combined to form a respective collision group and a technical drawing of the two or more colliding modelled 3D bodies is produced. A 2D edge or its associated boundary of a face that belongs to a collision group is treated by masking the other faces that are associated with the same collision group.
    Type: Grant
    Filed: December 6, 2005
    Date of Patent: April 10, 2012
    Assignee: Parametric Technology Corporation
    Inventors: Manfred Göbel, Hans-Ulrich Becker, Jochen Dürr
  • Patent number: 8144155
    Abstract: An approach to enrich skeleton-driven animations with physically-based secondary deformation in real time is described. To achieve this goal, the technique described employs a surface-based deformable model that can interactively emulate the dynamics of both low- and high-frequency volumetric effects. Given a surface mesh and a few sample sequences of its physical behavior, a set of motion parameters of the material are learned during an off-line preprocessing step. The deformable model is then applicable to any given skeleton-driven animation of the surface mesh. Additionally, the described dynamic skinning technique can be entirely implemented on GPUs and executed with great efficiency. Thus, with minimal changes to the conventional graphics pipeline, the technique can drastically enhance the visual experience of skeleton-driven animations by adding secondary deformation in real time.
    Type: Grant
    Filed: August 11, 2008
    Date of Patent: March 27, 2012
    Assignee: Microsoft Corp.
    Inventors: Kun Zhou, Xiaohan Shi, Baining Guo
  • Patent number: 8139067
    Abstract: Motion capture animation, shape completion and markerless motion capture methods are provided. A pose deformation space model encoding variability in pose is learnt from a three-dimensional (3D) dataset. Body shape deformation space model encoding variability in pose and shape is learnt from another 3D dataset. The learnt pose model is combined with the learnt body shape model. For motion capture animation, given parameter set, the combined model generates a 3D shape surface of a body in a pose and shape. For shape completion, given partial surface of a body defined as 3D points, the combined model generates a 3D surface model in the combined spaces that fits the 3D points. For markerless motion capture, given 3D information of a body, the combined model traces the movement of the body using the combined spaces that fits the 3D information or reconstructing the body's shape or deformations that fits the 3D information.
    Type: Grant
    Filed: July 25, 2007
    Date of Patent: March 20, 2012
    Assignee: The Board of Trustees of the Leland Stanford Junior University
    Inventors: Dragomir D. Anguelov, Praveen Srinivasan, Daphne Koller, Sebastian Thrun
  • Patent number: 8134558
    Abstract: Systems and methods for editing of a computer-generated animation across a plurality of keyframe pairs are provided. Embodiments enable time editing across a plurality of non-roving keyframe pairs. Such non-roving keyframes have fixed references relative to an animation's reference timeline. An author may specify a point on an animation's reference timeline at which each non-roving keyframe is placed. In accordance with embodiments of the present invention, an animation across a plurality of non-roving keyframes is treated as an editable unit. Thus, an author may modify the timing for all or a select portion of such editable unit (which may span a plurality of the non-roving keyframes). For instance, an author may expand or reduce the time span for a plurality of non-roving keyframes, and the timing of the plurality of non-roving keyframes automatically adjusts to maintain their timing proportionality relative to each other in the resulting modified time span.
    Type: Grant
    Filed: December 6, 2007
    Date of Patent: March 13, 2012
    Assignee: Adobe Systems Incorporated
    Inventor: John Mayhew
  • Patent number: 8134552
    Abstract: A method, apparatus and medium to render three-dimensional (3D) objects for 3D graphics. The method includes detecting the presence of a motion by performing local coordinate calculations with respect to each 3D object, performing global coordinate calculations with respect only to objects that each has a motion among the 3D objects, and rendering 3D objects for which local and global coordinate calculations have been performed.
    Type: Grant
    Filed: September 22, 2006
    Date of Patent: March 13, 2012
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Keechang Lee, Dokyoon Kim, Jeonghwan Ahn, Seyoon Tak
  • Patent number: 8126293
    Abstract: In the present invention, there is provided an image processing apparatus including: a detecting section configured to detect a motion vector from an input image signal acting as the image signal for each of chronologically input pixels; a determining section configured to determine whether the input image signal is cleared; and an interpolating section configured such that if the input image signal is not found cleared, then the interpolating section interpolates and outputs an input image signal intermediate signal interposed at a predetermined point in time between the uncleared input image signal and a preceding input image signal that precedes the uncleared input signal, in accordance with the motion vector; and if the input image signal is found cleared, then the interpolating section allows the input image signal to be output unchanged as the input image signal intermediate signal.
    Type: Grant
    Filed: April 10, 2008
    Date of Patent: February 28, 2012
    Assignee: Sony Corporation
    Inventors: Masayuki Suematsu, Takayuki Ohe, Masato Usuki, Masanari Yamamoto, Makoto Haitani