Temporal Interpolation Or Processing Patents (Class 345/475)
-
Patent number: 8339392Abstract: A computer implemented method, a computer program product, and a data processing system manage regions within a virtual universe. A current location of an avatar is identified within a virtual universe, the current location being within a currently populated region. A vectored movement of the avatar is identified. Any adjacent region that may probabilistically encounter a horizon of the avatar within a region activation time of the virtual universe is identified. If the adjacent region is deactivated, then the adjacent region is activated. An unpopulated region that is currently active is identified, wherein the unpopulated region is within an extended distance from the avatar's current location. The unpopulated region is then deactivated.Type: GrantFiled: September 30, 2008Date of Patent: December 25, 2012Assignee: International Business Machines CorporationInventors: Boas Betzler, Sean L. Dague, Peter George Finn
-
Patent number: 8339403Abstract: Architecture that enhances the visual experience of a slide presentation by animating slide content as “actors” in the same background “scene”. This is provided by multi-layered transitions between slides, where a slide is first separated into “layers” (e.g., with a level of transparency). Each layer can then be transitioned independently. All layers are composited together to accomplish the end effect. The layers can comprise one or more content layers, and a background layer. The background layer can further be separated into a background graphics layer and a background fill layer. The transition phase can include a transition effect such as a fade, a wipe, a dissolve effect, and other desired effects. To provide the continuity and uniformity of presentation the content on the same background scene, a transition effect is not applied to the background layer.Type: GrantFiled: April 16, 2008Date of Patent: December 25, 2012Assignee: Microsoft CorporationInventors: Jason Zhao, Mark Pearson, Peter Lai
-
Patent number: 8334872Abstract: A method for a computer system comprising receiving a displacement for a first object model surface from a user determined in response to a first physical motion captured pose, determining a weighted combination of a first displacement group and a second displacement group from the displacement, wherein the first displacement group is determined from displacements between the first object model surface and a second object model surface, wherein the second object model surface is determined from displacements between a second physical motion captured pose, wherein the second displacement group is determined from displacements between the first object model surface and a third object model surface, wherein the third object model surface is determined from a third physical motion captured pose, determining a fourth object model surface from the first object model surface and the weighted combination, and displaying the fourth object model surface to the user on a display.Type: GrantFiled: June 30, 2009Date of Patent: December 18, 2012Assignee: Two Pic MC LLCInventors: Doug Epps, Nate Reid
-
Patent number: 8334871Abstract: The present invention provides a computer implemented method and apparatus to project a projected avatar associated with an avatar in a virtual universe. A computer receives a command to project the avatar, the command having a projection point. The computer transmits a request to place a projected avatar at the projection point to a virtual universe host. The computer renders a tab associated with the projected avatar.Type: GrantFiled: June 6, 2011Date of Patent: December 18, 2012Assignee: International Business Machine CorporationInventors: Rick Allen Hamilton, II, Brian Marshall O'Connell, Clifford Alan Pickover, Keith Raymond Walker
-
Patent number: 8319778Abstract: Variable motion blur is created by varying the evaluation time used to determine the poses of objects according to motion blur parameters when evaluating a blur frame. A blur parameter can be associated with one or more objects, portions of objects, or animation variables. The animation system modifies the time of the blur frame by a function including the blur parameter to determine poses of objects or portions thereof associated with the blur parameter in a blur frame. The animation system determines the values of animation variables at their modified times, rather than at the time of the blur frame, and poses objects or portions thereof accordingly. Multiple blur parameters can be used to evaluate the poses of different portions of a scene at different times for a blur frame. Portions of an object can be associated with different blur parameters, enabling motion blur to be varied within an object.Type: GrantFiled: January 31, 2008Date of Patent: November 27, 2012Assignee: PixarInventors: Rick Sayre, Martin Reddy, Peter Bernard Demoreuille
-
Patent number: 8319779Abstract: A game developer can “tag” an item in the game environment. When an animated character walks near the “tagged” item, the animation engine can cause the character's head to turn toward the item, and mathematically computes what needs to be done in order to make the action look real and normal. The tag can also be modified to elicit an emotional response from the character. For example, a tagged enemy can cause fear, while a tagged inanimate object may cause only indifference or indifferent interest.Type: GrantFiled: March 30, 2011Date of Patent: November 27, 2012Assignee: Nintendo of America, Inc.Inventors: Henry Sterchi, Jeff Kalles, Shigeru Miyamoto, Denis Dyack, Carey Murray
-
Patent number: 8319777Abstract: A character display for attracting user interest by increasing the variety of on-screen display while reducing data processing by making the time variation of posture common among a plurality of characters.Type: GrantFiled: May 16, 2008Date of Patent: November 27, 2012Assignee: Konami Digital Entertainment Co., Ltd.Inventor: Yuichi Asami
-
Patent number: 8314801Abstract: Embodiments described herein are directed to automatically generating an animation for a transition between a current state and a new state. In one embodiment, a computer system accesses state properties of a visual element corresponding to a current state the visual element is in and a new state the visual element is to be transitioned to. The state properties include visual properties and transition description information. The computer system determines the differences between the visual properties of the current state and the new state and automatically generates an animation based on the determined differences between the visual properties for the current state and the new state, such that the animation is playable to transition the visual element from the current state to new state.Type: GrantFiled: February 29, 2008Date of Patent: November 20, 2012Assignee: Microsoft CorporationInventors: Kenneth L. Young, Steven Charles White, Christian B. Schormann
-
Patent number: 8311731Abstract: A robot is provided with a motion control unit that avoids collision between segments of the robot or between segments of the robot and other objects. The motion control unit of the robot comprises a distance computing module, a whole body control module, a collision avoidance module, and a blending control unit. The distance computing module calculates two closest points of different segments of the robot connected to each other via at least one joint or a segment of the robot and another object. The collision avoidance module is provided with the information about the two closest points. The blending control unit combines the weighted output control signals of the whole body control module and the collision avoidance control module. The weight of the whole body control output signal is higher when the risk of collision is lower. The weight of the collision avoidance control output signal is higher when the risk of collision is higher.Type: GrantFiled: March 20, 2008Date of Patent: November 13, 2012Assignee: Honda Research Institute Europe GmbHInventors: Hisashi Sugiura, Herbert Janssen, Michael Gienger
-
Patent number: 8305379Abstract: In accordance with one or more embodiments, a method and system of managing animation data and related control data for recording on an enhanced navigation medium is provided. The method comprises constructing animation data comprising first image data into a first graphic MNG file in chunk data format, wherein the first graphic file comprises a first header portion, a second end portion, first control data and a frame containing additional data; and recording the first graphic file on an enhanced navigation medium.Type: GrantFiled: August 29, 2007Date of Patent: November 6, 2012Assignee: LG Electronics, Inc.Inventors: Woo Seong Yoon, Jea Yong Yoo, Limoniv Alexandre, Byung Jin Kim
-
Patent number: 8300054Abstract: In accordance with one or more embodiments, a method of managing animation data and related control data for recording on an enhanced navigation medium is provided. The method comprises constructing animation data comprising first image data into a first graphic MNG file in chunk data format, wherein the first graphic file comprises a first header portion, a second end portion, first control data and a frame containing additional data; and recording the first graphic file on an enhanced navigation medium.Type: GrantFiled: August 29, 2007Date of Patent: October 30, 2012Assignee: LG Electronics Inc.Inventors: Woo Seong Yoon, Jea Yong Yoo, Limoniv Alexandre, Byung Jin Kim
-
Patent number: 8300042Abstract: An interactive video display system uses strobing light to allow easy and unencumbered interaction between a person and projected video images. A camera may be synchronized to the strobing light and the strobing light may be in an infrared spectral range. A processing system detects images of a human in the field of view of the camera and controls the display so that the projected video images are changed based on an interaction between the human and the projected video images. The system may project the video images on a surface around the person and may move the projected video images on the surface based on a movement of the person. The projected video images may include computer-generated characters and/or virtual objects that react to the movement of a person.Type: GrantFiled: October 31, 2007Date of Patent: October 30, 2012Assignee: Microsoft CorporationInventor: Matthew Bell
-
Patent number: 8284202Abstract: A method for a computer system includes receiving global positional data associated with a set of markers from a plurality of markers associated with a surface of an object at one or more time instances, wherein global positional data associated with a first marker from the plurality of markers is absent from a first time instance, using local statistical methods to determine global positional data associated with the first marker at the first time instance in response to the global positional data associated with the set of markers at the one or more time instances, and determining a model of the object in response to the global positional data associated with the set of markers and the global positional data associated with the first marker.Type: GrantFiled: July 2, 2007Date of Patent: October 9, 2012Assignee: Two Pic MC LLCInventors: Jessica K. Hodgins, Sang Il Park
-
Patent number: 8271204Abstract: Provided are a human recognition apparatus and a human recognition method identifying a user based on a walking pattern. The human recognition apparatus includes a detecting unit detecting a vibration according to a user's walking, and outputting an electric signal, a pattern calculating unit acquiring a walker's walking pattern from the electric signal, and a user determining unit comparing the walking pattern with a previously measured reference data by user and identifying the user based on the comparison result. The human recognition apparatus and the human recognition method are robust against peripheral noise and can increase an acceptance rate through a simple structure and procedure by using the waling pattern, which is one-dimensional time information requiring no vast data throughput, as the user identification data.Type: GrantFiled: January 30, 2009Date of Patent: September 18, 2012Assignee: Electronics and Telecommunications Research InstituteInventors: Seung Min Choi, Ji Ho Chang, Jae Il Cho, Dae Hwan Hwang, Jae Yeon Lee, Do Hyung Kim, Eul Gyoon Lim, Ho Chul Shin
-
Patent number: 8269778Abstract: This disclosure relates to computer-generated imagery (CGI) and computer-aided animation. More specifically, this disclosure relates to techniques for preserving the shape of simulated and dynamic objects for use in CGI and computer-aided animation.Type: GrantFiled: December 3, 2010Date of Patent: September 18, 2012Assignee: PixarInventors: David Baraff, Christine Waggoner
-
Patent number: 8259118Abstract: A client device receives a user interface event corresponding to a spline curve associated with an object displayed on a mobile device. The user interface triggers creation of a new spline curve based on computation of a new spline tangent associated with the spline curve and phase space-based dynamics of a new state. A scene graph having state information associated with the new state is maintained. A rendering event triggers repainting of the object associated with the spline curve using the scene graph.Type: GrantFiled: December 12, 2008Date of Patent: September 4, 2012Assignee: MobiTV, Inc.Inventors: James Roseborough, Ian Farmer
-
Patent number: 8253728Abstract: In general, one or more aspects of the subject matter described in this specification can include associating with each clip in a sequence of one or more clips a copy of a three dimensional (3D) scene that was used to create the clip, where the clip is a sequence of one or more images that depict the clip's respective 3D scene from the perspective of one or more virtual cameras. Input identifying a clip in the sequence is received. In response to the receiving, a copy of the identified clip's associated copy of the 3D scene is presented in an editor.Type: GrantFiled: February 25, 2008Date of Patent: August 28, 2012Assignee: Lucasfilm Entertainment Company Ltd.Inventors: Steve Sullivan, Max S-Han Chen, Jeffrey Bruce Yost
-
Patent number: 8253744Abstract: The system (13, 15) for virtually drawing on a physical surface of the invention comprises electronic circuitry, wherein the electronic circuitry is operative to detect movements of a physical object over the physical surface and project a drawing (11) corresponding to the movements on the physical surface. The computer program product of the invention enables a programmable device to function as the system of the invention.Type: GrantFiled: September 27, 2007Date of Patent: August 28, 2012Assignee: Koninklijke Philips Electronics N.V.Inventors: Marko Macura, Thomas Marzano, Kyriakos Mama
-
Patent number: 8243093Abstract: Aspects of the present invention relate to creation, modification and implementation of dither pattern structures applied to an image to diminish contouring artifacts. Some aspects relate to dither pattern structures with pixel values in a first color channel pattern that are spatially dispersed from pixel values in a corresponding pattern in a second color channel. Some aspects relate to application. Some aspects relate to systems and apparatus for creation and application of these dither pattern structures comprising pixel values dispersed across color channels.Type: GrantFiled: August 22, 2003Date of Patent: August 14, 2012Assignee: Sharp Laboratories of America, Inc.Inventors: Xiao-Fan Feng, Scott J. Daly
-
Patent number: 8243079Abstract: An event, such as a vertical blank interrupt or signal, received from a display adapter in a system is identified. Activation of a timer-driven animation routine that updates a state of an animation and activation of a paint controller module that identifies updates to the state of the animation and composes a frame that includes the updates to the state of the animation are aligned, both being activated based on the identified event in the system.Type: GrantFiled: September 14, 2010Date of Patent: August 14, 2012Assignee: Microsoft CorporationInventors: Cenk Ergan, Benjamin C. Constable
-
Patent number: 8237720Abstract: Embodiments for shader-based finite state machine frame detection for implementing alternative graphical processing on an animation scenario are disclosed. In accordance with one embodiment, the embodiment includes assigning an identifier to each shader used to render animation scenarios. The embodiment also includes defining a finite state machine for a key frame in each of the animation scenarios, whereby each finite state machine representing a plurality of shaders that renders the key frame in each animation scenario. The embodiment further includes deriving a shader ID sequence for each finite state machine based on the identifier assigned to each shader. The embodiment additionally includes comparing an input shader ID sequence of a new frame of a new animation scenario to each derived shader ID sequences. Finally, the embodiment includes executing alternative graphics processing on the new animation scenario when the input shader ID sequence matches one of the derived shader ID sequences.Type: GrantFiled: February 12, 2009Date of Patent: August 7, 2012Assignee: Microsoft CorporationInventors: Jinyu Li, Chen Li, Xin Tong
-
Patent number: 8232999Abstract: The surface of a body of water can be animated by deconstructing a master wave model into several layer models and then reconstructing the layer models to form an optimized wave model. A wave model is obtained, which describes the wave surfaces in a body of water. The wave model is comprised of a range of wave model frequencies over a given area. A primary layer model, secondary and tertiary layer models are constructed based on portions of the wave model frequencies. An optimized wave model is constructed by combining the primary, secondary, and tertiary layer models. A wave surface point location is determined within the given area. A wave height value is computed for the wave surface point location using the optimized wave model. The wave height value that is associated with the surface point location is stored.Type: GrantFiled: January 20, 2009Date of Patent: July 31, 2012Assignee: DreamWorks Animation LLCInventor: Galen Gerald Gornowicz
-
Patent number: 8234572Abstract: A media player can have advanced-playlist creation capabilities such as the ability to automatically generate a playlist around a “seed” song selected by a user. In some embodiments, the accessory can determine whether the media player can use a particular song as a seed song for an advanced playlist and can so inform the user. The user can then operate the accessory's user interface to create an advanced playlist based on a particular song, rather than having to interact directly with the media player.Type: GrantFiled: March 10, 2009Date of Patent: July 31, 2012Assignee: Apple Inc.Inventors: Shailesh Rathi, Lawrence G. Bolton
-
Patent number: 8223154Abstract: Systems and methods for integrating graphic animation technologies with fantasy sports contest applications are provided. This invention enables a fantasy sports contest application to depict plays in various sporting events using graphic animation. The fantasy sports contest application may combine graphical representation of real-life elements such as, for example, player facial features, with default elements such as, for example, a generic player body, to create realistic graphic video. The fantasy sports contest application may provide links to animated videos for depicting plays on contest screens in which information associated with the plays may be displayed. The fantasy sports contest application may play the animated video for a user in response to the user selecting such a link. In some embodiment of the present invention, the fantasy sports contest application may also customize animated video based on user-supplied setup information.Type: GrantFiled: December 10, 2010Date of Patent: July 17, 2012Assignee: Rovi Technologies CorporationInventors: Patrick J. Hughes, David Barber
-
Patent number: 8223144Abstract: A parallelization permission and prohibition management unit of a processor manages the permission or prohibition of the parallelization for each combination of partial spaces in cooperation with another parallelization permission and prohibition management unit of a different processor. Specifically, when any given object is present across the boundary between a first partial space and a second partial space, the parallelization is prohibited between the collision process to be performed by any given processor on the virtual objects in the first partial space and the collision process to be performed by another processor on the virtual object in the second partial space.Type: GrantFiled: December 5, 2007Date of Patent: July 17, 2012Assignee: Sony Corporation Entertainment Inc.Inventors: Tatsuya Ishiwata, Masahiro Yasue
-
Patent number: 8221237Abstract: A first sound volume calculation unit (251) obtains a length of a straight line connecting a sound emitting object and a sound detection object, and calculates a first sound volume attenuated from a predetermined reference sound volume in accordance with the length. A second volume calculation unit (252), in a case where on the straight line there is an other object that is an obstacle, calculates a second sound volume attenuated from the first sound volume by a predetermined ratio.Type: GrantFiled: March 12, 2007Date of Patent: July 17, 2012Assignee: Konami Digital Entertainment Co., Ltd.Inventor: Hiroyuki Nakayama
-
Patent number: 8218909Abstract: A method for deformable registration of 2 digital images includes providing a pair of digital images, including a fixed image and a moving image, extracting a set of edge images from each image of the pair of images, each edge set being extracted at a different resolution, selecting a pair of edge images with a lowest resolution, determining a mapping from edge points of the fixed image to edge points of moving image using a geodesic thin plate spline interpolation, applying the mapping to a next higher resolution edge point image of the moving image, selecting a pair of edge images at a next higher resolution, where a moving edge image is the moving edge image to which the mapping has been applied, repeating the steps at a next higher resolution for all edge images in the set of edge images, and applying the mapping to an entire moving image.Type: GrantFiled: August 26, 2008Date of Patent: July 10, 2012Assignee: Siemens AktiengesellschaftInventors: Ali Khamene, Fabrice Michel
-
Patent number: 8212823Abstract: A data path for a SIMD-based microprocessor is used to perform different simultaneous filter sub-operations in parallel data lanes of the SIMD-based microprocessor. Filter operations for sub-pixel interpolation are performed simultaneously on separate lanes of the SIMD processor's data path. Using a dedicated internal data path, precision higher than the native precision of the SIMD unit may be achieved. Through the data path according to this invention, a single instruction may be used to generate the value of two adjacent sub-pixels located diagonally with respect to integer pixel positions.Type: GrantFiled: September 28, 2006Date of Patent: July 3, 2012Assignee: Synopsys, Inc.Inventors: Carl Norman Graham, Kar-Lik Wong, Simon Jones, Aris Aristodemou
-
Patent number: 8208067Abstract: A method includes receiving a digital video segment simulating motion at one speed, the frames in the segment spaced at a uniform time interval. The method further includes receiving a desired speed of motion, determining the appropriate uniform time interval corresponding to the desired speed, and generating a frame sequence simulating the motion at the desired speed, the frames in the generated sequence spaced at the determined appropriate uniform time interval. If the generated frame sequence includes a frame from the original segment, then only frames from the original segment are included in the generated frame sequence, and if the generated frame sequence includes an interpolated frame then only interpolated frames are included in the generated frame sequence.Type: GrantFiled: July 11, 2007Date of Patent: June 26, 2012Assignee: Adobe Systems IncorporatedInventors: Harshdeep Singh, Samreen Dhillon
-
Patent number: 8207971Abstract: A system includes a computer system capable of representing one or more animated characters. The computer system includes a blendshape manager that combines multiple blendshapes to produce the animated character. The computer system also includes an expression manager to respectively adjust one or more control parameters associated with each of the plurality of blendshapes for adjusting an expression of the animated character. The computer system also includes a corrective element manager that applies one or more corrective elements to the combined blendshapes based upon at least one of the control parameters. The one or more applied corrective elements are adjustable based upon one or more of the control parameters absent the introduction of one or more additional control parameters.Type: GrantFiled: February 19, 2009Date of Patent: June 26, 2012Assignee: Lucasfilm Entertainment Company Ltd.Inventors: Michael Koperwas, Frederic P. Pighin, Cary Phillips, Steve Sullivan, Eduardo Hueso
-
Patent number: 8207970Abstract: An information processing apparatus includes, for example, a touch panel placed over a display screen. For example, when a start of a touch input is detected, the first-touch-input coordinates are determined as object generation coordinates and an object is displayed in the coordinates. When an input direction based on coordinates continuously detected after the object generation coordinates until, for example, a touch-off is determined to be a predetermined direction, the object is moved in a specific direction. Alternatively, an input direction is determined based on coordinates continuously detected after the object generation coordinates until a touch-off, and then, based on the input direction, the direction opposite to the input direction, for example, is determined to be a moving direction and the object is moved in the moving direction.Type: GrantFiled: July 26, 2006Date of Patent: June 26, 2012Assignee: Nintendo Co., Ltd.Inventor: Hirofumi Matsuoka
-
Patent number: 8209612Abstract: Some embodiments provide a method of specifying speed effects for playing a video clip. The method defines a set of speed effects for the video clip. It then displays in real-time a presentation of the video clip that accounts for the set of speed effects defined for the video clip. In some embodiments, this method represents the playback speed of a video clip in terms of a graph that is part of a graphical user interface (“GUI”). This graph is defined along two axes, with one axis representing the playback time, and the other axis representing the content-time (i.e., the time within the video clip). In these embodiments, a user can change the playback speed of the video clip by using a set of GUI operations to select and modify the graph. For instance, a user can select and adjust the graph at different instances in time in order to change the playback speed of the video clip at these instances. Different embodiments use different types of graphs to represent playback speed.Type: GrantFiled: April 19, 2010Date of Patent: June 26, 2012Assignee: Apple Inc.Inventor: Gary Johnson
-
Publication number: 20120159360Abstract: A timing function that distributes progressive start times of a series of target animations, or staggers animations, is disclosed. The timing function includes a set of selectable parameters that are used to create a customized staggering animation in the user interface. The set of selectable parameters includes a user interface geometry for each of the target animations.Type: ApplicationFiled: December 17, 2010Publication date: June 21, 2012Applicant: MICROSOFT CORPORATIONInventors: Jesse Bishop, Ruurd Johan Boeke, Terry Adams
-
Patent number: 8203561Abstract: A computer implemented method, computer program product, and a data processing system determine an excursion corridor within a virtual environment. A time-stamped snapshot of a location of at least one avatar within the virtual universe is recorded. An avatar tracking data structure is then updated. The avatar tracking data structure provides a time-based history of avatar locations within the virtual universe. A weighted density map is generated. The weighted density map is then correlated with virtual object locations. Each virtual object location corresponds to a virtual object. Excursion corridors are identified. The excursion corridor identifies frequently taken routes between the virtual object locations. Waypoints are identified. Each waypoint corresponds to a virtual object. Each waypoint is an endpoint for one of the excursion corridors.Type: GrantFiled: September 10, 2008Date of Patent: June 19, 2012Assignee: International Business Machines CorporationInventors: William S. Carter, Guido D. Corona
-
Patent number: 8199150Abstract: A system for controlling a rendering engine by using specialized commands. The commands are used to generate a production, such as a television show, at an end-user's computer that executes the rendering engine. In one embodiment, the commands are sent over a network, such as the Internet, to achieve broadcasts of video programs at very high compression and efficiency. Commands for setting and moving camera viewpoints, animating characters, and defining or controlling scenes and sounds are described. At a fine level of control math models and coordinate systems can be used make specifications. At a coarse level of control the command language approaches the text format traditionally used in television or movie scripts. Simple names for objects within a scene are used to identify items, directions and paths. Commands are further simplified by having the rendering engine use defaults when specifications are left out.Type: GrantFiled: March 23, 2005Date of Patent: June 12, 2012Assignee: Quonsil PL. 3, LLCInventor: Charles J. Kulas
-
Patent number: 8199151Abstract: A method of detecting an occurrence of an event of an event type during an animation, in which the animation comprises, for each of a plurality of object parts of an object, data defining the respective movement of that object part at each of a sequence of time-points for the animation, the method comprising: indicating the event type, wherein the event type specifies: one or more of the object parts; and a sequence of two or more event phases that occur during an event of that event type such that, for each event phase, the respective movements of the one or more specified object parts during that event phase are each constrained according to a constraint type associated with that event phase; and detecting an occurrence of an event of the event type by detecting a section of the animation during which the respective movements defined by the animation for the specified one or more object parts are constrained in accordance with the sequence of two or more event phases.Type: GrantFiled: February 13, 2009Date of Patent: June 12, 2012Assignee: Naturalmotion Ltd.Inventor: Nicholas MacDonald Spencer
-
Patent number: 8194079Abstract: A method is described to let animators control the extent by which kinematically scripted character motions affect dynamically simulated objects' motions. The dynamic objects are connected to the kinematic character, such as clothing or hair, and the motion of the dynamic objects is simulated based on the motion of the kinematic character. Such control is important to produce reasonable behavior of dynamic objects in the presence of physically unrealistic kinematic character motion. An Inertial Field Generator (IFG) is employed to compensate for the unreasonable behavior of dynamic objects when the kinematic character undergoes unrealistic motion.Type: GrantFiled: April 13, 2007Date of Patent: June 5, 2012Assignee: PixarInventors: David E. Baraff, Andrew Witkin
-
Patent number: 8194080Abstract: Among other disclosed subject matter, a computer-implemented method for generating a surface representation of an item includes identifying, for a point on an item in an animation process, at least first and second transformation points corresponding to respective first and second transformations of the point. Each of the first and second transformations represents an influence on a location of the point of respective first and second joints associated with the item. The method includes determining an axis for a cylindrical coordinate system using the first and second transformations. The method includes performing an interpolation of the first and second transformation points in the cylindrical coordinate system to obtain an interpolated point. The method includes recording the interpolated point in a surface representation of the item in the animation process.Type: GrantFiled: February 27, 2008Date of Patent: June 5, 2012Assignee: Lucasfilm Entertainment Company Ltd.Inventors: Jason Smith, Frederic P. Pighin, Cary Phillips
-
Patent number: 8184122Abstract: An electronic entertainment system for creating a video sequence by executing video game camera behavior based upon a video game sound file includes a memory configured to store an action event/camera behavior (AE/CB) database, game software such as an action generator module, and one or more sound files. In addition, the system includes a sound processing unit coupled to the memory for processing a selected sound file, and a processor coupled to the memory and the sound processing unit. The processor randomly selects an AE pointer and a CB pointer from the AE/CB database. Upon selection of the CB pointer and the AE pointer, the action generator executes camera behavior corresponding to the selected CB pointer to view an action event corresponding to the selected AE pointer.Type: GrantFiled: July 22, 2010Date of Patent: May 22, 2012Assignee: Sony Computer Entertainment America LLCInventor: Ed Annunziata
-
Patent number: 8184102Abstract: Touch sensor methods, devices and systems are disclosed. One embodiment of the present invention pertains to a method comprising monitoring a finger movement along a touch sensing surface based on position data of a finger touching the touch sensing surface, where the position data is obtained by locating a position of a force applied by the finger in a coordinate of the touch sensing surface. In addition, the method comprises generating direction data associated with the finger movement if the finger movement travels for more than a threshold distance. Furthermore, the method comprises determining a finger gesture which corresponds to the finger movement using a lookup table having multiple preconfigured finger gestures based on the direction data.Type: GrantFiled: December 17, 2008Date of Patent: May 22, 2012Assignee: Cypress Semiconductor CorporationInventors: Tony Park, Luther Lu, Nelson Chow
-
Patent number: 8169438Abstract: In various embodiments, deformations caused by kinematic or reference objects to secondary objects such as hair or fur may be computed in parallel using a temporally coherent deformation technique. A single or uniform direction for a deformation may be determined from which the deformation of the secondary object will occur with respect to a reference object. The uniform direction for the deformation may be determined rather than allowing the direction of the deformation to vary along a dimension of the secondary object. The magnitude of the deformation may be determined to vary along the dimension of the secondary object in response to the penetration depth or the measure of how far inside the secondary object finds itself within the reference object.Type: GrantFiled: March 31, 2008Date of Patent: May 1, 2012Assignee: PixarInventors: David Baraff, Michael Fong, Christine Waggoner
-
Patent number: 8164590Abstract: A method for a computer system includes determining a plurality of illumination modes associated with a plurality of scene descriptors, wherein the plurality of scene descriptors includes a first scene descriptor and a second scene descriptor, determining a first plurality of weights, wherein each weight from the first plurality of weights is associated with an illumination mode from the plurality of illumination modes, determining illumination data associated with the first scene descriptor in response to the first plurality of weights and in response to the plurality of illumination modes, determining a second plurality of weights, wherein each weight from the second plurality of weights is associated with an illumination mode from the plurality of illumination modes, and determining illumination data associated with the second scene descriptor in response to the second plurality of weights and in response to the plurality of illumination modes.Type: GrantFiled: May 13, 2011Date of Patent: April 24, 2012Assignee: PixarInventors: John Anderson, Mark Meyer
-
Patent number: 8164596Abstract: Techniques are provided for automatically creating style sheet animations including keyframe information. In some embodiments, a style sheet animation creation tool with a timeline-based interface is provided. By interacting with the user-interface, the user can select a point on a timeline for an animation object to add a keyframe to an animation of the animation object. In response to the user's selection of the keyframe time point, the style sheet animation creation tool displays an interactive keyframe indicator on the timeline to indicate the selected time point. With the style sheet animation creation tool, a user can generate a style sheet animation without having to author style sheet language text statements by hand.Type: GrantFiled: October 6, 2011Date of Patent: April 24, 2012Assignee: Sencha, Inc.Inventor: Arne Nikolai Bech
-
Patent number: 8159504Abstract: A system that incorporates teachings of the present disclosure may include, for example, an avatar engine having a controller to retrieve a user profile, present a user an avatar having characteristics that correlate to the user profile, detect a change in a developmental growth of the user, adapt a portion of the characteristics of the avatar responsive to the detected change, and present the user the adapted avatar. Other embodiments are disclosed.Type: GrantFiled: October 16, 2008Date of Patent: April 17, 2012Assignee: AT&T Intellectual Property I, L.P.Inventors: E-Lee Chang, Horst Schroeter, Linda Roberts, Darnell Clayton, Madhur Khandelwal
-
Patent number: 8154545Abstract: The invention relates to a method and a computer-aided modelling system for creating a technical drawing from at least two modelled 3D bodies that collide with one another. In a first step, one or more of the regions of the 3D bodies that are affected by the collision are selected. In a second step, a group of colliding faces of the selected regions of the two or more 3D bodies are combined to form a respective collision group and a technical drawing of the two or more colliding modelled 3D bodies is produced. A 2D edge or its associated boundary of a face that belongs to a collision group is treated by masking the other faces that are associated with the same collision group.Type: GrantFiled: December 6, 2005Date of Patent: April 10, 2012Assignee: Parametric Technology CorporationInventors: Manfred Göbel, Hans-Ulrich Becker, Jochen Dürr
-
Patent number: 8144155Abstract: An approach to enrich skeleton-driven animations with physically-based secondary deformation in real time is described. To achieve this goal, the technique described employs a surface-based deformable model that can interactively emulate the dynamics of both low- and high-frequency volumetric effects. Given a surface mesh and a few sample sequences of its physical behavior, a set of motion parameters of the material are learned during an off-line preprocessing step. The deformable model is then applicable to any given skeleton-driven animation of the surface mesh. Additionally, the described dynamic skinning technique can be entirely implemented on GPUs and executed with great efficiency. Thus, with minimal changes to the conventional graphics pipeline, the technique can drastically enhance the visual experience of skeleton-driven animations by adding secondary deformation in real time.Type: GrantFiled: August 11, 2008Date of Patent: March 27, 2012Assignee: Microsoft Corp.Inventors: Kun Zhou, Xiaohan Shi, Baining Guo
-
Patent number: 8139067Abstract: Motion capture animation, shape completion and markerless motion capture methods are provided. A pose deformation space model encoding variability in pose is learnt from a three-dimensional (3D) dataset. Body shape deformation space model encoding variability in pose and shape is learnt from another 3D dataset. The learnt pose model is combined with the learnt body shape model. For motion capture animation, given parameter set, the combined model generates a 3D shape surface of a body in a pose and shape. For shape completion, given partial surface of a body defined as 3D points, the combined model generates a 3D surface model in the combined spaces that fits the 3D points. For markerless motion capture, given 3D information of a body, the combined model traces the movement of the body using the combined spaces that fits the 3D information or reconstructing the body's shape or deformations that fits the 3D information.Type: GrantFiled: July 25, 2007Date of Patent: March 20, 2012Assignee: The Board of Trustees of the Leland Stanford Junior UniversityInventors: Dragomir D. Anguelov, Praveen Srinivasan, Daphne Koller, Sebastian Thrun
-
Patent number: 8134558Abstract: Systems and methods for editing of a computer-generated animation across a plurality of keyframe pairs are provided. Embodiments enable time editing across a plurality of non-roving keyframe pairs. Such non-roving keyframes have fixed references relative to an animation's reference timeline. An author may specify a point on an animation's reference timeline at which each non-roving keyframe is placed. In accordance with embodiments of the present invention, an animation across a plurality of non-roving keyframes is treated as an editable unit. Thus, an author may modify the timing for all or a select portion of such editable unit (which may span a plurality of the non-roving keyframes). For instance, an author may expand or reduce the time span for a plurality of non-roving keyframes, and the timing of the plurality of non-roving keyframes automatically adjusts to maintain their timing proportionality relative to each other in the resulting modified time span.Type: GrantFiled: December 6, 2007Date of Patent: March 13, 2012Assignee: Adobe Systems IncorporatedInventor: John Mayhew
-
Patent number: 8134552Abstract: A method, apparatus and medium to render three-dimensional (3D) objects for 3D graphics. The method includes detecting the presence of a motion by performing local coordinate calculations with respect to each 3D object, performing global coordinate calculations with respect only to objects that each has a motion among the 3D objects, and rendering 3D objects for which local and global coordinate calculations have been performed.Type: GrantFiled: September 22, 2006Date of Patent: March 13, 2012Assignee: Samsung Electronics Co., Ltd.Inventors: Keechang Lee, Dokyoon Kim, Jeonghwan Ahn, Seyoon Tak
-
Patent number: 8126293Abstract: In the present invention, there is provided an image processing apparatus including: a detecting section configured to detect a motion vector from an input image signal acting as the image signal for each of chronologically input pixels; a determining section configured to determine whether the input image signal is cleared; and an interpolating section configured such that if the input image signal is not found cleared, then the interpolating section interpolates and outputs an input image signal intermediate signal interposed at a predetermined point in time between the uncleared input image signal and a preceding input image signal that precedes the uncleared input signal, in accordance with the motion vector; and if the input image signal is found cleared, then the interpolating section allows the input image signal to be output unchanged as the input image signal intermediate signal.Type: GrantFiled: April 10, 2008Date of Patent: February 28, 2012Assignee: Sony CorporationInventors: Masayuki Suematsu, Takayuki Ohe, Masato Usuki, Masanari Yamamoto, Makoto Haitani