Patents by Inventor Mark Pauly
Mark Pauly has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11948238Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: May 27, 2022Date of Patent: April 2, 2024Assignee: Apple Inc.Inventors: Sofien Bouaziz, Mark Pauly
-
Patent number: 11836838Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: GrantFiled: December 3, 2020Date of Patent: December 5, 2023Assignee: Apple Inc.Inventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Publication number: 20230343013Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: ApplicationFiled: May 9, 2023Publication date: October 26, 2023Inventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Patent number: 11568601Abstract: Technologies are provided herein for modeling and tracking physical objects, such as human hands, within a field of view of a depth sensor. A sphere-mesh model of the physical object can be created and used to track the physical object in real-time. The sphere-mesh model comprises an explicit skeletal mesh and an implicit convolution surface generated based on the skeletal mesh. The skeletal mesh parameterizes the convolution surface and distances between points in data frames received from the depth sensor and the sphere-mesh model can be efficiently determined using the skeletal mesh. The sphere-mesh model can be automatically calibrated by dynamically adjusting positions and associated radii of vertices in the skeletal mesh to fit the convolution surface to a particular physical object.Type: GrantFiled: August 14, 2017Date of Patent: January 31, 2023Assignee: UVic Industry Partnerships Inc.Inventors: Andrea Tagliasacchi, Anastasia Tkach, Mark Pauly
-
Patent number: 11348299Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: January 27, 2020Date of Patent: May 31, 2022Assignee: Apple Inc.Inventors: Sofien Bouaziz, Mark Pauly
-
Patent number: 11298966Abstract: The invention relates to a thin optical security element comprising a reflective or refractive light-redirecting surface having a relief pattern operable to redirect incident light from a light source and form a projected image on a projection surface, the projected image comprising a caustic pattern reproducing a reference pattern that is easily visually recognizable by a person. The invention also relates to a method for designing a relief pattern of a light-redirecting surface of a thin optical security element.Type: GrantFiled: September 28, 2018Date of Patent: April 12, 2022Assignee: SICPA HOLDING SAInventors: Andrea Callegari, Pierre Degott, Todor Dinoev, Christophe Garnier, Alain Mayer, Yuliy Schwartzburg, Romain Testuz, Mark Pauly
-
Patent number: 11292283Abstract: The invention relates to a thin optical security element comprising a reflective or refractive light-redirecting surface having a relief pattern operable to redirect incident light from a light source and form a projected image on a projection surface, the optical parameters of this optical security element fulfilling a specific projection criterion such that the projected image comprises a caustic pattern reproducing a reference pattern that is easily visually recognizable by a person. The invention also relates to a method for designing a relief pattern of an optical security element.Type: GrantFiled: September 28, 2018Date of Patent: April 5, 2022Assignee: SICPA HOLDING SAInventors: Andrea Callegari, Pierre Degott, Todor Dinoev, Christophe Garnier, Alain Mayer, Yuliy Schwartzburg, Romain Testuz, Mark Pauly
-
Publication number: 20210174567Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: ApplicationFiled: December 3, 2020Publication date: June 10, 2021Inventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Patent number: 10861211Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: GrantFiled: July 2, 2018Date of Patent: December 8, 2020Assignee: Apple Inc.Inventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Patent number: 10732405Abstract: Method of designing a refractive surface, comprising providing a refractive object having a refractive surface with an initial geometry, determining a refraction of incident illumination through the refractive surface with the initial geometry to create a source irradiance ES distribution on a receiver; and determining a shape of the refractive surface of the refractive object such that a resulting irradiance distribution on the receiver matches a desired target irradiance ET provided by a user.Type: GrantFiled: June 11, 2015Date of Patent: August 4, 2020Assignee: ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE (EPFL)Inventors: Mark Pauly, Romain P. Testuz, Yuliy Schwartzburg
-
Patent number: 10586372Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: January 28, 2019Date of Patent: March 10, 2020Assignee: Apple Inc.Inventors: Sofien Bouaziz, Mark Pauly
-
Publication number: 20190139287Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: ApplicationFiled: July 2, 2018Publication date: May 9, 2019Inventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Patent number: 10192343Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: July 5, 2017Date of Patent: January 29, 2019Assignee: faceshift AGInventors: Sofien Bouaziz, Mark Pauly
-
Patent number: 10013787Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: GrantFiled: December 12, 2011Date of Patent: July 3, 2018Assignee: Faceshift AGInventors: Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
-
Patent number: 9964942Abstract: Embodiments of a system and a method for continuously measuring cementitious board during the continuous manufacture thereof can be used in connection with the manufacture of various cementitious products, including gypsum wallboard, for example. Embodiments of a system and a method for continuously measuring cementitious board during its continuous manufacture can be used online in a continuous manufacturing process to effectively determine the degree to which cementitious slurry has set (e.g., expressed as percent hydration) at a predetermined location, such as, near a cutting station, for example. A height measuring system can be used to determine the relative amount the cementitious board sags as it passes over an unsupported span disposed between the forming station and the cutting station and to correlate the measured sag distance with a value of percent hydration of the cementitious slurry of that particular portion of the cementitious board.Type: GrantFiled: September 21, 2016Date of Patent: May 8, 2018Assignee: United States Gypsum CompanyInventors: Robert Nelson, Christopher Mark Pauly, Charles Whittington, Andrew Rowe
-
Patent number: 9734617Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: May 27, 2016Date of Patent: August 15, 2017Assignee: faceshift AGInventors: Sofien Bouaziz, Mark Pauly
-
Publication number: 20170131701Abstract: Embodiments of a system and a method for continuously measuring cementitious board during the continuous manufacture thereof can be used in connection with the manufacture of various cementitious products, including gypsum wallboard, for example. Embodiments of a system and a method for continuously measuring cementitious board during its continuous manufacture can be used online in a continuous manufacturing process to effectively determine the degree to which cementitious slurry has set (e.g., expressed as percent hydration) at a predetermined location, such as, near a cutting station, for example. A height measuring system can be used to determine the relative amount the cementitious board sags as it passes over an unsupported span disposed between the forming station and the cutting station and to correlate the measured sag distance with a value of percent hydration of the cementitious slurry of that particular portion of the cementitious board.Type: ApplicationFiled: September 21, 2016Publication date: May 11, 2017Inventors: Robert Nelson, Christopher Mark Pauly, Charles Whittington, Andrew Rowe
-
Patent number: 9576553Abstract: A method and apparatus for producing a reflective or refractive surface that reflects or refracts light shined thereon and reproduces on a screen a desired greyscale intensity image on which the reflective or refractive surface is based and a corresponding apparatus, wherein the method permits a reproduction of a reference grayscale image with adjustable precision.Type: GrantFiled: August 23, 2013Date of Patent: February 21, 2017Assignee: ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNEInventors: Mark Pauly, Thomas Kiser
-
Patent number: 9378576Abstract: Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.Type: GrantFiled: June 7, 2013Date of Patent: June 28, 2016Assignee: faceshift AGInventors: Sofien Bouaziz, Mark Pauly
-
Publication number: 20130147788Abstract: A method of animating a digital character according to facial expressions of a user, comprising the steps of, (a) obtaining a 2D image and 3D depth map of the face of the user, (b) determining expression parameters for a user expression model so that a facial expression of the user-specific expression model represents the face of the user shown in the 2D image and 3D depth map (c) using the expression parameters and an animation prior to determine animation parameters usable to animate a digital character, wherein the animation prior is a sequence of animation parameters which represent predefined animations of a digital character (d) using the animation parameters to animate a digital character so that the digital character mimics the face of the user.Type: ApplicationFiled: December 12, 2011Publication date: June 13, 2013Inventors: Thibaut WEISE, Sofien Bouaziz, Hao Ll, Mark Pauly