Patents by Inventor Miguel A. Otaduy

Miguel A. Otaduy has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11853659
    Abstract: Modeling cross-sections of yarn may include receiving yarn simulation input comprising a descriptive model of a general curvature followed by the yarn, providing a plurality of fibers distributed radially from the center of a ply, setting a base position based on parameters, applying a strain model to simulate the effect of stretch forces applied to the yarn, and outputting a yarn model indicating position and directionality of fibers in the yarn. The technology also relates to real-time modeling of a garment comprising a fabric. For instance, real-time modeling of a garment may include providing an input associated with one or more parameters of the fabric, receiving frames of a computer simulated garment, the computer simulated garment including a simulation of the fabric, the fabric simulation including yarns simulated based on a yarn model.
    Type: Grant
    Filed: May 4, 2021
    Date of Patent: December 26, 2023
    Assignee: SEDDI, INC.
    Inventors: Carlos Castillo, Miguel A. Otaduy, Carlos Aliaga, Jorge Lopez
  • Patent number: 11763536
    Abstract: A learning-based clothing animation method and system for highly efficient virtual try-on simulations is provided. Given a garment, the system preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using a database, the system trains a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. A model according to embodiments separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. A recurrent neural network is provided to regress garment wrinkles, and the system achieves highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods.
    Type: Grant
    Filed: January 13, 2022
    Date of Patent: September 19, 2023
    Assignee: SEDDI, INC.
    Inventors: Igor Santesteban, Miguel A. Otaduy, Dan Casas
  • Publication number: 20220139058
    Abstract: A learning-based clothing animation method and system for highly efficient virtual try-on simulations is provided. Given a garment, the system preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using a database, the system trains a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. A model according to embodiments separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. A recurrent neural network is provided to regress garment wrinkles, and the system achieves highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods.
    Type: Application
    Filed: January 13, 2022
    Publication date: May 5, 2022
    Applicant: SEDDI, INC.
    Inventors: Igor SANTESTEBAN, Miguel A. OTADUY, Dan CASAS
  • Patent number: 11250639
    Abstract: A learning-based clothing animation method and system for highly efficient virtual try-on simulations is provided. Given a garment, the system preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using a database, the system trains a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. A model according to embodiments separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. A recurrent neural network is provided to regress garment wrinkles, and the system achieves highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods.
    Type: Grant
    Filed: December 11, 2019
    Date of Patent: February 15, 2022
    Assignee: SEDDI, INC.
    Inventors: Igor Santesteban, Miguel A. Otaduy, Dan Casas
  • Publication number: 20210256172
    Abstract: The technology relates to modeling cross-sections of yarn. For instance, modeling cross-sections of yarn may include receiving yarn simulation input comprising a descriptive model of a general curvature followed by the yarn, providing a plurality of fibers distributed raidally from the center of a ply, setting a base position based on parameters, applying a strain model to simulate the effect of stretch forces applied to the yarn, and outputting a yarn model indicating position and directionality of fibers in the yarn. The technology also relates to real-time modeling of a garment comprising a fabric. For instance, real-time modeling of a garment may include providing an input associated with one or more parameters of the fabric, receiving frames of a computer simulated garment, the computer simulated garment including a simulation of the fabric, the fabric simulation including yarns simulated based on a yarn model.
    Type: Application
    Filed: May 4, 2021
    Publication date: August 19, 2021
    Applicant: SEDDI, INC.
    Inventors: Carlos CASTILLO, Miguel A. OTADUY, Carlos ALIAGA, Jorge LOPEZ
  • Publication number: 20210118239
    Abstract: A learning-based clothing animation method and system for highly efficient virtual try-on simulations is provided. Given a garment, the system preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using a database, the system trains a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. A model according to embodiments separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. A recurrent neural network is provided to regress garment wrinkles, and the system achieves highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods.
    Type: Application
    Filed: December 11, 2019
    Publication date: April 22, 2021
    Applicant: SEDDI, INC.
    Inventors: Igor SANTESTEBAN, Miguel A. OTADUY, Dan CASAS
  • Patent number: 9623608
    Abstract: In an object generation system, consumable base materials are characterized in a characterization process wherein an object generation system can use a plurality of so-characterized base materials. User input representing a desired object and set of characteristics for that desired object are processed, using a computer or computing device, to derive a mapping of locations for placement of portions of the plurality of base materials such that when the mapping is provided to an object generator, the generated object approximates the representing a desired object and set of characteristics. The characterization of a base material might include elasticity of the base material, the user input might be a desired shape and elasticity, the object generator might be a 3D multi-material printer and the generated object might at least approximate the desired shape and elasticity as a result of being constructed from the plurality of base materials used by the printer.
    Type: Grant
    Filed: October 21, 2013
    Date of Patent: April 18, 2017
    Assignee: DISNEY ENTERPRISES, INC.
    Inventors: Bernd Bickel, Wojciech Matusik, Miguel A. Otaduy, Markus Gross, Hanspeter Pfister
  • Publication number: 20140046469
    Abstract: In an object generation system, consumable base materials are characterized in a characterization process wherein an object generation system can use a plurality of so-characterized base materials. User input representing a desired object and set of characteristics for that desired object are processed, using a computer or computing device, to derive a mapping of locations for placement of portions of the plurality of base materials such that when the mapping is provided to an object generator, the generated object approximates the representing a desired object and set of characteristics. The characterization of a base material might include elasticity of the base material, the user input might be a desired shape and elasticity, the object generator might be a 3D multi-material printer and the generated object might at least approximate the desired shape and elasticity as a result of being constructed from the plurality of base materials used by the printer.
    Type: Application
    Filed: October 21, 2013
    Publication date: February 13, 2014
    Applicant: Disney Enterprises, Inc.
    Inventors: Bernd Bickel, Wojciech Matusik, Miguel A. Otaduy, Markus Gross, Hanspeter Pfister
  • Patent number: 8565909
    Abstract: In an object generation system, consumable base materials are characterized in a characterization process wherein an object generation system can use a plurality of so-characterized base materials. User input representing a desired object and set of characteristics for that desired object are processed, using a computer or computing device, to derive a mapping of locations for placement of portions of the plurality of base materials such that when the mapping is provided to an object generator, the generated object approximates the representing a desired object and set of characteristics. The characterization of a base material might include elasticity of the base material, the user input might be a desired shape and elasticity, the object generator might be a 3D multi-material printer and the generated object might at least approximate the desired shape and elasticity as a result of being constructed from the plurality of base materials used by the printer.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: October 22, 2013
    Assignee: Disney Enterprises, Inc.
    Inventors: Bernd Bickel, Wojciech Matusik, Miguel A. Otaduy, Markus Gross, Hanspeter Pfister
  • Publication number: 20120053716
    Abstract: In an object generation system, consumable base materials are characterized in a characterization process wherein an object generation system can use a plurality of so-characterized base materials. User input representing a desired object and set of characteristics for that desired object are processed, using a computer or computing device, to derive a mapping of locations for placement of portions of the plurality of base materials such that when the mapping is provided to an object generator, the generated object approximates the representing a desired object and set of characteristics. The characterization of a base material might include elasticity of the base material, the user input might be a desired shape and elasticity, the object generator might be a 3D multi-material printer and the generated object might at least approximate the desired shape and elasticity as a result of being constructed from the plurality of base materials used by the printer.
    Type: Application
    Filed: February 18, 2011
    Publication date: March 1, 2012
    Applicant: Disney Enterprises, Inc.
    Inventors: Bernd Bickel, Wojciech Matusik, Miguel A. Otaduy, Markus Gross, Hanspeter Pfister
  • Patent number: 7289106
    Abstract: An apparatus comprises a manipulandum, a housing, a sensor and an actuator. The housing has a palpation region spaced apart from the manipulandum. The sensor is coupled to the palpation region of the housing. The sensor is configured to send a signal based on a palpation of the palpation region of the housing. The actuator is coupled to the manipulandum. The actuator is configured to send haptic output to the manipulandum based on the signal.
    Type: Grant
    Filed: May 17, 2004
    Date of Patent: October 30, 2007
    Assignee: Immersion Medical, Inc.
    Inventors: David Bailey, J. Michael Brown, Robert Cohen, Richard L. Cunningham, Robert B. Falk, Miguel A. Otaduy, Victor Wu
  • Publication number: 20050219205
    Abstract: An apparatus comprises a manipulandum, a housing, a sensor and an actuator. The housing has a palpation region spaced apart from the manipulandum. The sensor is coupled to the palpation region of the housing. The sensor is configured to send a signal based on a palpation of the palpation region of the housing. The actuator is coupled to the manipulandum. The actuator is configured to send haptic output to the manipulandum based on the signal.
    Type: Application
    Filed: May 17, 2004
    Publication date: October 6, 2005
    Inventors: David Bailey, J. Brown, Robert Cohen, Richard Cunningham, Robert Falk, Miguel Otaduy, Victor Wu