Patents by Inventor Evan Patrick Atherton

Evan Patrick Atherton has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11953879
    Abstract: An agent engine allocates a collection of agents to scan the surface of an object model. Each agent operates autonomously and implements particular behaviors based on the actions of nearby agents. Accordingly, the collection of agents exhibits swarm-like behavior. Over a sequence of time steps, the agents traverse the surface of the object model. Each agent acts to avoid other agents, thereby maintaining a relatively consistent distribution of agents across the surface of the object model over all time steps. At a given time step, the agent engine generates a slice through the object model that intersects each agent in a group of agents. The slice associated with a given time step represents a set of locations where material should be deposited to fabricate a 3D object. Based on a set of such slices, a robot engine causes a robot to fabricate the 3D object.
    Type: Grant
    Filed: September 8, 2020
    Date of Patent: April 9, 2024
    Assignee: AUTODESK, INC.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote, Hui Li
  • Publication number: 20240070949
    Abstract: In various embodiments, a computer animation application automatically solves inverse kinematic problems when generating object animations. The computer animation application determines a target vector based on a target value for a joint parameter associated with a joint chain and at least one of a target position or a target orientation for an end-effector associated with the joint chain. The computer animation application executes a trained machine learning model on the target vector to generate a predicted vector that includes data associated with multiple joint parameters associated with the joint chain.
    Type: Application
    Filed: August 24, 2022
    Publication date: February 29, 2024
    Inventors: Evan Patrick ATHERTON, Dieu Linh TRAN
  • Publication number: 20240070517
    Abstract: In various embodiments, an inverse kinematic (IK) modeling application generates models that are used to solve IK problems for object animations. The IK modeling application generates configuration vectors based on a set of joint parameters associated with a joint chain. The IK modeling application executes forward kinematic operation(s) on the joint chain based on the configuration vectors to generate target vectors. Each target vector includes data associated with at least one of a position or an orientation for an end-effector associated with the joint chain. The IK modeling application performs one or more machine learning (ML) operations on an ML model based on the configuration vectors and the target vectors to generate a trained ML model that computes a predicted joint vector associated with the joint chain based on at least one of a target position or a target orientation for the end-effector.
    Type: Application
    Filed: August 24, 2022
    Publication date: February 29, 2024
    Inventors: Evan Patrick ATHERTON, Dieu Linh TRAN
  • Patent number: 11679506
    Abstract: One embodiment of the present invention sets forth a technique for generating simulated training data for a physical process. The technique includes receiving, as input to at least one machine learning model, a first simulated image of a first object, wherein the at least one machine learning model includes mappings between simulated images generated from models of physical objects and real-world images of the physical objects. The technique also includes performing, by the at least one machine learning model, one or more operations on the first simulated image to generate a first augmented image of the first object. The technique further includes transmitting the first augmented image to a training pipeline for an additional machine learning model that controls a behavior of the physical process.
    Type: Grant
    Filed: March 10, 2022
    Date of Patent: June 20, 2023
    Assignee: AUTODESK, INC.
    Inventors: Hui Li, Evan Patrick Atherton, Erin Bradner, Nicholas Cote, Heather Kerrick
  • Publication number: 20230182302
    Abstract: A computer-implemented method for generating and evaluating robotic workcell solutions includes: determining a plurality of locations within a workcell volume, wherein each location corresponds to a possible workcell solution; for each location included in the plurality of locations, determining a value for a first robot-motion attribute for a first robot based on position information associated with the location and a trajectory associated with a component of the first robot; and, for each location included in the plurality of locations, computing a first value for a first performance metric based on the value for the first robot-motion attribute.
    Type: Application
    Filed: December 10, 2021
    Publication date: June 15, 2023
    Inventors: Evan Patrick ATHERTON, Ardavan BIDGOLI
  • Patent number: 11654565
    Abstract: One embodiment of the present invention sets forth a technique for controlling the execution of a physical process. The technique includes receiving, as input to a machine learning model that is configured to adapt a simulation of the physical process executing in a virtual environment to a physical world, simulated output for controlling how the physical process performs a task in the virtual environment and real-world data collected from the physical process performing the task in the physical world. The technique also includes performing, by the machine learning model, one or more operations on the simulated output and the real-world data to generate augmented output. The technique further includes transmitting the augmented output to the physical process to control how the physical process performs the task in the physical world.
    Type: Grant
    Filed: July 27, 2020
    Date of Patent: May 23, 2023
    Assignee: AUTODESK, INC.
    Inventors: Hui Li, Evan Patrick Atherton, Erin Bradner, Nicholas Cote, Heather Kerrick
  • Patent number: 11609547
    Abstract: A robot system is configured to identify gestures performed by an end-user proximate to a work piece. The robot system then determines a set of modifications to be made to the work piece based on the gestures. A projector coupled to the robot system projects images onto the work piece that represent the modification to be made and/or a CAD model of the work piece. The robot system then performs the modifications.
    Type: Grant
    Filed: December 19, 2016
    Date of Patent: March 21, 2023
    Assignee: AUTODESK, INC.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick
  • Patent number: 11556108
    Abstract: A robot is configured to assist an end-user with creative tasks. While the end-user modifies the work piece, the robot observes the modifications made by the end-user and determines one or more objectives that the end-user may endeavor to accomplish. The robot then determines a set of actions to perform that assist the end-user with accomplishing the objectives.
    Type: Grant
    Filed: May 4, 2020
    Date of Patent: January 17, 2023
    Assignee: AUTODESK, INC.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick
  • Publication number: 20220193912
    Abstract: One embodiment of the present invention sets forth a technique for generating simulated training data for a physical process. The technique includes receiving, as input to at least one machine learning model, a first simulated image of a first object, wherein the at least one machine learning model includes mappings between simulated images generated from models of physical objects and real-world images of the physical objects. The technique also includes performing, by the at least one machine learning model, one or more operations on the first simulated image to generate a first augmented image of the first object. The technique further includes transmitting the first augmented image to a training pipeline for an additional machine learning model that controls a behavior of the physical process.
    Type: Application
    Filed: March 10, 2022
    Publication date: June 23, 2022
    Inventors: Hui LI, Evan Patrick ATHERTON, Erin BRADNER, Nicholas COTE, Heather KERRICK
  • Patent number: 11273553
    Abstract: One embodiment of the present invention sets forth a technique for generating simulated training data for a physical process. The technique includes receiving, as input to at least one machine learning model, a first simulated image of a first object, wherein the at least one machine learning model includes mappings between simulated images generated from models of physical objects and real-world images of the physical objects. The technique also includes performing, by the at least one machine learning model, one or more operations on the first simulated image to generate a first augmented image of the first object. The technique further includes transmitting the first augmented image to a training pipeline for an additional machine learning model that controls a behavior of the physical process.
    Type: Grant
    Filed: May 31, 2018
    Date of Patent: March 15, 2022
    Assignee: AUTODESK, INC.
    Inventors: Hui Li, Evan Patrick Atherton, Erin Bradner, Nicholas Cote, Heather Kerrick
  • Patent number: 10956739
    Abstract: A technique for displaying a representative path associated with a robotic device. The technique includes detecting at least one reference point within a first image of a workspace, generating the representative path based on path instructions associated with the robotic device and the at least one reference point, and displaying the representative path within the workspace.
    Type: Grant
    Filed: June 27, 2016
    Date of Patent: March 23, 2021
    Assignee: AUTODESK, INC.
    Inventors: David Thomasson, Evan Patrick Atherton, Maurice Ugo Conti, Heather Kerrick
  • Publication number: 20210073445
    Abstract: A robotic assembly cell is configured to generate a physical mesh of physical polygons based on a simulated mesh of simulated triangles. A control application configured to operate the assembly cell selects a simulated polygon in the simulated mesh and then causes a positioning robot in the cell to obtain a physical polygon that is similar to the simulated polygon. The positioning robot positions the polygon on the physical mesh, and a welding robot in the cell then welds the polygon to the mesh. The control application captures data that reflects how the physical polygon is actually positioned on the physical mesh, and then updates the simulated mesh to be geometrically consistent with the physical mesh. In doing so, the control application may execute a multi-objective solver to generate an updated simulated mesh that meets specific design criteria.
    Type: Application
    Filed: November 24, 2020
    Publication date: March 11, 2021
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote
  • Publication number: 20200401105
    Abstract: An agent engine allocates a collection of agents to scan the surface of an object model. Each agent operates autonomously and implements particular behaviors based on the actions of nearby agents. Accordingly, the collection of agents exhibits swarm-like behavior. Over a sequence of time steps, the agents traverse the surface of the object model. Each agent acts to avoid other agents, thereby maintaining a relatively consistent distribution of agents across the surface of the object model over all time steps. At a given time step, the agent engine generates a slice through the object model that intersects each agent in a group of agents. The slice associated with a given time step represents a set of locations where material should be deposited to fabricate a 3D object. Based on a set of such slices, a robot engine causes a robot to fabricate the 3D object.
    Type: Application
    Filed: September 8, 2020
    Publication date: December 24, 2020
    Inventors: Evan Patrick ATHERTON, David THOMASSON, Maurice Ugo CONTI, Heather KERRICK, Nicholas COTE, Hui LI
  • Patent number: 10853539
    Abstract: A robotic assembly cell is configured to generate a physical mesh of physical polygons based on a simulated mesh of simulated triangles. A control application configured to operate the assembly cell selects a simulated polygon in the simulated mesh and then causes a positioning robot in the cell to obtain a physical polygon that is similar to the simulated polygon. The positioning robot positions the polygon on the physical mesh, and a welding robot in the cell then welds the polygon to the mesh. The control application captures data that reflects how the physical polygon is actually positioned on the physical mesh, and then updates the simulated mesh to be geometrically consistent with the physical mesh. In doing so, the control application may execute a multi-objective solver to generate an updated simulated mesh that meets specific design criteria.
    Type: Grant
    Filed: May 26, 2017
    Date of Patent: December 1, 2020
    Assignee: AUTODESK, INC.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote
  • Publication number: 20200353621
    Abstract: One embodiment of the present invention sets forth a technique for controlling the execution of a physical process. The technique includes receiving, as input to a machine learning model that is configured to adapt a simulation of the physical process executing in a virtual environment to a physical world, simulated output for controlling how the physical process performs a task in the virtual environment and real-world data collected from the physical process performing the task in the physical world. The technique also includes performing, by the machine learning model, one or more operations on the simulated output and the real-world data to generate augmented output. The technique further includes transmitting the augmented output to the physical process to control how the physical process performs the task in the physical world.
    Type: Application
    Filed: July 27, 2020
    Publication date: November 12, 2020
    Inventors: Hui LI, Evan Patrick ATHERTON, Erin BRADNER, Nicholas COTE, Heather KERRICK
  • Patent number: 10768606
    Abstract: An agent engine allocates a collection of agents to scan the surface of an object model. Each agent operates autonomously and implements particular behaviors based on the actions of nearby agents. Accordingly, the collection of agents exhibits swarm-like behavior. Over a sequence of time steps, the agents traverse the surface of the object model. Each agent acts to avoid other agents, thereby maintaining a relatively consistent distribution of agents across the surface of the object model over all time steps. At a given time step, the agent engine generates a slice through the object model that intersects each agent in a group of agents. The slice associated with a given time step represents a set of locations where material should be deposited to fabricate a 3D object. Based on a set of such slices, a robot engine causes a robot to fabricate the 3D object.
    Type: Grant
    Filed: June 2, 2017
    Date of Patent: September 8, 2020
    Assignee: AUTODESK, INC.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote, Hui Li
  • Patent number: 10751879
    Abstract: One embodiment of the present invention sets forth a technique for controlling the execution of a physical process. The technique includes receiving, as input to a machine learning model that is configured to adapt a simulation of the physical process executing in a virtual environment to a physical world, simulated output for controlling how the physical process performs a task in the virtual environment and real-world data collected from the physical process performing the task in the physical world. The technique also includes performing, by the machine learning model, one or more operations on the simulated output and the real-world data to generate augmented output. The technique further includes transmitting the augmented output to the physical process to control how the physical process performs the task in the physical world.
    Type: Grant
    Filed: May 31, 2018
    Date of Patent: August 25, 2020
    Assignee: AUTODESK, INC.
    Inventors: Hui Li, Evan Patrick Atherton, Erin Bradner, Nicholas Cote, Heather Kerrick
  • Publication number: 20200264583
    Abstract: A robot is configured to assist an end-user with creative tasks. While the end-user modifies the work piece, the robot observes the modifications made by the end-user and determines one or more objectives that the end-user may endeavor to accomplish. The robot then determines a set of actions to perform that assist the end-user with accomplishing the objectives.
    Type: Application
    Filed: May 4, 2020
    Publication date: August 20, 2020
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick
  • Publication number: 20200147794
    Abstract: An assembly engine is configured to generate, based on a computer-aided design (CAD) assembly, a set of motion commands that causes the robot to manufacture a physical assembly corresponding to the CAD assembly. The assembly engine analyzes the CAD assembly to determine an assembly sequence for various physical components to be included in the physical assembly. The assembly sequence indicates the order in which each physical component should be incorporated into the physical assembly and how those physical components should be physically coupled together. The assembly engine further analyzes the CAD assembly to determine different component paths that each physical component should follow when being incorporated into the physical assembly. Based on the assembly sequence and the component paths, the assembly engine generates a set of motion commands that the robot executes to assemble the physical components into the physical assembly.
    Type: Application
    Filed: October 29, 2019
    Publication date: May 14, 2020
    Inventors: Heather KERRICK, Erin BRADNER, Hui LI, Evan Patrick ATHERTON, Nicholas COTE
  • Patent number: 10642244
    Abstract: A robot is configured to assist an end-user with creative tasks. While the end-user modifies the work piece, the robot observes the modifications made by the end-user and determines one or more objectives that the end-user may endeavor to accomplish. The robot then determines a set of actions to perform that assist the end-user with accomplishing the objectives.
    Type: Grant
    Filed: December 19, 2016
    Date of Patent: May 5, 2020
    Assignee: Autodesk, Inc.
    Inventors: Evan Patrick Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick