Patents by Inventor Evan Atherton
Evan Atherton has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11295400Abstract: One embodiment of the present invention sets forth a technique for performing tasks associated with a construction project. The technique includes transmitting to a worker, via a mobile computing device worn by the worker, a first instruction related to performing a first task included in a plurality of tasks associated with a construction project, and transmitting to a light-emitting device a command to provide a visual indicator to the worker that facilitates performing the first task, based on an input received from the mobile computing device, determining that the worker has completed the first task of the construction project, selecting, from a database that tracks eligibility of each of the plurality of tasks, a second task included in the plurality of tasks that the worker is eligible to perform, and transmitting to the worker, via the mobile computing device, a second instruction related to performing the second task.Type: GrantFiled: November 22, 2016Date of Patent: April 5, 2022Assignee: AUTODESK, INC.Inventors: Tovi Grossman, George Fitzmaurice, Anderson Nogueira, Nick Beirne, Justin Frank Matejka, Danil Nagy, Steven Li, Benjamin LaFreniere, Heather Kerrick, Thomas White, Fraser Anderson, Evan Atherton, David Thomasson, Arthur Harsuvanakit, Maurice Ugo Conti
-
Patent number: 11181886Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: GrantFiled: April 24, 2017Date of Patent: November 23, 2021Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote
-
Patent number: 11179793Abstract: A control application implements computer vision techniques to cause a positioning robot and a welding robot to perform fabrication operations. The control application causes the positioning robot to place elements of a structure at certain positions based on real-time visual feedback captured by the positioning robot. The control application also causes the welding robot to weld those elements into place based on real-time visual feedback captured by the welding robot. By analyzing the real-time visual feedback captured by both robots, the control application adjusts the positioning and welding operations in real time.Type: GrantFiled: September 12, 2017Date of Patent: November 23, 2021Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Heather Kerrick, Hui Li
-
Patent number: 11072071Abstract: A robot system models the behavior of a user when the user occupies an operating zone associated with a robot. The robot system predicts future behaviors of the user, and then determines whether those predicted behaviors interfere with anticipated behaviors of the robot. When such interference may occur, the robot system generates dynamics adjustments that can be implemented by the robot to avoid such interference. The robot system may also generate dynamics adjustments that can be implemented by the user to avoid such interference.Type: GrantFiled: September 19, 2017Date of Patent: July 27, 2021Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Heather Kerrick, Hui Li
-
Publication number: 20210208563Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: ApplicationFiled: March 22, 2021Publication date: July 8, 2021Inventors: Evan ATHERTON, David THOMASSON, Maurice Ugo CONTI, Heather KERRICK, Nicholas COTE
-
Patent number: 10955814Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: GrantFiled: April 24, 2017Date of Patent: March 23, 2021Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote
-
Patent number: 10708479Abstract: One embodiment of the present invention sets forth a technique for determining a location of an object that is being manipulated or processed by a robot. The technique includes capturing a digital image of the object while the object is disposed by the robot within an imaging space, wherein the digital image includes a direct view of the object and a reflected view of the object, detecting a visible feature of the object in the direct view and the visible feature of the object in the reflected view, and computing a first location of the visible feature in a first direction based on a position of the visible feature in the direct view. The technique further includes computing a second location of the visible feature in a second direction based on a position of the visible feature in the reflected view and causing the robot to move the object to a processing station based at least in part on the first location and the second location.Type: GrantFiled: July 16, 2019Date of Patent: July 7, 2020Assignee: Autodesk, Inc.Inventors: Evan Atherton, David Thomasson, Heather Kerrick, Maurice Conti
-
Patent number: 10579046Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: GrantFiled: April 24, 2017Date of Patent: March 3, 2020Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Maurice Ugo Conti, Heather Kerrick, Nicholas Cote
-
Publication number: 20190337161Abstract: One embodiment of the present invention sets forth a technique for determining a location of an object that is being manipulated or processed by a robot. The technique includes capturing a digital image of the object while the object is disposed by the robot within an imaging space, wherein the digital image includes a direct view of the object and a reflected view of the object, detecting a visible feature of the object in the direct view and the visible feature of the object in the reflected view, and computing a first location of the visible feature in a first direction based on a position of the visible feature in the direct view. The technique further includes computing a second location of the visible feature in a second direction based on a position of the visible feature in the reflected view and causing the robot to move the object to a processing station based at least in part on the first location and the second location.Type: ApplicationFiled: July 16, 2019Publication date: November 7, 2019Inventors: Evan ATHERTON, David THOMASSON, Heather KERRICK, Maurice CONTI
-
Patent number: 10363667Abstract: One embodiment of the present invention sets forth a technique for determining a location of an object that is being manipulated or processed by a robot. The technique includes capturing a digital image of the object while the object is disposed by the robot within an imaging space, wherein the digital image includes a direct view of the object and a reflected view of the object, detecting a visible feature of the object in the direct view and the visible feature of the object in the reflected view, and computing a first location of the visible feature in a first direction based on a position of the visible feature in the direct view. The technique further includes computing a second location of the visible feature in a second direction based on a position of the visible feature in the reflected view and causing the robot to move the object to a processing station based at least in part on the first location and the second location.Type: GrantFiled: November 29, 2016Date of Patent: July 30, 2019Assignee: AUTODESK, INC.Inventors: Evan Atherton, David Thomasson, Heather Kerrick, Maurice Conti
-
Publication number: 20190084158Abstract: A robot system models the behavior of a user when the user occupies an operating zone associated with a robot. The robot system predicts future behaviors of the user, and then determines whether those predicted behaviors interfere with anticipated behaviors of the robot. When such interference may occur, the robot system generates dynamics adjustments that can be implemented by the robot to avoid such interference. The robot system may also generate dynamics adjustments that can be implemented by the user to avoid such interference.Type: ApplicationFiled: September 19, 2017Publication date: March 21, 2019Inventors: Evan ATHERTON, David THOMASSON, Heather KERRICK, Hui LI
-
Publication number: 20190076949Abstract: A control application implements computer vision techniques to cause a positioning robot and a welding robot to perform fabrication operations. The control application causes the positioning robot to place elements of a structure at certain positions based on real-time visual feedback captured by the positioning robot. The control application also causes the welding robot to weld those elements into place based on real-time visual feedback captured by the welding robot. By analyzing the real-time visual feedback captured by both robots, the control application adjusts the positioning and welding operations in real time.Type: ApplicationFiled: September 12, 2017Publication date: March 14, 2019Inventors: Evan ATHERTON, David THOMASSON, Heather KERRICK, Hui LI
-
Publication number: 20180307206Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: ApplicationFiled: April 24, 2017Publication date: October 25, 2018Inventors: Evan ATHERTON, David THOMASSON, Maurice Ugo CONTI, Heather KERRICK, Nicholas COTE
-
Publication number: 20180304550Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: ApplicationFiled: April 24, 2017Publication date: October 25, 2018Inventors: Evan ATHERTON, David THOMASSON, Maurice Ugo CONTI, Heather KERRICK, Nicholas COTE
-
Publication number: 20180307207Abstract: A robot system is configured to fabricate three-dimensional (3D) objects using closed-loop, computer vision-based control. The robot system initiates fabrication based on a set of fabrication paths along which material is to be deposited. During deposition of material, the robot system captures video data and processes that data to determine the specific locations where the material is deposited. Based on these locations, the robot system adjusts future deposition locations to compensate for deviations from the fabrication paths. Additionally, because the robot system includes a 6-axis robotic arm, the robot system can deposit material at any locations, along any pathway, or across any surface. Accordingly, the robot system is capable of fabricating a 3D object with multiple non-parallel, non-horizontal, and/or non-planar layers.Type: ApplicationFiled: April 24, 2017Publication date: October 25, 2018Inventors: Evan ATHERTON, David THOMASSON, Maurice Ugo CONTI, Heather KERRICK, Nicholas COTE
-
Publication number: 20170151676Abstract: One embodiment of the present invention sets forth a technique for determining a location of an object that is being manipulated or processed by a robot. The technique includes capturing a digital image of the object while the object is disposed by the robot within an imaging space, wherein the digital image includes a direct view of the object and a reflected view of the object, detecting a visible feature of the object in the direct view and the visible feature of the object in the reflected view, and computing a first location of the visible feature in a first direction based on a position of the visible feature in the direct view. The technique further includes computing a second location of the visible feature in a second direction based on a position of the visible feature in the reflected view and causing the robot to move the object to a processing station based at least in part on the first location and the second location.Type: ApplicationFiled: November 29, 2016Publication date: June 1, 2017Inventors: Evan ATHERTON, David THOMASSON, Heather KERRICK, Maurice CONTI
-
Publication number: 20170148116Abstract: One embodiment of the present invention sets forth a technique for performing tasks associated with a construction project. The technique includes transmitting to a worker, via a mobile computing device worn by the worker, a first instruction related to performing a first task included in a plurality of tasks associated with a construction project, and transmitting to a light-emitting device a command to provide a visual indicator to the worker that facilitates performing the first task, based on an input received from the mobile computing device, determining that the worker has completed the first task of the construction project, selecting, from a database that tracks eligibility of each of the plurality of tasks, a second task included in the plurality of tasks that the worker is eligible to perform, and transmitting to the worker, via the mobile computing device, a second instruction related to performing the second task.Type: ApplicationFiled: November 22, 2016Publication date: May 25, 2017Inventors: Tovi Grossman, George Fitzmaurice, Anderson Nogueira, Nick Beirne, Justin Frank Matejka, Danil Nagy, Steven Li, Benjamin LaFreniere, Heather Kerrick, Thomas White, Fraser Anderson, Evan Atherton, David Thomasson, Arthur Harsuvanakit, Maurice Ugo Conti