Patents by Inventor Jake Sganga
Jake Sganga has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11958198Abstract: A lab automation system receives an instruction from a user to perform a protocol within a lab via an interface including a graphical representation of the lab. The lab includes a robot and set of equipment rendered within the graphical representation of the lab. The lab automation system identifies an ambiguous term of the instruction and pieces of equipment corresponding to the ambiguous term and modifies the interface to include a predictive text interface element listing the pieces of equipment. Upon a mouseover of a listed piece of equipment within the predictive text interface element, the lab automation system modifies the graphical representation of the lab to highlight the listed piece of equipment corresponding to the mouseover. Upon a selection of the listed piece of equipment within the predictive text interface element, the lab automation system modifies the instruction to include the listed piece of equipment.Type: GrantFiled: August 2, 2021Date of Patent: April 16, 2024Assignee: Artificial, Inc.Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Patent number: 11919174Abstract: A lab system identifies a set of steps associated with a protocol for a lab meant to be performed by a robot within the lab using equipment and reagents. The lab system renders, within a user interface, a virtual representation of the lab, a virtual robot, and virtual equipment and reagents. Responsive to operating in a first mode, the lab system simulates the identified set of steps identify virtual positions of the virtual robot within the lab as the virtual robot performs the steps and modifies the virtual representation of the lab to mirror the identified positions of the virtual robot in real-time. Responsive to operating in a second mode, the lab system identifies positions of the robot within the lab as the robot performs the identified set of steps and modifies the virtual representation of the lab to mirror the identified positions of the robot in real-time.Type: GrantFiled: August 2, 2021Date of Patent: March 5, 2024Assignee: Artificial, Inc.Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Patent number: 11897144Abstract: A lab system accesses a first protocol for performance by a first robot in a first lab. The first protocol includes a set of steps, each associated with an operation, reagent, and equipment. For each of one or more steps, the lab system modifies the step by: (1) identifying one or more replacement operations that achieve an equivalent or substantially similar result as a performance of the operation, (2) identifying replacement equipment that operates substantially similarly to the equipment, and/or (3) identifying one or more replacement reagents that, when substituted for the reagent, do not substantially affect the performance of the step. The lab system generates a modified protocol by replacing one or more of the set of steps with the modified steps. The lab system selects a second lab including a second and configures the second robot to perform the modified protocol in the second lab.Type: GrantFiled: August 2, 2021Date of Patent: February 13, 2024Assignee: Artificial, Inc.Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20220040853Abstract: A lab system configures robots to performs protocols in labs. The lab automation system receives, via a user interface, an instruction from a user to perform a protocol within a lab. The instruction may comprise text, and the lab may comprise a robot configured to perform the protocol. The lab system converts, using a machine learned model, the text into steps and, for each step, identifies one or more of an operation, lab equipment, and reagent associated with the step. In response to detecting an ambiguity/error associated with the step, the lab system notifies the user via the user interface of the ambiguity/error. The lab system may receive one or more indications from the user that resolve the ambiguity/error and update the associated steps. For each step, the lab system configures the robot to perform an identified operation, interact with identified lab equipment, and/or access/use an identified reagent.Type: ApplicationFiled: August 2, 2021Publication date: February 10, 2022Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20220043561Abstract: A lab automation system receives an instruction from a user to perform a protocol within a lab via an interface including a graphical representation of the lab. The lab includes a robot and set of equipment rendered within the graphical representation of the lab. The lab automation system identifies an ambiguous term of the instruction and pieces of equipment corresponding to the ambiguous term and modifies the interface to include a predictive text interface element listing the pieces of equipment. Upon a mouseover of a listed piece of equipment within the predictive text interface element, the lab automation system modifies the graphical representation of the lab to highlight the listed piece of equipment corresponding to the mouseover. Upon a selection of the listed piece of equipment within the predictive text interface element, the lab automation system modifies the instruction to include the listed piece of equipment.Type: ApplicationFiled: August 2, 2021Publication date: February 10, 2022Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20220040862Abstract: A lab system identifies a set of steps associated with a protocol for a lab meant to be performed by a robot within the lab using equipment and reagents. The lab system renders, within a user interface, a virtual representation of the lab, a virtual robot, and virtual equipment and reagents. Responsive to operating in a first mode, the lab system simulates the identified set of steps identify virtual positions of the virtual robot within the lab as the virtual robot performs the steps and modifies the virtual representation of the lab to mirror the identified positions of the virtual robot in real-time. Responsive to operating in a second mode, the lab system identifies positions of the robot within the lab as the robot performs the identified set of steps and modifies the virtual representation of the lab to mirror the identified positions of the robot in real-time.Type: ApplicationFiled: August 2, 2021Publication date: February 10, 2022Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20220040856Abstract: A lab system accesses a first protocol for performance by a first robot in a first lab. The first protocol includes a set of steps, each associated with an operation, reagent, and equipment. For each of one or more steps, the lab system modifies the step by: (1) identifying one or more replacement operations that achieve an equivalent or substantially similar result as a performance of the operation, (2) identifying replacement equipment that operates substantially similarly to the equipment, and/or (3) identifying one or more replacement reagents that, when substituted for the reagent, do not substantially affect the performance of the step. The lab system generates a modified protocol by replacing one or more of the set of steps with the modified steps. The lab system selects a second lab including a second and configures the second robot to perform the modified protocol in the second lab.Type: ApplicationFiled: August 2, 2021Publication date: February 10, 2022Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20220040863Abstract: A lab system calibrates robots and cameras within a lab. The lab system accesses, via a camera within a lab, an image of a robot arm, which comprises a visible tag located on an exterior. The lab system determines a position of the robot arm using position sensors located within the robot arm and determines a location of the camera relative to the robot arm based on the determined position and the location of the tag. The lab system calibrates the camera using the determined location of the camera relative to the robot arm. After calibrating the camera, the lab system accesses, via the camera, a second image of equipment in the lab that comprises a second visible tag on an exterior. The lab system determines, based on a location of the second visible tag within the accessed second image, a location of the equipment relative to the robot arm.Type: ApplicationFiled: August 2, 2021Publication date: February 10, 2022Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
-
Publication number: 20200297444Abstract: Certain aspects relate to systems and techniques for localizing and/or navigating a medical instrument within a luminal network. A medical system can include an elongate body configured to be inserted into the luminal network, as well as an imaging device positioned on a distal portion of the elongate body. The system may include memory and processors configured to receive from the imaging device image data that includes an image captured when the elongate body is within the luminal network. The image can depict one or more branchings of the luminal network. The processor can be configured to access a machine learning model of one or more luminal networks and determine, based on the machine learning model and information regarding the one or more branchings, a location of the distal portion of the elongate body within the luminal network.Type: ApplicationFiled: March 20, 2020Publication date: September 24, 2020Inventors: David B. Camarillo, David Eng, Jake Sganga
-
Patent number: 10603124Abstract: Autonomous closed loop control of a flexible tendon-driven continuum manipulator having a sensor at a distal tip is performed by measuring spatial attributes of a sensor at the distal tip and estimating an orientation of a base of an articulating region of the flexible tendon-driven continuum manipulator from a kinematic model and the spatial attributes of the sensor. The manipulator control in a task space uses the estimated orientation, a desired trajectory in the task space, and the position of the sensor at the distal tip. The sensor at the distal tip may be a magnetic sensor, impedance sensor, or optical sensor.Type: GrantFiled: November 8, 2017Date of Patent: March 31, 2020Assignee: The Board of Trustees of the Leland Stanford Junior UniversityInventors: David B. Camarillo, Jake A. Sganga
-
Publication number: 20180125591Abstract: Autonomous closed loop control of a flexible tendon-driven continuum manipulator having a sensor at a distal tip is performed by measuring spatial attributes of a sensor at the distal tip and estimating an orientation of a base of an articulating region of the flexible tendon-driven continuum manipulator from a kinematic model and the spatial attributes of the sensor. The manipulator control in a task space uses the estimated orientation, a desired trajectory in the task space, and the position of the sensor at the distal tip. The sensor at the distal tip may be a magnetic sensor, impedance sensor, or optical sensor.Type: ApplicationFiled: November 8, 2017Publication date: May 10, 2018Inventors: David B. Camarillo, Jake A. Sganga