AUTOMATED VISUAL PIPETTING

A system and method is described for Automated Visual Pipetting on a machine such as a three dimensional fabrication device like a 3D Printer or other computer numerical control (CNC) machine tools, for improved speed, accuracy and reliability in pipetting procedures. A camera is mounted on a pipette or another type of tool that may be a deposition or non-deposition tool (e.g., milling). A camera feed and recognition software can enable users to replace existing micro-pipetting techniques with a computerized process that may be controlled with a few simple mouse clicks, while the user can directly visualize a live experimental setup. The embodiment also allows for integration with common molecular biology procedure “kits,” and enables the process to be automated and visualized without requiring constant interaction by the user.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
REFERENCE TO RELATED APPLICATIONS

This applicant claims the benefit of U.S. Provisional Application Ser. No. 61/633,433, filed Feb. 10, 2012 and incorporated by reference herein; U.S. Provisional Application Ser. No. 61/741,368, filed Jul. 18, 2012 and incorporated by reference herein; U.S. Provisional Application Ser. No. 61/689,963, filed Jun. 18, 2012 and incorporated by reference herein.

Co-pending patent application no. 13/761,272 entitled “MULTI-AXIS, MULTI-PURPOSE ROBOTICS AUTOMATION AND QUALITY ADAPTIVE ADDITIVE MANUFACTURING” filed on Feb. 7, 2013 having named inventor Adam Perry Tow is hereby incorporated by reference in its entirety and for all purposes.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to automated visual pipetting (AVP) using a camera and automation procedures using techniques available in three dimensional fabrication systems.

2. Background

There have been many developments in additive manufacturing in recent years, and three dimensional fabrication or “printing” systems have become an increasingly practical means of manufacturing organic and inorganic materials from a digital model. For clarity, three dimensional fabricators may be referred to as an additive manufacturing device or 3D Printer. A description of many such fabrication systems and recent developments in the art can be found in U.S. Pat. No. 7,625,198 to Lipson et al. and the patents and publications referenced therein.

With the proliferation of molecular biology research in recent years, the efficiency of researchers has proven to be an important factor in achieving successful results from research and development endeavors. With manual tasks such as micro-pipetting becoming a limiting factor on progress, a system to quicken and automate routine molecular biology procedures would have significant impacts on modern biology laboratories. A pipette (also called a pipet, pipettor, or chemical dropper) is a laboratory tool used to transport a measured volume of liquid. Pipettes are commonly used in molecular biology, analytical chemistry, and medical tests. Pipettes come in several designs for various purposes with differing levels of accuracy and precision, from single piece glass pipettes to more complex adjustable or electronic pipettes. Many pipette types work by creating a partial vacuum above the liquid-holding chamber and selectively releasing this vacuum to draw up and dispense liquid.

Though many experiments depend on cell geometry, to date, current activities molecular biology may be hindered by the inability to standardize cell culture across experiments, often relying on inaccurate and subjective measures of cell culture growth and confluence. This has thereby led to severe consequences because a basic tenet of scientific method is replication (i.e., to repeat an experiment in order to duplicate the results, thus further validating the underlying hypothesis), something which current techniques do not allow in precise terms.

In addition, and often in an effort to achieve more precise and reliable results, current pipetting activities may need to be performed by individuals with sufficient training and developed skills in pipetting procedures, which precludes laboratory assistants or students with less experience from performing experiments that require precision or reliability.

Due to the inherent complexities of performing experiments with identical samples and the shortcomings in currently known techniques, existing pipetting procedures may fail to achieve optimal levels of efficiency, reliability and accuracy. In particular, it would be desirable to have pipetting systems and techniques that increase the speed of conducting such procedures, minimize reliance on the individual skills of personnel involved in the experiments, and reduce the risk of generating flawed, unreliable or imprecise results.

SUMMARY OF THE INVENTION

The shortcomings of the prior art can be overcome and additional advantages can be provided with the Automated Visual Pipetting (“AVP”) systems and techniques described herein. The present invention can improve the accuracy, ease of use and efficiency of pipetting procedures in order to achieve a drastic improvement in performance and quality.

Embodiments of the present invention involving Automated Visual Pipetting are conceived with a few goals in mind, most notably: (1) to be visually consistent with the logic of experiments performed by hand; (2) such that a first day undergraduate research associate could be capable of operating such an embodiment, and running experiments within minutes of engaging it; and (3) able to be added to machines that may have non-pipetting uses, such as three dimensional fabricators (i.e., additive manufacturing) or other computer numerical control (CNC) machine tools like milling.

In one Automated Visual Pipetting embodiment of the present invention, a camera is mounted on a pipette or another type of tool that may be a deposition or non-deposition tool (e.g., milling). A camera feed and recognition software can enable users to replace existing micro-pipetting techniques with a computerized process that may be controlled with a few simple mouse clicks, while the user can directly visualize a live experimental setup. The embodiment also allows for integration with common molecular biology procedure “kits,” and enables the process to be automated and visualized without requiring constant interaction by the user.

Such embodiments of the present invention may be uniquely capable of pipetting and automating cell culture using a single machine and essentially the same software. Unlike existing pipetting techniques and systems that suffer the problems described above, embodiments of the present invention as disclosed herein can allow for cells to be deposited in exact (or very near exact) geometric patterns, providing several new opportunities such as optimizing repeatable cell line-specific deposition patterns for achieving desired confluence in a specific time frame, and experimenting using novel geometric cell arrangements which could potentially include platforms for studying single neuron synapses or creating entire “manufactured” organ systems.

Embodiments of the present invention may involve a camera and pipetting tool that, for example, is guided by a control unit receiving instructions from a fabrication and/or pipetting command unit (which may be a computer) running either a locally-stored or server-based fabrication and/or pipetting software application. The computer may be integrated into a fabrication and/or pipetting device, or connected to the fabrication and/or pipetting device via a wireless connection such as Bluetooth, WLAN, NFC or other wireless communication technologies, or a wired connection such as Ethernet, USB, FireWire, serial or parallel connection, or other wired communication technologies.

Some of the features provided by the system of the present disclosure are described as follows:

A three dimensional pipetting device, having a control unit for receiving instructions from a pipetting command unit and operating a pipetting tool head, and a plurality of interchangeable pipetting tips that can be affixed to the pipetting tool head, such that the control unit can operate the pipetting tool head to selectively use one of the plurality of interchangeable pipetting tips, and selectively draw up or dispense liquid from or to one or more containers on a work surface.

A three dimensional pipetting device, having a control unit for receiving instructions from a pipetting command unit and operating a pipetting tool head, and a camera for recording data including the position of one or more containers on a work surface, such that the control unit can operate the pipetting tool head to selectively draw up or dispense liquid from or to the one or more containers on the work surface. Additionally, the three dimensional pipetting device may also be a three dimensional fabricating system. Additionally, the pipetting command unit is configured to receive and process data recorded by the camera to determine the location and size of at least one container on the work surface. Additionally, the pipetting command unit is configured to receive and process data recorded by the camera to generate a digital image of a plurality of items on the work surface, which may optionally be output to an external monitor to display a virtual arrangement of the items that is different from the physical arrangement of the items on the work surface. Additionally, the pipetting command unit is configured to receive data recorded by the camera and visually simulate the performance of a procedure that can be performed by the pipetting tool head. Additionally, the pipetting command unit is configured to receive and process a visual indicator recorded by the camera to identity an item on the work surface. Additionally, the pipetting command unit is configured to receive and process a visual indicator recorded by the camera as an instruction to use the pipetting tool head and selectively draw up liquid from at least one container on the work surface. Additionally, the pipetting command unit is configured to receive and process a visual indicator recorded by the camera as an instruction to use the pipetting tool head and selectively deposit liquid into at least one container on the work surface.

A method for using a three dimensional pipetting device, including the steps of transmitting instructions for operating a camera from a pipetting command unit to a control unit, operating the camera with the control unit to record image data of one or more containers on a work surface, transmitting recorded image data from the control unit to the pipetting command unit, transmitting instructions for operating a pipetting tool head from a pipetting command unit to a control unit; and operating the pipetting tool head to selectively draw up liquid from the one or more containers on the work surface. Additionally, the pipetting command unit processes recorded image data and determines the location of at least one container on the work surface. Additionally, the pipetting command unit processes recorded image data and generates a digital image of a plurality of items on the work surface, and the digital image may include a virtual arrangement of the items that is different from the physical arrangement of the items on the work surface. Additionally, the pipetting command unit processes recorded image data received from the control unit and simulates performance of a procedure that can be performed by the pipetting tool head. Additionally, the pipetting command unit processes recorded image data and detects a visual indicator (that may be a QR code) in at least one of the recorded images that identifies an item on the work surface. Additionally, the pipetting command unit processes recorded image data, detects a visual indicator (that may be a QR code) in at least one of the recorded images as an instruction to use the pipetting tool head, and selectively draws up or deposits liquid from or into at least one container on the work surface in response to said instruction. Additionally, the pipetting command unit processes recorded image data received from the control unit and detects whether a prescribed amount of liquid has been aspirated from the one or more containers on the work surface by the pipetting tool head.

One great benefit of the present invention is that many embodiments of Automated Visual Pipetting can be combined with, and implemented in, three dimensional fabricators (3D printers). However, although many embodiments of the present invention described herein relate to a joint three dimensional fabricator and Automated Visual Pipetting device, it should be readily apparent to the reader that stand-alone Automated Visual Pipetting devices are within the scope of the present invention, and that a three dimensional fabricator is not required to implement the present invention. The present invention has many embodiments, some of which are described herein, and others which should be apparent to the reader or inferred from what is taught herein.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a perspective view of a three dimensional fabricator.

FIG. 2 shows an embodiment of the invention in which a camera is affixed to the deposition tool head.

FIG. 3 shows an embodiment of the invention in which a visual code is used by the on-board camera to guide motion and interchange tips, or identify items on the machine's build-tray.

FIG. 4 is a perspective view of a three dimensional pipetting device in accordance with an embodiment of the invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)

In order to provide some background regarding three dimensional fabricating systems and illustrate common components in such devices that may be used in connection with the present invention, FIG. 1 provides a perspective view of a prior art three dimensional fabricating system. Fabrication system 100 includes fabricator 101 with material deposition tool head 102 (also referred to herein as deposition tool or deposition head), control unit 103 having one or more actuators and sensors configured to control operating characteristics of material deposition tool 102, and build tray (i.e., build surface) 104. Fabrication command unit 105 may be coupled to fabricator 101 as a component physically inside fabricator 101, or it may be coupled as an external device (e.g., computer) via a wired or wireless connection.

Fabrication command unit 105 includes processor 106, memory 107, and fabrication software application 108 that can be stored in memory 107 and executed by processor 106. It should be appreciated that control unit 103 of fabricator 101 may be configured to receive instructions from fabrication command unit 105 such that fabricator 101 can fabricate an output product on build surface 104 from materials dispensed by material deposition tool 102. Fabrication software application 108 can generate tool path information for fabricator 101 and delineate how material can be used to generate shapes. Complex CAD programs may also be used to generate the intended geometry.

Embodiments of the present invention may be implemented in any three dimensional fabricating system (i.e., additive manufacturing device or 3D Printer), for example, as illustrated in FIG. 1 and described above, that is suitable for performing Automated Visual Pipetting techniques. Other exemplary three dimensional fabricating system or components thereof that may be suitable for Automated Visual Pipetting are described in U.S. Pub. No. 2012/0241993 entitled “SYSTEMS AND METHODS FOR FREEFORM FABRICATION OF FOAMED STRUCTURES” and published on Sept. 27, 2012 (filed as U.S. application Ser. No. 13/356,194 on Jan. 23, 2012) and U.S. Pat. No. 7,625,198 to Lipson et al.

By way of further explanation, embodiments of the present invention may use Fabrication system 100 to perform Automated Visual Pipetting. Material deposition tool 102 may include a mounted pipetting tool and camera, or be replaced with a pipetting tool and camera. Alternatively, material deposition tool 102 may include a mounted pipetting tool or be replaced with a pipetting tool, and one or more cameras may be mounted on other surfaces of fabricator 101. Control unit 103 having one or more actuators and sensors configured to control operating characteristics of material deposition tool 102, may similarly be configured to control operating characteristics of a pipetting tool and one or more cameras. Experiments to undergo pipetting procedures may be placed onto build tray (i.e., build surface) 104.

Fabrication command unit 105 may be configured to support manual and automated use of a pipetting tool and one or more cameras. Likewise, fabrication software application 108 may include or be replaced with Automated Visual Pipetting software to receive pipetting and/or camera requests from a user, generate pipetting tool and/or camera path information, and direct fabricator 101 and perform manual and automated pipetting procedures, as well as operate the one or more cameras.

Automated Visual Pipetting software included with, or replacing, fabrication software application 108 can be stored in memory 107 and executed by processor 106. It should be appreciated that control unit 103 of fabricator 101 may be configured to receive instructions from fabrication command unit 105 such that fabricator 101 can conduct pipetting procedures on experiments placed upon build surface 104 from a pipetting tool mounted on, or replacing, deposition tool 102. Therefore, in various embodiments of the present invention, fabrication system 100 may be referred to as AVP system 100, fabrication command unit 105 may be referred to as AVP command unit 105, and fabricator 101 may be referred to as a three dimensional pipetting device, as it supports three dimensional movement of the pipetting tool above build surface 104.

Embodiments of the present invention may be implemented in three dimensional fabricating systems (i.e., additive manufacturing devices or 3D Printers), similar to the example illustrated in FIG. 1 and described above. U.S. Pub. No. 2012/0241993 entitled “SYSTEMS AND METHODS FOR FREEFORM FABRICATION OF FOAMED STRUCTURES” and published on Sep. 27, 2012 (filed as U.S. application Ser. No. 13/356,194 on Jan. 23, 2012) provides a similar depiction of FIG. 1 with an accompanying disclosure. As noted above, additional information regarding many components of three dimensional fabricating systems may be found in U.S. Pat. No. 7,625,198 to Lipson et al.

As shown in an embodiment of the present invention depicted in FIG. 2, tool head 201 can accept a plurality of disposable (or reusable) pipette tips (or needles) 202 and act as a pipette or micropipette would function in the hand of a biologist. Camera unit 203 can send a live video or image feed of lens 204 to a computer (e.g., Fabrication command unit 105 of FIG. 1). The live video or image feed can be used by the computer to map out an experimental setup placed on a build surface (e.g., build tray 104 of FIG. 1) which is accessible to tool head 201. Tool head 201 may be mounted to, or replace, material deposition tool 102 of FIG. 1.

By using the camera, Automated Visual Pipetting software (e.g., included with, or replacing, fabrication software application 108) can assess where items are in a setup by visualizing items, the particular containers being used, or by other visual indicators, such as a particular shape, a QR code, a bar code, one or more numbers and/or letters, a specific packaging design or logo, etc. Camera 203 may be used by AVP system 100 to automatically guide tool head 201, or to assess distance, for example by measuring pixels between two points of known distance and thereby calculating a distance from those points by how big they appear to the camera (on a pixel basis). Camera 203 may be able to visualize multiple resolutions, by the use of digital or optical zoom. This could be used to integrate information about experimental results in real time to the software. Camera 203 could be calibrated to detect different colored circumferences on various types of containers, to easily identify components of a commonly used kit of experimental materials. Additionally, by using the camera, Automated Visual Pipetting software (e.g., included with, or replacing, fabrication software application 108) can assess what pipetting operations to conduct with respect to particular items detected by the camera (e.g., drawing liquids or depositing liquids) by visualizing instructions on, e.g., a container or build tray, such as by recognizing the particular containers being used, or by other visual indicators, such as a particular shape, a QR code, a bar code, one or more numbers and/or letters, a specific packaging design or logo, etc. The visual indicator may directly identify the procedure to be performed; which, with the use of a kit, may require the user to identify a substrate on which to perform the procedure. Automated Visual Pipetting software may read instructions directly from the items, or suggest (and execute) possible procedures based on the identity of items on the tray and their corresponding potential uses in a series of known procedures.

In certain embodiments, Camera 203 may allow a user viewing a live video or image feed from Camera 203 via the Automated Visual Pipetting software to guide tool head 201 with, e.g., computer commands, a mouse or a joystick. However, the Automated Visual Pipetting software may be designed to limit the actions of a user in order to prevent errors or unintended consequences resulting from user mistakes. For example, the software may prevent the user from initiating a pipetting action when tool head 201 is not aligned properly over the target to be pipetted. As another example, the software may prevent the user from initiating a pipetting action when tool head 201 is aligned over a sample that should not be pipetted (or has already been pipetted).

Embodiments of the present invention may cover the entire spectrum of AVP functionality, with embodiments on one end of the spectrum completely automating the described procedures, and embodiments at the other end allowing these procedures to be manually directed by a user of the computer running the Automated Visual Pipetting software. Different implementations involving partial automation and partial user interaction will be readily apparent to one of ordinary skill in the art and constitute embodiments of the present invention as well. A single device may be configurable such that it can implement a number of different embodiments, each one involving different levels of automation and user interaction.

FIG. 3 illustrates additional aspects of an embodiment of the present invention. Atop build table 301 (e.g., build tray 104 of FIG. 1), a plurality of kits and accessories can be placed, such as the standard pipette tip box 305. Tip box 305 has several holes 306 in which pipette tips 309 are removed by tool head 201 of FIG. 2 (as illustrated at the top of FIG. 3). A camera (e.g., Camera 203 of FIG. 2) can identify target 308 on box 305. By knowing the actual size of target 308, AVP system 100 can determine the distance of tool head 201 from box 305 by the relative (pixel) size of target 308. Alternatively, AVP system 100 can measure the distance in pixels between target rings 303 and 304 to determine the distance between tool head 201 and box 305 or build table 301. Having target rings 303 and 304 on build table 301, is one possible way to eliminate the need for target 308 on box 305, since it has a feature or object of known dimension. Alternatively, QR code 307 or QR code 302 can be used to identify a particular kit (e.g., box 305) or build table 301, respectively. Just as the distance can be calculated using target 308, it can also be calculated simply by using the relative (pixel) size of a pipette tip's visible, proximal lumen and comparing it to the known size of a pipette tip's visible lumen in a tip box. The same technique can be used to determine the distance from any other object, such as a conical tube, which could be visually matched by the Automated Visual Pipetting software to a likely, known tube and then distance measured by the relative pixel measuring system just described. Even if the tube were not directly in line below a camera, Automated Visual Pipetting software can be adjusted to compensate by noting the difference in appearance of a tube that was perfectly flat. Likewise, a camera could also be used to detect the positioning of material inside a vessel (e.g. conical tube), and then position the pipette tip to capture that material, and also measure the volume of material aspirated (e.g. by visualizing the extent to which the pipette tip has been filled) to assure it was successful. (Such visualization may require the use or a secondary camera or mirror system to the primary camera to visualize the engaged deposition pipette tip.) Often, if there is insufficient liquid to aspirate the amount of fluid prescribed by the user of a micropipette system, as the present invention, a jet of aspirated air will cause the liquid to splash inside the pipette tip. Likewise, the knowledge of the fluid properties (e.g. viscosity, surface tension, density, etc.) allows the present system to predict the level at which the properly aspirated amount of fluid should rise to. By visually examining the tip, through direct camera visualization or by way of a mirror angled to allow camera visualization, the present invention can discern whether a uniform aspiration was carried out, (e.g. was there a jet of air caused by lack of material) or whether liquid rises to the proper level in the tip (e.g. as in cases both where there was or was not an air jet, as might be the case for a viscous fluid). The software may use a comparative algorithm between an empty tip and the aspirated state (or expectations thereof) to perform the above functionality.

Laboratory procedures using an embodiment of the present invention may be performed in a variety of ways: by using the live video or image feed, tool head 201 could be controlled by simple mouse clicks (or other input methods) on a connected computing device, which may be connected directly, or over a local wireless connection, or via the Internet. These techniques could replace the need for skilled pipetting procedures by providing similar results with as little as a few computer commands (e.g., a few mouse clicks), allowing a live experiment to be run by a user without special training or manual skills. Moreover, these techniques could be initiated with a virtual run through, and then performed without requiring any user interaction. For example, a user could make a few selections on a computer to initiate an experiment, the computer could visually present a virtual demonstration of that procedure, await for user confirmation to begin, and then perform the process without requiring further user interaction. Alternatively, a procedure can be programmed for a particular experimental setup. In this scenario, the experimenter would simply have to place the appropriate components, kits, and accessories on a build table, and AVP system 100 would recognize these items and simply proceed with performing an experiment based on a preprogrammed set of instructions. These instructions need not be relative to the position of items on the build table, but rather to what the items actually are, which is now possible due to the use of a video camera. AVP system 100 could identify the various components, or allow the user to do so, and then proceed with the experiment as would a person, being able to adapt to the “random” positioning of objects by knowing what and where things are, not just where they should be.

These embodiments could use interchangeable (optionally disposable and/or sterilizable) pipette (needle) tips, much in the same way that many of the deposition heads described above can switch tips, and though the mechanisms may differ slightly to accommodate mass produced pipette tips currently available, implementation of embodiments would be apparent in view of the disclosures herein.

These embodiments can also automate the use of cell cultures by processing cell growth data and depositing cells in geometries which will lead to a particular result, such as a desired cell confluence (in x days), or yield a particular growth pattern, like a row of neurons.

For example, the embodiments can run as follows. Several items can be placed on a build tray, they will each be digitally identified either by reference to a database of known items, by a scanned code, or by object properties (e.g., opening size, position), all of which can use digital mapping via the camera. Each item can then be superimposed with a digital outline shape, showing the computer's recognition of an opening, and either identifying what an object is or allowing the object's identity to be assigned. With objects known, a pre-programmed protocol can be performed. Alternatively, placing the items on a tray can be a pre-defined indicator for what protocol the computer should run. If no protocol exists, a series of selections on a computer (i.e., with Automated Visual Pipetting software) can be made to indicate the pipetting action to be taken. This can be performed either in real time or be input into a computer, optionally previewed in a virtual run through, and then performed AVP system 100 will take appropriate measurements to assure it pipettes correct volume, as well as use the camera to help identify optimal pipette placement in a container and the amount of fluid in it (recall the z-axis is virtually a constant).

Embodiments can also involve creating a custom build tray (e.g., using a plastic deposition tool) and then setting up a unique experimental setup in that tray. After use, the tray can be discarded, replaced, cleaned or reused.

Embodiments could take a crowded build (work) tray with many tube trays and other components on it, and then visually separate those components out when displaying them on a computer screen (for example by identifying component edges or small areas of unused space), so that they are easier to visualize for the user, and not crowded in a displayed video feed despite their physical positioning. This can allow more efficient utility of build tray space, and potentially allow for more logical click-through protocol setup. A user could, for example, digitally rearrange the position of components as they appear on the computer screen (versus how they actually sit on the tray), and customize the visualization describing data about those components on the same screen.

Embodiments could also track the contents of each individual component over a series of experiments. So, for example, a QR code (or otherwise visually) labeled 12-well plate could have reagent X added to it on day one, incubated overnight, and then upon replacement onto the build (work) tray on day two, the camera would check the plate data against a database containing recorded actions of AVP system 100 on the previous day, such that the user would be aware that reagent X was added to particular wells on the plate the previous day. A series of tubes could have labels in several areas, for example the cap. In such a case, the user may wish to leave the closed tubes on the tray, allow the camera to identify the codes, and remove and replace one tube at a time (to reduce error) and then open them. Embodiments can also contain additional cameras at various positions (and/or with various angles) on one or more tool heads or on the device frame itself for a variety of advances uses, including reading identification codes not easily identified from above the build (work) tray.

Embodiments of the present invention could also be used with an integrated label gun, which would add labels to components in an experiment as needed.

By using a camera feed in certain embodiments, automation and manufacturing tasks can be more precisely controlled and audited. For example, a camera can be used to verify the quality of three dimensional items as they are being printed, or precisely pick and place, or pipette, deposition tasks.

FIG. 4 illustrates one possible embodiment with a modified version of Fabrication system 100 as shown in FIG. 1 and described herein. In this embodiment, AVP system 400 can perform Automated Visual Pipetting. Pipetting tool and camera 409 (such as the pipetting tool and camera illustrated in FIG. 2 and described herein) is mounted for use on deposition tool head 402. Control unit 403 has one or more actuators and sensors configured to control operating characteristics of pipetting tool and camera 40×. Experiments to undergo pipetting procedures may be placed onto work tray (i.e., work surface) 404. For example, Container 410 may be a beaker or petri dish and contain an item such as a liquid, substrate or cell culture. Pipette tip box 411 (such as the pipette tip box illustrated in FIG. 3 and described herein) may also be placed on work tray 404 within reach of pipetting tool and camera 409.

AVP command unit 405 may be configured to support manual and automated use of pipetting tool and camera 409. Likewise, AVP software application 408 may receive pipetting and/or camera requests from a user, generate pipetting tool and/or camera path information, and direct control unit 403 to perform manual and automated pipetting procedures, as well as operate one or more cameras.

Automated Visual Pipetting software included can be stored in memory 407 and executed by processor 406. It should be appreciated that control unit 403 of AVP system 400 may be configured to receive instructions from AVP command unit 405 such that AVP system 400 can conduct pipetting procedures on experiments placed upon work surface 404 from pipetting tool and camera 409 mounted on deposition tool 402. Therefore, in various embodiments of the present invention, AVP system 400 may be referred to as a three dimensional pipetting device because it supports two axis movement of the pipetting tool above work surface 404, as well as up and down movement.

AVP command unit 405 may have a wireless or wired connection to external computer screen (i.e., monitor) 413 direct output of recorded images from pipetting tool and camera 409 to external computer screen 413, such as to display image 415 of container 410, and image 416 of pipette tip box 411. As discussed above, a computer screen can also be used in a variety of way to take advantage of the benefits offered by embodiments of the present invention.

It will be appreciated by persons of ordinary skill in the art that the present invention is not limited to the exemplary embodiments illustrated and described herein, nor is it limited to the dimensions or specific physical implementations illustrated and described herein. The present invention may have other embodiments that are readily apparent and enabled as a result of the concepts and descriptions provided herein.

Claims

1. A three dimensional pipetting device, comprising:

a control unit for receiving instructions from a pipetting command unit and operating a pipetting tool head; and
a plurality of interchangeable pipetting tips that can be affixed to the pipetting tool head;
wherein the control unit can operate the pipetting tool head to selectively use one of the plurality of interchangeable pipetting tips; and
wherein the control unit can operate the pipetting tool head to selectively draw up liquid from one or more containers on a work surface;
wherein the control unit can operate the pipetting tool head to selectively dispense liquid into the one or more containers on the work surface.

2. A three dimensional pipetting device, comprising:

a control unit for receiving instructions from a pipetting command unit and operating a pipetting tool head; and
a camera for recording data including the position of one or more containers on a work surface;
wherein the control unit can operate the pipetting tool head to selectively draw up liquid from the one or more containers on the work surface; and
wherein the control unit can operate the pipetting tool head to selectively dispense liquids into the one or more containers on the work surface.

3. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive and process data recorded by the camera to determine the location and size of at least one container on the work surface.

4. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive and process data recorded by the camera to generate a digital image of a plurality of items on the work surface.

5. A three dimensional pipetting device of claim 4, wherein the digital image of the plurality of items on the work surface can be output to an external monitor to display a virtual arrangement of the items that is different from the physical arrangement of the items on the work surface.

6. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive data recorded by the camera and visually simulate the performance of a procedure that can be performed by the pipetting tool head.

7. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive and process a visual indicator recorded by the camera to identity an item on the work surface.

8. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive and process a visual indicator recorded by the camera as an instruction to use the pipetting tool head and selectively draw up liquid from at least one container on the work surface.

9. A three dimensional pipetting device of claim 2, wherein the pipetting command unit is configured to receive and process a visual indicator recorded by the camera as an instruction to use the pipetting tool head and selectively deposit liquid into at least one container on the work surface.

10. A three dimensional pipetting device of claim 2, wherein the three dimensional pipetting device is also a three dimensional fabricating system.

11. A method for using a three dimensional pipetting device, comprising the steps of:

transmitting instructions for operating a camera from a pipetting command unit to a control unit;
operating the camera with the control unit to record image data of one or more containers on a work surface;
transmitting recorded image data from the control unit to the pipetting command unit;
transmitting instructions for operating a pipetting tool head from a pipetting command unit to a control unit; and
operating the pipetting tool head to selectively draw up liquid from the one or more containers on the work surface.

12. The method of claim 11, further comprising the step of the pipetting command unit processing recorded image data received from the control unit and determining the location of at least one container on the work surface.

13. The method of claim 11, further comprising the step of the pipetting command unit processing recorded image data received from the control unit and generating a digital image of a plurality of items on the work surface.

14. The method of claim 13, wherein the digital image includes a virtual arrangement of the items that is different from the physical arrangement of the items on the work surface.

15. The method of claim 11, wherein after said step of transmitting recorded image data from the control unit to the pipetting command unit, and before said step of transmitting instructions for operating a pipetting tool head from a pipetting command unit to a control unit, the pipetting command unit processes recorded image data received from the control unit and simulates performance of a procedure that can be performed by the pipetting tool head.

16. The method of claim 11, further comprising the steps of:

the pipetting command unit processing recorded image data received from the control unit;
detecting a visual indicator in at least one of the recorded images; and
identifying an item on the work surface.

17. The method of claim 16, wherein said visual indicator is a QR code.

18. The method of claim 11, further comprising the steps of:

the pipetting command unit processing recorded image data received from the control unit;
detecting a visual indicator in at least one of the recorded images as an instruction to use the pipetting tool head; and
selectively drawing up liquid from at least one container on the work surface in response to said instruction.

19. The method of claim 11, further comprising the steps of:

the pipetting command unit processing recorded image data received from the control unit;
detecting a visual indicator in at least one of the recorded images as an instruction to use the pipetting tool head; and
selectively depositing liquid into at least one container on the work surface in response to said instruction.

20. The method of claim 11, further comprising the steps of:

the pipetting command unit processing recorded image data received from the control unit; and
detecting whether a prescribed amount of liquid has been aspirated from the one or more containers on the work surface by the pipetting tool head.
Patent History
Publication number: 20130205920
Type: Application
Filed: Feb 10, 2013
Publication Date: Aug 15, 2013
Inventor: Adam Perry Tow (Boca Raton, FL)
Application Number: 13/763,715
Classifications
Current U.S. Class: Automatic Control (73/863.01); Including Tip Attachment Or Removal (422/511)
International Classification: B01L 3/02 (20060101);