METHODS AND SYSTEMS FOR DIAGNOSING DISEASE, PHYSIOLOGICAL CHANGES, OR OTHER INTERNAL CONDITIONS IN CRUSTACEANS THROUGH NON-INVASIVE MEANS
Methods and systems are disclosed for improvements in aquaculture that allow for increasing the number and harvesting efficiency of crustaceans in an aquaculture setting by identifying and predicting internal conditions and/or physiological conditions of the crustaceans based on external characteristics that are imaged through non-invasive means.
The subject applications claims priority to U.S. Provisional Patent Application No. 62/889,313, filed Aug. 20, 2019; U.S. Provisional Application No. 62/956,751, filed Jan. 3, 2020; U.S. Provisional Application No. 62/956,759, filed Jan. 3, 2020; and U.S. Provisional Patent Application No. 62/956,764, filed Jan. 3, 2020. The subject matter of each of which is incorporated herein by reference in entirety.
FIELD OF THE INVENTIONThe invention relates to identifying internal conditions in crustacean.
BACKGROUNDAquaculture, the farming of fish, crustaceans, mollusks, aquatic plants, algae, and other organisms, is the fastest growing food sector, and will soon provide the most crustaceans for human consumption. However, there is a rising demand for seafood due to human population growth, increasing disposable income in the developing world (which coincides with an increase of a meat-based diet), increasing coastal populations, and general health awareness (which tends to motivate consumers to crustacean-based protein). Wild crustacean resources are already at their limits, and the world market is increasingly focusing on being more sustainable and environmentally responsible, meaning increased harvesting of wild crustacean is not feasible.
SUMMARYGiven the infeasibility of meeting the world demand through harvesting wild crustacean, aquaculture represents a natural solution. However, increasing the use of aquaculture raises additional concerns such as increased disease, growth and feeding inefficiencies, and waste management. That is, while crustacean farming has increased exponentially in the past decades, the transition from small-scale hatcheries to larger industrial hatcheries and nurseries (necessary to keep up with the rising demand) has introduced both disease and issues with reproduction in crustacean species. For example, increasing the number of crustaceans in a crustacean farm, without mitigating measures, increases the prevalence of diseases as the proximity of crustacean to each other increases. The growth and feed inefficiencies are increased as more crustacean in proximity to each other make tracking crustacean size and effectively distributing feed more difficult, and more crustacean lead to more crustacean waste which increases the environmental impact of a crustacean farm on its immediate area.
For example, most captive conditions for crustaceans (e.g., shrimp species in particular) cause inhibitions in females that prevent them from developing mature ovaries. While techniques such as eyestalk abalation have been developed to stimulate the female crustaceans to develop mature ovaries and spawn, this process is both labor intensive, invasive, and raises ethical concerns. Likewise, densely populated, monocultural farms allow virus infections to spread rapidly. As the viruses are transmitted through the water itself, the viruses pose a danger to both the farmed crustaceans and wild populations. Additionally, bacterial infections such as vibriosis, which may also spread rapidly, have nearly 70% mortality rates. While vibriosis may be detected based on external characteristics (e.g., infected crustaceans may become weak and disoriented, and may have dark wounds on the cuticle), the detection is often too late to save the infected specimen and/or sort out the infected specimen to prevent a spread of the disease to other specimens.
Methods and systems are also disclosed herein for improvements in aquaculture that allow for increasing the number and harvesting efficiency of crustaceans in an aquaculture setting while still mitigating the problems above. For example, the methods and system may be applied to detecting and predicting physiological changes such as molting as well as disease infection, stage of development, and/or other internal conditions. For example, the methods and systems may be used to detect and/or predict any function or mechanism (and changes thereto) in the crustacean. For example, the methods and systems are successful in both predicting when a crustacean is likely to molt as well as determine whether or not a crustacean has molted based on non-invasive means. Moreover, these non-invasive means rely on exterior images of crustaceans and thus may be scaled for use on entire batches numbering in thousands of crustaceans.
The methods and systems may also applied to selective breeding and monitoring of crustaceans, to identify crustaceans with superior traits, experiencing physiological changes (e.g., molting), and/or having particular internal condition (e.g., disease infection). In conventional systems, crustacean sorting is not possible until crustaceans mature to a grow-out phase and/or the crustaceans reach a particular size. Prior to this, invasive genetic tests can be done to determine gender and identify biomarkers related to certain diseases, but these genetic tests may result in the death of the specimen (e.g., crushing of the fertilized egg) and/or are time-consuming. Moreover, even at the grow-out phase, sorting may be a manual process that is both time and labor intensive.
In contrast to the conventional approaches, methods and systems are described herein for non-invasive procedures for identifying traits in crustaceans. Moreover, based on the methods and systems described herein, certain traits can be determined when the crustacean is only 2-4 grams, and/or still in the egg, and with over 90% accuracy for certain genetic biomarkers. As this procedure is non-invasive and relies of the detection of phenotype characteristics based on external images of crustacean eggs and/or crustacean fries, the viability of the specimen is not threatened. Because the viability of the specimen is not threatened and the genetic traits are identified, the crustacean may be sorted for a given gender or disease resistance at a size and/or level of maturity unseen by conventional approaches. The efficiencies and number of crustaceans available for farming are thus increased without the drawbacks discussed above.
Key to the advancements discussed above is the detection of certain phenotype characteristics, as discussed below, of the crustacean based on external images. As the number of specimens (e.g., eggs and/or fries) in a single batch may number in the tens of thousands, also important to these advancements is to efficiently detect the phenotype characteristics quickly and efficiently. To do this, the methods and systems discussed below described the use of trained artificial neural networks. These artificial neural networks are trained on data sets of crustaceans in different life stages (e.g., fertilized egg, nauplius larval, zoea larval, megalopa larval, post-larval, juvenile, and/or adult), at a particular age (e.g., under 35 days after fertilization, under 90 days after fertilization, and/or under 120 days after fertilization), and/or a particular size (e.g., under 1 millimeter in length, under 1 centimeter in length, over 10 centimeters in length, etc.), and are based on image sets that include the phenotype characteristics. In some embodiments, the artificial neural networks may be trained on images of crustaceans both before and after a given molting. In some embodiments, the artificial neural networks may be trained on images of crustaceans both before and after the existence of a pathogen.
In one aspect, methods and systems for identifying internal conditions in crustaceans based on external characteristics are described. For example, the system may include receiving an image set of a first crustacean, wherein the image set of the first crustacean includes a phenotype characteristic of the first crustacean. In some embodiments, the crustacean may be a shrimplet, and the image set of the crustacean may include an external first view image of the first shrimplet and an external second view image of the first shrimplet. Additionally or alternatively, the image set of the shrimplet may be generated while the gills of the shrimplet are hydrated or while the shrimplet is sedated in order to reduce stress on the shrimplet. In some embodiments, the crustacean may be in a larval stage, and the image set of the crustacean may include a depth of field of about half of the larva. In some embodiments, the crustacean may be a fertilized egg, and the image set of the crustacean may include a depth of field of about half of the fertilized egg. In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers and/or different genders of crustaceans may be imaged together. For example, by standardizing the collection and preparation of image sets of crustaceans that contain the phenotype characteristics (e.g., by reducing stress, imaging genders together, normalizing specimen size, and/or using consistent image parameters), system bias can be eliminated. For egg and larval images, images may be captured using an ocular micrometer placed in the microscope eyepiece of a compound microscope.
The system may then generate a first pixel array based on the image set of the first crustacean and label the first pixel array with a genotype biomarker for the first crustacean. For example, the detected phenotype characteristics in the image set may be converted into a form that can be quickly and efficiently processed by an artificial neural network. In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue colored or grayscale images. Furthermore, in some embodiments, the system may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, the system may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.
The system may then train an artificial neural network to detect the genotype biomarker in crustaceans based on the labeled first pixel array. For example, the system may generate the image set of the first crustacean, and genetically test the first crustacean to determine a genotype biomarker (e.g., sequencing of seven hypervariable regions of the 16S rRNA gene and/or gender) in the first crustacean. Additional discussion of the 16S rRNA gene and its relation to disease resistance, and biomarkers therefor, is available in Cornejo-Granados, F., Lopez-Zavala, A. A., Gallardo-Becerra, L. et al., “Microbiome of Pacific Whiteleg shrimp reveals differential bacterial community composition between Wild, Aquacultured and AHPND/EMS outbreak conditions,” Sci Rep 7, 11783 (2017), which is hereby incorporated by reference in its entirety.
The presence of a particular genotype biomarker is then correlated to one or more phenotype characteristics. For example, the artificial neural network may have classifications for the genotype biomarkers. The artificial neural network is then trained based on a first data set (e.g., including data of the first crustacean and others) to classify a specimen as having a given genotype biomarker when particular phenotype characteristics are present.
The system may then receive an image set of a second crustacean, wherein the image set of the second crustacean includes a phenotype characteristic of the second crustacean. The system may generate a second pixel array based on the image set of the second crustacean, and input the second pixel array into the trained neural network. The system may then receive an output from the trained neural network indicating that the second crustacean has the genotype biomarker. For example, the system may input a second data set (e.g., image sets of crustacean for which genotype biomarkers are not known) into the trained artificial neural network. The trained artificial neural network may then classify the image sets of crustaceans according to the genotype biomarkers. For example, the genotype biomarker for the first crustacean may be a first classification of the neural network, and the system may generate an output from the neural network indicating the second crustacean has the same genotype biomarker as the first crustacean based on matching the second pixel array to the first classification.
Various other aspects, features, and advantages of the invention will be apparent through the detailed description of the invention and the drawings attached hereto. It is also to be understood that both the foregoing general description and the following detailed description are examples and not restrictive of the scope of the invention. As used in the specification and in the claims, the singular forms of “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. In addition, as used in the specification and the claims, the term “or” means “and/or” unless the context clearly dictates otherwise.
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It will be appreciated, however, by those having skill in the art that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other cases, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.
The methods and systems described herein may successfully predict and detect traits in crustaceans. Traits may include any internal (e.g., genotype, pathogen infection, physiological change) or external condition (e.g., phenotype). For example, the methods and systems discussed herein may be used to predict and detect other phenotypes (e.g., not just genotypes) based on a phenotype. For example, the user of a first phenotype (e.g., color), may be used to predict a second phenotype (e.g., size, weight, etc.). Additionally, the methods and system may be used to recognize individual specimens based on its phenotype.
The internal conditions may include a current physiological condition (e.g., a condition occurring normally in the body of the juvenile crustacean) such as a gender of the crustacean (e.g., as determined by the development of sex organs) and/or a stage of development in the crustacean (e.g., the state of molting). The internal conditions may include a predisposition to a future physiological condition such as a growth rate, maturity date, and/or behavioral traits. The internal condition may include a pathological condition (e.g., a condition centered on an abnormality in the body of the crustacean based in response to a disease) such as whether or not the crustacean is suffering from a given disease and/or is currently infected with a given disease. The internal condition may include a genetic condition (e.g., a condition based on the formation of the genome of the crustacean) such as whether or not the crustacean includes a given genotype. The internal condition may include a presence of a given biomarker (e.g., a measurable substance in an organism whose presence is indicative of a disease, infection, current internal condition, future internal condition, and/or environmental exposure).
In some embodiments, the methods and systems may detect whether or not a crustacean has undergone a physiological change (e.g., molting) and/or predict when the crustacean will undergo a physiological change. Physiological change may include molting, sexual maturity, and/or other physiological development. Physiological changes during molting may include altered body shape, changes to the cuticle, and/or behavior changes. This may also include a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.
As shown in diagram 100, molting may have three stages, although additionally sub-stages may also be used in some embodiments. During a post-molt, a crustacean recovers from its previous molt and its cuticle begins to harden. While adapting to its new size, the crustacean has little to no activity and/or feeding. Additionally, the crustacean may absorb large amounts of water. Some diseases such as White Spot Syndrome Virus (WSS) in shrimp may emerge during this phase as well as potential osmotic shock due to the intake of water. It should also be noted that in some embodiments along with detecting external characteristics associated with this stage, the system may also monitor other conditions (e.g., salt concentrations in the environment using an osmoregulatory). This information may be recorded and correlated with the specimen as discussed below. During the intermolt phase, the crustacean is stable and the cuticle is functional. The mass growth of the crustacean is continuous, and its feeding activity is at its maximum label. It should be noted that internal conditions of the crustacean (e.g., such as mass) may be detected by the system even though the external characteristics (e.g., size of the cuticle remains the same).
For example, as discussed below, these internal characteristics may be determined based on externally visible traits of the crustacean. These externally visible traits may include phenotype characteristics (e.g., one or more observable characteristics of the crustacean resulting from the interaction of its genotype with the environment). These externally visible traits may include traits corresponding to physiological changes in the crustacean. For example, during molting, externally visible traits related to this physiological change may include altered body shape, increased cuticle hardness, and changes in water intact, and behavior (e.g., feeding, mating, etc.). In such cases, the system may compare images of crustaceans before and after these physiological, morphological and/or behavioral to train an artificial neural network for to be used to predict when, and/or detect whether, other crustaceans have undergone, or will begin, the morphological and/or behavioral changes. In some embodiments, a trait may include a weight, presence of an active pathogen, presence of a parasite, or presence of a disease.
Following the intermolt stage, the crustacean will begin the premolt as the new cuticle begins formation and appears. The crustacean will also experience and interregnum during between the old and new cuticle. This stage may also coincide with a gradual decrease in feeding and other activities. The system may monitor these changes in order to predict changes as discussed below.
In some embodiments, the system may include receiving an image set of a first crustacean. The image set may include one or more images of the crustacean. If the image set includes multiple images, the multiple images may be captured from different angles (e.g., a top view, side view, bottom view, etc.) and/or may be captured substantially simultaneously. The images in the image set may include separate images (e.g., images stored separately, but linked by a common identifier such as a serial number) or images stored together. An image in an image set may also be a composite image (e.g., an image created by cutting, cropping, rearranging, and/or overlapping two or more images. In some embodiments, the crustacean may be a shrimplet, and the image set of the crustacean may include an external first view image of the first shrimplet and an external second view image of the first shrimplet. Additionally or alternatively, the image set of the shrimplet may be generated while the gills of the shrimplet are hydrated or while the shrimplet is sedated in order to reduce stress on the shrimplet. In some embodiments, the juvenile crustacean may be a fertilized egg, and the image set of the juvenile crustacean may include a depth of field of about half of the fertilized egg and/or a depth of field such that the image captures one or more of the vitelline membrane, chorion, yolk, oil globule, perivitelline space, or embryo.
In some embodiments (e.g., when predicting a date range for a physiological change and/or determining whether or not a physiological change has occurred, image sets may include a series of images taken after predetermined intervals of time and of a predetermined number (e.g., two or more). The intervals of time may vary and may depend on the life stage (or molting stage) of the specimen being imaged (e.g., as discussed in relation to
For example, in shrimp, the time from fertilization to shrimplet may be twenty-five to thirty-five days. Shrimplet to juvenile may be roughly sixty days. From juvenile to adult may be fifteen days. Adults may require one to three days to mate. The system may peg imaging dates to these periods of reference.
In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers. In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between 400 to 500 nanometers, between 500 to 600 nanometers, between 700 to 900 nanometers, or between 700 to 1100 nanometers.
In some embodiments, strobe lighting may be synced with the imaging device. The strobe light may have a flash energy in the region of 10 to 150 joules, and discharge times as short as a few milliseconds, often resulting in a flash power of several kilowatts. The strobe light source may be a xenon flash lamp, or flashtube, which may have a complex spectrum and a color temperature of approximately 5,600 kelvins. In some embodiments, the system may obtain colored light through the use of colored gels.
The image set may capture an image of a given specimen. The specimen may be a crustacean in any life stage. For example, the crustacean may be an adult crustacean and/or a juvenile crustacean (e.g., a crustacean that has not reach sexual maturity). Juvenile crustaceans may include crustacean shrimplets, eggs, or larvae. It should be noted that while embodiments of this disclosure relate to juvenile crustaceans, these embodiments are also applicable to other specimens. In particular, these specimens may include any type of aquatic life (e.g., organisms that live in aquatic ecosystems) and/or oviparous organisms.
A plurality of imaging devices may be used. However, controlling the imaging devices (e.g., using the same imaging devices of the same settings on the imaging devices) may ensure that the artificial neural network achieves the best results. Alternatively, the imaging devices or settings for the imaging devices may be randomized. Randomizing the imaging devices or settings for the imaging devices may improve the results of the artificial neural network as unintended bias (e.g., from the imaging device or a setting of the imaging device) is not introduced. In some embodiments, the imaging device may be able to capture high resolution images and may comprise a 21 mega pixel camera. The image captured by the imaging device may include 3-6 mega pixel images.
In some embodiments, the system may then generate a pixel array based on the image set of the first crustacean. The pixel array may refer to computer data that describes the image (e.g., pixel by pixel). In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue or grayscale image. Furthermore, in some embodiments, the system may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, the system may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.
Each of these devices may also include memory in the form of electronic storage. The electronic storage may include non-transitory storage media that electronically stores information. The electronic storage media of the electronic storages may include one or both of (i) system storage that is provided integrally (e.g., substantially non-removable) with servers or client devices or (ii) removable storage that is removably connectable to the servers or client devices via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). The electronic storages may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. The electronic storages may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). The electronic storage may store software algorithms, information determined by the processors, information obtained from servers, information obtained from client devices, or other information that enables the functionality as described herein.
In some embodiments, system 200 may use one or more prediction models to predict internal conditions based on external characteristics. For example, as shown in
As an example, with respect to
Machine learning model 222 may be trained to detect the internal conditions in crustaceans based on a pixel array. For example, client device 202 or 204 may generate the image set of the first crustacean (e.g., via an image capture device), and genetically test the first crustacean to determine a genotype biomarker (e.g., sequencing of seven hypervariable regions of the 16S rRNA gene and gender) in the first crustacean. The presence of a particular genotype biomarker is then correlated to one or more phenotype characteristics. For example, machine learning model 222 may have classifications for the internal conditions (e.g., genotype biomarkers). Machine learning model 222 is then trained based on a first data set (e.g., including data of the first crustacean and others) to classify a specimen as having a given genotype biomarker when particular phenotype characteristics are present.
The system may then receive an image set of a crustacean, wherein the image set of the crustacean includes an external characteristic of the second crustacean. Client device 202 or 204 may generate a second pixel array based on the image set of the second crustacean and input the second pixel array into machine learning model 222. The system may then receive an output from machine learning model 222 indicating that the second crustacean has the same internal condition (e.g., genotype biomarker) as the first. For example, the system may input a second data set (e.g., image sets of crustaceans for which genotype biomarkers are not known) into machine learning model 222. Machine learning model 222 may then classify the image sets of crustaceans into according to the genotype biomarkers. For example, the genotype biomarker for the first crustacean may be a first classification of machine learning model 222, and the system may generate an output from machine learning model 222 the second crustacean has the same genotype biomarker as the first crustacean based on matching the second pixel array to the first classification.
In some embodiments, system 200 is further configured to handle, sort, and/or transfer crustaceans (e.g., for vaccination, gender segregation, transfer to sea or breeding area, etc.). In such embodiments, the internal condition may be detected based on external characteristics in real-time (e.g., as the crustaceans are transported along a conveyor belt or otherwise transferred). That is, following the output of an internal condition (e.g., a genotype biomarker as described in
In some embodiments, model 350 may implement an inverted residual structure where the input and output of a residual block (e.g., block 354) are thin bottleneck layers. A residual layer may feed into the next layer and directly into layers that are one or more layers downstream. A bottleneck layer (e.g., block 358) is a layer that contains few neural units compared to the previous layers. Model 350 may use a bottleneck layer to obtain a representation of the input with reduced dimensionality. An example of this is the use of autoencoders with bottleneck layers for nonlinear dimensionality reduction. Additionally, model 350 may remove non-linearities in a narrow layer (e.g., block 358) in order to maintain representational power. In some embodiments, the design of model 350 may also be guided by the metric of computation complexity (e.g., the number of floating point operations). In some embodiments, model 350 may increase the feature map dimension at all units to involve as many locations as possible instead of sharply increasing the feature map dimensions at neural units that perform downsampling. In some embodiments, model 350 may decrease the depth and increase width of residual layers in the downstream direction.
In some embodiments, model 300 or model 350 may be a siamese neural network (e.g., model 390) that uses the same weights while working in tandem on two different input vectors to compute comparable output vectors. For example, in a siamese artificial neural network, model 300 may include two convolutional neural networks (e.g., two of model 350), that are not two different networks, but are two copies of the same network (e.g., model 350). For example, two input images may pass through model 390 to generate a fixed-length feature vector for each image. If the two input images belong to the same fish, then their feature vectors will also be similar, while if the two input images belong to two different fish, then their feature vectors will be different. The system may then generate a similarity score generated by an output sigmoid layer (e.g., layer 370) to detect and predict physiological changes and/or as well as diagnosing disease, stage of development, and/or condition. Furthermore, as one illustrative example of the algorithm used, the system may rely on a siamese neural network and/or other neural network that uses the same or similar weights while working on two different input vectors to compute comparable output vectors, typically in tandem. Additionally, the Siamese neural network does not rely on algorithms that require multiple images of each specimen, images from multiple angles, and/or with annotation of key features.
Computer system 400 also include control circuitry 402. Control circuitry 402 may perform one or more processes (e.g., as described below in relation to
Image 410 is of a shrimp in a post-larvae, juvenile stage. For example, after completing the three stages of larval development (e.g., as discussed in
Image 420 is a shrimp in a juvenile stage during grow-out. For example, when a specimen reaches one gram, the specimen will require additional space and transferred to a grow-out pond (or other environment) for roughly seven to eight months, until it is harvest (typically at a weight of twenty to thirty grams). During this time, the system may capture image 420.
It should be noted that the image set of a specimen may include one or more angles of the specimen, although only a single view is shown in
It should be noted that multiple views of the specimen may be used. The one or more views may create a standardized series of orthographic two-dimensional images that represent the form of the three-dimensional specimen. For example, six views of the specimen may be used, with each projection plane parallel to one of the coordinate axes of the object. The views may be positioned relative to each other according to either a first-angle projection scheme or a third-angle projection scheme. The views may include a side view, front view, top view, bottom, and/or end view. The views may also include plan, elevation, and/or section views.
Computer system 500 also include control circuitry 502. Control circuitry 502 may perform one or more processes (e.g., as described below in relation to
For example, as shown in
In some embodiments, the image set may also include other features used to identify the specimen (e.g., a serial number, order number, and/or batch number), used to determine the scale of the specimen and/or a part of the specimen (e.g., measurement means for height, length, and/or weight), used to provide a reference point for a given phenotype characteristic (e.g., a color palette used to compare color of the specimen to), and/or used to indicate other information that may be used to classify the specimen (e.g., an indicator of age, maturity level, species, size, etc.). Furthermore, each of images image 520, 530, and 540 may be stored together with a profile of a given crustacean. This profile may include information on the crustacean's attributes (e.g., batch, cohort, family, etc.) as well as other information (e.g., a weight) on the crustacean. The system may correlate this information to a time and/or date of the image as well as any imaging conditions (e.g., settings of an imaging device).
The system may also capture and analyze image 520, 530, and 540 in a similar manner. Image 520 is an image of a shrimp in a nauplii larval stage, which lasts for under two days after hatching. During this stage, its body may consist of a thorax and abdomen, head and telson. The specimen may also include a naupliar eye. The nauplii larvae may also include three pairs of appendages, the first and second function as antennae, and the third function as mandibles. The length, position, and use (e.g., the use of the appendages in feeding and propulsion) may be used by the system to classify the specimen.
Image 530 is an image of a shrimp in a zoea larval stage, which lasts for three to five days. During this time, the specimen begins to develop its eyes and extended body length, which may be detected and analyzed by the system. The specimen also begins it behavioral routine of feedings (e.g., on algae). Zoea larvae swim using thoracic appendages (e.g., maxillipeds and pereopods). The zoea larvae has two stalked compound eyes (the size of which relative to its body may be measured by the system). The zoea larvae also has two maxillipeds. The length and position of these between the rostral and lateral spines may be measured by the system. The anterior-most maxilliped contains the endopodites that are used for feeding, the use of which may be monitored by the system to classify the specimen.
Image 540 is an image of a shrimp in a mysis larval stage, which lasts for three to five days. During this time, the head and thorax have a carapace, and all of the cephalic and thoracic appendages are present. However, the thoracic appendages are alike and biramous with exopodites. In addition to measuring the size, position, and length of these features, the system may monitor the behavior of the specimen to classify the specimen.
At step 602, process 600 receives (e.g., using control circuitry 402 (
In some embodiments, the crustacean may be a shrimplet (as discussed in relation to
At step 604, process 600 generates (e.g., using control circuitry 402 (
At step 606, process 600 labels (e.g., using control circuitry 402 (
At step 608, process 600 trains (e.g., using control circuitry 402 (
At step 610, process 600 receives (e.g., using control circuitry 402 (
At step 612, process 600 generates (e.g., using control circuitry 402 (
At step 614, process 600 inputs (e.g., using control circuitry 402 (
At step 616, process 600 outputs (e.g., using control circuitry 402 (
In some embodiments, process 600 may further handle, sort, and/or transfer the crustaceans (e.g., for vaccination, gender segregation, transfer to sea or breeding area, etc.) automatically. In such embodiments, the internal condition may be detected based on external characteristics in real-time (e.g., as the crustaceans are transported along a conveyor belt or otherwise transferred). That is, following the output of an internal condition (e.g., a genotype biomarker as described in
It is contemplated that the steps or descriptions of
At step 702, process 700 receives (e.g., using control circuitry of one or more components of system 200 (
At step 704, process 700 generates (e.g., using control circuitry of one or more components of system 200 (
At step 708, process 700 trains (e.g., using control circuitry of one or more components of system 200 (
For example, the artificial neural network may comprise a convolutional neural network and/or the first date range may correspond to a first classification of the artificial neural network and the third date range may correspond to second classification of the artificial neural network. The system may then classify an inputted image (e.g., an image of a crustacean) as corresponding to the first and/or second classification. For example, the system may input the second pixel array into the artificial neural network, and the system may determine whether the second pixel array corresponds to the first classification or the second classification.
At step 710, process 700 receives (e.g., using control circuitry of one or more components of system 200 (
At step 712, process 700 generates (e.g., using control circuitry of one or more components of system 200 (
At step 714, process 700 input (e.g., using control circuitry of one or more components of system 200 (
In some embodiments, the system may further comprise sorting the second crustacean based on the output. For example, the system may control a sorting mechanism that either outputs a signal that causes the mechanism to sort a crustacean and/or label a particular crustacean (e.g., in a profile associated with the crustacean) with the date range.
It is contemplated that the steps or descriptions of
In some embodiments, the first physiological change may be an absolute cuticle hardness, a predetermined cuticle level, or change in a cuticle hardness. The system may further detect the physiological changes in crustaceans with varying ages, weight, length, sex, cohorts, or other attributes. For example, the system may identify physiological changes in crustaceans up to 2 grams, 5 grams, up to 100 grams, up to 200 grams, and/or over 200 grams.
At step 802, process 800 receives (e.g., using control circuitry of one or more components of system 200 (
At step 804, process 800 generates (e.g., using control circuitry of one or more components of system 200 (
For example, the artificial neural network may comprise a convolutional neural network and/or the first physiological change may correspond to a first classification of the artificial neural network and the second physiological change may correspond to second classification of the artificial neural network. The system may then classify an inputted image (e.g., an image of a crustacean) as corresponding to the first and/or second classification. For example, the system may input the second pixel array into the artificial neural network, and the system may determine whether the second pixel array corresponds to the first classification or the second classification.
At step 810, process 800 receives (e.g., using control circuitry of one or more components of system 200 (
At step 816, process 800 receives (e.g., using control circuitry of one or more components of system 200 (
In some embodiments, the system may further comprise sorting the second crustacean based on the output. For example, the system may control a sorting mechanism that either outputs a signal that causes the mechanism to sort a crustacean and/or label a particular crustacean (e.g., in a profile associated with the crustacean) with the date range.
It is contemplated that the steps or descriptions of
At step 902, process 900 selects data sets based on a crustacean attribute. For example, the system may be trained to identify crustaceans having a common cohort, age, sex, length, and/or weight. Accordingly, the system may first select a data set of crustaceans having the particular attribute. The particular attribute may be recorded with an identifier (e.g., serial number) associated with each crustacean. In some embodiments, the serial number may be correlated with a label for each crustacean.
At step 904, process 900 prepares sets of images. For example, the system may prepare the data for training the artificial neural network. For example, the system may randomize a first training characteristic or control the first training characteristic. For example, the first training characteristic may be an imaging device used to capture the first training set of images and/or an image background for the first training set of images.
At step 906, process 900 trains an artificial neural network using a first training set of images comprising crustaceans with corresponding labels for a physiological change. For example, the system may train the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute. For example, the system may train the artificial neural network to detect one or more physiological changes, which may include any characteristic that may distinguish one crustacean (or group of crustaceans) from another. Physiological changes may include any quantitative or qualitative description of a characteristic such as early gonadal development, early molting, early-sexual maturation, disease resistance, and/or robustness. Physiological changes may also be described in relation to other crustaceans, a progression in development, and/or a comparison to an average progression of other crustaceans.
In some embodiments, the first training set of images may comprise an image set for each of the 2500 to 3000 crustaceans, and the image set may comprise a series of images of each of the 2500 to 3000 crustaceans at a first time and a second time. The first time and second time may be at predetermined times and may be based on age, weight, and/size. For example, each of the 2500 to 3000 crustaceans may be juvenile crustaceans or under 200 grams.
In some embodiments, a first training set of images may comprise an image of the 2500 to 3000 crustaceans during a photoperiod treatment. For example, the system may monitor the period of time each day during which the crustaceans receive illumination. The system may then use these periods to determine when to collect the sets of images for each crustacean.
At step 908, process 900 trains the artificial neural network using a second training set of images comprising images of crustaceans without corresponding labels for the trait. For example, the system may train the artificial neural network by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of un-tagged crustaceans without corresponding labels for the physiological change and are part of the first attribute.
It is contemplated that the steps or descriptions of
Although the present invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but on the contrary, is intended to cover modifications and equivalent arrangements that are within the scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
The present techniques will be better understood with reference to the following enumerated embodiments:
1. A method of identifying internal conditions in crustacean based on external characteristics, the method comprising: receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a phenotype characteristic of the first crustacean; generating, using the control circuitry, a first pixel array based on the image set of the first crustacean; labeling, using the control circuitry, the first pixel array with a genotype biomarker for the first crustacean; training, using the control circuitry, an artificial neural network to detect the genotype biomarker in the first crustacean based on the labeled first pixel array; receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes a phenotype characteristic of the second crustacean; generating, using the control circuitry, a second pixel array based on the image set of the second crustacean; inputting, using the control circuitry, the second pixel array into the trained neural network; and receiving, using the control circuitry, an output from the trained neural network indicating that the second crustacean has the genotype biomarker.
2. The method of embodiment 1, wherein the first crustacean is a first shrimplet and the second crustacean is a second shrimplet, and wherein the image set of the first crustacean includes an external first view image of the first shrimplet and an external second view image of the first shrimplet and the image set of the second crustacean includes an external first view image of the second shrimplet and an external second view image of the second shrimplet.
3. The method of embodiment 2, wherein the image set of the first crustacean is generated while the gills of the first crustacean are hydrated or while the first crustacean is sedated.
4. The method of claim 1, wherein the first crustacean is a first fertilized crustacean egg and the second crustacean is a second fertilized crustacean egg, and wherein the image set of the first crustacean includes an image of the first fertilized egg with a depth of field of about half of the first fertilized egg and the image set of the second crustacean includes an image of the first fertilized egg with a depth of field of about half of the second fertilized egg.
5. The method of any of embodiments 1-4, further comprising: receiving an image set of a third crustacean, wherein the image set of the third crustacean includes a phenotype characteristic of the third crustacean; generating a third pixel array based on the image set of the third crustacean; labeling the third pixel array with a genotype biomarker for the third crustacean; training the neural network to detect genotype biomarkers in crustaceans based on the labeled first pixel array and the labeled second pixel array.
6. The method of any of embodiments 1-5, wherein the genotype biomarker for the first crustacean is a first classification of the neural network, and wherein receiving the output from the neural network indicating the genotype biomarker for the second crustacean comprises matching the second pixel array to the first classification.
7. The method of any of embodiments 1-6, wherein the image set of the first crustacean and the image set of the second crustacean were generated together, and wherein the first crustacean is male and the second crustacean is female.
8. The method of any of embodiments 1-7, wherein the image set of the first crustacean is created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers.
9. The method of any of embodiments 1-8, wherein the image set of the first crustacean has a red color array, a green color array, and a blue color array, and wherein generating the first pixel array based on the image set of the first crustacean, further comprises: determining a grayscale color array for the image set of the first crustacean; and generating the first pixel array based on the grayscale color array.
10. The method of any of embodiments 1-9, further comprising: generating the image set of the first crustacean; and genetically testing the first crustacean to determine the genotype biomarker in the first crustacean.
11. The method of any of embodiments 1-10, wherein the first crustacean is under 50 grams, and wherein the second crustacean is under 5 grams.
12. A method of predicting dates of physiological changes in crustacean based on external characteristics, the method comprising:
receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
labeling, using the control circuitry, the first pixel array with a first date range for a physiological change for the first crustacean;
training, using the control circuitry, an artificial neural network to determine the first date range based on the labeled first pixel array;
receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
inputting, using the control circuitry, the second pixel array into the artificial neural network; and
receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first date range of the physiological change.
13. The method of embodiment 12, further comprising:
receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
labeling, using the control circuitry, the third pixel array with a third date range for a physiological change for the third crustacean;
training, using the control circuitry, the artificial neural network to determine the third date range based on the labeled third pixel array.
14. The method of any of embodiments 12-13, wherein the first date range corresponds to a first classification of the artificial neural network and the third date range corresponds to second classification of the artificial neural network.
15. The method of embodiment 14, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
16. The method of any of embodiments 12-15, wherein the artificial neural network comprises a convolutional neural network.
17. The method of any of embodiments 12-16, wherein the physiological change is molting.
18. The method of any of embodiments 12-17, wherein the first crustacean is 0-3 grams, 2-6 grams, 10-100 grams, under 200 grams, or over 200 grams.
19. The method of any of embodiments 12-18, wherein the first date range includes a date when molting is complete in the second crustacean.
20. The method of any of embodiments 12-19, wherein the physiological change is sexual maturity, and wherein the first date range includes a date when the second crustacean is sexually mature.
21. The method of any of embodiments 12-20, further comprising sorting the second crustacean based on the output.
22. A method of predicting dates of physiological changes in crustaceans based on external characteristics, the method comprising:
receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
labeling, using the control circuitry, the first pixel array with a first physiological change for the first crustacean;
training, using the control circuitry, an artificial neural network to detect the first physiological change based on the labeled first pixel array;
receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
inputting, using the control circuitry, the second pixel array into the artificial neural network; and
receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first physiological change.
23. The method of embodiment 22, further comprising:
receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
labeling, using the control circuitry, the third pixel array with a second physiological change;
training, using the control circuitry, the artificial neural network to detect the second physiological change.
24. The method of any of embodiments 22-23, wherein the first physiological change corresponds to a first classification of the artificial neural network and the second physiological change corresponds to second classification of the artificial neural network.
25. The method of embodiment 24, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
26. The method of any of embodiments 22-25, wherein the artificial neural network comprises a convolutional neural network.
27. The method of any of embodiments 22-26, wherein the first physiological change is molting.
28. The method of any of embodiments 22-27, wherein the first physiological change is a predetermined cuticle hardness.
29. The method of any of embodiments 22-28, wherein the first physiological change is a predetermined cuticle hardness level or a degree of change in a cuticle.
30. The method of any of embodiments 22-29, wherein the first physiological change is a predetermined water intake amount in the first crustacean or a degree of change in water intake level of the first crustacean.
31. The method of any of embodiments 22-30, wherein the first crustacean is up to 200 grams.
32 An artificial neural network for detecting physiological changes of crustacean based on external characteristics with over ninety-nine percent accuracy produced by:
training the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute.
33. The artificial neural network of embodiment 32, further produced by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of un-tagged crustaceans without corresponding labels for the physiological change and are part of the first attribute.
34. The artificial neural network of any of embodiments 32-33, wherein the first attribute comprises a cohort, age, or weight.
35. The artificial neural network of any of embodiments 32-34, wherein the trait comprises a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.
36. The artificial neural network of any of embodiments 32-35, wherein the physiological change comprises a completion of gonadal development, molting, or sexual maturation.
37. The artificial neural network of any of embodiments 32-36, wherein the training comprises randomizing a first training characteristic or controlling the first training characteristic.
38. The artificial neural network of embodiment 36 or 37, the first training characteristic is an imaging device used to capture the first training set of images or an image background for the first training set of images.
39. The artificial neural network of any of embodiments 32-38, wherein the first training set of images comprises an image set for each of the 2500 to 3000 crustaceans, and wherein the image set comprises a series of images of each of the 2500 to 3000 crustacean at a first time and a second time.
40. The artificial neural network of any of embodiments 32-39, wherein the first training set of images comprises an image of the 2500 to 3000 crustaceans during a photoperiod treatment.
41. The artificial neural network of any of embodiments 32-40, wherein each of the 2500 to 3000 crustaceans are juvenile crustaceans or under 200 grams.
42. A tangible, non-transitory, machine-readable medium storing instructions that, when executed by a data processing apparatus, cause the data processing apparatus to perform operations comprising those of any of embodiments 1-41.
43. A system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-41.
Claims
1. A method of predicting dates of physiological changes in crustaceans based on external characteristics, the method comprising:
- receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
- generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
- labeling, using the control circuitry, the first pixel array with a first date range for a physiological change for the first crustacean;
- training, using the control circuitry, an artificial neural network to determine the first date range based on the labeled first pixel array;
- receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
- generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
- inputting, using the control circuitry, the second pixel array into the artificial neural network; and
- receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first date range of the physiological change.
2. The method of claim 1, further comprising:
- receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
- generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
- labeling, using the control circuitry, the third pixel array with a third date range for a physiological change for the third crustacean;
- training, using the control circuitry, the artificial neural network to determine the third date range based on the labeled third pixel array.
3. The method of claim 2, wherein the first date range corresponds to a first classification of the artificial neural network and the third date range corresponds to second classification of the artificial neural network.
4. The method of claim 3, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
5. The method of claim 1, wherein the artificial neural network comprises a convolutional neural network.
6. The method of claim 1, wherein the physiological change is molting.
7. The method of claim 1, wherein the first crustacean is under grams.
8. The method of claim 1, wherein the first date range includes a date when molting is complete in the second crustacean.
9. The method of claim 1, wherein the physiological change is sexual maturity, and wherein the first date range includes a date when the second crustacean is sexually mature.
10. The method of claim 1, further comprising sorting the second crustacean based on the output.
11. A method of predicting dates of physiological changes in crustacean based on external characteristics, the method comprising:
- receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
- generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
- labeling, using the control circuitry, the first pixel array with a first physiological change for the first crustacean;
- training, using the control circuitry, an artificial neural network to detect the first physiological change based on the labeled first pixel array;
- receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
- generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
- inputting, using the control circuitry, the second pixel array into the artificial neural network; and
- receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first physiological change.
12. The method of claim 11, further comprising:
- receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
- generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
- labeling, using the control circuitry, the third pixel array with a second physiological change;
- training, using the control circuitry, the artificial neural network to detect the second physiological change.
13. The method of claim 12, wherein the first physiological change corresponds to a first classification of the artificial neural network and the second physiological change corresponds to second classification of the artificial neural network.
14. The method of claim 13, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
15. The method of claim 11, wherein the artificial neural network comprises a convolutional neural network.
16. The method of claim 11, wherein the first physiological change is molting.
17. The method of claim 11, wherein the first physiological change is a cuticle hardness.
18. The method of claim 11, wherein the first physiological change is a level of feeding activity.
19. The method of claim 11, wherein the first physiological change is an amount of water intake of the first crustacean or a degree of change in cuticle hardness.
20. The method of claim 11, wherein the first crustacean is up to 100 grams.
21. An artificial neural network for detecting physiological changes of crustaceans based on external characteristics with over ninety-nine percent accuracy in detecting the traits produced by:
- training the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute.
22. The artificial neural network of claim 21, further produced by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of crustaceans without corresponding labels for the physiological change and are part of the first attribute.
23. The artificial neural network of claim 21, wherein the first attribute comprises a cohort, molt stage, life stage, age, size, or weight.
24. The artificial neural network of claim 21, wherein the trait comprises a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.
25. The artificial neural network of claim 21, wherein the trait comprises a completion of gonadal development, molting, or sexual maturation.
26. The artificial neural network of claim 21, wherein the training comprises randomizing a first training characteristic or controlling the first training characteristic.
27. The artificial neural network of claim 26, the first training characteristic is an imaging device used to capture the first training set of images or an image background for the first training set of images.
28. The artificial neural network of claim 21, wherein the first training set of images comprises an image set for each of the 2500 to 3000 crustaceans, and wherein the image set comprises a series of images of each of the 2500 to 3000 crustaceans at a first time and a second time.
29. The artificial neural network of claim 21, wherein the first training set of images comprises an image of the 2500 to 3000 crustaceans during a photoperiod treatment.
30. The artificial neural network of claim 21, wherein each of the 2500 to 3000 crustaceans are juvenile crustaceans or under 2 grams.
Type: Application
Filed: Aug 4, 2020
Publication Date: Oct 20, 2022
Inventor: Ayal BRENNER (Birkirkara)
Application Number: 17/642,321