METHODS AND SYSTEMS FOR DIAGNOSING DISEASE, PHYSIOLOGICAL CHANGES, OR OTHER INTERNAL CONDITIONS IN CRUSTACEANS THROUGH NON-INVASIVE MEANS

Methods and systems are disclosed for improvements in aquaculture that allow for increasing the number and harvesting efficiency of crustaceans in an aquaculture setting by identifying and predicting internal conditions and/or physiological conditions of the crustaceans based on external characteristics that are imaged through non-invasive means.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The subject applications claims priority to U.S. Provisional Patent Application No. 62/889,313, filed Aug. 20, 2019; U.S. Provisional Application No. 62/956,751, filed Jan. 3, 2020; U.S. Provisional Application No. 62/956,759, filed Jan. 3, 2020; and U.S. Provisional Patent Application No. 62/956,764, filed Jan. 3, 2020. The subject matter of each of which is incorporated herein by reference in entirety.

FIELD OF THE INVENTION

The invention relates to identifying internal conditions in crustacean.

BACKGROUND

Aquaculture, the farming of fish, crustaceans, mollusks, aquatic plants, algae, and other organisms, is the fastest growing food sector, and will soon provide the most crustaceans for human consumption. However, there is a rising demand for seafood due to human population growth, increasing disposable income in the developing world (which coincides with an increase of a meat-based diet), increasing coastal populations, and general health awareness (which tends to motivate consumers to crustacean-based protein). Wild crustacean resources are already at their limits, and the world market is increasingly focusing on being more sustainable and environmentally responsible, meaning increased harvesting of wild crustacean is not feasible.

SUMMARY

Given the infeasibility of meeting the world demand through harvesting wild crustacean, aquaculture represents a natural solution. However, increasing the use of aquaculture raises additional concerns such as increased disease, growth and feeding inefficiencies, and waste management. That is, while crustacean farming has increased exponentially in the past decades, the transition from small-scale hatcheries to larger industrial hatcheries and nurseries (necessary to keep up with the rising demand) has introduced both disease and issues with reproduction in crustacean species. For example, increasing the number of crustaceans in a crustacean farm, without mitigating measures, increases the prevalence of diseases as the proximity of crustacean to each other increases. The growth and feed inefficiencies are increased as more crustacean in proximity to each other make tracking crustacean size and effectively distributing feed more difficult, and more crustacean lead to more crustacean waste which increases the environmental impact of a crustacean farm on its immediate area.

For example, most captive conditions for crustaceans (e.g., shrimp species in particular) cause inhibitions in females that prevent them from developing mature ovaries. While techniques such as eyestalk abalation have been developed to stimulate the female crustaceans to develop mature ovaries and spawn, this process is both labor intensive, invasive, and raises ethical concerns. Likewise, densely populated, monocultural farms allow virus infections to spread rapidly. As the viruses are transmitted through the water itself, the viruses pose a danger to both the farmed crustaceans and wild populations. Additionally, bacterial infections such as vibriosis, which may also spread rapidly, have nearly 70% mortality rates. While vibriosis may be detected based on external characteristics (e.g., infected crustaceans may become weak and disoriented, and may have dark wounds on the cuticle), the detection is often too late to save the infected specimen and/or sort out the infected specimen to prevent a spread of the disease to other specimens.

Methods and systems are also disclosed herein for improvements in aquaculture that allow for increasing the number and harvesting efficiency of crustaceans in an aquaculture setting while still mitigating the problems above. For example, the methods and system may be applied to detecting and predicting physiological changes such as molting as well as disease infection, stage of development, and/or other internal conditions. For example, the methods and systems may be used to detect and/or predict any function or mechanism (and changes thereto) in the crustacean. For example, the methods and systems are successful in both predicting when a crustacean is likely to molt as well as determine whether or not a crustacean has molted based on non-invasive means. Moreover, these non-invasive means rely on exterior images of crustaceans and thus may be scaled for use on entire batches numbering in thousands of crustaceans.

The methods and systems may also applied to selective breeding and monitoring of crustaceans, to identify crustaceans with superior traits, experiencing physiological changes (e.g., molting), and/or having particular internal condition (e.g., disease infection). In conventional systems, crustacean sorting is not possible until crustaceans mature to a grow-out phase and/or the crustaceans reach a particular size. Prior to this, invasive genetic tests can be done to determine gender and identify biomarkers related to certain diseases, but these genetic tests may result in the death of the specimen (e.g., crushing of the fertilized egg) and/or are time-consuming. Moreover, even at the grow-out phase, sorting may be a manual process that is both time and labor intensive.

In contrast to the conventional approaches, methods and systems are described herein for non-invasive procedures for identifying traits in crustaceans. Moreover, based on the methods and systems described herein, certain traits can be determined when the crustacean is only 2-4 grams, and/or still in the egg, and with over 90% accuracy for certain genetic biomarkers. As this procedure is non-invasive and relies of the detection of phenotype characteristics based on external images of crustacean eggs and/or crustacean fries, the viability of the specimen is not threatened. Because the viability of the specimen is not threatened and the genetic traits are identified, the crustacean may be sorted for a given gender or disease resistance at a size and/or level of maturity unseen by conventional approaches. The efficiencies and number of crustaceans available for farming are thus increased without the drawbacks discussed above.

Key to the advancements discussed above is the detection of certain phenotype characteristics, as discussed below, of the crustacean based on external images. As the number of specimens (e.g., eggs and/or fries) in a single batch may number in the tens of thousands, also important to these advancements is to efficiently detect the phenotype characteristics quickly and efficiently. To do this, the methods and systems discussed below described the use of trained artificial neural networks. These artificial neural networks are trained on data sets of crustaceans in different life stages (e.g., fertilized egg, nauplius larval, zoea larval, megalopa larval, post-larval, juvenile, and/or adult), at a particular age (e.g., under 35 days after fertilization, under 90 days after fertilization, and/or under 120 days after fertilization), and/or a particular size (e.g., under 1 millimeter in length, under 1 centimeter in length, over 10 centimeters in length, etc.), and are based on image sets that include the phenotype characteristics. In some embodiments, the artificial neural networks may be trained on images of crustaceans both before and after a given molting. In some embodiments, the artificial neural networks may be trained on images of crustaceans both before and after the existence of a pathogen.

In one aspect, methods and systems for identifying internal conditions in crustaceans based on external characteristics are described. For example, the system may include receiving an image set of a first crustacean, wherein the image set of the first crustacean includes a phenotype characteristic of the first crustacean. In some embodiments, the crustacean may be a shrimplet, and the image set of the crustacean may include an external first view image of the first shrimplet and an external second view image of the first shrimplet. Additionally or alternatively, the image set of the shrimplet may be generated while the gills of the shrimplet are hydrated or while the shrimplet is sedated in order to reduce stress on the shrimplet. In some embodiments, the crustacean may be in a larval stage, and the image set of the crustacean may include a depth of field of about half of the larva. In some embodiments, the crustacean may be a fertilized egg, and the image set of the crustacean may include a depth of field of about half of the fertilized egg. In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers and/or different genders of crustaceans may be imaged together. For example, by standardizing the collection and preparation of image sets of crustaceans that contain the phenotype characteristics (e.g., by reducing stress, imaging genders together, normalizing specimen size, and/or using consistent image parameters), system bias can be eliminated. For egg and larval images, images may be captured using an ocular micrometer placed in the microscope eyepiece of a compound microscope.

The system may then generate a first pixel array based on the image set of the first crustacean and label the first pixel array with a genotype biomarker for the first crustacean. For example, the detected phenotype characteristics in the image set may be converted into a form that can be quickly and efficiently processed by an artificial neural network. In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue colored or grayscale images. Furthermore, in some embodiments, the system may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, the system may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.

The system may then train an artificial neural network to detect the genotype biomarker in crustaceans based on the labeled first pixel array. For example, the system may generate the image set of the first crustacean, and genetically test the first crustacean to determine a genotype biomarker (e.g., sequencing of seven hypervariable regions of the 16S rRNA gene and/or gender) in the first crustacean. Additional discussion of the 16S rRNA gene and its relation to disease resistance, and biomarkers therefor, is available in Cornejo-Granados, F., Lopez-Zavala, A. A., Gallardo-Becerra, L. et al., “Microbiome of Pacific Whiteleg shrimp reveals differential bacterial community composition between Wild, Aquacultured and AHPND/EMS outbreak conditions,” Sci Rep 7, 11783 (2017), which is hereby incorporated by reference in its entirety.

The presence of a particular genotype biomarker is then correlated to one or more phenotype characteristics. For example, the artificial neural network may have classifications for the genotype biomarkers. The artificial neural network is then trained based on a first data set (e.g., including data of the first crustacean and others) to classify a specimen as having a given genotype biomarker when particular phenotype characteristics are present.

The system may then receive an image set of a second crustacean, wherein the image set of the second crustacean includes a phenotype characteristic of the second crustacean. The system may generate a second pixel array based on the image set of the second crustacean, and input the second pixel array into the trained neural network. The system may then receive an output from the trained neural network indicating that the second crustacean has the genotype biomarker. For example, the system may input a second data set (e.g., image sets of crustacean for which genotype biomarkers are not known) into the trained artificial neural network. The trained artificial neural network may then classify the image sets of crustaceans according to the genotype biomarkers. For example, the genotype biomarker for the first crustacean may be a first classification of the neural network, and the system may generate an output from the neural network indicating the second crustacean has the same genotype biomarker as the first crustacean based on matching the second pixel array to the first classification.

Various other aspects, features, and advantages of the invention will be apparent through the detailed description of the invention and the drawings attached hereto. It is also to be understood that both the foregoing general description and the following detailed description are examples and not restrictive of the scope of the invention. As used in the specification and in the claims, the singular forms of “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. In addition, as used in the specification and the claims, the term “or” means “and/or” unless the context clearly dictates otherwise.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows a chart indicating common external characteristics used for detection of molting in crustaceans, in accordance with one or more embodiments.

FIG. 2 shows a system featuring a machine learning model configured to detect internal conditions based on external characteristics, in accordance with one or more embodiments.

FIG. 3 shows graphical representations of artificial neural network models for detecting internal conditions based on external characteristics, in accordance with one or more embodiments.

FIG. 4 shows an illustrative example of detecting internal conditions based on external characteristics in crustacean fries, in accordance with one or more embodiments.

FIG. 5 shows an illustrative example of detecting internal conditions based on external characteristics in fertilized eggs, in accordance with one or more embodiments.

FIG. 6 shows a flowchart for detecting internal conditions based on external characteristics in crustaceans, in accordance with one or more embodiments.

FIG. 7 shows a flowchart for predicting when a crustacean will undergo a physiological change, in accordance with one or more embodiments.

FIG. 8 shows a flowchart for detecting physiological changes in crustaceans, in accordance with one or more embodiments.

FIG. 9 shows a flowchart for training an artificial neural network, in accordance with one or more embodiments.

DETAILED DESCRIPTION OF THE DRAWINGS

In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It will be appreciated, however, by those having skill in the art that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other cases, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.

The methods and systems described herein may successfully predict and detect traits in crustaceans. Traits may include any internal (e.g., genotype, pathogen infection, physiological change) or external condition (e.g., phenotype). For example, the methods and systems discussed herein may be used to predict and detect other phenotypes (e.g., not just genotypes) based on a phenotype. For example, the user of a first phenotype (e.g., color), may be used to predict a second phenotype (e.g., size, weight, etc.). Additionally, the methods and system may be used to recognize individual specimens based on its phenotype.

The internal conditions may include a current physiological condition (e.g., a condition occurring normally in the body of the juvenile crustacean) such as a gender of the crustacean (e.g., as determined by the development of sex organs) and/or a stage of development in the crustacean (e.g., the state of molting). The internal conditions may include a predisposition to a future physiological condition such as a growth rate, maturity date, and/or behavioral traits. The internal condition may include a pathological condition (e.g., a condition centered on an abnormality in the body of the crustacean based in response to a disease) such as whether or not the crustacean is suffering from a given disease and/or is currently infected with a given disease. The internal condition may include a genetic condition (e.g., a condition based on the formation of the genome of the crustacean) such as whether or not the crustacean includes a given genotype. The internal condition may include a presence of a given biomarker (e.g., a measurable substance in an organism whose presence is indicative of a disease, infection, current internal condition, future internal condition, and/or environmental exposure).

In some embodiments, the methods and systems may detect whether or not a crustacean has undergone a physiological change (e.g., molting) and/or predict when the crustacean will undergo a physiological change. Physiological change may include molting, sexual maturity, and/or other physiological development. Physiological changes during molting may include altered body shape, changes to the cuticle, and/or behavior changes. This may also include a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.

FIG. 1 shows a chart indicating common external characteristics used for detection of molting in crustaceans, in accordance with one or more embodiments. For example, diagram 100 shows various characteristics related to the different stages of molting in crustaceans. Additionally, diagram 100 shows a length of time associated with the stages of molting. The length of time is given as a percentage of the total molting time as the time per molt may different between crustacean species as well as differ based on the age of a given specimen. As a point of reference, shrimp typically take three to eight weeks to complete a molt.

As shown in diagram 100, molting may have three stages, although additionally sub-stages may also be used in some embodiments. During a post-molt, a crustacean recovers from its previous molt and its cuticle begins to harden. While adapting to its new size, the crustacean has little to no activity and/or feeding. Additionally, the crustacean may absorb large amounts of water. Some diseases such as White Spot Syndrome Virus (WSS) in shrimp may emerge during this phase as well as potential osmotic shock due to the intake of water. It should also be noted that in some embodiments along with detecting external characteristics associated with this stage, the system may also monitor other conditions (e.g., salt concentrations in the environment using an osmoregulatory). This information may be recorded and correlated with the specimen as discussed below. During the intermolt phase, the crustacean is stable and the cuticle is functional. The mass growth of the crustacean is continuous, and its feeding activity is at its maximum label. It should be noted that internal conditions of the crustacean (e.g., such as mass) may be detected by the system even though the external characteristics (e.g., size of the cuticle remains the same).

For example, as discussed below, these internal characteristics may be determined based on externally visible traits of the crustacean. These externally visible traits may include phenotype characteristics (e.g., one or more observable characteristics of the crustacean resulting from the interaction of its genotype with the environment). These externally visible traits may include traits corresponding to physiological changes in the crustacean. For example, during molting, externally visible traits related to this physiological change may include altered body shape, increased cuticle hardness, and changes in water intact, and behavior (e.g., feeding, mating, etc.). In such cases, the system may compare images of crustaceans before and after these physiological, morphological and/or behavioral to train an artificial neural network for to be used to predict when, and/or detect whether, other crustaceans have undergone, or will begin, the morphological and/or behavioral changes. In some embodiments, a trait may include a weight, presence of an active pathogen, presence of a parasite, or presence of a disease.

Following the intermolt stage, the crustacean will begin the premolt as the new cuticle begins formation and appears. The crustacean will also experience and interregnum during between the old and new cuticle. This stage may also coincide with a gradual decrease in feeding and other activities. The system may monitor these changes in order to predict changes as discussed below.

In some embodiments, the system may include receiving an image set of a first crustacean. The image set may include one or more images of the crustacean. If the image set includes multiple images, the multiple images may be captured from different angles (e.g., a top view, side view, bottom view, etc.) and/or may be captured substantially simultaneously. The images in the image set may include separate images (e.g., images stored separately, but linked by a common identifier such as a serial number) or images stored together. An image in an image set may also be a composite image (e.g., an image created by cutting, cropping, rearranging, and/or overlapping two or more images. In some embodiments, the crustacean may be a shrimplet, and the image set of the crustacean may include an external first view image of the first shrimplet and an external second view image of the first shrimplet. Additionally or alternatively, the image set of the shrimplet may be generated while the gills of the shrimplet are hydrated or while the shrimplet is sedated in order to reduce stress on the shrimplet. In some embodiments, the juvenile crustacean may be a fertilized egg, and the image set of the juvenile crustacean may include a depth of field of about half of the fertilized egg and/or a depth of field such that the image captures one or more of the vitelline membrane, chorion, yolk, oil globule, perivitelline space, or embryo.

In some embodiments (e.g., when predicting a date range for a physiological change and/or determining whether or not a physiological change has occurred, image sets may include a series of images taken after predetermined intervals of time and of a predetermined number (e.g., two or more). The intervals of time may vary and may depend on the life stage (or molting stage) of the specimen being imaged (e.g., as discussed in relation to FIGS. 1, 4, and 5). For example, in some embodiments, the intervals may be from a few weeks to up to three months at the juvenile stage, but following the juvenile stage, the intervals may be smaller. The intervals may increase to over a month at the “grow-out” stage when the specimens are at sea. The interval may also be keyed to particular activities. For example, a first imaging may occur when the specimens are release to sea and a second imaging may occur at harvest.

For example, in shrimp, the time from fertilization to shrimplet may be twenty-five to thirty-five days. Shrimplet to juvenile may be roughly sixty days. From juvenile to adult may be fifteen days. Adults may require one to three days to mate. The system may peg imaging dates to these periods of reference.

In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers. In some embodiments, the image set may be created using an imaging device that detects electromagnetic radiation with wavelengths between 400 to 500 nanometers, between 500 to 600 nanometers, between 700 to 900 nanometers, or between 700 to 1100 nanometers.

In some embodiments, strobe lighting may be synced with the imaging device. The strobe light may have a flash energy in the region of 10 to 150 joules, and discharge times as short as a few milliseconds, often resulting in a flash power of several kilowatts. The strobe light source may be a xenon flash lamp, or flashtube, which may have a complex spectrum and a color temperature of approximately 5,600 kelvins. In some embodiments, the system may obtain colored light through the use of colored gels.

The image set may capture an image of a given specimen. The specimen may be a crustacean in any life stage. For example, the crustacean may be an adult crustacean and/or a juvenile crustacean (e.g., a crustacean that has not reach sexual maturity). Juvenile crustaceans may include crustacean shrimplets, eggs, or larvae. It should be noted that while embodiments of this disclosure relate to juvenile crustaceans, these embodiments are also applicable to other specimens. In particular, these specimens may include any type of aquatic life (e.g., organisms that live in aquatic ecosystems) and/or oviparous organisms.

A plurality of imaging devices may be used. However, controlling the imaging devices (e.g., using the same imaging devices of the same settings on the imaging devices) may ensure that the artificial neural network achieves the best results. Alternatively, the imaging devices or settings for the imaging devices may be randomized. Randomizing the imaging devices or settings for the imaging devices may improve the results of the artificial neural network as unintended bias (e.g., from the imaging device or a setting of the imaging device) is not introduced. In some embodiments, the imaging device may be able to capture high resolution images and may comprise a 21 mega pixel camera. The image captured by the imaging device may include 3-6 mega pixel images.

In some embodiments, the system may then generate a pixel array based on the image set of the first crustacean. The pixel array may refer to computer data that describes the image (e.g., pixel by pixel). In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue or grayscale image. Furthermore, in some embodiments, the system may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, the system may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.

FIG. 2 shows a computer system featuring a machine learning model configured to detect internal conditions based on external characteristics, in accordance with one or more embodiments. As shown in FIG. 2, system 200 may include client device 202, client device 204 or other components. Each of client devices 202 and 204 may include any type of mobile terminal, fixed terminal, or other device. Each of these devices may receive content and data via input/output (hereinafter “I/O”) paths and may also include processors and/or control circuitry to send and receive commands, requests, and other suitable data using the I/O paths. The control circuitry may comprise any suitable processing circuitry. Each of these devices may also include a user input interface and/or display for use in receiving and displaying data. By way of example, client devices 202 and 204 may include a desktop computer, a server, or other client device. Users may, for instance, utilize one or more client devices 202 and 204 to interact with one another, one or more servers, or other components of system 200. It should be noted that, while one or more operations are described herein as being performed by particular components of system 200, those operations may, in some embodiments, be performed by other components of system 200. As an example, while one or more operations are described herein as being performed by components of client device 202, those operations may, in some embodiments, be performed by components of client device 204. It should be noted that, although some embodiments are described herein with respect to machine learning models, other prediction models (e.g., statistical models or other analytics models) may be used in lieu of or in addition to machine learning models in other embodiments (e.g., a statistical model replacing a machine learning model and a non-statistical model replacing a non-machine-learning model in one or more embodiments).

Each of these devices may also include memory in the form of electronic storage. The electronic storage may include non-transitory storage media that electronically stores information. The electronic storage media of the electronic storages may include one or both of (i) system storage that is provided integrally (e.g., substantially non-removable) with servers or client devices or (ii) removable storage that is removably connectable to the servers or client devices via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). The electronic storages may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. The electronic storages may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). The electronic storage may store software algorithms, information determined by the processors, information obtained from servers, information obtained from client devices, or other information that enables the functionality as described herein.

FIG. 2 also includes communication paths 208, 210, and 212. Communication paths 208, 210, and 212 may include the Internet, a mobile phone network, a mobile voice or data network (e.g., a 4G or LTE network), a cable network, a public switched telephone network, or other types of communications network or combinations of communications networks. Communication paths 208, 210, and 212 may separately or together include one or more communications paths, such as a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths. The computing devices may include additional communication paths linking a plurality of hardware, software, and/or firmware components operating together. For example, the computing devices may be implemented by a cloud of computing platforms operating together as the computing devices.

In some embodiments, system 200 may use one or more prediction models to predict internal conditions based on external characteristics. For example, as shown in FIG. 2, system 200 may predict a genotype characteristic of a specimen (e.g., a crustacean identified by a specimen identification) using machine learning model 222. The determination may be output shown as output 218 on client device 204. The system may include one or more neural networks (e.g., as discussed in relation to FIG. 3) or other machine learning models.

As an example, with respect to FIG. 2, machine learning model 222 may take inputs 224 and provide outputs 226. The inputs may include multiple data sets such as a training data set and a test data set. The data sets may represent images (or image sets) of specimens such as crustaceans. In one use case, outputs 226 may be fed back to machine learning model 222 as input to train machine learning model 222 (e.g., alone or in conjunction with user indications of the accuracy of outputs 226, labels associated with the inputs, or with other reference feedback information). In another use case, machine learning model 222 may update its configurations (e.g., weights, biases, or other parameters) based on its assessment of its prediction (e.g., outputs 226) and reference feedback information (e.g., user indication of accuracy, reference labels, or other information). In another use case, where machine learning model 222 is a neural network, connection weights may be adjusted to reconcile differences between the neural network's prediction and the reference feedback. In a further use case, one or more neurons (or nodes) of the neural network may require that their respective errors are sent backward through the neural network to them to facilitate the update process (e.g., backpropagation of error). Updates to the connection weights may, for example, be reflective of the magnitude of error propagated backward after a forward pass has been completed. In this way, for example, the machine learning model 222 may be trained to generate better predictions.

Machine learning model 222 may be trained to detect the internal conditions in crustaceans based on a pixel array. For example, client device 202 or 204 may generate the image set of the first crustacean (e.g., via an image capture device), and genetically test the first crustacean to determine a genotype biomarker (e.g., sequencing of seven hypervariable regions of the 16S rRNA gene and gender) in the first crustacean. The presence of a particular genotype biomarker is then correlated to one or more phenotype characteristics. For example, machine learning model 222 may have classifications for the internal conditions (e.g., genotype biomarkers). Machine learning model 222 is then trained based on a first data set (e.g., including data of the first crustacean and others) to classify a specimen as having a given genotype biomarker when particular phenotype characteristics are present.

The system may then receive an image set of a crustacean, wherein the image set of the crustacean includes an external characteristic of the second crustacean. Client device 202 or 204 may generate a second pixel array based on the image set of the second crustacean and input the second pixel array into machine learning model 222. The system may then receive an output from machine learning model 222 indicating that the second crustacean has the same internal condition (e.g., genotype biomarker) as the first. For example, the system may input a second data set (e.g., image sets of crustaceans for which genotype biomarkers are not known) into machine learning model 222. Machine learning model 222 may then classify the image sets of crustaceans into according to the genotype biomarkers. For example, the genotype biomarker for the first crustacean may be a first classification of machine learning model 222, and the system may generate an output from machine learning model 222 the second crustacean has the same genotype biomarker as the first crustacean based on matching the second pixel array to the first classification.

In some embodiments, system 200 is further configured to handle, sort, and/or transfer crustaceans (e.g., for vaccination, gender segregation, transfer to sea or breeding area, etc.). In such embodiments, the internal condition may be detected based on external characteristics in real-time (e.g., as the crustaceans are transported along a conveyor belt or otherwise transferred). That is, following the output of an internal condition (e.g., a genotype biomarker as described in FIG. 6 below), system 200 may sort the crustaceans based on the determined internal condition.

FIG. 3 shows a graphical representations of artificial neural network models for internal condition (e.g., genotype biomarker) detection based on an external (e.g., phenotype) characteristics, in accordance with one or more embodiments. Model 300 illustrates an artificial neural network. Model 300 includes input layer 302. Image sets of crustaceans may be entered into model 300 at this level. Model 300 also includes one or more hidden layers (e.g., hidden layer 304 and hidden layer 306). Model 300 may be based on a large collection of neural units (or artificial neurons). Model 300 loosely mimics the manner in which a biological brain works (e.g., via large clusters of biological neurons connected by axons). Each neural unit of a model 300 may be connected with many other neural units of model 300. Such connections can be enforcing or inhibitory in their effect on the activation state of connected neural units. In some embodiments, each individual neural unit may have a summation function which combines the values of all of its inputs together. In some embodiments, each connection (or the neural unit itself) may have a threshold function such that the signal must surpass before it propagates to other neural units. Model 300 may be self-learning and trained, rather than explicitly programmed, and can perform significantly better in certain areas of problem solving, as compared to traditional computer programs. During training, output layer 308 may corresponds to a classification of model 300 (e.g., whether or not a given image set corresponds to a genotype biomarker) and an input known to correspond to that classification may be input into input layer 302. In some embodiments, model 300 may include multiple layers (e.g., where a signal path traverses from front layers to back layers). In some embodiments, back propagation techniques may be utilized by model 300 where forward stimulation is used to reset weights on the “front” neural units. In some embodiments, stimulation and inhibition for model 300 may be more free-flowing, with connections interacting in a more chaotic and complex fashion. Model 300 also includes output layer 308. During testing, output layer 308 may indicate whether or not a given input corresponds to a classification of model 300 (e.g., whether or not a given image set corresponds to a genotype biomarker).

FIG. 3 also includes model 350, which is a convolutional neural network. The convolutional neural network is an artificial neural network that features one or more convolutional layers. Convolution layers extract features from an input image. Convolution preserves the relationship between pixels by learning image features using small squares of input data. For example, the relationship between the individual parts of the fertilized egg (e.g., vitelline membrane, chorion, yolk, oil globule, perivitelline space, and/or embryo) may be preserved. In another example, the relationship between the individual parts of the shrimplet (e.g., rostrum, eye, carapace, abdomen, antennule, antenna, cheliped, pereiopods, pleopods, telson, and uropod) may be preserved. As shown in model 350, input layer 352 may proceed to convolution blocks 354 and 356 before being output to convolutional output 360. In some embodiments, model 350 may itself serve as an input to model 300.

In some embodiments, model 350 may implement an inverted residual structure where the input and output of a residual block (e.g., block 354) are thin bottleneck layers. A residual layer may feed into the next layer and directly into layers that are one or more layers downstream. A bottleneck layer (e.g., block 358) is a layer that contains few neural units compared to the previous layers. Model 350 may use a bottleneck layer to obtain a representation of the input with reduced dimensionality. An example of this is the use of autoencoders with bottleneck layers for nonlinear dimensionality reduction. Additionally, model 350 may remove non-linearities in a narrow layer (e.g., block 358) in order to maintain representational power. In some embodiments, the design of model 350 may also be guided by the metric of computation complexity (e.g., the number of floating point operations). In some embodiments, model 350 may increase the feature map dimension at all units to involve as many locations as possible instead of sharply increasing the feature map dimensions at neural units that perform downsampling. In some embodiments, model 350 may decrease the depth and increase width of residual layers in the downstream direction.

In some embodiments, model 300 or model 350 may be a siamese neural network (e.g., model 390) that uses the same weights while working in tandem on two different input vectors to compute comparable output vectors. For example, in a siamese artificial neural network, model 300 may include two convolutional neural networks (e.g., two of model 350), that are not two different networks, but are two copies of the same network (e.g., model 350). For example, two input images may pass through model 390 to generate a fixed-length feature vector for each image. If the two input images belong to the same fish, then their feature vectors will also be similar, while if the two input images belong to two different fish, then their feature vectors will be different. The system may then generate a similarity score generated by an output sigmoid layer (e.g., layer 370) to detect and predict physiological changes and/or as well as diagnosing disease, stage of development, and/or condition. Furthermore, as one illustrative example of the algorithm used, the system may rely on a siamese neural network and/or other neural network that uses the same or similar weights while working on two different input vectors to compute comparable output vectors, typically in tandem. Additionally, the Siamese neural network does not rely on algorithms that require multiple images of each specimen, images from multiple angles, and/or with annotation of key features.

FIG. 4 shows an illustrative example of detecting a genotype biomarker based on phenotype characteristics in crustaceans, in accordance with one or more embodiments. FIG. 4 includes computer system 400, which may in some embodiments correspond to system 200 (FIG. 2). For example, computer system 400 may capture an image set (e.g., an image set of a shrimplet including a phenotype characteristic of the shrimplet, juvenile, and/or adult crustacean) using an imaging device 406. Imaging device 406 may be incorporated into and/or accessible by a computer system 400. Computer system 400 also includes memory 404, which may be incorporated into and/or accessible by a computer system 400. In some embodiments, computer system 400 may retrieve the image sets from memory 404.

Computer system 400 also include control circuitry 402. Control circuitry 402 may perform one or more processes (e.g., as described below in relation to FIG. 6) to detect a genotype biomarker based on phenotype characteristics in a shrimplet, juvenile, and/or adult crustacean. Control circuitry 402 may train an artificial neural network (e.g., as described above) in order to detect the genotype biomarker based on one or more data sets (e.g., stored in memory 404). Computer system 400 may receive user inputs (e.g., via client device 204 (FIG. 2)) to determine a genotype biomarker based on phenotype characteristics as indicated by image sets. Computer system may then output the determined genotype (e.g., as output 218 (FIG. 2)). Alternatively or additionally, computer system 400 may receive user inputs (e.g., via client device 204 (FIG. 2)) to predict or detect a trait and/or physiological change based on external characteristics as indicated by image sets.

FIG. 4 also includes image 410 and image 420 of a specimen (e.g., a shrimp). Image 410 and image 420 are each an exemplary image set of a shrimp in a shrimplet and grow-out phase, juvenile, respectively. As shown in FIG. 4, the specimen has individual external characteristics and the size, position, color, molt stage, and arrangement of these characteristics, with respect to other external characteristics and the shrimp as a whole, may indicate the presence of internal conditions. These individual external characteristics are identifiable using image 410 and image 420. For example, as shown in FIG. 4, external characteristics of the shrimplet include the size, location, position, coloration, maturity/development level of the parts of the shrimplet, including the rostrum, eye, carapace, abdomen, antennule, antenna, cheliped, pereiopods, pleopods, telson, and uropod. Imaging device 406 may capture one or more image sets. The image sets may be converted into pixel array and used an input for the artificial neural network. In embodiments using the convolutional neural network, the convolution layers may extract features from the input image and preserve the relationship between pixels by learning image features (and their size, location, coloration, maturity/development level) using small squares of input data. For example, the size of the rostrum with respect to the carapace may indicate an internal condition. In another example, the location of the cheliped may indicate another internal condition. Furthermore, each of images 410 and 420 may be stored together with a profile of a given crustacean. This profile may include information on the crustacean's attributes (e.g., batch, cohort, family, etc.) as well as other information (e.g., a weight) on the crustacean. The system may correlate this information to a time and/or date of the image as well as any imaging conditions (e.g., settings of an imaging device).

Image 410 is of a shrimp in a post-larvae, juvenile stage. For example, after completing the three stages of larval development (e.g., as discussed in FIG. 5), the specimen is a fully developed shrimp. In some embodiments, the specimen may remain in a hatchery for about two weeks, and once the specimen has reached a length of roughly 1 centimeter long, it is transferred to a juvenile pond to be grown out to roughly 1 gram. During this time, the system may capture image 410.

Image 420 is a shrimp in a juvenile stage during grow-out. For example, when a specimen reaches one gram, the specimen will require additional space and transferred to a grow-out pond (or other environment) for roughly seven to eight months, until it is harvest (typically at a weight of twenty to thirty grams). During this time, the system may capture image 420.

It should be noted that the image set of a specimen may include one or more angles of the specimen, although only a single view is shown in FIG. 4. For example, in some embodiments, additional angles may be used based on particular phenotype characteristics of interest. By using multiple angles, computer system 400 may better detect external characteristics of a given specimen. For example, the multiple angles allow for individual phenotype characteristics (e.g., a rostrum) to be imaged from different angles, which provides computer system 400 with more complete data. In some embodiments, the image set may also include other features used to identify the specimen (e.g., a serial number, order number, and/or batch number), used to determine the scale of the specimen and/or a part of the specimen (e.g., measurement means for height, length, and/or weight), used to provide a reference point for a given phenotype characteristic (e.g., a color palette used to compare color of the specimen to), and/or used to indicate other information that may be used to classify the specimen (e.g., an indicator of age, maturity level, species, size, etc.).

It should be noted that multiple views of the specimen may be used. The one or more views may create a standardized series of orthographic two-dimensional images that represent the form of the three-dimensional specimen. For example, six views of the specimen may be used, with each projection plane parallel to one of the coordinate axes of the object. The views may be positioned relative to each other according to either a first-angle projection scheme or a third-angle projection scheme. The views may include a side view, front view, top view, bottom, and/or end view. The views may also include plan, elevation, and/or section views.

FIG. 5 shows an illustrative example of detecting a genotype biomarker based on phenotype characteristics in fertilized eggs and larval stages, in accordance with one or more embodiments. FIG. 5 includes computer system 500, which may in some embodiments correspond to system 200 (FIG. 2). For example, computer system 500 may capture an image set (e.g., an image set of fertilized eggs including a phenotype characteristic of the fertilized eggs and larval stages) using an imaging device 506. Imaging device 506 may be incorporated into and/or accessible by a computer system 500. Computer system 500 also includes memory 504, which may be incorporated into and/or accessible by a computer system 500. In some embodiments, computer system 500 may retrieve the image sets from memory 504.

Computer system 500 also include control circuitry 502. Control circuitry 502 may perform one or more processes (e.g., as described below in relation to FIG. 6) to detect a genotype biomarker based on phenotype characteristics in fertilized eggs. Control circuitry 502 may train an artificial neural network (e.g., as described above) in order to detect the genotype biomarker based on one or more data sets (e.g., stored in memory 504). Computer system may receive user inputs (e.g., via client device 204 (FIG. 2)) to determine a genotype biomarker based on phenotype characteristics as indicated by image sets. Computer system 500 may then output the determined genotype (e.g., as output 218 (FIG. 2)). Alternatively or additionally, computer system 500 may receive user inputs (e.g., via client device 204 (FIG. 2)) to predict or detect a trait and/or physiological change based on external characteristics as indicated by image sets.

FIG. 5 also includes four images (e.g., image 510, image 520, image 530, and image 540) that may be processed by computer system 500. Image 510, image 520, image 530, and image 540 are each an exemplary image set of a fertilized egg and larval stage, respectively. As shown in FIG. 5, each specimen has individual external characteristics and the size, position, and arrangement of these, with respect to other external characteristics and the fertilized egg or larvae as a whole, may indicate the presence of internal conditions. For example, in each specimen, the size and placement of the egg parts (e.g., vitelline membrane, chorion, yolk, oil globule, perivitelline space, and/or embryo) with respect to the other egg parts provide indications of the genotype biomarkers.

For example, as shown in FIG. 5, external characteristics of the fertilized egg in image 510 including the size, location, position, coloration, maturity/development level of the parts of the fertilized egg such as the vitelline membrane, chorion, yolk, oil globule, perivitelline space, and/or embryo. Imaging device 506 may capture one or more image sets. The image sets may be converted into pixel arrays and use an input for an artificial neural network. In embodiments using the convolutional neural network, the convolution layers may extract features from the input image and preserve the relationship between pixels by learning image features (and their size, location, coloration, and/or maturity/development level) using small squares of input data. For example, the size of the vitelline membrane with respect to the chorion may indicate an internal condition. In another example, the location of the oil globule may indicate another internal condition.

In some embodiments, the image set may also include other features used to identify the specimen (e.g., a serial number, order number, and/or batch number), used to determine the scale of the specimen and/or a part of the specimen (e.g., measurement means for height, length, and/or weight), used to provide a reference point for a given phenotype characteristic (e.g., a color palette used to compare color of the specimen to), and/or used to indicate other information that may be used to classify the specimen (e.g., an indicator of age, maturity level, species, size, etc.). Furthermore, each of images image 520, 530, and 540 may be stored together with a profile of a given crustacean. This profile may include information on the crustacean's attributes (e.g., batch, cohort, family, etc.) as well as other information (e.g., a weight) on the crustacean. The system may correlate this information to a time and/or date of the image as well as any imaging conditions (e.g., settings of an imaging device).

The system may also capture and analyze image 520, 530, and 540 in a similar manner. Image 520 is an image of a shrimp in a nauplii larval stage, which lasts for under two days after hatching. During this stage, its body may consist of a thorax and abdomen, head and telson. The specimen may also include a naupliar eye. The nauplii larvae may also include three pairs of appendages, the first and second function as antennae, and the third function as mandibles. The length, position, and use (e.g., the use of the appendages in feeding and propulsion) may be used by the system to classify the specimen.

Image 530 is an image of a shrimp in a zoea larval stage, which lasts for three to five days. During this time, the specimen begins to develop its eyes and extended body length, which may be detected and analyzed by the system. The specimen also begins it behavioral routine of feedings (e.g., on algae). Zoea larvae swim using thoracic appendages (e.g., maxillipeds and pereopods). The zoea larvae has two stalked compound eyes (the size of which relative to its body may be measured by the system). The zoea larvae also has two maxillipeds. The length and position of these between the rostral and lateral spines may be measured by the system. The anterior-most maxilliped contains the endopodites that are used for feeding, the use of which may be monitored by the system to classify the specimen.

Image 540 is an image of a shrimp in a mysis larval stage, which lasts for three to five days. During this time, the head and thorax have a carapace, and all of the cephalic and thoracic appendages are present. However, the thoracic appendages are alike and biramous with exopodites. In addition to measuring the size, position, and length of these features, the system may monitor the behavior of the specimen to classify the specimen.

FIG. 6 shows a flowchart of a process for identifying genotype biomarkers in crustaceans, in accordance with one or more embodiments. For example, process 600 may represent the steps taken by one or more devices as shown in FIGS. 1-5 for identifying genotype biomarkers in crustaceans based on phenotype characteristics.

At step 602, process 600 receives (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) an image set of a first crustacean. The image set of the first crustacean may include an external characteristic (e.g., a phenotype characteristic) of the first crustacean. In some embodiments, the image set may have been captured using an imaging device (e.g., imaging device 406 (FIG. 4) or imaging device 506 (FIG. 5)) incorporated into and/or accessible by a computer system (e.g., computer system 400 (FIG. 4) or computer system 500 (FIG. 5)) for identifying genotype biomarkers in crustacean. In some embodiments, the image set may have been retrieved from memory (e.g., memory 404 (FIG. 4) or memory 504 (FIG. 5)) incorporated into and/or accessible by the computer system.

In some embodiments, the crustacean may be a shrimplet (as discussed in relation to FIG. 4 above), and the image set of the crustacean may include an external first view image of the first shrimplet and an external second view image of the first shrimplet. Additionally or alternatively, the image set of the shrimplet may be generated while the gills of the shrimplet are hydrated or while the shrimplet is sedated in order to reduce stress on the shrimplet. For example, in order to reduce stress on the shrimplet, the image set may be captured while minimizing the time that the shrimplet has been out of the water. This amount of time (e.g., 1 second, 2, seconds, 5-10 seconds, etc.) should be low enough that the gills of the shrimplet do not become dehydrated such that the breathing of the shrimplet is labored. Likewise, in some embodiments, the shrimplet may first enter a liquid sedative solution in which the shrimplet may be sedated prior to the image set being captured. In some embodiments, the crustacean may be a fertilized egg (as discussed in relation to FIG. 5 above), and the image set of the crustacean may include a depth of field of about half of the fertilized egg.

At step 604, process 600 generates (e.g., using control circuitry 402 (FIG. 4) or control circuitry 505 (FIG. 5)) a first pixel array based on the image set of the first crustacean. For example, the detected external characteristic (e.g., phenotype characteristic) in the image set may be converted into a form that can be quickly and efficiently processed by an artificial neural network. In some embodiments, process 600 may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue or grayscale image (e.g., as discussed in relation to FIG. 3). Furthermore, in some embodiments, process 600 may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, process 600 may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.

At step 606, process 600 labels (e.g., using control circuitry 402 (FIG. 4) or control circuitry 505 (FIG. 5)) the first pixel array with an internal condition (e.g., a genotype biomarker) for the first crustacean. For example, process 600 may include labeling a given specimen with a specimen identifier that can be used to match the genotype biomarker (e.g., as determined from a genetic test of the first crustacean) to the first pixel array. In some embodiments, the specimen identifier may indicate a particular x, y coordinates in an image and batch number for the given specimen. For example, in some embodiments, to eliminate bias, process 600 may image multiple specimens together and track to location of a given specimen in an image of multiple specimens. In another example, the specimen identifier may indicate a particular order within a batch.

At step 608, process 600 trains (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) an artificial neural network to detect the internal condition (e.g., genotype biomarker) in crustaceans based on the labeled first pixel array. For example, the system may generate the image set of the first crustacean, and genetically test the first crustacean to determine a genotype biomarker in the first crustacean. Process 600 may then correlate the presence of the genotype biomarker to one or more phenotype characteristics in the image set using the artificial neural network. For example, as discussed in relation to FIG. 3, the artificial neural network may have classifications for the genotype biomarkers. The artificial neural network is then trained based on a first data set (e.g., including data of the first crustacean and others) to classify a specimen as having a given genotype biomarker when particular phenotype characteristics are present.

At step 610, process 600 receives (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) an image set of a second crustacean. The image set of the second crustacean may include an external characteristic (e.g., a phenotype characteristic) of the second crustacean. In some embodiments, the image set of the second crustacean may feature the second crustacean in the same arrangement and/or position as the first crustacean. Additionally or alternatively, other parameters (e.g., axial alignment, image device settings, image quality, etc.) may be the same for the image set of the second crustacean as the image set of the crustacean in order to standardize the image sets. Additionally or alternatively, the step taken to prepare the second crustacean prior to the image set being captured (e.g., time out of water, feeding schedule, sedation level) may be the same in order to standardize the image sets. Additionally or alternatively, the batches of crustacean for which image sets are to be captured may first be sorted (e.g., based on size, age, maturity level, environmental factors) in order to standardize the image sets. However, in some cases, the crustacean may not be sorted or may be actively unsorted to prevent the introduction of bias (e.g., from sorting male and female specimens prior to imaging). For example, process 600 may prevent sorting prior to imaging in order to prevent subtle changes in the imaging parameters to affect the captured images. Process 600 may, in some instances, normalize the image set to account for bias in the system.

At step 612, process 600 generates (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) a second pixel array based on the image set of the second crustacean. For example, as with the first pixel array, the detected external characteristic (e.g., phenotype characteristic) in the image set may be converted into a form that can be quickly and efficiently processed by an artificial neural network. This form may be the same as the form of the first pixel array. In some embodiments, process 600 may normalize the second array in order to account for bias in the system. In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue or grayscale image (e.g., as discussed in relation to FIG. 3). Furthermore, in some embodiments, process 600 may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, process 600 may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array.

At step 614, process 600 inputs (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) the second pixel array into the trained neural network. For example, as discussed above in relation to FIG. 3, process 600 may input a second data set (e.g., image sets of crustaceans for which genotype biomarkers are not known) into the trained artificial neural network. The trained artificial neural network may then classify the image sets of crustaceans into according to the genotype biomarkers. For example, the genotype biomarker for the first crustacean may be a first classification of the neural network, and the system may generate an output from the neural network indicating the second crustacean has the same genotype biomarker as the first crustacean based on matching the second pixel array to the first classification.

At step 616, process 600 outputs (e.g., using control circuitry 402 (FIG. 4) or control circuitry 502 (FIG. 5)) from the trained neural network indicating that the second crustacean has the internal condition (e.g., the genotype biomarker). For example, process 600 may determine that based on the second pixel array, the second crustacean is included in the same classification as the first crustacean. Because the first crustacean is determined to have a given genotype biomarker, process 600 determines that the second crustacean has the same genotype biomarker.

In some embodiments, process 600 may further handle, sort, and/or transfer the crustaceans (e.g., for vaccination, gender segregation, transfer to sea or breeding area, etc.) automatically. In such embodiments, the internal condition may be detected based on external characteristics in real-time (e.g., as the crustaceans are transported along a conveyor belt or otherwise transferred). That is, following the output of an internal condition (e.g., a genotype biomarker as described in FIG. 6 below), process 600 may sort the crustaceans based on the determined internal condition. For example, a crustacean determined to have a first internal characteristic may be sorted into a first group, and a crustacean determined to have a second internal characteristic may be sorted into a second group.

It is contemplated that the steps or descriptions of FIG. 6 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 6 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-5 could be used to perform one or more of the steps in FIG. 6.

FIG. 7 shows a flowchart for predicting when a crustacean will undergo a physiological change, in accordance with one or more embodiments. For example, process 700 describes a method of predicting dates of physiological changes in crustaceans based on external characteristics. In some embodiments, the physiological change may comprise molting and/or the date range may indicate with molting will begin, occur, and/or be complete. The system may indicate the date range in a temporal measurement (e.g., a time, day, month, etc.) or may express the date range as a probability (e.g., a likelihood of molting during a particular date range and/or may include a level of confidence with the probability). In some embodiments, the physiological change is sexual maturity, and the first date range may include a date (or other temporal and/or probability) when the second crustacean is sexually mature. The system may further detect the physiological changes in crustaceans with varying ages, weight, length, sex, cohorts, or other attributes. For example, the system may identify physiological changes in crustaceans up to 2 grams, 5 grams, up to 100 grams, up to 200 grams, and/or over 200 grams.

At step 702, process 700 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an image set of a first crustacean. For example, the image set of the first crustacean may include a first external characteristic on the first crustacean. The image set may comprise one or more images of a crustacean and/or may comprise one or more views of the crustacean. In some embodiments, the system may comprise a single top view or a single side view of the crustacean. The image may include the entire length of the crustacean (e.g., from head to tail) or may only include some portions of the crustacean. For example, the system may only include a head portion of the crustacean (e.g., mouth to gills). Alternatively or additionally, the image may focus on a particular portion of a crustacean (e.g., a telson or maxilliped). Each image set is labeled with a crustacean that the image corresponds to. Each label may additionally include a serial number or other identifier used to denote the crustacean.

At step 704, process 700 generates (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) a first pixel array based on the image set of the first crustacean (e.g., as discussed above). At step 706, process 700 labels (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) the first pixel array with a first date range for a physiological change for the first crustacean.

At step 708, process 700 trains (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an artificial neural network to determine the first date range based on the labeled first pixel array. It should be noted that in some embodiments, the system may train the artificial neural network to determine other date ranges based on other labeled pixel arrays. For example, the system may receive an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean. The system may then generate, using the control circuitry, a third pixel array based on the image set of the third crustacean. The system may label, using the control circuitry, the third pixel array with a third date range for a physiological change for the third crustacean, and train using the control circuitry, the artificial neural network to determine the third date range based on the labeled third pixel array.

For example, the artificial neural network may comprise a convolutional neural network and/or the first date range may correspond to a first classification of the artificial neural network and the third date range may correspond to second classification of the artificial neural network. The system may then classify an inputted image (e.g., an image of a crustacean) as corresponding to the first and/or second classification. For example, the system may input the second pixel array into the artificial neural network, and the system may determine whether the second pixel array corresponds to the first classification or the second classification.

At step 710, process 700 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;

At step 712, process 700 generates (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) a second pixel array based on the image set of the second crustacean.

At step 714, process 700 input (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) the second pixel array into the artificial neural network. At step 716, process 700 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an output from the artificial neural network indicating that the second crustacean has the first date range of the physiological change.

In some embodiments, the system may further comprise sorting the second crustacean based on the output. For example, the system may control a sorting mechanism that either outputs a signal that causes the mechanism to sort a crustacean and/or label a particular crustacean (e.g., in a profile associated with the crustacean) with the date range.

It is contemplated that the steps or descriptions of FIG. 7 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 7 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-5 could be used to perform one or more of the steps in FIG. 7.

FIG. 8 shows a flowchart for detecting physiological changes in crustaceans, in accordance with one or more embodiments. For example, process 800 describes a method of predicting physiological changes in crustaceans based on external characteristics. In some embodiments, the physiological change may comprise molting and/or the date range may indicate with molting will begin, occur, and/or be complete. The system may indicate whether or not a crustacean has begun or completed a physiological change and/or a probability (e.g., a likelihood of molting and/or may include a level of confidence with the probability).

In some embodiments, the first physiological change may be an absolute cuticle hardness, a predetermined cuticle level, or change in a cuticle hardness. The system may further detect the physiological changes in crustaceans with varying ages, weight, length, sex, cohorts, or other attributes. For example, the system may identify physiological changes in crustaceans up to 2 grams, 5 grams, up to 100 grams, up to 200 grams, and/or over 200 grams.

At step 802, process 800 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an image set of a first crustacean. For example, the image set of the first crustacean may include a first external characteristic on the first crustacean.

At step 804, process 800 generates (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) a first pixel array based on the image set of the first crustacean. At step 806, process 800 labels (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) the first pixel array with a first physiological change for the first crustacean. At step 808, process 800 trains (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an artificial neural network to detect the first physiological change based on the labeled first pixel array. It should be noted that in some embodiments, the system may train the artificial neural network to determine whether or not other physiological changes have occurred based on other labeled pixel arrays. For example, the system may receive an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean. The system may then generate, using the control circuitry, a third pixel array based on the image set of the third crustacean. The system may label, using the control circuitry, the third pixel array with a physiological change for the third crustacean, and train using the control circuitry, the artificial neural network to detect the physiological change of the third crustacean based on the labeled third pixel array.

For example, the artificial neural network may comprise a convolutional neural network and/or the first physiological change may correspond to a first classification of the artificial neural network and the second physiological change may correspond to second classification of the artificial neural network. The system may then classify an inputted image (e.g., an image of a crustacean) as corresponding to the first and/or second classification. For example, the system may input the second pixel array into the artificial neural network, and the system may determine whether the second pixel array corresponds to the first classification or the second classification.

At step 810, process 800 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an image set of a second crustacean. For example, the image set of the second crustacean may include the first external characteristic on the second crustacean. At step 812, process 800 generates (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) a second pixel array based on the image set of the second crustacean. At step 814, process 800 inputs (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) the second pixel array into the artificial neural network.

At step 816, process 800 receives (e.g., using control circuitry of one or more components of system 200 (FIG. 2)) an output from the artificial neural network indicating that the second crustacean has the first physiological change.

In some embodiments, the system may further comprise sorting the second crustacean based on the output. For example, the system may control a sorting mechanism that either outputs a signal that causes the mechanism to sort a crustacean and/or label a particular crustacean (e.g., in a profile associated with the crustacean) with the date range.

It is contemplated that the steps or descriptions of FIG. 8 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 8 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-5 could be used to perform one or more of the steps in FIG. 8.

FIG. 9 shows a flowchart for training an artificial neural network, in accordance with one or more embodiments. For example, FIG. 9 describes a process for producing an artificial neural network for detecting physiological changes in crustaceans based on external characteristics with over ninety-nine percent accuracy by training an artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a trait, wherein the 2500 to 3000 crustaceans share a first attribute.

At step 902, process 900 selects data sets based on a crustacean attribute. For example, the system may be trained to identify crustaceans having a common cohort, age, sex, length, and/or weight. Accordingly, the system may first select a data set of crustaceans having the particular attribute. The particular attribute may be recorded with an identifier (e.g., serial number) associated with each crustacean. In some embodiments, the serial number may be correlated with a label for each crustacean.

At step 904, process 900 prepares sets of images. For example, the system may prepare the data for training the artificial neural network. For example, the system may randomize a first training characteristic or control the first training characteristic. For example, the first training characteristic may be an imaging device used to capture the first training set of images and/or an image background for the first training set of images.

At step 906, process 900 trains an artificial neural network using a first training set of images comprising crustaceans with corresponding labels for a physiological change. For example, the system may train the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute. For example, the system may train the artificial neural network to detect one or more physiological changes, which may include any characteristic that may distinguish one crustacean (or group of crustaceans) from another. Physiological changes may include any quantitative or qualitative description of a characteristic such as early gonadal development, early molting, early-sexual maturation, disease resistance, and/or robustness. Physiological changes may also be described in relation to other crustaceans, a progression in development, and/or a comparison to an average progression of other crustaceans.

In some embodiments, the first training set of images may comprise an image set for each of the 2500 to 3000 crustaceans, and the image set may comprise a series of images of each of the 2500 to 3000 crustaceans at a first time and a second time. The first time and second time may be at predetermined times and may be based on age, weight, and/size. For example, each of the 2500 to 3000 crustaceans may be juvenile crustaceans or under 200 grams.

In some embodiments, a first training set of images may comprise an image of the 2500 to 3000 crustaceans during a photoperiod treatment. For example, the system may monitor the period of time each day during which the crustaceans receive illumination. The system may then use these periods to determine when to collect the sets of images for each crustacean.

At step 908, process 900 trains the artificial neural network using a second training set of images comprising images of crustaceans without corresponding labels for the trait. For example, the system may train the artificial neural network by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of un-tagged crustaceans without corresponding labels for the physiological change and are part of the first attribute.

It is contemplated that the steps or descriptions of FIG. 9 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 9 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-5 could be used to perform one or more of the steps in FIG. 9.

Although the present invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but on the contrary, is intended to cover modifications and equivalent arrangements that are within the scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.

The present techniques will be better understood with reference to the following enumerated embodiments:

1. A method of identifying internal conditions in crustacean based on external characteristics, the method comprising: receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a phenotype characteristic of the first crustacean; generating, using the control circuitry, a first pixel array based on the image set of the first crustacean; labeling, using the control circuitry, the first pixel array with a genotype biomarker for the first crustacean; training, using the control circuitry, an artificial neural network to detect the genotype biomarker in the first crustacean based on the labeled first pixel array; receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes a phenotype characteristic of the second crustacean; generating, using the control circuitry, a second pixel array based on the image set of the second crustacean; inputting, using the control circuitry, the second pixel array into the trained neural network; and receiving, using the control circuitry, an output from the trained neural network indicating that the second crustacean has the genotype biomarker.
2. The method of embodiment 1, wherein the first crustacean is a first shrimplet and the second crustacean is a second shrimplet, and wherein the image set of the first crustacean includes an external first view image of the first shrimplet and an external second view image of the first shrimplet and the image set of the second crustacean includes an external first view image of the second shrimplet and an external second view image of the second shrimplet.
3. The method of embodiment 2, wherein the image set of the first crustacean is generated while the gills of the first crustacean are hydrated or while the first crustacean is sedated.
4. The method of claim 1, wherein the first crustacean is a first fertilized crustacean egg and the second crustacean is a second fertilized crustacean egg, and wherein the image set of the first crustacean includes an image of the first fertilized egg with a depth of field of about half of the first fertilized egg and the image set of the second crustacean includes an image of the first fertilized egg with a depth of field of about half of the second fertilized egg.
5. The method of any of embodiments 1-4, further comprising: receiving an image set of a third crustacean, wherein the image set of the third crustacean includes a phenotype characteristic of the third crustacean; generating a third pixel array based on the image set of the third crustacean; labeling the third pixel array with a genotype biomarker for the third crustacean; training the neural network to detect genotype biomarkers in crustaceans based on the labeled first pixel array and the labeled second pixel array.
6. The method of any of embodiments 1-5, wherein the genotype biomarker for the first crustacean is a first classification of the neural network, and wherein receiving the output from the neural network indicating the genotype biomarker for the second crustacean comprises matching the second pixel array to the first classification.
7. The method of any of embodiments 1-6, wherein the image set of the first crustacean and the image set of the second crustacean were generated together, and wherein the first crustacean is male and the second crustacean is female.
8. The method of any of embodiments 1-7, wherein the image set of the first crustacean is created using an imaging device that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers.
9. The method of any of embodiments 1-8, wherein the image set of the first crustacean has a red color array, a green color array, and a blue color array, and wherein generating the first pixel array based on the image set of the first crustacean, further comprises: determining a grayscale color array for the image set of the first crustacean; and generating the first pixel array based on the grayscale color array.
10. The method of any of embodiments 1-9, further comprising: generating the image set of the first crustacean; and genetically testing the first crustacean to determine the genotype biomarker in the first crustacean.
11. The method of any of embodiments 1-10, wherein the first crustacean is under 50 grams, and wherein the second crustacean is under 5 grams.
12. A method of predicting dates of physiological changes in crustacean based on external characteristics, the method comprising:

receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;

generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;

labeling, using the control circuitry, the first pixel array with a first date range for a physiological change for the first crustacean;

training, using the control circuitry, an artificial neural network to determine the first date range based on the labeled first pixel array;

receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;

generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;

inputting, using the control circuitry, the second pixel array into the artificial neural network; and

receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first date range of the physiological change.

13. The method of embodiment 12, further comprising:

receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;

generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;

labeling, using the control circuitry, the third pixel array with a third date range for a physiological change for the third crustacean;

training, using the control circuitry, the artificial neural network to determine the third date range based on the labeled third pixel array.

14. The method of any of embodiments 12-13, wherein the first date range corresponds to a first classification of the artificial neural network and the third date range corresponds to second classification of the artificial neural network.
15. The method of embodiment 14, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
16. The method of any of embodiments 12-15, wherein the artificial neural network comprises a convolutional neural network.
17. The method of any of embodiments 12-16, wherein the physiological change is molting.
18. The method of any of embodiments 12-17, wherein the first crustacean is 0-3 grams, 2-6 grams, 10-100 grams, under 200 grams, or over 200 grams.
19. The method of any of embodiments 12-18, wherein the first date range includes a date when molting is complete in the second crustacean.
20. The method of any of embodiments 12-19, wherein the physiological change is sexual maturity, and wherein the first date range includes a date when the second crustacean is sexually mature.
21. The method of any of embodiments 12-20, further comprising sorting the second crustacean based on the output.
22. A method of predicting dates of physiological changes in crustaceans based on external characteristics, the method comprising:

receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;

generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;

labeling, using the control circuitry, the first pixel array with a first physiological change for the first crustacean;

training, using the control circuitry, an artificial neural network to detect the first physiological change based on the labeled first pixel array;

receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;

generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;

inputting, using the control circuitry, the second pixel array into the artificial neural network; and

receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first physiological change.

23. The method of embodiment 22, further comprising:

receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;

generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;

labeling, using the control circuitry, the third pixel array with a second physiological change;

training, using the control circuitry, the artificial neural network to detect the second physiological change.

24. The method of any of embodiments 22-23, wherein the first physiological change corresponds to a first classification of the artificial neural network and the second physiological change corresponds to second classification of the artificial neural network.
25. The method of embodiment 24, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.
26. The method of any of embodiments 22-25, wherein the artificial neural network comprises a convolutional neural network.
27. The method of any of embodiments 22-26, wherein the first physiological change is molting.
28. The method of any of embodiments 22-27, wherein the first physiological change is a predetermined cuticle hardness.
29. The method of any of embodiments 22-28, wherein the first physiological change is a predetermined cuticle hardness level or a degree of change in a cuticle.
30. The method of any of embodiments 22-29, wherein the first physiological change is a predetermined water intake amount in the first crustacean or a degree of change in water intake level of the first crustacean.
31. The method of any of embodiments 22-30, wherein the first crustacean is up to 200 grams.
32 An artificial neural network for detecting physiological changes of crustacean based on external characteristics with over ninety-nine percent accuracy produced by:

training the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute.

33. The artificial neural network of embodiment 32, further produced by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of un-tagged crustaceans without corresponding labels for the physiological change and are part of the first attribute.
34. The artificial neural network of any of embodiments 32-33, wherein the first attribute comprises a cohort, age, or weight.
35. The artificial neural network of any of embodiments 32-34, wherein the trait comprises a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.
36. The artificial neural network of any of embodiments 32-35, wherein the physiological change comprises a completion of gonadal development, molting, or sexual maturation.
37. The artificial neural network of any of embodiments 32-36, wherein the training comprises randomizing a first training characteristic or controlling the first training characteristic.
38. The artificial neural network of embodiment 36 or 37, the first training characteristic is an imaging device used to capture the first training set of images or an image background for the first training set of images.
39. The artificial neural network of any of embodiments 32-38, wherein the first training set of images comprises an image set for each of the 2500 to 3000 crustaceans, and wherein the image set comprises a series of images of each of the 2500 to 3000 crustacean at a first time and a second time.
40. The artificial neural network of any of embodiments 32-39, wherein the first training set of images comprises an image of the 2500 to 3000 crustaceans during a photoperiod treatment.
41. The artificial neural network of any of embodiments 32-40, wherein each of the 2500 to 3000 crustaceans are juvenile crustaceans or under 200 grams.
42. A tangible, non-transitory, machine-readable medium storing instructions that, when executed by a data processing apparatus, cause the data processing apparatus to perform operations comprising those of any of embodiments 1-41.
43. A system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-41.

Claims

1. A method of predicting dates of physiological changes in crustaceans based on external characteristics, the method comprising:

receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
labeling, using the control circuitry, the first pixel array with a first date range for a physiological change for the first crustacean;
training, using the control circuitry, an artificial neural network to determine the first date range based on the labeled first pixel array;
receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
inputting, using the control circuitry, the second pixel array into the artificial neural network; and
receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first date range of the physiological change.

2. The method of claim 1, further comprising:

receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
labeling, using the control circuitry, the third pixel array with a third date range for a physiological change for the third crustacean;
training, using the control circuitry, the artificial neural network to determine the third date range based on the labeled third pixel array.

3. The method of claim 2, wherein the first date range corresponds to a first classification of the artificial neural network and the third date range corresponds to second classification of the artificial neural network.

4. The method of claim 3, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.

5. The method of claim 1, wherein the artificial neural network comprises a convolutional neural network.

6. The method of claim 1, wherein the physiological change is molting.

7. The method of claim 1, wherein the first crustacean is under grams.

8. The method of claim 1, wherein the first date range includes a date when molting is complete in the second crustacean.

9. The method of claim 1, wherein the physiological change is sexual maturity, and wherein the first date range includes a date when the second crustacean is sexually mature.

10. The method of claim 1, further comprising sorting the second crustacean based on the output.

11. A method of predicting dates of physiological changes in crustacean based on external characteristics, the method comprising:

receiving, using control circuitry, an image set of a first crustacean, wherein the image set of the first crustacean includes a first external characteristic on the first crustacean;
generating, using the control circuitry, a first pixel array based on the image set of the first crustacean;
labeling, using the control circuitry, the first pixel array with a first physiological change for the first crustacean;
training, using the control circuitry, an artificial neural network to detect the first physiological change based on the labeled first pixel array;
receiving, using the control circuitry, an image set of a second crustacean, wherein the image set of the second crustacean includes the first external characteristic on the second crustacean;
generating, using the control circuitry, a second pixel array based on the image set of the second crustacean;
inputting, using the control circuitry, the second pixel array into the artificial neural network; and
receiving, using the control circuitry, an output from the artificial neural network indicating that the second crustacean has the first physiological change.

12. The method of claim 11, further comprising:

receiving, using control circuitry, an image set of a third crustacean, wherein the image set of the third crustacean includes a second external characteristic on the third crustacean;
generating, using the control circuitry, a third pixel array based on the image set of the third crustacean;
labeling, using the control circuitry, the third pixel array with a second physiological change;
training, using the control circuitry, the artificial neural network to detect the second physiological change.

13. The method of claim 12, wherein the first physiological change corresponds to a first classification of the artificial neural network and the second physiological change corresponds to second classification of the artificial neural network.

14. The method of claim 13, wherein inputting the second pixel array into the artificial neural network comprises determining whether the second pixel array corresponds to the first classification or the second classification.

15. The method of claim 11, wherein the artificial neural network comprises a convolutional neural network.

16. The method of claim 11, wherein the first physiological change is molting.

17. The method of claim 11, wherein the first physiological change is a cuticle hardness.

18. The method of claim 11, wherein the first physiological change is a level of feeding activity.

19. The method of claim 11, wherein the first physiological change is an amount of water intake of the first crustacean or a degree of change in cuticle hardness.

20. The method of claim 11, wherein the first crustacean is up to 100 grams.

21. An artificial neural network for detecting physiological changes of crustaceans based on external characteristics with over ninety-nine percent accuracy in detecting the traits produced by:

training the artificial neural network using a first training set of images comprising 2500 to 3000 crustaceans with corresponding labels for a physiological change, wherein the 2500 to 3000 crustaceans share a first attribute.

22. The artificial neural network of claim 21, further produced by training the artificial neural network using a second training set of images comprising 2500 to 20,000 images of crustaceans without corresponding labels for the physiological change and are part of the first attribute.

23. The artificial neural network of claim 21, wherein the first attribute comprises a cohort, molt stage, life stage, age, size, or weight.

24. The artificial neural network of claim 21, wherein the trait comprises a level of gonadal development, a level of molting, a level of sexual maturation, a level of disease resistance, or a level of robustness.

25. The artificial neural network of claim 21, wherein the trait comprises a completion of gonadal development, molting, or sexual maturation.

26. The artificial neural network of claim 21, wherein the training comprises randomizing a first training characteristic or controlling the first training characteristic.

27. The artificial neural network of claim 26, the first training characteristic is an imaging device used to capture the first training set of images or an image background for the first training set of images.

28. The artificial neural network of claim 21, wherein the first training set of images comprises an image set for each of the 2500 to 3000 crustaceans, and wherein the image set comprises a series of images of each of the 2500 to 3000 crustaceans at a first time and a second time.

29. The artificial neural network of claim 21, wherein the first training set of images comprises an image of the 2500 to 3000 crustaceans during a photoperiod treatment.

30. The artificial neural network of claim 21, wherein each of the 2500 to 3000 crustaceans are juvenile crustaceans or under 2 grams.

Patent History
Publication number: 20220335721
Type: Application
Filed: Aug 4, 2020
Publication Date: Oct 20, 2022
Inventor: Ayal BRENNER (Birkirkara)
Application Number: 17/642,321
Classifications
International Classification: G06V 20/50 (20060101); G06V 10/774 (20060101); G06V 10/82 (20060101); G06V 20/70 (20060101); G06V 20/60 (20060101);