Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Nov 15;23(22):9197.
doi: 10.3390/s23229197.

Seaweed Growth Monitoring with a Low-Cost Vision-Based System

Affiliations

Seaweed Growth Monitoring with a Low-Cost Vision-Based System

Jeroen Gerlo et al. Sensors (Basel). .

Abstract

In this paper, we introduce a method for automated seaweed growth monitoring by combining a low-cost RGB and stereo vision camera. While current vision-based seaweed growth monitoring techniques focus on laboratory measurements or above-ground seaweed, we investigate the feasibility of the underwater imaging of a vertical seaweed farm. We use deep learning-based image segmentation (DeeplabV3+) to determine the size of the seaweed in pixels from recorded RGB images. We convert this pixel size to meters squared by using the distance information from the stereo camera. We demonstrate the performance of our monitoring system using measurements in a seaweed farm in the River Scheldt estuary (in The Netherlands). Notwithstanding the poor visibility of the seaweed in the images, we are able to segment the seaweed with an intersection of the union (IoU) of 0.9, and we reach a repeatability of 6% and a precision of the seaweed size of 18%.

Keywords: aquaculture; image segmentation; seaweed monitoring; underwater stereo imaging.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results. Author Dennis G. Kooijman was employed by the company Intelligent Autonomous Mobility Center. Author Ivo W. Wieling was employed by the company Aqitec. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Common seaweed farm typology: Floaters or buoys (3) are held in position on the water surface using anchors (1) and tethers (2). A long rope line (4) is suspended between the fixed buoys. The seaweed (6) is seeded on this suspended line, growing downwards and flowing freely in the water current. Attached buoys and/or floaters (3, 5) keep the seaweed in the upper water column in order to catch sunlight.
Figure 2
Figure 2
Seaweed farm schematic during growth. (7) Dense seaweed bundles; (8) delineated surface area of the seaweed, approximating it as a continuous 2D sheet.
Figure 3
Figure 3
Sequence of the automated monitoring process. The separate components are designed to be integrated in the UTOPIA framework.
Figure 4
Figure 4
(Left) Area within the red dashed line shows the waterproof housing and mounting of the Realsense camera. (Right) The camera setup submerged and recording during one of the measurement campaigns at Neeltje Jans.
Figure 5
Figure 5
(Left) Sharp images of seaweed in the foreground, but too close to determine plant size. (Right) The seaweed plant is in frame, but is too far away to be sufficiently sharp due to the haze created by turbidity and lighting conditions at the moment of measurement.
Figure 6
Figure 6
Example of underwater images taken with the Realsense RGB camera. Seaweed plants can take on different colors and shapes depending on current lighting conditions, depth and position within the seaweed bundles.
Figure 7
Figure 7
Example of manual annotation on a seaweed RGB image. (Left): original image. (Right): original image with the annotation superimposed in blue with a black border. Two bundles of seaweed are visible in the foreground, both growing on the closest rope line. Another bundle of seaweed is visible in the background, much more obscured by the water haze.
Figure 8
Figure 8
DeeplabV3+ architecture. The structure of the Atruous Spatial Pyramid Pooling (ASSP) decoder module is shown, followed by a simple decoder to acquire image prediction. Adapted from [31].
Figure 9
Figure 9
Segmentation results of the DeeplabV3+ model for different hyperparameters. Segmentation masks are shown as a yellow overlay, with the base image in the bottom right.
Figure 10
Figure 10
More examples of seaweed segmentation with the DeeplabV3+ model using our optimal hyperparameters. The model correctly distinguishes a gap between the seaweed plants in the foreground, and ignores the seaweed on a second line in the background. Finer details of trailing seaweed at the bottom of the plants is not detected due to limitations on pixel resolution for the segmentation.
Figure 11
Figure 11
Evolution of the intersection over union (IoU) metric of the training images using different hyperparameters. A perfect model would reach an IoU of 1. The 576 px model converges towards an IoU of 0.9 after 100 iterations (epochs).
Figure 12
Figure 12
Intersection over union metric for the DeeplabV3+ model for seaweed segmentation. Training and validation IoU are approximately identical after 100 epochs.
Figure 13
Figure 13
Spread of the segmented surface area as a percentile of the total image, for two one-minute steady camera positions. There are no large outliers in either test.
Figure 14
Figure 14
Construction of a disparity map. (top) Original left and right stereo images. (bottom) Normalized disparity map, calculated via the Semi-Global Block Matching algorithm.
Figure 15
Figure 15
Spread of the distance measurement 2D median with a steady plant target, calculated for 100 images. The red line denotes the mean.
Figure 16
Figure 16
Segmented images, with seaweed crop yield in square meters as calculated by our algorithm. The pixel area of seaweed on the left image is larger than the right, but the left image was captured at a shorter distance from the seaweed (1.99 m vs. 2.97 m). Our method returns a smaller surface area in m2 for the left image.
Figure 17
Figure 17
Additional examples of segmented images, with seaweed crop yield in square meters as calculated by our algorithm.

Similar articles

References

    1. European Commission, Directorate General for Maritime Affairs and Fisheries, Joint Research Centre. Addamo A., Calvo Santos A., Guillén J. The EU Blue Economy Report 2022. Publications Office of the European Union; Luxemburg: 2022. - DOI
    1. FOA . Brief to The State of World Fisheries and Aquaculture 2022. Food and Agriculture Organization of the United Nations; Rome, Italy: 2022. - DOI
    1. Ahmed Z.U., Hasan O., Rahman M.M., Akter M., Rahman M.S., Sarker S. Seaweeds for the sustainable blue economy development: A study from the south east coast of Bangladesh. Heliyon. 2022;8:e09079. doi: 10.1016/j.heliyon.2022.e09079. - DOI - PMC - PubMed
    1. Campbell I., Macleod A., Sahlmann C., Neves L., Funderud J., Øverland M., Hughes A.D., Stanley M. The Environmental Risks Associated With the Development of Seaweed Farming in Europe—Prioritizing Key Knowledge Gaps. Front. Mar. Sci. 2019;6 doi: 10.3389/fmars.2019.00107. - DOI
    1. Bostock J., Lane A., Hough C., Yamamoto K. An assessment of the economic contribution of EU aquaculture production and the influence of policies for its sustainable development. Aquac. Int. 2016;24:699–733. doi: 10.1007/s10499-016-9992-1. - DOI

Grants and funding

LinkOut - more resources