single-au.php

IJAT Vol.19 No.3 pp. 248-257
doi: 10.20965/ijat.2025.p0248
(2025)

Research Paper:

Automatic Detection and Distribution Mapping of Epibenthos from Seabed Images

Koichiro Enomoto*1,† ORCID Icon, Naohiro Maruo*2, Koji Miyoshi*3 ORCID Icon, Yasuhiro Kuwahara*4, and Masashi Toda*5 ORCID Icon

*1Regional ICT Research Center of Human, Industry and Future, The University of Shiga Prefecture
2500 Hassaka-cho, Hikone-shi, Shiga 522-8533, Japan

Corresponding author

*2The University of Shiga Prefecture
Hikone, Japan

*3Fisheries Research Department, Central Fisheries Research Institute, Hokkaido Research Organization
Yoichi, Japan

*4Fisheries Research Department, Mariculture Fisheries Research Institute, Hokkaido Research Organization
Muroran, Japan

*5Kumamoto University
Kumamoto, Japan

Received:
November 25, 2024
Accepted:
January 22, 2025
Published:
May 5, 2025
Keywords:
Mask R-CNN, Mask2Former, epibenthos, fishery investigation, seabed image
Abstract

The efficient investigation of fishery resources is critical for rapidly understanding the effects of abrupt environmental changes. Seabed imagery has been used extensively for resource assessment in scallop fisheries in the Sea of Okhotsk, Hokkaido, Japan. However, the potential of these images for the broader investigation of epibenthos remains unclear. In this paper, we propose an automatic detection method for epibenthos from seabed images using deep learning, specifically Mask R-CNN and Mask2Former models. We focus on four species: Asterias amurensis, Distolasterias nipon, Halocynthia aurantium, and Patiria pectinifera. The Mask R-CNN X101-FPN 3x model show the highest overall accuracy, with a mask mAP of 77.8%, whereas Mask2Former excelled in specific species detection. The trained models are successfully used to generate epibenthic distribution maps, demonstrating the effectiveness of the proposed method for monitoring large-scale marine ecosystems. This approach significantly enhances our ability to conduct comprehensive assessments of benthic communities, thereby providing an effective tool for marine-biodiversity assessment and fishery resource management.

Cite this article as:
K. Enomoto, N. Maruo, K. Miyoshi, Y. Kuwahara, and M. Toda, “Automatic Detection and Distribution Mapping of Epibenthos from Seabed Images,” Int. J. Automation Technol., Vol.19 No.3, pp. 248-257, 2025.
Data files:
References
  1. [1] Food and Agriculture Organization of the United Nations, “The State of World Fisheries and Aquaculture 2024,” FAO, 2024. https://doi.org/10.4060/cd0683en
  2. [2] S. Jennings and M. J. Kaiser, “The Effects of Fishing on Marine Ecosystems,” Advances in Marine Biology, Vol.34, pp. 213-352, 1998. https://doi.org/10.1016/S0065-2881(08)60212-6
  3. [3] Marine Stewardship Council, “Japanese scallop hanging and seabed enhanced fisheries.” https://fisheries.msc.org/ [Accessed September 19, 2024]
  4. [4] K. Enomoto, M. Toda, and Y. Kuwahara, “Extraction Method of Scallop Area from Sand Seabed Images,” IEICE Trans. Inf. & Syst., Vol.E97.D, No.1, pp. 130-139, 2014. https://doi.org/10.1587/transinf.E97.D.130
  5. [5] K. Enomoto, M. Toda, and Y. Kuwahara, “Discussion on a Method to Extract Scallop Using Line Convergence Index Filter from Granule-sand Seabed Videos,” IAPR Conf. on Machine Vision Applications (MVA2015), pp. 35-40, 2015. https://doi.org/10.1109/MVA.2015.7153127
  6. [6] J. Kitagawa, K. Enomoto, M. Toda, K. Miyoshi, and Y. Kuwahara, “A Study of Bottom Sediment Classification System Using Seabed Images,” Sensors and Materials, Vol.31, No.3, pp. 823-830, 2019. https://doi.org/10.18494/SAM.2019.2151
  7. [7] K. He, G. Gkioxari, P. Dollar, and R. Girshick, “Mask R-CNN,” 2017 IEEE Int. Conf. on Computer Vision, 2017. https://doi.org/10.1109/ICCV.2017.322
  8. [8] B. Cheng, I. Misra, A. G. Schwing, A. Kirillov, and R. Girdhar, “Masked-attention Mask Transformer for Universal Image Segmentation,” 2022 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), 2022. https://doi.org/10.1109/CVPR52688.2022.00135
  9. [9] Abashiri Fisheries Research Institute, Hokkaido Research Organization, “Starfish predation damage control for scallop fisheries,” (in Japanese). https://www.hro.or.jp/fisheries/research/abashiri/topics/Hitode_Higai_ver1.html [Accessed October 1, 2024]
  10. [10] J. M. Lawrence et al., “Starfish: Biology and Ecology of the Asteroidea,” Johns Hopkins University Press, 2013.
  11. [11] M. Narita, T. Maoka, Y. Kuwahara, and K. Ebitani, “Proximate Composition and Carotenoids Composition of Halocynthia aurantium in the Okhotsk Sea,” Nippon Suisan Gakkaishi, Vol.83, No.6, pp. 996-1004, 2017 (in Japanese). https://doi.org/10.2331/suisan.17-00040
  12. [12] P. M. Almond, K. Linse, S. Dreutter, S. M. Grant, H. J. Griffiths, R. J. Whittle, M. Mackenzie, and W. D. K. Reid, “In-situ Image Analysis of Habitat Heterogeneity and Benthic Biodiversity in the Prince Gustav Channel, Eastern Antarctic Peninsula,” Front. Mar. Sci., Vol.8, Article No.614496, 2021. https://doi.org/10.3389/fmars.2021.614496
  13. [13] B. Taormina, M. P. Marzloff, N. Desroy, X. Caisey, O. Dugornay, E. M. Thiesse, A. Tancray, and A. Carlier, “Optimizing image-based protocol to monitor macroepibenthic communities colonizing artificial structures,” ICES J. Mar. Sci., Vol.77, No.2, pp. 835-845, 2020. https://doi.org/10.1093/icesjms/fsz249
  14. [14] O. E. Boulais, B. Woodward, B. Schlining, L. Lundsten, K. Barnard, K. C. Bell, and K. Katija, “FathomNet: An underwater image training database for ocean exploration and discovery,” arXiv:2007.00114, 2020. https://doi.org/10.48550/arXiv.2007.00114
  15. [15] A. Marburg and K. Bigham, “Deep learning for benthic fauna identification,” OCEANS 2016 MTS/IEEE Monterey, 2016. https://doi.org/10.1109/OCEANS.2016.7761146
  16. [16] C. Rasmussen, J. Zhao, D. Ferraro, and A. Trembanis, “Deep Census: AUV-Based Scallop Population Monitoring,” IEEE Inter. Conf. on Computer Vision (ICCV), pp. 2865-2873, 2017. https://doi.org/10.1109/ICCVW.2017.338
  17. [17] I. H. Chen and N. Belbachir, “Using Mask R-CNN for Underwater Fish Instance Segmentation as Novel Objects: A Proof of Concept,” Proc. of the Northern Lights Deep Learning Workshop, Vol.4, 2023. https://doi.org/10.7557/18.6791
  18. [18] P. Muñoz-Benavent, J. Martínez-Peiró, G. Andreu-García, V. Puig-Pons, V. Espinosa, I. Pérez-Arjona, F. De la Gándara, and A. Ortega, “Impact evaluation of deep learning on image segmentation for automatic bluefin tuna sizing,” Aquacultural Engineering, Vol.99, Article No.102299, 2022. https://doi.org/10.1016/j.aquaeng.2022.102299
  19. [19] S. Song, J. Zhu, X. Li, and Q. Huang, “Integrate MSRCR and Mask R-CNN to Recognize Underwater Creatures on Small Sample Datasets,” IEEE Access, Vol.8, pp. 172848-172858, 2020. https://doi.org/10.1109/ACCESS.2020.3025617
  20. [20] J. Aguzzi, A. Manuel, F. Condal, J. Guillen, M. Nogueras, J. del Rio, C. Costa, P. Menesatti, P. Puig, F. Sarda, D. Toma, and A. Palanques, “The new Seafloor Observatory (OBSEA) for Remote and Long-Term Coastal Ecosystem Monitoring,” Sensors, Vol.11, No.6, pp. 5850-5872, 2011. https://doi.org/10.3390/s110605850
  21. [21] jsbroks, “coco-annotator.” https://github.com/jsbroks/coco-annotator [Accessed October 1, 2024]
  22. [22] T.-Y. Lin et al., “Microsoft COCO: Common Objects in Context,” Computer Vision – ECCV 2014, pp. 740-755, 2014. https://doi.org/10.1007/978-3-319-10602-1_48
  23. [23] Y. Wu, A. Kirillov, F. Massa, W. Y. Lo, and R. Girshick, “Detectron2.” https://github.com/facebookresearch/detectron2 [Accessed October 1, 2024]
  24. [24] B. Cheng, A. G. Schwing, and A. Kirillov “MPer-Pixel Classification is Not All You Need for Semantic Segmentation,” 35th Conf. on Neural Information Processing Systems (NeurIPS 2021), 2021.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 08, 2025