Global river monitoring using semantic fusion networks

Zhihao Wei, Kebin Jia, Xiaowei Jia, Ankush Khandelwal, Vipin Kumar

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Global river monitoring is an important mission within the remote sensing society. One of the main challenges faced by this mission is generating an accurate water mask from remote sensing images (RSI) of rivers (RSIR), especially on a global scale with various river features. Aiming at better water area classification using semantic information, this paper presents a segmentation method for global river monitoring based on semantic clustering and semantic fusion. Firstly, an encoder-decoder network (AEN)-based architecture is proposed to obtain the semantic features from RSIR. Secondly, a clustering-based semantic fusion method is proposed to divide semantic features of RSIR into groups and train convolutional neural networks (CNN) models corresponding to each group using data augmentation and semi-supervised learning. Thirdly, a semantic distance-based segmentation fusion method is proposed for fusing the CNN models result into final segmentation mask. We built a global river dataset that contains multiple river segments from each continent of the world based on Sentinel-2 satellite imagery. The result shows that the F1-score of the proposed segmentation method is 93.32%, which outperforms several state-of-the-art algorithms, and demonstrates that grouping semantic information helps better segment the RSIR in global scale.

Original languageEnglish (US)
Article number2258
JournalWater (Switzerland)
Volume12
Issue number8
DOIs
StatePublished - Aug 2020

Bibliographical note

Publisher Copyright:
© 2020 by the authors.

Keywords

  • Convolution
  • Encoder-decoder network
  • Feature extraction
  • Remote sensing image of river
  • Semantic fusion
  • Semi-supervised learning

Fingerprint

Dive into the research topics of 'Global river monitoring using semantic fusion networks'. Together they form a unique fingerprint.

Cite this