船舶航行遥感影像信息提取中,由于影像摄取面积大且背景复杂,在图像分割阶段,仅依赖于光谱信息而忽略了空间关系,会导致分割结果边缘不连续。为此,提出深度卷积网络下船舶航行遥感影像信息提取方法。通过图像配准对船舶航行遥感影像展开预处理,考虑空间关系采用WGMM-MRF模型对船舶航行影像进行分割处理,确保分割结果边缘具备连续性。构建深度卷积神经网络,基于深度特征提取和变化区域判别策略,将WGMM-MRF分割后的船舶航行图像作为输入,实现船舶航行遥感影像信息提取。实验结果表明,所提方法的提取精度最高达到了0.93,损失程度最高仅为0.04,具备较高的实际应用价值。
In the extraction of remote sensing image information for ship navigation, due to the large image capture area and complex background, relying only on spectral information and ignoring spatial relationships in the image segmentation stage can lead to discontinuous edges in the segmentation results. Therefore, a method for extracting remote sensing image information of ship navigation under deep convolutional network is proposed. Preprocess ship navigation remote sensing images through image registration, consider spatial relationships, and use the WGMM-MRF model to segment ship navigation images, ensuring that the segmentation results have continuous edges. Constructing a deep convolutional neural network, based on deep feature extraction and change region discrimination strategies, using WGMM-MRF segmented ship navigation images as input, to achieve remote sensing image information extraction of ship navigation. The experimental results show that the proposed method achieves the highest extraction accuracy of 0.93 and the highest loss degree of only 0.04, indicating high practical application value.
2025,47(8): 171-175 收稿日期:2024-8-24
DOI:10.3404/j.issn.1672-7649.2025.08.029
分类号:P237
基金项目:2024年河南省科技攻关支持项目(242102210101)
作者简介:白磊(1985-),男,讲师,研究方向为计算机技术应用、网络安全等
参考文献:
[1] 王继成. 基于深度卷积神经网络的高分遥感影像高速铁路沿线建筑物信息提取[J]. 测绘学报, 2023, 52(6): 1041.
[2] 陈立娜, 李真, 宋辉. 基于人工智能的无人机测绘遥感图像信息提取方法[J]. 电子设计工程, 2023, 31(24): 181-185.
[3] 徐红明, 王兴华, 方诚, 等. 基于旋转不变性的高分辨率遥感影像船舶检测[J]. 中国航海, 2024, 47(2): 120-127.
[4] 于野, 艾华, 贺小军, 等. A-FPN算法及其在遥感图像船舶检测中的应用[J]. 遥感学报, 2020, 24(2): 107-115.
[5] 张永显, 马国锐, 訾栓紧, 等. 多源遥感影像学习型特征双向一致性配准[J]. 测绘学报, 2023, 52(11): 1906-1916.
[6] 彭继达, 马治国, 张春桂. 基于特征点的高分辨率卫星遥感影像自动配准方法[J]. 现代电子技术, 2022, 45(18): 102-106.
[7] 朱金星. 一种改进的小波变换遥感影像融合方法[J]. 北京测绘, 2023, 37(4): 486-490.