PSSNet: Planarity-sensible Semantic Segmentation of large-scale urban meshes

Weixiao GAO*, Liangliang Nan, Bas Boom, Hugo Ledoux

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)
47 Downloads (Pure)

Abstract

We introduce a novel deep learning-based framework to interpret 3D urban scenes represented as textured meshes. Based on the observation that object boundaries typically align with the boundaries of planar regions, our framework achieves semantic segmentation in two steps: planarity-sensible over-segmentation followed by semantic classification. The over-segmentation step generates an initial set of mesh segments that capture the planar and non-planar regions of urban scenes. In the subsequent classification step, we construct a graph that encodes the geometric and photometric features of the segments in its nodes and the multi-scale contextual features in its edges. The final semantic segmentation is obtained by classifying the segments using a graph convolutional network. Experiments and comparisons on two semantic urban mesh benchmarks demonstrate that our approach outperforms the state-of-the-art methods in terms of boundary quality, mean IoU (intersection over union), and generalization ability. We also introduce several new metrics for evaluating mesh over-segmentation methods dedicated to semantic segmentation, and our proposed over-segmentation approach outperforms state-of-the-art methods on all metrics. Our source code is available at https://github.com/WeixiaoGao/PSSNet.

Original languageEnglish
Pages (from-to)32-44
Number of pages13
JournalISPRS Journal of Photogrammetry and Remote Sensing
Volume196
DOIs
Publication statusPublished - 2023

Keywords

  • Over-segmentation
  • Semantic segmentation
  • Texture meshes
  • Urban scene understanding

Fingerprint

Dive into the research topics of 'PSSNet: Planarity-sensible Semantic Segmentation of large-scale urban meshes'. Together they form a unique fingerprint.

Cite this