Skip to Main content Skip to Navigation
Conference papers

Scale Equivariant Neural Networks with Morphological Scale-Spaces

Abstract : The translation equivariance of convolutions can make convolutional neural networks translation equivariant or invariant. Equivariance to other transformations (e.g. rotations, affine transformations, scalings) may also be desirable as soon as we know a priori that transformed versions of the same objects appear in the data. The semigroup cross-correlation, which is a linear operator equivariant to semigroup actions, was recently proposed and applied in conjunction with the Gaussian scale-space to create architectures which are equivariant to discrete scalings. In this paper, a generalization using a broad class of liftings, including morphological scale-spaces, is proposed. The architectures obtained from different scale-spaces are tested and compared in supervised classification and semantic segmentation tasks where objects in test images appear at different scales compared to training images. In both classification and segmentation tasks, the scale-equivariant architectures improve dramatically the generalization to unseen scales compared to a convolutional baseline. Besides, in our experiments morphological scale-spaces outperformed the Gaussian scale-space in geometrical tasks.
Complete list of metadata
Contributor : Mateus Sangalli Connect in order to contact the contributor
Submitted on : Friday, April 30, 2021 - 2:57:33 PM
Last modification on : Wednesday, May 25, 2022 - 11:01:45 AM
Long-term archiving on: : Saturday, July 31, 2021 - 7:09:14 PM


Files produced by the author(s)


  • HAL Id : hal-03213645, version 1
  • ARXIV : 2105.01335


Mateus Sangalli, Samy Blusseau, Santiago Velasco-Forero, Jesus Angulo. Scale Equivariant Neural Networks with Morphological Scale-Spaces. IAPR International Conference on Discrete Geometry and Mathematical Morphology (DGMM), 2021, May 2021, Uppsala, Sweden. ⟨hal-03213645⟩



Record views


Files downloads