Affordable Access

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

Authors
  • Velasco-Forero, Santiago
  • Angulo, Jesús
Publication Date
Oct 24, 2022
Source
HAL-Mines ParisTech
Keywords
Language
English
License
Unknown
External links

Abstract

This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.

Report this publication

Statistics

Seen <100 times