activation function

Re: activation function

par Nicolas Talabot,
Number of replies: 0
Mostly for simplicity. ReLU is simple, fast, works generally okay, and is still very often used. But yes, there exists a lot of other non-linearites that can replace ReLU for potential performance improvement (e.g., https://pytorch.org/docs/stable/nn.html#non-linear-activations-weighted-sum-nonlinearity).
Though generally, it still boils down to testing a few and selecting the best for a given problem.