C# (CSharp) SuperNeuro.Layers.Activations Namespace

Сlasses

Name Description
Exp Exponential activation function which returns simple exp(x)
Softplus The softplus activation: log(exp(x) + 1).
Tanh Hyperbolic tangent activation. Tanh squashes a real-valued number to the range [-1, 1]. It’s non-linear. But unlike Sigmoid, its output is zero-centered. Therefore, in practice the tanh non-linearity is always preferred to the sigmoid nonlinearity.