C# (CSharp) SuperNeuro.Layers.Activations 네임스페이스

클래스들

이름 설명
Exp Exponential activation function which returns simple exp(x)
Softplus The softplus activation: log(exp(x) + 1).
Tanh Hyperbolic tangent activation. Tanh squashes a real-valued number to the range [-1, 1]. It’s non-linear. But unlike Sigmoid, its output is zero-centered. Therefore, in practice the tanh non-linearity is always preferred to the sigmoid nonlinearity.