C# Class SuperNeuro.Layers.Activations.Tanh

Hyperbolic tangent activation. Tanh squashes a real-valued number to the range [-1, 1]. It’s non-linear. But unlike Sigmoid, its output is zero-centered. Therefore, in practice the tanh non-linearity is always preferred to the sigmoid nonlinearity.
Inheritance: BaseLayer
ファイルを表示 Open project: tech-quantum/SuperchargedArray Class Usage Examples

Public Methods

Method Description
Backward ( SuperArray outputgrad ) : void

Calculate the gradient of this layer function

Forward ( SuperArray x ) : void

Forwards the inputs and compute the output

Tanh ( ) : SuperchargedArray

Initializes a new instance of the Tanh class.

Method Details

Backward() public method

Calculate the gradient of this layer function
public Backward ( SuperArray outputgrad ) : void
outputgrad SuperArray The calculated output grad from previous layer.
return void

Forward() public method

Forwards the inputs and compute the output
public Forward ( SuperArray x ) : void
x SuperArray The input SuperArray for this layer.
return void

Tanh() public method

Initializes a new instance of the Tanh class.
public Tanh ( ) : SuperchargedArray
return SuperchargedArray