C# 클래스 SuperNeuro.Layers.Activations.Tanh

Hyperbolic tangent activation. Tanh squashes a real-valued number to the range [-1, 1]. It’s non-linear. But unlike Sigmoid, its output is zero-centered. Therefore, in practice the tanh non-linearity is always preferred to the sigmoid nonlinearity.
상속: BaseLayer
파일 보기 프로젝트 열기: tech-quantum/SuperchargedArray 1 사용 예제들

공개 메소드들

메소드 설명
Backward ( SuperArray outputgrad ) : void

Calculate the gradient of this layer function

Forward ( SuperArray x ) : void

Forwards the inputs and compute the output

Tanh ( ) : SuperchargedArray

Initializes a new instance of the Tanh class.

메소드 상세

Backward() 공개 메소드

Calculate the gradient of this layer function
public Backward ( SuperArray outputgrad ) : void
outputgrad SuperArray The calculated output grad from previous layer.
리턴 void

Forward() 공개 메소드

Forwards the inputs and compute the output
public Forward ( SuperArray x ) : void
x SuperArray The input SuperArray for this layer.
리턴 void

Tanh() 공개 메소드

Initializes a new instance of the Tanh class.
public Tanh ( ) : SuperchargedArray
리턴 SuperchargedArray