Search in sources :

Example 1 with HardTanhDerivative

use of org.nd4j.linalg.api.ops.impl.transforms.gradient.HardTanhDerivative in project nd4j by deeplearning4j.

the class ActivationHardTanH method backprop.

@Override
public Pair<INDArray, INDArray> backprop(INDArray in, INDArray epsilon) {
    INDArray dLdz = Nd4j.getExecutioner().execAndReturn(new HardTanhDerivative(in));
    dLdz.muli(epsilon);
    return new Pair<>(dLdz, null);
}
Also used : INDArray(org.nd4j.linalg.api.ndarray.INDArray) HardTanhDerivative(org.nd4j.linalg.api.ops.impl.transforms.gradient.HardTanhDerivative) Pair(org.nd4j.linalg.primitives.Pair)

Aggregations

INDArray (org.nd4j.linalg.api.ndarray.INDArray)1 HardTanhDerivative (org.nd4j.linalg.api.ops.impl.transforms.gradient.HardTanhDerivative)1 Pair (org.nd4j.linalg.primitives.Pair)1