Search in sources :

Example 1 with LeakyReLUDerivative

use of org.nd4j.linalg.api.ops.impl.transforms.gradient.LeakyReLUDerivative in project nd4j by deeplearning4j.

the class ActivationLReLU method backprop.

@Override
public Pair<INDArray, INDArray> backprop(INDArray in, INDArray epsilon) {
    INDArray dLdz = Nd4j.getExecutioner().execAndReturn(new LeakyReLUDerivative(in, alpha));
    dLdz.muli(epsilon);
    return new Pair<>(dLdz, null);
}
Also used : INDArray(org.nd4j.linalg.api.ndarray.INDArray) LeakyReLUDerivative(org.nd4j.linalg.api.ops.impl.transforms.gradient.LeakyReLUDerivative) Pair(org.nd4j.linalg.primitives.Pair)

Aggregations

INDArray (org.nd4j.linalg.api.ndarray.INDArray)1 LeakyReLUDerivative (org.nd4j.linalg.api.ops.impl.transforms.gradient.LeakyReLUDerivative)1 Pair (org.nd4j.linalg.primitives.Pair)1