Search in sources :

Example 66 with SDVariable

use of org.nd4j.autodiff.samediff.SDVariable in project nd4j by deeplearning4j.

the class CosineSimilarity method doDiff.

public static List<SDVariable> doDiff(SameDiff sameDiff, DifferentialFunctionFactory f, SDVariable x, SDVariable y, SDVariable gradOut, int... dimensions) {
    SDVariable a = sameDiff.sum(x.mul(y), dimensions);
    SDVariable l2x = f.norm2(x, dimensions);
    SDVariable l2y = f.norm2(y, dimensions);
    SDVariable b = l2x.mul(l2y);
    int origRank = Shape.rankFromShape(x.getShape());
    SDVariable broadcastableA = f.reductionBroadcastableWithOrigShape(origRank, dimensions, a);
    SDVariable broadcastableB = f.reductionBroadcastableWithOrigShape(origRank, dimensions, b);
    SDVariable broadcastableL2xSq = f.reductionBroadcastableWithOrigShape(origRank, dimensions, sameDiff.square(l2x));
    SDVariable broadcastableL2ySq = f.reductionBroadcastableWithOrigShape(origRank, dimensions, sameDiff.square(l2y));
    SDVariable broadcastableGrad = f.reductionBroadcastableWithOrigShape(origRank, dimensions, gradOut);
    SDVariable dcdx = y.sub(x.mul(broadcastableA).div(broadcastableL2xSq)).div(broadcastableB);
    SDVariable dcdy = x.sub(y.mul(broadcastableA).div(broadcastableL2ySq)).div(broadcastableB);
    return Arrays.asList(dcdx.mul(broadcastableGrad), dcdy.mul(broadcastableGrad));
}
Also used : SDVariable(org.nd4j.autodiff.samediff.SDVariable)

Example 67 with SDVariable

use of org.nd4j.autodiff.samediff.SDVariable in project nd4j by deeplearning4j.

the class CumSum method doDiff.

@Override
public List<SDVariable> doDiff(List<SDVariable> grad) {
    // Output gradient is the reversed cumulative sum of the reversed input gradient
    SDVariable gradient = sameDiff.setupFunction(grad.get(0));
    SDVariable reverseGrad = sameDiff.reverse(gradient, 1 - dimensions[0]);
    SDVariable ret = sameDiff.cumsum(reverseGrad, exclusive, reverse, dimensions);
    SDVariable reversedRet = sameDiff.reverse(ret, 1 - dimensions[0]);
    return Arrays.asList(reversedRet);
}
Also used : SDVariable(org.nd4j.autodiff.samediff.SDVariable)

Example 68 with SDVariable

use of org.nd4j.autodiff.samediff.SDVariable in project nd4j by deeplearning4j.

the class Max method doDiff.

@Override
public List<SDVariable> doDiff(List<SDVariable> i_v1) {
    // TODO do we need to handle the "multiple equal maximums" case?
    // TODO code duplication (min/max)
    SDVariable out = outputVariables()[0];
    int origRank = Shape.rankFromShape(arg().getShape());
    SDVariable expandedOut = sameDiff.f().reductionBroadcastableWithOrigShape(origRank, dimensions, out);
    expandedOut = sameDiff.onesLike(arg()).mul(expandedOut);
    SDVariable expandedGrad = sameDiff.f().reductionBroadcastableWithOrigShape(origRank, dimensions, i_v1.get(0));
    SDVariable eq = sameDiff.eq(arg(), expandedOut);
    SDVariable ret = eq.mul(expandedGrad);
    return Arrays.asList(ret);
}
Also used : SDVariable(org.nd4j.autodiff.samediff.SDVariable)

Example 69 with SDVariable

use of org.nd4j.autodiff.samediff.SDVariable in project nd4j by deeplearning4j.

the class Mean method doDiff.

@Override
public List<SDVariable> doDiff(List<SDVariable> i_v1) {
    // If out = mean(in), then dL/dIn = 1/N * dL/dOut  (broadcast to appropriate shape)
    // Note that N differs for "along dimension" vs. "whole array" reduce cases
    int n = f().getReductionLength(this);
    int rank = Shape.rankFromShape(arg().getShape());
    SDVariable broadcastableGrad = f().reductionBroadcastableWithOrigShape(rank, dimensions, i_v1.get(0));
    // 1/N with shape equal to input
    SDVariable ret = sameDiff.onesLike(arg()).div(n);
    ret = ret.mul(broadcastableGrad);
    return Arrays.asList(ret);
}
Also used : SDVariable(org.nd4j.autodiff.samediff.SDVariable)

Example 70 with SDVariable

use of org.nd4j.autodiff.samediff.SDVariable in project nd4j by deeplearning4j.

the class Min method doDiff.

@Override
public List<SDVariable> doDiff(List<SDVariable> i_v1) {
    // TODO do we need to handle the "multiple equal minimums" case?
    // TODO code duplication (min/max)
    SDVariable out = outputVariables()[0];
    int origRank = Shape.rankFromShape(arg().getShape());
    SDVariable expandedOut = sameDiff.f().reductionBroadcastableWithOrigShape(origRank, dimensions, out);
    expandedOut = sameDiff.onesLike("temp0", arg()).mul("tempmul", expandedOut);
    SDVariable expandedGrad = sameDiff.f().reductionBroadcastableWithOrigShape(origRank, dimensions, i_v1.get(0));
    SDVariable eq = sameDiff.eq(arg(), expandedOut);
    SDVariable ret = eq.mul(expandedGrad);
    return Arrays.asList(ret);
}
Also used : SDVariable(org.nd4j.autodiff.samediff.SDVariable)

Aggregations

SDVariable (org.nd4j.autodiff.samediff.SDVariable)104 SameDiff (org.nd4j.autodiff.samediff.SameDiff)41 INDArray (org.nd4j.linalg.api.ndarray.INDArray)38 Test (org.junit.Test)36 ArrayList (java.util.ArrayList)18 DynamicCustomOp (org.nd4j.linalg.api.ops.DynamicCustomOp)10 lombok.val (lombok.val)7 LossFunctions (org.nd4j.autodiff.loss.LossFunctions)4 LossInfo (org.nd4j.autodiff.loss.LossInfo)4 BernoulliDistribution (org.nd4j.linalg.api.ops.random.impl.BernoulliDistribution)4 Ignore (org.junit.Ignore)3 DifferentialFunction (org.nd4j.autodiff.functions.DifferentialFunction)3 ND4JIllegalStateException (org.nd4j.linalg.exception.ND4JIllegalStateException)3 Triple (org.nd4j.linalg.primitives.Triple)2 DataOutputStream (java.io.DataOutputStream)1 FileOutputStream (java.io.FileOutputStream)1 ByteBuffer (java.nio.ByteBuffer)1 NoOpNameFoundException (org.nd4j.imports.NoOpNameFoundException)1 NdIndexIterator (org.nd4j.linalg.api.iter.NdIndexIterator)1 TruncateDivOp (org.nd4j.linalg.api.ops.impl.transforms.arithmetic.TruncateDivOp)1