Search in sources :

Example 16 with BroadcastMulOp

use of org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp in project nd4j by deeplearning4j.

the class StandardizeStrategy method revert.

/**
 * Denormalize a data array
 *
 * @param array the data to denormalize
 * @param stats statistics of the data population
 */
@Override
public void revert(INDArray array, INDArray maskArray, DistributionStats stats) {
    if (array.rank() <= 2) {
        array.muliRowVector(filteredStd(stats));
        array.addiRowVector(stats.getMean());
    } else {
        Nd4j.getExecutioner().execAndReturn(new BroadcastMulOp(array, filteredStd(stats), array, 1));
        Nd4j.getExecutioner().execAndReturn(new BroadcastAddOp(array, stats.getMean(), array, 1));
    }
    if (maskArray != null) {
        DataSetUtil.setMaskedValuesToZero(array, maskArray);
    }
}
Also used : BroadcastAddOp(org.nd4j.linalg.api.ops.impl.broadcast.BroadcastAddOp) BroadcastMulOp(org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp)

Example 17 with BroadcastMulOp

use of org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp in project nd4j by deeplearning4j.

the class MinMaxStrategy method revert.

/**
 * Denormalize a data array
 *
 * @param array the data to denormalize
 * @param stats statistics of the data population
 */
@Override
public void revert(INDArray array, INDArray maskArray, MinMaxStats stats) {
    // Subtract target range minimum value
    array.subi(minRange);
    // Scale by target range
    array.divi(maxRange - minRange);
    if (array.rank() <= 2) {
        array.muliRowVector(stats.getRange());
        array.addiRowVector(stats.getLower());
    } else {
        Nd4j.getExecutioner().execAndReturn(new BroadcastMulOp(array, stats.getRange(), array, 1));
        Nd4j.getExecutioner().execAndReturn(new BroadcastAddOp(array, stats.getLower(), array, 1));
    }
    if (maskArray != null) {
        DataSetUtil.setMaskedValuesToZero(array, maskArray);
    }
}
Also used : BroadcastAddOp(org.nd4j.linalg.api.ops.impl.broadcast.BroadcastAddOp) BroadcastMulOp(org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp)

Example 18 with BroadcastMulOp

use of org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp in project nd4j by deeplearning4j.

the class OpExecutionerTestsC method testBroadcastMultiDim.

@Test
public void testBroadcastMultiDim() {
    INDArray data = Nd4j.linspace(1, 30, 30).reshape(2, 3, 5);
    System.out.println(data);
    INDArray mask = Nd4j.create(new double[][] { { 1.00, 1.00, 1.00, 1.00, 1.00 }, { 1.00, 1.00, 1.00, 0.00, 0.00 } });
    Nd4j.getExecutioner().exec(new BroadcastMulOp(data, mask, data, 0, 2));
    INDArray assertion = Nd4j.create(new double[] { 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 0.0, 0.0, 21.0, 22.0, 23.0, 0.0, 0.0, 26.0, 27.0, 28.0, 0.0, 0.0 }).reshape(2, 3, 5);
    assertEquals(assertion, data);
}
Also used : INDArray(org.nd4j.linalg.api.ndarray.INDArray) BroadcastMulOp(org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp) BaseNd4jTest(org.nd4j.linalg.BaseNd4jTest) Test(org.junit.Test)

Aggregations

BroadcastMulOp (org.nd4j.linalg.api.ops.impl.broadcast.BroadcastMulOp)18 INDArray (org.nd4j.linalg.api.ndarray.INDArray)16 BroadcastAddOp (org.nd4j.linalg.api.ops.impl.broadcast.BroadcastAddOp)9 Test (org.junit.Test)7 BroadcastDivOp (org.nd4j.linalg.api.ops.impl.broadcast.BroadcastDivOp)6 MultiLayerConfiguration (org.deeplearning4j.nn.conf.MultiLayerConfiguration)4 MultiLayerNetwork (org.deeplearning4j.nn.multilayer.MultiLayerNetwork)4 Pair (org.deeplearning4j.berkeley.Pair)3 BroadcastCopyOp (org.nd4j.linalg.api.ops.impl.broadcast.BroadcastCopyOp)3 IsMax (org.nd4j.linalg.api.ops.impl.transforms.IsMax)3 Gradient (org.deeplearning4j.nn.gradient.Gradient)2 BroadcastSubOp (org.nd4j.linalg.api.ops.impl.broadcast.BroadcastSubOp)2 Layer (org.deeplearning4j.nn.api.Layer)1 DefaultGradient (org.deeplearning4j.nn.gradient.DefaultGradient)1 BaseNd4jTest (org.nd4j.linalg.BaseNd4jTest)1