use of org.nd4j.linalg.api.blas.params.MMulTranspose in project nd4j by deeplearning4j.
the class Nd4jTestsC method testMmulOp.
@Test
public void testMmulOp() {
INDArray arr = Nd4j.create(new double[][] { { 1, 2, 3 }, { 4, 5, 6 } });
INDArray z = Nd4j.create(2, 2);
INDArray assertion = Nd4j.create(new double[][] { { 14, 32 }, { 32, 77 } });
MMulTranspose mMulTranspose = MMulTranspose.builder().transposeB(true).a(arr).b(arr).build();
DynamicCustomOp op = new Mmul(arr, arr, z, mMulTranspose);
Nd4j.getExecutioner().exec(op);
assertEquals(getFailureMessage(), assertion, z);
}
use of org.nd4j.linalg.api.blas.params.MMulTranspose in project nd4j by deeplearning4j.
the class TensorMmul method initFromTensorFlow.
@Override
public void initFromTensorFlow(NodeDef nodeDef, SameDiff initWith, Map<String, AttrValue> attributesForNode, GraphDef graph) {
super.initFromTensorFlow(nodeDef, initWith, attributesForNode, graph);
/**
* name: "MatMul"
* op: "MatMul"
* input: "input"
* input: "Variable/read"
* attr {
* key: "transpose_b"
* value {
* b: false
* }
* }
* attr {
* key: "transpose_a"
* value {
* b: false
* }
* }
* attr {
* key: "T"
* value {
* type: DT_FLOAT
* }
* }
*/
val isTransposeA = attributesForNode.get("transpose_a").getB();
val isTransposeB = attributesForNode.get("transpose_b").getB();
MMulTranspose mMulTranspose = MMulTranspose.builder().transposeA(isTransposeA).transposeB(isTransposeB).build();
this.mMulTranspose = mMulTranspose;
val args = args();
}
Aggregations