Search in sources :

Example 76 with MultiLayerConfiguration

use of org.deeplearning4j.nn.conf.MultiLayerConfiguration in project deeplearning4j by deeplearning4j.

the class TestCustomLayers method checkInitializationFF.

@Test
public void checkInitializationFF() {
    //Actually create a network with a custom layer; check initialization and forward pass
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().learningRate(0.1).list().layer(0, new DenseLayer.Builder().nIn(9).nOut(10).build()).layer(1, //hard-coded nIn/nOut of 10
    new CustomLayer(3.14159)).layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(10).nOut(11).build()).pretrain(false).backprop(true).build();
    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    assertEquals(9 * 10 + 10, net.getLayer(0).numParams());
    assertEquals(10 * 10 + 10, net.getLayer(1).numParams());
    assertEquals(10 * 11 + 11, net.getLayer(2).numParams());
    //Check for exceptions...
    net.output(Nd4j.rand(1, 9));
    net.fit(new DataSet(Nd4j.rand(1, 9), Nd4j.rand(1, 11)));
}
Also used : OutputLayer(org.deeplearning4j.nn.conf.layers.OutputLayer) CustomOutputLayer(org.deeplearning4j.nn.layers.custom.testclasses.CustomOutputLayer) CustomLayer(org.deeplearning4j.nn.layers.custom.testclasses.CustomLayer) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) DenseLayer(org.deeplearning4j.nn.conf.layers.DenseLayer) DataSet(org.nd4j.linalg.dataset.DataSet) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) Test(org.junit.Test)

Example 77 with MultiLayerConfiguration

use of org.deeplearning4j.nn.conf.MultiLayerConfiguration in project deeplearning4j by deeplearning4j.

the class TestCustomLayers method testJsonMultiLayerNetwork.

@Test
public void testJsonMultiLayerNetwork() {
    //First: Ensure that the CustomLayer class is registered
    ObjectMapper mapper = NeuralNetConfiguration.mapper();
    AnnotatedClass ac = AnnotatedClass.construct(Layer.class, mapper.getSerializationConfig().getAnnotationIntrospector(), null);
    Collection<NamedType> types = mapper.getSubtypeResolver().collectAndResolveSubtypes(ac, mapper.getSerializationConfig(), mapper.getSerializationConfig().getAnnotationIntrospector());
    Set<Class<?>> registeredSubtypes = new HashSet<>();
    boolean found = false;
    for (NamedType nt : types) {
        System.out.println(nt);
        //            registeredSubtypes.add(nt.getType());
        if (nt.getType() == CustomLayer.class)
            found = true;
    }
    assertTrue("CustomLayer: not registered with NeuralNetConfiguration mapper", found);
    //Second: let's create a MultiLayerCofiguration with one, and check JSON and YAML config actually works...
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().learningRate(0.1).list().layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build()).layer(1, new CustomLayer(3.14159)).layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(10).nOut(10).build()).pretrain(false).backprop(true).build();
    String json = conf.toJson();
    String yaml = conf.toYaml();
    System.out.println(json);
    MultiLayerConfiguration confFromJson = MultiLayerConfiguration.fromJson(json);
    assertEquals(conf, confFromJson);
    MultiLayerConfiguration confFromYaml = MultiLayerConfiguration.fromYaml(yaml);
    assertEquals(conf, confFromYaml);
}
Also used : OutputLayer(org.deeplearning4j.nn.conf.layers.OutputLayer) CustomOutputLayer(org.deeplearning4j.nn.layers.custom.testclasses.CustomOutputLayer) CustomLayer(org.deeplearning4j.nn.layers.custom.testclasses.CustomLayer) NamedType(org.nd4j.shade.jackson.databind.jsontype.NamedType) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) DenseLayer(org.deeplearning4j.nn.conf.layers.DenseLayer) AnnotatedClass(org.nd4j.shade.jackson.databind.introspect.AnnotatedClass) AnnotatedClass(org.nd4j.shade.jackson.databind.introspect.AnnotatedClass) ObjectMapper(org.nd4j.shade.jackson.databind.ObjectMapper) HashSet(java.util.HashSet) Test(org.junit.Test)

Example 78 with MultiLayerConfiguration

use of org.deeplearning4j.nn.conf.MultiLayerConfiguration in project deeplearning4j by deeplearning4j.

the class OutputLayerTest method testOutputLayersRnnForwardPass.

@Test
public void testOutputLayersRnnForwardPass() {
    //Test output layer with RNNs (
    //Expect all outputs etc. to be 2d
    int nIn = 2;
    int nOut = 5;
    int layerSize = 4;
    int timeSeriesLength = 6;
    int miniBatchSize = 3;
    Random r = new Random(12345L);
    INDArray input = Nd4j.zeros(miniBatchSize, nIn, timeSeriesLength);
    for (int i = 0; i < miniBatchSize; i++) {
        for (int j = 0; j < nIn; j++) {
            for (int k = 0; k < timeSeriesLength; k++) {
                input.putScalar(new int[] { i, j, k }, r.nextDouble() - 0.5);
            }
        }
    }
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345L).list().layer(0, new GravesLSTM.Builder().nIn(nIn).nOut(layerSize).weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).activation(Activation.TANH).updater(Updater.NONE).build()).layer(1, new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(LossFunction.MCXENT).activation(Activation.SOFTMAX).nIn(layerSize).nOut(nOut).weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).updater(Updater.NONE).build()).inputPreProcessor(1, new RnnToFeedForwardPreProcessor()).build();
    MultiLayerNetwork mln = new MultiLayerNetwork(conf);
    mln.init();
    INDArray out2d = mln.feedForward(input).get(2);
    assertArrayEquals(out2d.shape(), new int[] { miniBatchSize * timeSeriesLength, nOut });
    INDArray out = mln.output(input);
    assertArrayEquals(out.shape(), new int[] { miniBatchSize * timeSeriesLength, nOut });
    INDArray act = mln.activate();
    assertArrayEquals(act.shape(), new int[] { miniBatchSize * timeSeriesLength, nOut });
    INDArray preout = mln.preOutput(input);
    assertArrayEquals(preout.shape(), new int[] { miniBatchSize * timeSeriesLength, nOut });
    //As above, but for RnnOutputLayer. Expect all activations etc. to be 3d
    MultiLayerConfiguration confRnn = new NeuralNetConfiguration.Builder().seed(12345L).list().layer(0, new GravesLSTM.Builder().nIn(nIn).nOut(layerSize).weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).activation(Activation.TANH).updater(Updater.NONE).build()).layer(1, new org.deeplearning4j.nn.conf.layers.RnnOutputLayer.Builder(LossFunction.MCXENT).activation(Activation.SOFTMAX).nIn(layerSize).nOut(nOut).weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).updater(Updater.NONE).build()).build();
    MultiLayerNetwork mlnRnn = new MultiLayerNetwork(confRnn);
    mln.init();
    INDArray out3d = mlnRnn.feedForward(input).get(2);
    assertArrayEquals(out3d.shape(), new int[] { miniBatchSize, nOut, timeSeriesLength });
    INDArray outRnn = mlnRnn.output(input);
    assertArrayEquals(outRnn.shape(), new int[] { miniBatchSize, nOut, timeSeriesLength });
    INDArray actRnn = mlnRnn.activate();
    assertArrayEquals(actRnn.shape(), new int[] { miniBatchSize, nOut, timeSeriesLength });
    INDArray preoutRnn = mlnRnn.preOutput(input);
    assertArrayEquals(preoutRnn.shape(), new int[] { miniBatchSize, nOut, timeSeriesLength });
}
Also used : RnnOutputLayer(org.deeplearning4j.nn.layers.recurrent.RnnOutputLayer) NeuralNetConfiguration(org.deeplearning4j.nn.conf.NeuralNetConfiguration) RnnToFeedForwardPreProcessor(org.deeplearning4j.nn.conf.preprocessor.RnnToFeedForwardPreProcessor) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) Random(java.util.Random) INDArray(org.nd4j.linalg.api.ndarray.INDArray) GravesLSTM(org.deeplearning4j.nn.conf.layers.GravesLSTM) NormalDistribution(org.deeplearning4j.nn.conf.distribution.NormalDistribution) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) Test(org.junit.Test)

Example 79 with MultiLayerConfiguration

use of org.deeplearning4j.nn.conf.MultiLayerConfiguration in project deeplearning4j by deeplearning4j.

the class ConvolutionLayerSetupTest method testMnistLenet.

@Test
public void testMnistLenet() throws Exception {
    MultiLayerConfiguration.Builder incomplete = incompleteMnistLenet();
    incomplete.setInputType(InputType.convolutionalFlat(28, 28, 1));
    MultiLayerConfiguration testConf = incomplete.build();
    assertEquals(800, ((FeedForwardLayer) testConf.getConf(4).getLayer()).getNIn());
    assertEquals(500, ((FeedForwardLayer) testConf.getConf(5).getLayer()).getNIn());
    //test instantiation
    DataSetIterator iter = new MnistDataSetIterator(10, 10);
    MultiLayerNetwork network = new MultiLayerNetwork(testConf);
    network.init();
    network.fit(iter.next());
}
Also used : MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) MnistDataSetIterator(org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) MnistDataSetIterator(org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator) RecordReaderDataSetIterator(org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator) Test(org.junit.Test)

Example 80 with MultiLayerConfiguration

use of org.deeplearning4j.nn.conf.MultiLayerConfiguration in project deeplearning4j by deeplearning4j.

the class ConvolutionLayerSetupTest method testConvolutionLayerSetup.

@Test
public void testConvolutionLayerSetup() {
    MultiLayerConfiguration.Builder builder = inComplete();
    new ConvolutionLayerSetup(builder, 28, 28, 1);
    MultiLayerConfiguration completed = complete().build();
    MultiLayerConfiguration test = builder.build();
    assertEquals(completed, test);
}
Also used : MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) ConvolutionLayerSetup(org.deeplearning4j.nn.conf.layers.setup.ConvolutionLayerSetup) Test(org.junit.Test)

Aggregations

MultiLayerConfiguration (org.deeplearning4j.nn.conf.MultiLayerConfiguration)245 Test (org.junit.Test)225 MultiLayerNetwork (org.deeplearning4j.nn.multilayer.MultiLayerNetwork)194 INDArray (org.nd4j.linalg.api.ndarray.INDArray)132 NeuralNetConfiguration (org.deeplearning4j.nn.conf.NeuralNetConfiguration)123 DataSet (org.nd4j.linalg.dataset.DataSet)64 DataSetIterator (org.nd4j.linalg.dataset.api.iterator.DataSetIterator)59 DenseLayer (org.deeplearning4j.nn.conf.layers.DenseLayer)46 IrisDataSetIterator (org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator)45 OutputLayer (org.deeplearning4j.nn.conf.layers.OutputLayer)45 NormalDistribution (org.deeplearning4j.nn.conf.distribution.NormalDistribution)42 ScoreIterationListener (org.deeplearning4j.optimize.listeners.ScoreIterationListener)32 MnistDataSetIterator (org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator)29 ConvolutionLayer (org.deeplearning4j.nn.conf.layers.ConvolutionLayer)27 Random (java.util.Random)26 DL4JException (org.deeplearning4j.exception.DL4JException)20 BaseSparkTest (org.deeplearning4j.spark.BaseSparkTest)18 InMemoryModelSaver (org.deeplearning4j.earlystopping.saver.InMemoryModelSaver)17 MaxEpochsTerminationCondition (org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition)17 SparkDl4jMultiLayer (org.deeplearning4j.spark.impl.multilayer.SparkDl4jMultiLayer)17