Search in sources :

Example 6 with IrisDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator in project deeplearning4j by deeplearning4j.

the class TestComputationGraphNetwork method testPreTraining.

@Test
public void testPreTraining() {
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().iterations(100).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).iterations(1).updater(Updater.SGD).learningRate(1e-6).regularization(true).l2(2e-4).graphBuilder().addInputs("in").addLayer("layer0", new RBM.Builder(RBM.HiddenUnit.GAUSSIAN, RBM.VisibleUnit.GAUSSIAN).nIn(4).nOut(3).weightInit(WeightInit.DISTRIBUTION).dist(new UniformDistribution(0, 1)).activation(Activation.TANH).lossFunction(LossFunctions.LossFunction.KL_DIVERGENCE).build(), "in").addLayer("layer1", new RBM.Builder(RBM.HiddenUnit.GAUSSIAN, RBM.VisibleUnit.GAUSSIAN).nIn(4).nOut(3).weightInit(WeightInit.DISTRIBUTION).dist(new UniformDistribution(0, 1)).activation(Activation.TANH).lossFunction(LossFunctions.LossFunction.KL_DIVERGENCE).build(), "in").addLayer("layer2", new RBM.Builder(RBM.HiddenUnit.GAUSSIAN, RBM.VisibleUnit.GAUSSIAN).nIn(3).nOut(3).weightInit(WeightInit.DISTRIBUTION).dist(new UniformDistribution(0, 1)).activation(Activation.TANH).lossFunction(LossFunctions.LossFunction.KL_DIVERGENCE).build(), "layer1").addLayer("out", new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(3 + 3).nOut(3).weightInit(WeightInit.DISTRIBUTION).dist(new UniformDistribution(0, 1)).activation(Activation.SOFTMAX).build(), "layer0", "layer2").setOutputs("out").pretrain(true).backprop(false).build();
    ComputationGraph net = new ComputationGraph(conf);
    net.init();
    net.setListeners(new ScoreIterationListener(1));
    DataSetIterator iter = new IrisDataSetIterator(10, 150);
    net.fit(iter);
}
Also used : IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) UniformDistribution(org.deeplearning4j.nn.conf.distribution.UniformDistribution) org.deeplearning4j.nn.conf(org.deeplearning4j.nn.conf) ScoreIterationListener(org.deeplearning4j.optimize.listeners.ScoreIterationListener) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) RecordReaderMultiDataSetIterator(org.deeplearning4j.datasets.datavec.RecordReaderMultiDataSetIterator) MultiDataSetIterator(org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator) Test(org.junit.Test)

Example 7 with IrisDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator in project deeplearning4j by deeplearning4j.

the class TestComputationGraphNetwork method testScoringDataSet.

@Test
public void testScoringDataSet() {
    ComputationGraphConfiguration configuration = getIrisGraphConfiguration();
    ComputationGraph graph = new ComputationGraph(configuration);
    graph.init();
    MultiLayerConfiguration mlc = getIrisMLNConfiguration();
    MultiLayerNetwork net = new MultiLayerNetwork(mlc);
    net.init();
    DataSetIterator iris = new IrisDataSetIterator(150, 150);
    DataSet ds = iris.next();
    //Now: set parameters of both networks to be identical. Then feedforward, and check we get the same score
    Nd4j.getRandom().setSeed(12345);
    int nParams = getNumParams();
    INDArray params = Nd4j.rand(1, nParams);
    graph.setParams(params.dup());
    net.setParams(params.dup());
    double scoreMLN = net.score(ds, false);
    double scoreCG = graph.score(ds, false);
    assertEquals(scoreMLN, scoreCG, 1e-4);
}
Also used : IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) INDArray(org.nd4j.linalg.api.ndarray.INDArray) DataSet(org.nd4j.linalg.dataset.DataSet) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) RecordReaderMultiDataSetIterator(org.deeplearning4j.datasets.datavec.RecordReaderMultiDataSetIterator) MultiDataSetIterator(org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator) Test(org.junit.Test)

Example 8 with IrisDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator in project deeplearning4j by deeplearning4j.

the class TestComputationGraphNetwork method testCompGraphUnderscores.

@Test
public void testCompGraphUnderscores() {
    //Problem: underscores in names could be problematic for ComputationGraphUpdater, HistogramIterationListener
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).graphBuilder().addInputs("input").addLayer("first_layer", new DenseLayer.Builder().nIn(4).nOut(5).build(), "input").addLayer("output_layer", new OutputLayer.Builder().nIn(5).nOut(3).build(), "first_layer").setOutputs("output_layer").pretrain(false).backprop(true).build();
    ComputationGraph net = new ComputationGraph(conf);
    net.init();
    DataSetIterator iris = new IrisDataSetIterator(10, 150);
    while (iris.hasNext()) {
        net.fit(iris.next());
    }
}
Also used : IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) RecordReaderMultiDataSetIterator(org.deeplearning4j.datasets.datavec.RecordReaderMultiDataSetIterator) MultiDataSetIterator(org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator) Test(org.junit.Test)

Example 9 with IrisDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator in project deeplearning4j by deeplearning4j.

the class GradientCheckTests method testRbm.

@Test
public void testRbm() {
    //As above (testGradientMLP2LayerIrisSimple()) but with L2, L1, and both L2/L1 applied
    //Need to run gradient through updater, so that L2 can be applied
    RBM.HiddenUnit[] hiddenFunc = { RBM.HiddenUnit.BINARY, RBM.HiddenUnit.RECTIFIED };
    //If true: run some backprop steps first
    boolean[] characteristic = { false, true };
    LossFunction[] lossFunctions = { LossFunction.MSE, LossFunction.KL_DIVERGENCE };
    //i.e., lossFunctions[i] used with outputActivations[i] here
    String[] outputActivations = { "softmax", "sigmoid" };
    DataNormalization scaler = new NormalizerMinMaxScaler();
    DataSetIterator iter = new IrisDataSetIterator(150, 150);
    scaler.fit(iter);
    iter.setPreProcessor(scaler);
    DataSet ds = iter.next();
    INDArray input = ds.getFeatureMatrix();
    INDArray labels = ds.getLabels();
    double[] l2vals = { 0.4, 0.0, 0.4 };
    //i.e., use l2vals[i] with l1vals[i]
    double[] l1vals = { 0.0, 0.5, 0.5 };
    for (RBM.HiddenUnit hidunit : hiddenFunc) {
        for (boolean doLearningFirst : characteristic) {
            for (int i = 0; i < lossFunctions.length; i++) {
                for (int k = 0; k < l2vals.length; k++) {
                    LossFunction lf = lossFunctions[i];
                    String outputActivation = outputActivations[i];
                    double l2 = l2vals[k];
                    double l1 = l1vals[k];
                    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().regularization(true).l2(l2).l1(l1).learningRate(1.0).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).seed(12345L).list().layer(0, new RBM.Builder(hidunit, RBM.VisibleUnit.BINARY).nIn(4).nOut(3).weightInit(WeightInit.UNIFORM).updater(Updater.SGD).build()).layer(1, new OutputLayer.Builder(lf).nIn(3).nOut(3).weightInit(WeightInit.XAVIER).updater(Updater.SGD).activation(outputActivation).build()).pretrain(false).backprop(true).build();
                    MultiLayerNetwork mln = new MultiLayerNetwork(conf);
                    mln.init();
                    if (doLearningFirst) {
                        //Run a number of iterations of learning
                        mln.setInput(ds.getFeatures());
                        mln.setLabels(ds.getLabels());
                        mln.computeGradientAndScore();
                        double scoreBefore = mln.score();
                        for (int j = 0; j < 10; j++) mln.fit(ds);
                        mln.computeGradientAndScore();
                        double scoreAfter = mln.score();
                        //Can't test in 'characteristic mode of operation' if not learning
                        String msg = "testGradMLP2LayerIrisSimple() - score did not (sufficiently) decrease during learning - activationFn=" + hidunit.toString() + ", lossFn=" + lf + ", outputActivation=" + outputActivation + ", doLearningFirst=" + doLearningFirst + ", l2=" + l2 + ", l1=" + l1 + " (before=" + scoreBefore + ", scoreAfter=" + scoreAfter + ")";
                        assertTrue(msg, scoreAfter < scoreBefore);
                    }
                    if (PRINT_RESULTS) {
                        System.out.println("testGradientMLP2LayerIrisSimpleRandom() - activationFn=" + hidunit.toString() + ", lossFn=" + lf + ", outputActivation=" + outputActivation + ", doLearningFirst=" + doLearningFirst + ", l2=" + l2 + ", l1=" + l1);
                        for (int j = 0; j < mln.getnLayers(); j++) System.out.println("Layer " + j + " # params: " + mln.getLayer(j).numParams());
                    }
                    boolean gradOK = GradientCheckUtil.checkGradients(mln, DEFAULT_EPS, DEFAULT_MAX_REL_ERROR, DEFAULT_MIN_ABS_ERROR, PRINT_RESULTS, RETURN_ON_FIRST_FAILURE, input, labels);
                    String msg = "testGradMLP2LayerIrisSimple() - activationFn=" + hidunit.toString() + ", lossFn=" + lf + ", outputActivation=" + outputActivation + ", doLearningFirst=" + doLearningFirst + ", l2=" + l2 + ", l1=" + l1;
                    assertTrue(msg, gradOK);
                }
            }
        }
    }
}
Also used : NormalizerMinMaxScaler(org.nd4j.linalg.dataset.api.preprocessor.NormalizerMinMaxScaler) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) DataSet(org.nd4j.linalg.dataset.DataSet) DataNormalization(org.nd4j.linalg.dataset.api.preprocessor.DataNormalization) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) INDArray(org.nd4j.linalg.api.ndarray.INDArray) LossFunction(org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) Test(org.junit.Test)

Example 10 with IrisDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator in project deeplearning4j by deeplearning4j.

the class GradientCheckTestsComputationGraph method testBasicIris.

@Test
public void testBasicIris() {
    Nd4j.getRandom().setSeed(12345);
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).updater(Updater.NONE).learningRate(1.0).graphBuilder().addInputs("input").addLayer("firstLayer", new DenseLayer.Builder().nIn(4).nOut(5).activation(Activation.TANH).build(), "input").addLayer("outputLayer", new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MCXENT).activation(Activation.SOFTMAX).nIn(5).nOut(3).build(), "firstLayer").setOutputs("outputLayer").pretrain(false).backprop(true).build();
    ComputationGraph graph = new ComputationGraph(conf);
    graph.init();
    Nd4j.getRandom().setSeed(12345);
    int nParams = graph.numParams();
    INDArray newParams = Nd4j.rand(1, nParams);
    graph.setParams(newParams);
    DataSet ds = new IrisDataSetIterator(150, 150).next();
    INDArray min = ds.getFeatureMatrix().min(0);
    INDArray max = ds.getFeatureMatrix().max(0);
    ds.getFeatureMatrix().subiRowVector(min).diviRowVector(max.sub(min));
    INDArray input = ds.getFeatureMatrix();
    INDArray labels = ds.getLabels();
    if (PRINT_RESULTS) {
        System.out.println("testBasicIris()");
        for (int j = 0; j < graph.getNumLayers(); j++) System.out.println("Layer " + j + " # params: " + graph.getLayer(j).numParams());
    }
    boolean gradOK = GradientCheckUtil.checkGradients(graph, DEFAULT_EPS, DEFAULT_MAX_REL_ERROR, DEFAULT_MIN_ABS_ERROR, PRINT_RESULTS, RETURN_ON_FIRST_FAILURE, new INDArray[] { input }, new INDArray[] { labels });
    String msg = "testBasicIris()";
    assertTrue(msg, gradOK);
}
Also used : INDArray(org.nd4j.linalg.api.ndarray.INDArray) IrisDataSetIterator(org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator) NormalDistribution(org.deeplearning4j.nn.conf.distribution.NormalDistribution) DataSet(org.nd4j.linalg.dataset.DataSet) ComputationGraphConfiguration(org.deeplearning4j.nn.conf.ComputationGraphConfiguration) ComputationGraph(org.deeplearning4j.nn.graph.ComputationGraph) Test(org.junit.Test)

Aggregations

IrisDataSetIterator (org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator)96 Test (org.junit.Test)91 DataSetIterator (org.nd4j.linalg.dataset.api.iterator.DataSetIterator)75 DataSet (org.nd4j.linalg.dataset.DataSet)48 MultiLayerNetwork (org.deeplearning4j.nn.multilayer.MultiLayerNetwork)47 MultiLayerConfiguration (org.deeplearning4j.nn.conf.MultiLayerConfiguration)41 NeuralNetConfiguration (org.deeplearning4j.nn.conf.NeuralNetConfiguration)41 INDArray (org.nd4j.linalg.api.ndarray.INDArray)37 ScoreIterationListener (org.deeplearning4j.optimize.listeners.ScoreIterationListener)35 OutputLayer (org.deeplearning4j.nn.conf.layers.OutputLayer)21 InMemoryModelSaver (org.deeplearning4j.earlystopping.saver.InMemoryModelSaver)18 MaxEpochsTerminationCondition (org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition)18 BaseSparkTest (org.deeplearning4j.spark.BaseSparkTest)16 MaxTimeIterationTerminationCondition (org.deeplearning4j.earlystopping.termination.MaxTimeIterationTerminationCondition)15 ComputationGraphConfiguration (org.deeplearning4j.nn.conf.ComputationGraphConfiguration)15 DenseLayer (org.deeplearning4j.nn.conf.layers.DenseLayer)15 RecordReaderMultiDataSetIterator (org.deeplearning4j.datasets.datavec.RecordReaderMultiDataSetIterator)13 ComputationGraph (org.deeplearning4j.nn.graph.ComputationGraph)13 MultiDataSetIterator (org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator)13 IEarlyStoppingTrainer (org.deeplearning4j.earlystopping.trainer.IEarlyStoppingTrainer)12