Search in sources :

Example 1 with LFWDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator in project deeplearning4j by deeplearning4j.

the class DataSetIteratorTest method testLfwModel.

@Test
public void testLfwModel() throws Exception {
    final int numRows = 28;
    final int numColumns = 28;
    int numChannels = 3;
    int outputNum = LFWLoader.SUB_NUM_LABELS;
    int numSamples = 4;
    int batchSize = 2;
    int iterations = 1;
    int seed = 123;
    int listenerFreq = iterations;
    LFWDataSetIterator lfw = new LFWDataSetIterator(batchSize, numSamples, new int[] { numRows, numColumns, numChannels }, outputNum, true, true, 1.0, new Random(seed));
    MultiLayerConfiguration.Builder builder = new NeuralNetConfiguration.Builder().seed(seed).iterations(iterations).gradientNormalization(GradientNormalization.RenormalizeL2PerLayer).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).list().layer(0, new ConvolutionLayer.Builder(10, 10).nIn(numChannels).nOut(6).weightInit(WeightInit.XAVIER).activation(Activation.RELU).build()).layer(1, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX, new int[] { 2, 2 }).stride(1, 1).build()).layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD).nOut(outputNum).weightInit(WeightInit.XAVIER).activation(Activation.SOFTMAX).build()).setInputType(InputType.convolutionalFlat(numRows, numColumns, numChannels)).backprop(true).pretrain(false);
    MultiLayerNetwork model = new MultiLayerNetwork(builder.build());
    model.init();
    model.setListeners(new ScoreIterationListener(listenerFreq));
    model.fit(lfw.next());
    DataSet dataTest = lfw.next();
    INDArray output = model.output(dataTest.getFeatureMatrix());
    Evaluation eval = new Evaluation(outputNum);
    eval.eval(dataTest.getLabels(), output);
    System.out.println(eval.stats());
}
Also used : Evaluation(org.deeplearning4j.eval.Evaluation) SubsamplingLayer(org.deeplearning4j.nn.conf.layers.SubsamplingLayer) LFWDataSetIterator(org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator) DataSet(org.nd4j.linalg.dataset.DataSet) NeuralNetConfiguration(org.deeplearning4j.nn.conf.NeuralNetConfiguration) ConvolutionLayer(org.deeplearning4j.nn.conf.layers.ConvolutionLayer) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) Random(java.util.Random) INDArray(org.nd4j.linalg.api.ndarray.INDArray) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) ScoreIterationListener(org.deeplearning4j.optimize.listeners.ScoreIterationListener) Test(org.junit.Test)

Example 2 with LFWDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator in project deeplearning4j by deeplearning4j.

the class DataSetIteratorTest method testLfwIterator.

@Test
public void testLfwIterator() throws Exception {
    int numExamples = 1;
    int row = 28;
    int col = 28;
    int channels = 1;
    LFWDataSetIterator iter = new LFWDataSetIterator(numExamples, new int[] { row, col, channels }, true);
    assertTrue(iter.hasNext());
    DataSet data = iter.next();
    assertEquals(numExamples, data.getLabels().size(0));
    assertEquals(row, data.getFeatureMatrix().size(2));
}
Also used : LFWDataSetIterator(org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator) DataSet(org.nd4j.linalg.dataset.DataSet) Test(org.junit.Test)

Example 3 with LFWDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator in project deeplearning4j by deeplearning4j.

the class LocalUnstructuredDataFormatterTest method testRearrange.

@Test
public void testRearrange() throws Exception {
    //ensure exists
    new LFWDataSetIterator(new int[] { 28, 28, 3 }).next();
    File destinationDir = new File(System.getProperty("user.home"), "rearrangedlfw");
    if (destinationDir.exists())
        FileUtils.deleteDirectory(destinationDir);
    LocalUnstructuredDataFormatter formatter = new LocalUnstructuredDataFormatter(destinationDir, new File(System.getProperty("user.home"), "lfw"), LocalUnstructuredDataFormatter.LabelingType.DIRECTORY, 0.8);
    formatter.rearrange();
    //train and test in the split directory
    File splitRoot = new File(destinationDir, "split");
    assertEquals(2, splitRoot.listFiles().length);
    Iterator<File> files = FileUtils.iterateFiles(splitRoot, null, true);
    int count = 0;
    while (files.hasNext()) {
        files.next();
        count++;
    }
    Iterator<File> trainFiles = FileUtils.iterateFiles(new File(splitRoot, "train"), null, true);
    int trainFilesCount = 0;
    while (trainFiles.hasNext()) {
        trainFiles.next();
        trainFilesCount++;
    }
    //assert the number of files is the same in the split test train as the original dataset
    assertEquals(formatter.getNumExamplesTotal(), count);
    assertEquals(formatter.getNumExamplesToTrainOn(), trainFilesCount);
    Iterator<File> testFiles = FileUtils.iterateFiles(new File(splitRoot, "test"), null, true);
    int testFilesCount = 0;
    while (testFiles.hasNext()) {
        testFiles.next();
        testFilesCount++;
    }
    assertEquals(formatter.getNumTestExamples(), testFilesCount);
}
Also used : LFWDataSetIterator(org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator) File(java.io.File) Test(org.junit.Test)

Example 4 with LFWDataSetIterator

use of org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator in project deeplearning4j by deeplearning4j.

the class ManualTests method testCNNActivationsVisualization.

/**
     * This test is for manual execution only, since it's here just to get working CNN and visualize it's layers
     *
     * @throws Exception
     */
@Test
public void testCNNActivationsVisualization() throws Exception {
    final int numRows = 40;
    final int numColumns = 40;
    int nChannels = 3;
    int outputNum = LFWLoader.NUM_LABELS;
    int numSamples = LFWLoader.NUM_IMAGES;
    boolean useSubset = false;
    // numSamples/10;
    int batchSize = 200;
    int iterations = 5;
    int splitTrainNum = (int) (batchSize * .8);
    int seed = 123;
    int listenerFreq = iterations / 5;
    DataSet lfwNext;
    SplitTestAndTrain trainTest;
    DataSet trainInput;
    List<INDArray> testInput = new ArrayList<>();
    List<INDArray> testLabels = new ArrayList<>();
    log.info("Load data....");
    DataSetIterator lfw = new LFWDataSetIterator(batchSize, numSamples, new int[] { numRows, numColumns, nChannels }, outputNum, useSubset, true, 1.0, new Random(seed));
    log.info("Build model....");
    MultiLayerConfiguration.Builder builder = new NeuralNetConfiguration.Builder().seed(seed).iterations(iterations).activation(Activation.RELU).weightInit(WeightInit.XAVIER).gradientNormalization(GradientNormalization.RenormalizeL2PerLayer).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).learningRate(0.01).momentum(0.9).regularization(true).updater(Updater.ADAGRAD).useDropConnect(true).list().layer(0, new ConvolutionLayer.Builder(4, 4).name("cnn1").nIn(nChannels).stride(1, 1).nOut(20).build()).layer(1, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX, new int[] { 2, 2 }).name("pool1").build()).layer(2, new ConvolutionLayer.Builder(3, 3).name("cnn2").stride(1, 1).nOut(40).build()).layer(3, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX, new int[] { 2, 2 }).name("pool2").build()).layer(4, new ConvolutionLayer.Builder(3, 3).name("cnn3").stride(1, 1).nOut(60).build()).layer(5, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX, new int[] { 2, 2 }).name("pool3").build()).layer(6, new ConvolutionLayer.Builder(2, 2).name("cnn3").stride(1, 1).nOut(80).build()).layer(7, new DenseLayer.Builder().name("ffn1").nOut(160).dropOut(0.5).build()).layer(8, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD).nOut(outputNum).activation(Activation.SOFTMAX).build()).backprop(true).pretrain(false);
    new ConvolutionLayerSetup(builder, numRows, numColumns, nChannels);
    MultiLayerNetwork model = new MultiLayerNetwork(builder.build());
    model.init();
    log.info("Train model....");
    model.setListeners(Arrays.asList(new ScoreIterationListener(listenerFreq), new ConvolutionalIterationListener(listenerFreq)));
    while (lfw.hasNext()) {
        lfwNext = lfw.next();
        lfwNext.scale();
        // train set that is the result
        trainTest = lfwNext.splitTestAndTrain(splitTrainNum, new Random(seed));
        // get feature matrix and labels for training
        trainInput = trainTest.getTrain();
        testInput.add(trainTest.getTest().getFeatureMatrix());
        testLabels.add(trainTest.getTest().getLabels());
        model.fit(trainInput);
    }
    log.info("Evaluate model....");
    Evaluation eval = new Evaluation(lfw.getLabels());
    for (int i = 0; i < testInput.size(); i++) {
        INDArray output = model.output(testInput.get(i));
        eval.eval(testLabels.get(i), output);
    }
    INDArray output = model.output(testInput.get(0));
    eval.eval(testLabels.get(0), output);
    log.info(eval.stats());
    log.info("****************Example finished********************");
}
Also used : Evaluation(org.deeplearning4j.eval.Evaluation) DataSet(org.nd4j.linalg.dataset.DataSet) LFWDataSetIterator(org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator) NeuralNetConfiguration(org.deeplearning4j.nn.conf.NeuralNetConfiguration) ConvolutionalIterationListener(org.deeplearning4j.ui.weights.ConvolutionalIterationListener) MultiLayerConfiguration(org.deeplearning4j.nn.conf.MultiLayerConfiguration) INDArray(org.nd4j.linalg.api.ndarray.INDArray) ConvolutionLayerSetup(org.deeplearning4j.nn.conf.layers.setup.ConvolutionLayerSetup) MultiLayerNetwork(org.deeplearning4j.nn.multilayer.MultiLayerNetwork) ScoreIterationListener(org.deeplearning4j.optimize.listeners.ScoreIterationListener) SplitTestAndTrain(org.nd4j.linalg.dataset.SplitTestAndTrain) LFWDataSetIterator(org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator) DataSetIterator(org.nd4j.linalg.dataset.api.iterator.DataSetIterator) MnistDataSetIterator(org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator) Test(org.junit.Test)

Aggregations

LFWDataSetIterator (org.deeplearning4j.datasets.iterator.impl.LFWDataSetIterator)4 Test (org.junit.Test)4 DataSet (org.nd4j.linalg.dataset.DataSet)3 Evaluation (org.deeplearning4j.eval.Evaluation)2 MultiLayerConfiguration (org.deeplearning4j.nn.conf.MultiLayerConfiguration)2 NeuralNetConfiguration (org.deeplearning4j.nn.conf.NeuralNetConfiguration)2 MultiLayerNetwork (org.deeplearning4j.nn.multilayer.MultiLayerNetwork)2 ScoreIterationListener (org.deeplearning4j.optimize.listeners.ScoreIterationListener)2 INDArray (org.nd4j.linalg.api.ndarray.INDArray)2 File (java.io.File)1 Random (java.util.Random)1 MnistDataSetIterator (org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator)1 ConvolutionLayer (org.deeplearning4j.nn.conf.layers.ConvolutionLayer)1 SubsamplingLayer (org.deeplearning4j.nn.conf.layers.SubsamplingLayer)1 ConvolutionLayerSetup (org.deeplearning4j.nn.conf.layers.setup.ConvolutionLayerSetup)1 ConvolutionalIterationListener (org.deeplearning4j.ui.weights.ConvolutionalIterationListener)1 SplitTestAndTrain (org.nd4j.linalg.dataset.SplitTestAndTrain)1 DataSetIterator (org.nd4j.linalg.dataset.api.iterator.DataSetIterator)1