Search in sources :

Example 1 with ActivationTANH

use of org.encog.engine.network.activation.ActivationTANH in project shifu by ShifuML.

the class NNTrainer method buildNetwork.

@SuppressWarnings("unchecked")
public void buildNetwork() {
    network = new BasicNetwork();
    network.addLayer(new BasicLayer(new ActivationLinear(), true, trainSet.getInputSize()));
    int numLayers = (Integer) modelConfig.getParams().get(CommonConstants.NUM_HIDDEN_LAYERS);
    List<String> actFunc = (List<String>) modelConfig.getParams().get(CommonConstants.ACTIVATION_FUNC);
    List<Integer> hiddenNodeList = (List<Integer>) modelConfig.getParams().get(CommonConstants.NUM_HIDDEN_NODES);
    if (numLayers != 0 && (numLayers != actFunc.size() || numLayers != hiddenNodeList.size())) {
        throw new RuntimeException("the number of layer do not equal to the number of activation function or the function list and node list empty");
    }
    if (toLoggingProcess)
        LOG.info("    - total " + numLayers + " layers, each layers are: " + Arrays.toString(hiddenNodeList.toArray()) + " the activation function are: " + Arrays.toString(actFunc.toArray()));
    for (int i = 0; i < numLayers; i++) {
        String func = actFunc.get(i);
        Integer numHiddenNode = hiddenNodeList.get(i);
        // java 6
        if ("linear".equalsIgnoreCase(func)) {
            network.addLayer(new BasicLayer(new ActivationLinear(), true, numHiddenNode));
        } else if (func.equalsIgnoreCase("sigmoid")) {
            network.addLayer(new BasicLayer(new ActivationSigmoid(), true, numHiddenNode));
        } else if (func.equalsIgnoreCase("tanh")) {
            network.addLayer(new BasicLayer(new ActivationTANH(), true, numHiddenNode));
        } else if (func.equalsIgnoreCase("log")) {
            network.addLayer(new BasicLayer(new ActivationLOG(), true, numHiddenNode));
        } else if (func.equalsIgnoreCase("sin")) {
            network.addLayer(new BasicLayer(new ActivationSIN(), true, numHiddenNode));
        } else {
            LOG.info("Unsupported activation function: " + func + " !! Set this layer activation function to be Sigmoid ");
            network.addLayer(new BasicLayer(new ActivationSigmoid(), true, numHiddenNode));
        }
    }
    network.addLayer(new BasicLayer(new ActivationSigmoid(), false, trainSet.getIdealSize()));
    network.getStructure().finalizeStructure();
    if (!modelConfig.isFixInitialInput()) {
        network.reset();
    } else {
        int numWeight = 0;
        for (int i = 0; i < network.getLayerCount() - 1; i++) {
            numWeight = numWeight + network.getLayerTotalNeuronCount(i) * network.getLayerNeuronCount(i + 1);
        }
        LOG.info("    - You have " + numWeight + " weights to be initialize");
        loadWeightsInput(numWeight);
    }
}
Also used : ActivationLinear(org.encog.engine.network.activation.ActivationLinear) ActivationSIN(org.encog.engine.network.activation.ActivationSIN) ActivationLOG(org.encog.engine.network.activation.ActivationLOG) BasicNetwork(org.encog.neural.networks.BasicNetwork) ActivationTANH(org.encog.engine.network.activation.ActivationTANH) ActivationSigmoid(org.encog.engine.network.activation.ActivationSigmoid) ArrayList(java.util.ArrayList) List(java.util.List) BasicLayer(org.encog.neural.networks.layers.BasicLayer)

Example 2 with ActivationTANH

use of org.encog.engine.network.activation.ActivationTANH in project shifu by ShifuML.

the class ActivationPTANHTest method test.

@Test
public void test() {
    ActivationTANH tanh = new ActivationTANH();
    ActivationPTANH ptanh = new ActivationPTANH();
    double[] inputs = new double[] { 0.0d, 1.0d, -1.0d };
    ptanh.activationFunction(inputs, 0, 3);
    Assert.assertTrue(Math.abs(inputs[0] - 0.0) < 1e-6);
    Assert.assertTrue(Math.abs(inputs[1] - 0.7615941559557649d) < 1e-6);
    Assert.assertTrue(Math.abs(inputs[2] + 0.1903985389889412d) < 1e-6);
    double d = ptanh.derivativeFunction(0.0d, inputs[0]);
    Assert.assertTrue(Math.abs(d - 0.25d) < 1e-6);
    d = ptanh.derivativeFunction(1.0d, inputs[1]);
    Assert.assertTrue(Math.abs(d - tanh.derivativeFunction(1.0d, inputs[1])) < 1e-6);
    double[] t = new double[] { -1.0d };
    tanh.activationFunction(t, 0, 1);
    d = ptanh.derivativeFunction(-1.0d, inputs[2]);
    Assert.assertTrue(Math.abs(d - 0.25 * tanh.derivativeFunction(-1.0d, t[0])) < 1e-6);
}
Also used : ActivationTANH(org.encog.engine.network.activation.ActivationTANH) ActivationPTANH(ml.shifu.shifu.core.dtrain.nn.ActivationPTANH) Test(org.testng.annotations.Test)

Aggregations

ActivationTANH (org.encog.engine.network.activation.ActivationTANH)2 ArrayList (java.util.ArrayList)1 List (java.util.List)1 ActivationPTANH (ml.shifu.shifu.core.dtrain.nn.ActivationPTANH)1 ActivationLOG (org.encog.engine.network.activation.ActivationLOG)1 ActivationLinear (org.encog.engine.network.activation.ActivationLinear)1 ActivationSIN (org.encog.engine.network.activation.ActivationSIN)1 ActivationSigmoid (org.encog.engine.network.activation.ActivationSigmoid)1 BasicNetwork (org.encog.neural.networks.BasicNetwork)1 BasicLayer (org.encog.neural.networks.layers.BasicLayer)1 Test (org.testng.annotations.Test)1