use of org.encog.neural.NeuralNetworkError in project shifu by ShifuML.
the class FloatNeuralStructure method finalizeStruct.
/**
* Build the synapse and layer structure. This method should be called afteryou are done adding layers to a network,
* or change the network's logic property.
*/
public void finalizeStruct() {
if (this.getLayers().size() < 2) {
throw new NeuralNetworkError("There must be at least two layers before the structure is finalized.");
}
final FlatLayer[] flatLayers = new FlatLayer[this.getLayers().size()];
for (int i = 0; i < this.getLayers().size(); i++) {
final BasicLayer layer = (BasicLayer) this.getLayers().get(i);
if (layer.getActivation() == null) {
layer.setActivation(new ActivationLinear());
}
flatLayers[i] = layer;
}
this.setFlat(new FloatFlatNetwork(flatLayers, true));
finalizeLimit();
this.getLayers().clear();
enforceLimit();
}
use of org.encog.neural.NeuralNetworkError in project shifu by ShifuML.
the class CacheBasicFloatNetwork method compute.
/**
* Compute network score (forward computing). If cacheInputOutput is true, to cache first layer output in this
* class. Then if cacheInputOutput is false, read value from cache and then use sum-current item to save CPU
* computation.
*
* @param input
* input value array
* @param cacheInputOutput
* if it is to cache first layer output or to use first layer output cache.
* @param resetInputIndex
* if cacheInputOutput is false, resetInputIndex is which item should be removed.
* @return output value as score.
*/
public final MLData compute(final MLData input, boolean cacheInputOutput, int resetInputIndex) {
try {
final MLData result = new BasicMLData(this.network.getStructure().getFlat().getOutputCount());
compute(input.getData(), result.getData(), cacheInputOutput, resetInputIndex);
return result;
} catch (final ArrayIndexOutOfBoundsException ex) {
throw new NeuralNetworkError("Index exception: there was likely a mismatch between layer sizes, or the size of the input presented to the network.", ex);
}
}
Aggregations