public class SoftmaxLayer extends LinearLayer
bias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG
Constructor and Description |
---|
SoftmaxLayer()
Creates a new instance of SoftmaxLayer
|
Modifier and Type | Method and Description |
---|---|
void |
forward(double[] pattern)
Transfer function to recall a result on a trained net
|
backward, getBeta, getDefaultState, getDerivative, getMaximumState, getMinimumState, Inspections, setBeta
getLearningRate, getLrate, getMomentum, setDimensions, setLrate, setMomentum, setMonitor
addInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, check, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getDimension, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getLearner, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setOutputDimension, setOutputSynapses, setRows, start, stop, sumBackInput, sumInput, toString
public void forward(double[] pattern)
Layer
forward
in class LinearLayer
pattern
- input pattern to which to apply the rtransfer functionSubmit Feedback to pmarrone@users.sourceforge.net