public class BiasedLinearLayer extends SimpleLayer implements LearnableLayer
bias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG
Constructor and Description |
---|
BiasedLinearLayer()
Creates a new instance of BiasedLinearLayer
|
BiasedLinearLayer(java.lang.String anElemName)
Creates a new instance of BiasedLinearLayer.
|
Modifier and Type | Method and Description |
---|---|
void |
backward(double[] pattern)
Reverse transfer function of the component.
|
void |
forward(double[] pattern)
Transfer function to recall a result on a trained net
|
double |
getDefaultState()
Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer
|
double |
getDerivative(int i)
Similar to the backward message and used by RTRL
|
Learner |
getLearner()
Deprecated.
- Used only for backward compatibility
|
double |
getMaximumState()
Return maximum value of a node in this layer
|
double |
getMinimumState()
Return minimum value of a node in this layer
|
getLearningRate, getLrate, getMomentum, setDimensions, setLrate, setMomentum, setMonitor
addInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, check, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getDimension, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, Inspections, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setOutputDimension, setOutputSynapses, setRows, start, stop, sumBackInput, sumInput, toString
clone, equals, getClass, hashCode, notify, notifyAll, wait, wait, wait
getMonitor, initLearner
addInputSynapse, addNoise, addOutputSynapse, check, copyInto, getAllInputs, getAllOutputs, getBias, getLayerName, getMonitor, getRows, isRunning, removeAllInputs, removeAllOutputs, removeInputSynapse, removeOutputSynapse, setAllInputs, setAllOutputs, setBias, setLayerName, setMonitor, setRows, start
public BiasedLinearLayer()
public BiasedLinearLayer(java.lang.String anElemName)
The
- name of the layer.public void backward(double[] pattern)
Layer
backward
in class SimpleLayer
pattern
- input pattern on which to apply the transfer functionpublic double getDerivative(int i)
getDerivative
in class Layer
public void forward(double[] pattern)
Layer
public Learner getLearner()
Layer
getLearner
in interface Learnable
getLearner
in class Layer
Learnable.getLearner()
public double getDefaultState()
Layer
getDefaultState
in class Layer
public double getMinimumState()
Layer
getMinimumState
in class Layer
public double getMaximumState()
Layer
getMaximumState
in class Layer
Submit Feedback to pmarrone@users.sourceforge.net