public class LogarithmicLayer extends SimpleLayer implements LearnableLayer
bias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG
Constructor and Description |
---|
LogarithmicLayer()
Creates a new instance of LogarithmicLayer
|
LogarithmicLayer(java.lang.String elemName) |
Modifier and Type | Method and Description |
---|---|
protected void |
backward(double[] pattern)
Reverse transfer function of the component.
|
protected void |
forward(double[] pattern)
Transfer function to recall a result on a trained net
|
double |
getDefaultState()
Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer
|
double |
getDerivative(int i)
Similar to the backward message and used by RTRL
|
Learner |
getLearner()
Deprecated.
- Used only for backward compatibility
|
double |
getMaximumState()
Return maximum value of a node in this layer
|
double |
getMinimumState()
Return minimum value of a node in this layer
|
getLearningRate, getLrate, getMomentum, setDimensions, setLrate, setMomentum, setMonitor
addInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, check, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getDimension, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, Inspections, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setOutputDimension, setOutputSynapses, setRows, start, stop, sumBackInput, sumInput, toString
clone, equals, getClass, hashCode, notify, notifyAll, wait, wait, wait
getMonitor, initLearner
addInputSynapse, addNoise, addOutputSynapse, check, copyInto, getAllInputs, getAllOutputs, getBias, getLayerName, getMonitor, getRows, isRunning, removeAllInputs, removeAllOutputs, removeInputSynapse, removeOutputSynapse, setAllInputs, setAllOutputs, setBias, setLayerName, setMonitor, setRows, start
public LogarithmicLayer()
public LogarithmicLayer(java.lang.String elemName)
protected void forward(double[] pattern)
protected void backward(double[] pattern)
backward
in class SimpleLayer
pattern
- double[] - input pattern on wich to apply the transfer functionpublic double getDerivative(int i)
getDerivative
in class Layer
public Learner getLearner()
Layer
getLearner
in interface Learnable
getLearner
in class Layer
Learnable.getLearner()
public double getDefaultState()
Layer
getDefaultState
in class Layer
public double getMinimumState()
Layer
getMinimumState
in class Layer
public double getMaximumState()
Layer
getMaximumState
in class Layer
Submit Feedback to pmarrone@users.sourceforge.net