org.joone.engine
Class DelayLayer

java.lang.Object
  extended by org.joone.engine.Layer
      extended by org.joone.engine.MemoryLayer
          extended by org.joone.engine.DelayLayer
All Implemented Interfaces:
java.io.Serializable, java.lang.Runnable, Learnable, LearnableLayer, NeuralLayer, Inspectable

public class DelayLayer
extends MemoryLayer

Delay unit to create temporal windows from time series
O---> Yk(t-N)
|
...
|
O---> Yk(t-1)
|
O---> Yk(t)
|
|<--------- Xk(t)

Where:
Xk = Input signal
Yk(t)...Yk(t-N+1) = Values of the output temporal window
N = taps

See Also:
Serialized Form

Field Summary
 
Fields inherited from class org.joone.engine.MemoryLayer
backmemory, memory
 
Fields inherited from class org.joone.engine.Layer
bias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG
 
Constructor Summary
DelayLayer()
          Constructor method
DelayLayer(java.lang.String ElemName)
          Constructor method
 
Method Summary
protected  void backward(double[] pattern)
          Reverse transfer function of the component.
 java.util.TreeSet check()
          Get check messages from listeners.
protected  void forward(double[] pattern)
          Transfer function to recall a result on a trained net
 double getDefaultState()
          Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer
 double getDerivative(int i)
          Similar to the backward message and used by RTRL
 double getMaximumState()
          Return maximum value of a node in this layer
 double getMinimumState()
          Return minimum value of a node in this layer
 java.util.Collection Inspections()
          It doesn't make sense to return biases for this layer
 
Methods inherited from class org.joone.engine.MemoryLayer
getDimension, getTaps, setDimensions, setOutputDimension, setTaps, sumBackInput
 
Methods inherited from class org.joone.engine.Layer
addInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getLearner, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setMonitor, setOutputSynapses, setRows, start, stop, sumInput, toString
 
Methods inherited from class java.lang.Object
clone, equals, getClass, hashCode, notify, notifyAll, wait, wait, wait
 

Constructor Detail

DelayLayer

public DelayLayer()
Constructor method


DelayLayer

public DelayLayer(java.lang.String ElemName)
Constructor method

Parameters:
ElemName - The layer's name
Method Detail

backward

protected void backward(double[] pattern)
Description copied from class: MemoryLayer
Reverse transfer function of the component.

Overrides:
backward in class MemoryLayer
Parameters:
pattern - double[] - input pattern on wich to apply the transfer function

getDerivative

public double getDerivative(int i)
Similar to the backward message and used by RTRL

Specified by:
getDerivative in class Layer

forward

protected void forward(double[] pattern)
Description copied from class: MemoryLayer
Transfer function to recall a result on a trained net

Overrides:
forward in class MemoryLayer
Parameters:
pattern - double[] - input pattern

check

public java.util.TreeSet check()
Description copied from class: Layer
Get check messages from listeners. Subclasses should call this method from thier own check method.

Specified by:
check in interface NeuralLayer
Overrides:
check in class MemoryLayer
Returns:
validation errors.
See Also:
NeuralLayer

Inspections

public java.util.Collection Inspections()
It doesn't make sense to return biases for this layer

Specified by:
Inspections in interface Inspectable
Overrides:
Inspections in class Layer
Returns:
null
See Also:
org.joone.Inspection

getDefaultState

public double getDefaultState()
Description copied from class: Layer
Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer

Specified by:
getDefaultState in class Layer

getMinimumState

public double getMinimumState()
Description copied from class: Layer
Return minimum value of a node in this layer

Specified by:
getMinimumState in class Layer

getMaximumState

public double getMaximumState()
Description copied from class: Layer
Return maximum value of a node in this layer

Specified by:
getMaximumState in class Layer


Submit Feedback to pmarrone@users.sourceforge.net