org.joone.engine
Class ExtendableLearner

java.lang.Object
  extended by org.joone.engine.AbstractLearner
      extended by org.joone.engine.ExtendableLearner
All Implemented Interfaces:
java.io.Serializable, Learner
Direct Known Subclasses:
BasicLearner, BatchLearner, RpropLearner

public class ExtendableLearner
extends AbstractLearner

Learners that extend this class are forced to implement certain functions, a so-called skeleton. The good thing is, because learners extend this class certain plug-ins can be added. For example, plug ins that change the objective function, or the delta-update rule. Still learners that do not fit into this skeleton have to opportunity to implement Learner directly (or extend AbstractLearner), but it won't be able to use the extra plug-ins (unless it is build in the learner by the programmer itself). Basically, this class is the BasicLearner, but by adding extenders it can provide totally different learning algoriths.

Author:
Boris Jansen
See Also:
Serialized Form

Field Summary
protected  java.util.List theDeltaRuleExtenders
          The list with delta rule extenders, extenders that change the delta w, e.g.
protected  java.util.List theGradientExtenders
          The list with gradient extenders, extenders that change the gradient.
protected  UpdateWeightExtender theUpdateWeightExtender
          The update weight extender, that is, the way to update the weights, online, batch mode, etc.
 
Fields inherited from class org.joone.engine.AbstractLearner
learnable, learnableLayer, learnableSynapse, monitor
 
Constructor Summary
ExtendableLearner()
          Creates a new instance of ExtendableLearner
 
Method Summary
 void addDeltaRuleExtender(DeltaRuleExtender aDeltaRuleExtender)
          Adds a delta extender.
 void addGradientExtender(GradientExtender aGradientExtender)
          Adds a gradient extender.
 double getDefaultDelta(double[] currentGradientOuts, int j)
          Gets the default (normal calculation of) delta.
 double getDefaultDelta(double[] currentInps, int j, double[] currentPattern, int k)
          Gets the default (normal calculation of) delta.
 double getDefaultGradientBias(double[] currentGradientOuts, int j)
          Gets the default (normal calculation of the) gradient for biases.
 double getDefaultGradientWeight(double[] currentInps, int j, double[] currentPattern, int k)
          Gets the default (normal calculation of the) gradient for weights.
protected  double getDelta(double[] currentGradientOuts, int j)
          Computes the delta value for a bias.
protected  double getDelta(double[] currentInps, int j, double[] currentPattern, int k)
          Computes the delta value for a weight.
 double getGradientBias(double[] currentGradientOuts, int j)
          Gets the gradient for biases.
 double getGradientWeight(double[] currentInps, int j, double[] currentPattern, int k)
          Gets the gradient for weights.
protected  double getLearningRate(int j)
          Gets the learning rate.
protected  double getLearningRate(int j, int k)
          Gets the learning rate.
 UpdateWeightExtender getUpdateWeightExtender()
          Gets the update weight extender.
protected  void postBiasUpdate(double[] currentGradientOuts)
          Gives learners and extenders a change to do some post-computing after the biases are updated.
protected  void postBiasUpdateImpl(double[] currentGradientOuts)
          Gives learners a change to do some post-computing after the biases are updated.
protected  void postWeightUpdate(double[] currentPattern, double[] currentInps)
          Gives learners and extenders a change to do some post-computing after the weights are updated.
protected  void postWeightUpdateImpl(double[] currentPattern, double[] currentInps)
          Gives learners a change to do some post-computing after the weights are updated.
protected  void preBiasUpdate(double[] currentGradientOuts)
          Gives learners and extenders a change to do some pre-computing before the biases are updated.
protected  void preBiasUpdateImpl(double[] currentGradientOuts)
          Gives learners a change to do some pre-computing before the biases are updated.
protected  void preWeightUpdate(double[] currentPattern, double[] currentInps)
          Gives learners and extenders a change to do some pre-computing before the weights are updated.
protected  void preWeightUpdateImpl(double[] currentPattern, double[] currentInps)
          Gives learners a change to do some pre-computing before the weights are updated.
 void requestBiasUpdate(double[] currentGradientOuts)
          Override this method to implement what should be done to LearnableLayers
 void requestWeightUpdate(double[] currentPattern, double[] currentInps)
          Override this method to implement what should be done to LearnableSynapses
 void setUpdateWeightExtender(UpdateWeightExtender anUpdateWeightExtender)
          Sets an update weight extender.
protected  void updateBias(int j, double aDelta)
          Updates a bias with the calculated delta value.
protected  void updateWeight(int j, int k, double aDelta)
          Updates a weight with the calculated delta value.
 
Methods inherited from class org.joone.engine.AbstractLearner
getLayer, getMonitor, getSynapse, registerLearnable, setMonitor
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

theDeltaRuleExtenders

protected java.util.List theDeltaRuleExtenders
The list with delta rule extenders, extenders that change the delta w, e.g. momentum term, etc.


theGradientExtenders

protected java.util.List theGradientExtenders
The list with gradient extenders, extenders that change the gradient.


theUpdateWeightExtender

protected UpdateWeightExtender theUpdateWeightExtender
The update weight extender, that is, the way to update the weights, online, batch mode, etc.

Constructor Detail

ExtendableLearner

public ExtendableLearner()
Creates a new instance of ExtendableLearner

Method Detail

requestBiasUpdate

public final void requestBiasUpdate(double[] currentGradientOuts)
Description copied from interface: Learner
Override this method to implement what should be done to LearnableLayers


requestWeightUpdate

public final void requestWeightUpdate(double[] currentPattern,
                                      double[] currentInps)
Description copied from interface: Learner
Override this method to implement what should be done to LearnableSynapses


updateBias

protected void updateBias(int j,
                          double aDelta)
Updates a bias with the calculated delta value.

Parameters:
j - the index of the bias to update.
aDelta - the calculated delta value.

updateWeight

protected void updateWeight(int j,
                            int k,
                            double aDelta)
Updates a weight with the calculated delta value.

Parameters:
j - the input index of the weight to update.
k - the output index of the weight to update.
aDelta - the calculated delta value.

getDelta

protected double getDelta(double[] currentGradientOuts,
                          int j)
Computes the delta value for a bias.

Parameters:
currentGradientOuts - the back propagated gradients.
j - the index of the bias.

getDefaultDelta

public double getDefaultDelta(double[] currentGradientOuts,
                              int j)
Gets the default (normal calculation of) delta.

Parameters:
currentGradientOuts - the back propagated gradients.
j - the index of the bias.

getDelta

protected double getDelta(double[] currentInps,
                          int j,
                          double[] currentPattern,
                          int k)
Computes the delta value for a weight.

Parameters:
currentInps - the forwarded input.
j - the input index of the weight.
currentPattern - the back propagated gradients.
k - the output index of the weight.

getDefaultDelta

public double getDefaultDelta(double[] currentInps,
                              int j,
                              double[] currentPattern,
                              int k)
Gets the default (normal calculation of) delta.

Parameters:
currentInps - the forwarded input.
j - the input index of the weight.
currentPattern - the back propagated gradients.
k - the output index of the weight.

getLearningRate

protected double getLearningRate(int j)
Gets the learning rate.

Parameters:
j - the index of the bias (for which we should get the learning rate).
Returns:
the learning rate for a bias.

getLearningRate

protected double getLearningRate(int j,
                                 int k)
Gets the learning rate.

Parameters:
j - the input index of the weight (for which we should get the learning rate).
k - the output index of the weight (for which we should get the learning rate).
Returns:
the learning rate for a weight.

getGradientBias

public double getGradientBias(double[] currentGradientOuts,
                              int j)
Gets the gradient for biases.

Parameters:
currentGradientOuts - the back protected gradients.
j - the index of the bias.
Returns:
the gradient for bias b_i.

getDefaultGradientBias

public double getDefaultGradientBias(double[] currentGradientOuts,
                                     int j)
Gets the default (normal calculation of the) gradient for biases.

Parameters:
currentGradientOuts - the back protected gradients.
j - the index of the bias.
Returns:
the gradient for bias b_i.

getGradientWeight

public double getGradientWeight(double[] currentInps,
                                int j,
                                double[] currentPattern,
                                int k)
Gets the gradient for weights.

Parameters:
aCurrentInps - the forwarded input.
j - the input index of the weight.
currentPattern - the back propagated gradients.
k - the output index of the weight.
Returns:
the gradient for the weight w_j_k

getDefaultGradientWeight

public double getDefaultGradientWeight(double[] currentInps,
                                       int j,
                                       double[] currentPattern,
                                       int k)
Gets the default (normal calculation of the) gradient for weights.

Parameters:
aCurrentInps - the forwarded input.
j - the input index of the weight.
currentPattern - the back propagated gradients.
k - the output index of the weight.
Returns:
the gradient for the weight w_j_k

preBiasUpdate

protected final void preBiasUpdate(double[] currentGradientOuts)
Gives learners and extenders a change to do some pre-computing before the biases are updated.

Parameters:
currentGradientOuts - the back propagated gradients.

preBiasUpdateImpl

protected void preBiasUpdateImpl(double[] currentGradientOuts)
Gives learners a change to do some pre-computing before the biases are updated.

Parameters:
currentGradientOuts -

preWeightUpdate

protected final void preWeightUpdate(double[] currentPattern,
                                     double[] currentInps)
Gives learners and extenders a change to do some pre-computing before the weights are updated.

Parameters:
currentPattern - the back propagated gradients.
currentInps - the forwarded input.

preWeightUpdateImpl

protected void preWeightUpdateImpl(double[] currentPattern,
                                   double[] currentInps)
Gives learners a change to do some pre-computing before the weights are updated.

Parameters:
currentPattern - the back propagated gradients.
currentInps - the forwarded input.

postBiasUpdate

protected final void postBiasUpdate(double[] currentGradientOuts)
Gives learners and extenders a change to do some post-computing after the biases are updated.

Parameters:
currentGradientOuts - the back propagated gradients.

postBiasUpdateImpl

protected void postBiasUpdateImpl(double[] currentGradientOuts)
Gives learners a change to do some post-computing after the biases are updated.

Parameters:
currentGradientOuts - the back propagated gradients.

postWeightUpdate

protected final void postWeightUpdate(double[] currentPattern,
                                      double[] currentInps)
Gives learners and extenders a change to do some post-computing after the weights are updated.

Parameters:
currentPattern - the back propagated gradients.
currentInps - the forwarded input.

postWeightUpdateImpl

protected void postWeightUpdateImpl(double[] currentPattern,
                                    double[] currentInps)
Gives learners a change to do some post-computing after the weights are updated.

Parameters:
currentPattern - the back propagated gradients.
currentInps - the forwarded input.

addDeltaRuleExtender

public void addDeltaRuleExtender(DeltaRuleExtender aDeltaRuleExtender)
Adds a delta extender.

Parameters:
aDeltaRuleExtender - the delta rule extender to add.

addGradientExtender

public void addGradientExtender(GradientExtender aGradientExtender)
Adds a gradient extender.

Parameters:
aGradientExtender - the gradient extender to add.

setUpdateWeightExtender

public void setUpdateWeightExtender(UpdateWeightExtender anUpdateWeightExtender)
Sets an update weight extender.

Parameters:
anUpdateWeightExtender - the update weight extender to set.

getUpdateWeightExtender

public UpdateWeightExtender getUpdateWeightExtender()
Gets the update weight extender.

Returns:
the update weight extender.


Submit Feedback to pmarrone@users.sourceforge.net