|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use NeuralLayer | |
---|---|
org.joone.engine | |
org.joone.net | |
org.joone.structure |
Uses of NeuralLayer in org.joone.engine |
---|
Subinterfaces of NeuralLayer in org.joone.engine | |
---|---|
interface |
LearnableLayer
|
Classes in org.joone.engine that implement NeuralLayer | |
---|---|
class |
BiasedLinearLayer
This layer consists of linear neurons, i.e. |
class |
ContextLayer
The context layer is similar to the linear layer except that it has an auto-recurrent connection between its output and input. |
class |
DelayLayer
Delay unit to create temporal windows from time series
O---> Yk(t-N) |
class |
GaussianLayer
This layer implements the Gaussian Neighborhood SOM strategy. |
class |
GaussLayer
The output of a Gauss(ian) layer neuron is the sum of the weighted input values, applied to a gaussian curve ( exp(- x * x) ). |
class |
Layer
The Layer object is the basic element forming the neural net. |
class |
LinearLayer
The output of a linear layer neuron is the sum of the weighted input values, scaled by the beta parameter. |
class |
LogarithmicLayer
This layer implements a logarithmic transfer function. |
class |
MemoryLayer
|
class |
RbfGaussianLayer
This class implements the nonlinear layer in Radial Basis Function (RBF) networks using Gaussian functions. |
class |
RbfLayer
This is the basis (helper) for radial basis function layers. |
class |
SigmoidLayer
The output of a sigmoid layer neuron is the sum of the weighted input values, applied to a sigmoid function. |
class |
SimpleLayer
This abstract class represents layers that are composed by neurons that implement some transfer function. |
class |
SineLayer
The output of a sine layer neuron is the sum of the weighted input values, applied to a sine ( sin(x) ). |
class |
SoftmaxLayer
The outputs of the Softmax layer must be interpreted as probabilities. |
class |
TanhLayer
Layer that applies the tangent hyperbolic transfer function to its input patterns |
class |
WTALayer
This layer implements the Winner Takes All SOM strategy. |
Methods in org.joone.engine that return NeuralLayer | |
---|---|
NeuralLayer |
Layer.copyInto(NeuralLayer newLayer)
Copies one layer into another, to obtain a type-transformation from one kind of Layer to another. |
NeuralLayer |
NeuralLayer.copyInto(NeuralLayer newLayer)
Copies a Layer into another one, to obtain a type-transformation from a kind of Layer to another. |
Methods in org.joone.engine with parameters of type NeuralLayer | |
---|---|
NeuralLayer |
Layer.copyInto(NeuralLayer newLayer)
Copies one layer into another, to obtain a type-transformation from one kind of Layer to another. |
NeuralLayer |
NeuralLayer.copyInto(NeuralLayer newLayer)
Copies a Layer into another one, to obtain a type-transformation from a kind of Layer to another. |
Uses of NeuralLayer in org.joone.net |
---|
Classes in org.joone.net that implement NeuralLayer | |
---|---|
class |
NestedNeuralLayer
|
class |
NeuralNet
This object represents a container of a neural network, giving to the developer the possibility to manage a neural network as a whole. |
Methods in org.joone.net that return NeuralLayer | |
---|---|
NeuralLayer |
NeuralNet.copyInto(NeuralLayer p1)
Not implemented. |
NeuralLayer |
NestedNeuralLayer.copyInto(NeuralLayer p1)
|
Methods in org.joone.net with parameters of type NeuralLayer | |
---|---|
NeuralLayer |
NeuralNet.copyInto(NeuralLayer p1)
Not implemented. |
NeuralLayer |
NestedNeuralLayer.copyInto(NeuralLayer p1)
|
Uses of NeuralLayer in org.joone.structure |
---|
Classes in org.joone.structure that implement NeuralLayer | |
---|---|
class |
NetworkLayer
Wraps an existing joone network into a single layer. |
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |