Package | Description |
---|---|
org.joone.engine | |
org.joone.net | |
org.joone.structure |
Modifier and Type | Interface and Description |
---|---|
interface |
LearnableLayer |
Modifier and Type | Class and Description |
---|---|
class |
BiasedLinearLayer
This layer consists of linear neurons, i.e.
|
class |
ContextLayer
The context layer is similar to the linear layer except that
it has an auto-recurrent connection between its output and input.
|
class |
DelayLayer
Delay unit to create temporal windows from time series
O---> Yk(t-N) |
class |
GaussianLayer
This layer implements the Gaussian Neighborhood SOM strategy.
|
class |
GaussLayer
The output of a Gauss(ian) layer neuron is the sum of the weighted input values,
applied to a gaussian curve (
exp(- x * x) ). |
class |
Layer
The Layer object is the basic element forming the neural net.
|
class |
LinearLayer
The output of a linear layer neuron is the sum of the weighted input values,
scaled by the beta parameter.
|
class |
LogarithmicLayer
This layer implements a logarithmic transfer function.
|
class |
MemoryLayer |
class |
RbfGaussianLayer
This class implements the nonlinear layer in Radial Basis Function (RBF)
networks using Gaussian functions.
|
class |
RbfLayer
This is the basis (helper) for radial basis function layers.
|
class |
SigmoidLayer
The output of a sigmoid layer neuron is the sum of the weighted input values,
applied to a sigmoid function.
|
class |
SimpleLayer
This abstract class represents layers that are composed
by neurons that implement some transfer function.
|
class |
SineLayer
The output of a sine layer neuron is the sum of the weighted input values,
applied to a sine (
sin(x) ). |
class |
SoftmaxLayer
The outputs of the Softmax layer must be interpreted as probabilities.
|
class |
TanhLayer
Layer that applies the tangent hyperbolic transfer function
to its input patterns
|
class |
WTALayer
This layer implements the Winner Takes All SOM strategy.
|
Modifier and Type | Method and Description |
---|---|
NeuralLayer |
Layer.copyInto(NeuralLayer newLayer)
Copies one layer into another, to obtain a type-transformation
from one kind of Layer to another.
|
NeuralLayer |
NeuralLayer.copyInto(NeuralLayer newLayer)
Copies a Layer into another one, to obtain a type-transformation
from a kind of Layer to another.
|
Modifier and Type | Method and Description |
---|---|
NeuralLayer |
Layer.copyInto(NeuralLayer newLayer)
Copies one layer into another, to obtain a type-transformation
from one kind of Layer to another.
|
NeuralLayer |
NeuralLayer.copyInto(NeuralLayer newLayer)
Copies a Layer into another one, to obtain a type-transformation
from a kind of Layer to another.
|
Modifier and Type | Class and Description |
---|---|
class |
NestedNeuralLayer |
class |
NeuralNet
This object represents a container of a neural network,
giving to the developer the possibility to manage a
neural network as a whole.
|
Modifier and Type | Method and Description |
---|---|
NeuralLayer |
NeuralNet.copyInto(NeuralLayer p1)
Not implemented.
|
NeuralLayer |
NestedNeuralLayer.copyInto(NeuralLayer p1) |
Modifier and Type | Method and Description |
---|---|
NeuralLayer |
NeuralNet.copyInto(NeuralLayer p1)
Not implemented.
|
NeuralLayer |
NestedNeuralLayer.copyInto(NeuralLayer p1) |
Modifier and Type | Class and Description |
---|---|
class |
NetworkLayer
Wraps an existing joone network into a single layer.
|
Submit Feedback to pmarrone@users.sourceforge.net