Neural Networks (ANNs)
- 19 different neural network architectures are applied.
- The networks differed from each other in the number
of hidden layers and in the number of neurons in a
- Every node in a layer (input, hidden and output)
in these feed-forward networks (multilayer preceptors)
is fully connected (by adjustable weights) to every
node in a subsequent layer.
- Thus, the total input to a node of the hidden layer
can be written as follows:
- W are the adjustable connection weights.
- Constant terms, so called biases are given by introducing
an extra input to each unit.
- Transfer functions are of the logsig type, thus
a unit has a real-valued output: