Constant Index
$#! · 0-9 · A · B · C · D · E · F · G · H · I · J · K · L · M · N · O · P · Q · R · S · T · U · V · W · X · Y · Z
E
 ERRORFUNC_LINEAR, FANN
 ERRORFUNC_TANH, FANN
F
 FANN_ACTIVATIONFUNC_NAMES
 FANN_COS
 FANN_COS_SYMMETRIC
 FANN_E_CANT_ALLOCATE_MEM
 FANN_E_CANT_OPEN_CONFIG_R
 FANN_E_CANT_OPEN_CONFIG_W
 FANN_E_CANT_OPEN_TD_R
 FANN_E_CANT_OPEN_TD_W
 FANN_E_CANT_READ_CONFIG
 FANN_E_CANT_READ_CONNECTIONS
 FANN_E_CANT_READ_NEURON
 FANN_E_CANT_READ_TD
 FANN_E_CANT_TRAIN_ACTIVATION
 FANN_E_CANT_USE_ACTIVATION
 FANN_E_CANT_USE_TRAIN_ALG
 FANN_E_INDEX_OUT_OF_BOUND
 FANN_E_INPUT_NO_MATCH
 FANN_E_NO_ERROR
 FANN_E_OUTPUT_NO_MATCH
 FANN_E_SCALE_NOT_PRESENT
 FANN_E_TRAIN_DATA_MISMATCH
 FANN_E_TRAIN_DATA_SUBSET
 FANN_E_WRONG_CONFIG_VERSION
 FANN_E_WRONG_NUM_CONNECTIONS
 FANN_ELLIOT
 FANN_ELLIOT_SYMMETRIC
 FANN_ERRORFUNC_LINEAR
 FANN_ERRORFUNC_NAMES
 FANN_ERRORFUNC_TANH
 FANN_GAUSSIAN
 FANN_GAUSSIAN_SYMMETRIC
 FANN_LINEAR
 FANN_LINEAR_PIECE
 FANN_LINEAR_PIECE_SYMMETRIC
 FANN_NETTYPE_LAYER
 FANN_NETTYPE_SHORTCUT
 FANN_NETWORK_TYPE_NAMES
 FANN_SIGMOID
 FANN_SIGMOID_STEPWISE
 FANN_SIGMOID_SYMMETRIC
 FANN_SIN
 FANN_SIN_SYMMETRIC
 FANN_STOPFUNC_BIT
 FANN_STOPFUNC_MSE
 FANN_STOPFUNC_NAMES
 FANN_THRESHOLD
 FANN_THRESHOLD_SYMMETRIC
 FANN_TRAIN_BATCH
 FANN_TRAIN_INCREMENTAL
 FANN_TRAIN_NAMES
 FANN_TRAIN_QUICKPROP
 FANN_TRAIN_RPROP
L
 LAYER, FANN
S
 SHORTCUT, FANN
 STOPFUNC_BIT, FANN
 STOPFUNC_MSE, FANN
T
 TRAIN_BATCH, FANN
 TRAIN_INCREMENTAL, FANN
 TRAIN_QUICKPROP, FANN
 TRAIN_RPROP, FANN
Standard linear error function.
Tanh error function, usually better but can require a lower learning rate.
Constant array consisting of the names for the activation function, so that the name of an activation function can be received by:
Periodical cosinus activation function.
Periodical cosinus activation function.
Periodical cosinus activation function.
Unable to allocate memory
Unable to open configuration file for reading
Unable to open configuration file for writing
Unable to open train data file for reading
Unable to open train data file for writing
Error reading info from configuration file
Error reading connections from configuration file
Error reading neuron info from configuration file
Error reading training data from file
Unable to train with the selected activation function
Unable to use the selected activation function
Unable to use the selected training algorithm
Index is out of bound
The number of input neurons in the ann and data don’t match
No error
The number of output neurons in the ann and data don’t match
Scaling parameters not present
Irreconcilable differences between two struct fann_train_data structures
Trying to take subset which is not within the training set
Wrong version of configuration file
Number of connections not equal to the number expected
Fast (sigmoid like) activation function defined by David Elliott
Fast (sigmoid like) activation function defined by David Elliott
Fast (symmetric sigmoid like) activation function defined by David Elliott
Fast (symmetric sigmoid like) activation function defined by David Elliott
Standard linear error function.
Constant array consisting of the names for the training error functions, so that the name of an error function can be received by:
Tanh error function, usually better but can require a lower learning rate.
Gaussian activation function.
Gaussian activation function.
Symmetric gaussian activation function.
Symmetric gaussian activation function.
Linear activation function.
Linear activation function.
Bounded linear activation function.
Bounded linear activation function.
Bounded linear activation function.
Bounded Linear activation function.
Each layer only has connections to the next layer
Each layer has connections to all following layers
Constant array consisting of the names for the network types, so that the name of an network type can be received by:
Sigmoid activation function.
Sigmoid activation function.
Stepwise linear approximation to sigmoid.
Stepwise linear approximation to sigmoid.
Symmetric sigmoid activation function, aka.
Symmetric sigmoid activation function, aka.
Periodical sinus activation function.
Periodical sinus activation function.
Periodical sinus activation function.
Stop criteria is number of bits that fail.
Stop criteria is Mean Square Error (MSE) value.
Constant array consisting of the names for the training stop functions, so that the name of a stop function can be received by:
Threshold activation function.
Threshold activation function.
Threshold activation function.
Threshold activation function.
Standard backpropagation algorithm, where the weights are updated after calculating the mean square error for the whole training set.
Standard backpropagation algorithm, where the weights are updated after each training pattern.
Constant array consisting of the names for the training algorithms, so that the name of an training function can be received by:
A more advanced batch training algorithm which achieves good results for many problems.
A more advanced batch training algorithm which achieves good results for many problems.
Each layer only has connections to the next layer
Each layer has connections to all following layers
Stop criteria is number of bits that fail.
Stop criteria is Mean Square Error (MSE) value.
Standard backpropagation algorithm, where the weights are updated after calculating the mean square error for the whole training set.
Standard backpropagation algorithm, where the weights are updated after each training pattern.
A more advanced batch training algorithm which achieves good results for many problems.
A more advanced batch training algorithm which achieves good results for many problems.
Close