Sets the (mini-) batch size of your training. Default: 1 (online training)
If set to true, will clear the network after every activation. This is useful for training LSTM's, more importantly for time series prediction.
Sets the amount of test cases that should be assigned to cross validation. If data to 0.4, 40% of the given data will be used for cross validation.
A data of input values and ideal output values to train the network with
Dropout rate likelihood for any given neuron to be ignored during network training. Must be between zero and one, numbers closer to one will result in more neurons ignored.
The target error to train for, once the network falls below this error, the process is stopped. Lower error rates require more training cycles.
Sets amount of training cycles the process will maximally run, even when the target error has not been reached.
If set to n, outputs training status every n iterations. Setting log
to 1 will log the status every iteration_number
The options.loss function used to determine network error
Momentum. Adds a fraction of the previous weight update to the current one.
A learning rate policy, i.e. how to change the learning rate during training to better network performance
You can schedule tasks to happen every n iterations. Paired with options.schedule.function
If set to true, will shuffle the training data every iterationNumber. Good option to use if the network is performing worse in cross validation than in the real training data.
Generated using TypeDoc
Options used to train network