This could be due to the following reasons:
Scenario 1: One algorithm was selected but the Number of Models to Train was set to 30.
Explanation: Decanter AI’s model consists of 5 major groups of algorithms, and each group of algorithms also contains about 60 different algorithms. Therefore, if you selected a group of algorithms with a smaller number of hyperparameter combination compare to the number of models to train, this could cause the number of models trained not matching the pre-set number of models to train.
Scenario 2: Selected multiple algorithms and set the Number of Models to Train to 30
Explanation: Since Decanter AI utilizes Error Tolerance to achieve early stop in order to avoid overfitting, if the number of models to train is set to a large number (eg. more than 25) and also the error tolerance value is also set to a higher number, it could cause the trained model number being less than the pre-set number, due to the early stop feature.