You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi again,
I'm reporting to you about this new bug I found.
Overtraining Threshold doesn't work properly. Instad of doing what it supposed to do (stop training if no improvement is detected after you set the epochs number), it stops the training when it reaches to the number you set.
For example, when you set the Overtraining Threshold to 100, it stops training when it reaches to the 100th epoch, even if there's an improvment in epoch #99.
I tested it with 4 different model trainings, so I'm sure it stops when it reaches the number of the Overtraining Threshold.
The text was updated successfully, but these errors were encountered:
right now the feature is still in beta (that's why isn't enabled by default) i never had your problem but we are going to improve the system for the next updates
Hi again,
I'm reporting to you about this new bug I found.
Overtraining Threshold doesn't work properly. Instad of doing what it supposed to do (stop training if no improvement is detected after you set the epochs number), it stops the training when it reaches to the number you set.
For example, when you set the Overtraining Threshold to 100, it stops training when it reaches to the 100th epoch, even if there's an improvment in epoch #99.
The text was updated successfully, but these errors were encountered: