Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overtraining Threshold doesn't work properly #381

Closed
UberStorm opened this issue Apr 4, 2024 · 1 comment
Closed

Overtraining Threshold doesn't work properly #381

UberStorm opened this issue Apr 4, 2024 · 1 comment
Labels
bug Something isn't working enhancement New feature or request

Comments

@UberStorm
Copy link

UberStorm commented Apr 4, 2024

Hi again,
I'm reporting to you about this new bug I found.
Overtraining Threshold doesn't work properly. Instad of doing what it supposed to do (stop training if no improvement is detected after you set the epochs number), it stops the training when it reaches to the number you set.
For example, when you set the Overtraining Threshold to 100, it stops training when it reaches to the 100th epoch, even if there's an improvment in epoch #99.
image

  • I tested it with 4 different model trainings, so I'm sure it stops when it reaches the number of the Overtraining Threshold.
@blaise-tk
Copy link
Member

right now the feature is still in beta (that's why isn't enabled by default) i never had your problem but we are going to improve the system for the next updates

@blaise-tk blaise-tk added bug Something isn't working enhancement New feature or request labels Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants