-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
featureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on
Description
🚀 Feature
Replace *_percent_check with limit_*_batches and redesign overfit_pct Trainer arguments.
Motivation
Over the past few days I was struggling to use these parameters in my tasks. For a example, I want to run a test on model overfitting for an image classification problem. I would take a few (say, 2) batches from my train dataset and train several epochs on them. Then I would assert 100% accuracy on these train batches.
The problems I stumbled on are the following:
overfit_pctdocumentation is misleading. Recently a clarification was made that it sets*_percent_checkparameters to a given value, but it still doesn't actually help to overfit a model since you can't simply runtrainer.test()ortrainer.run_evaluation()without manipulating model's dataloaders after runningtrainer.fit(model).- If the value of the
val_percent_checkis too small, which actually can happen if you useoverfit_pctwith small training dataset in mind, you just silently skip validation loop and run into an exception inmodel.validation_epoch_endtrying to accumulate loss for batches. Yeah, handling the latter is reasonably on me since I override this method but it will be much nicer if such unexpected loop-skip is checked by Pytorch-Lightning. You guys are great and I want to love your project even more! train_percent_checkdoesn't guarantee training on the same small part of the training dataset for every epoch because it is a best practice to shuffle your training data every epoch. As a result, new batches are formed every epoch and thus no overfitting :(
Pitch
- I find redesigning these dataset limiting parameters as absolute number of batches more straightforward and desired than now. After all, I dare to say that most of the researches and developers are thinking in terms of number of bathes rather than percent of dataset.
overfit_pctis either removed or actually redesigned to help test overfitting, i.e. replacing validation or test loader with train loader. Ensure that training dataset isn't shuffled and the same batches are trained on every epoch.
Additional notes
I realize that you cannot prohibit shuffling in case of using simply *_percent_check parameters. There can be experiments where you would like to see how your model performs training only on a portion of data. Therefore such prohibition is valid only for overfit mode.
tshrjn, msperber, anotherbugmaster, stalkermustang, Dervun and 7 more
Metadata
Metadata
Assignees
Labels
featureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on