Skip to content

Conversation

@jakubgalik-digica
Copy link

What does this PR do?

Fixes #<issue_number>

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 8, 2022
@rohitgr7
Copy link
Contributor

rohitgr7 commented Aug 8, 2022

I don't think it's a bug since LightningDataModule doesn't have batch_size as an argument.

return dataloader(predict_dataset)

datamodule = cls()
datamodule = cls(batch_size=batch_size)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't assume the datamodule takes a batch_size argument in general. In which library did you find this problem?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean which library? pytorch lightning library.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to use SemanticSegmentationData.from_dataset method

Copy link
Author

@jakubgalik-digica jakubgalik-digica Aug 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and i get (sorry about pic of error, instead of str)
image

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch, in that case we need to add another **kwargs parameter and check if the existing ones are part of it or not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From skimming the flash.DataModule source code, it doesn't look like this method is supposed to be compatible with it, as it already manages creating the DataLoaders etc.

https://github.com/Lightning-AI/lightning-flash/blob/644f2b559f87b40a5b623f5260eee0cf924ea0a5/flash/core/data/new_data_module.py

Thoughts @krshrimali?

Copy link
Contributor

@awaelchli awaelchli Aug 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flash has an extra parameter **datamodule_kwargs in their special from_* methods. But this is unfortunately not a universal solution, because a parameter like batch_size, which is also used as a parameter by the from_dataset method, can't be passed in "twice" as kwargs. There are some solutions

  1. use a proper dict datamodule_kwargs as parameter instead of kwargs. But this would be inconsistent with how Flash handles it.
  2. Have datamodule_kwargs as proper kwargs, but do a signature inspection for the init to know whether you need to pass in parameters like batch_size
  3. Override the from_datasets method in FlashDatamodule and raise an error that this method is not supported.
  4. Something else?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'd say 2

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implemented in #14185

@rohitgr7
Copy link
Contributor

closing in favor of #14185

@rohitgr7 rohitgr7 closed this Aug 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

pl Generic label for PyTorch Lightning package

Projects

No open projects
Status: Done

Development

Successfully merging this pull request may close these issues.

5 participants