check for unet_lora_layers in sdxl pipeline's save_lora_weights method #4821
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The StableDiffusionXLPipeline's
save_lora_weightsmethod includes unet_lora_layers as a keyword argument, but does not check if it is non-null before attempting to include it in the state dict to be saved:Curent implementation:
This PR adds a check to ensure that one of unet_lora_layers or text_encoder_lora_layers or text_encoder_2_lora_layers is passed, and raises a ValueError if this is not the case.
I believe this is the right approach, as the unet_lora_layers should not be strictly required -- loras could theoretically be for one or both text encoders, so unet_lora_layers can be none as long as one of the other lora layer args is passed.
This is a simple PR that should be mergeable with no problems, tagging @patrickvonplaten / @sayakpaul for review :)