Skip to content

Conversation

@ErwannMillon
Copy link
Contributor

The StableDiffusionXLPipeline's save_lora_weights method includes unet_lora_layers as a keyword argument, but does not check if it is non-null before attempting to include it in the state dict to be saved:

Curent implementation:

    def save_lora_weights(
        self,
        save_directory: Union[str, os.PathLike],
        unet_lora_layers: Dict[str, Union[torch.nn.Module, torch.Tensor]] = None,
        text_encoder_lora_layers: Dict[str, Union[torch.nn.Module, torch.Tensor]] = None,
        text_encoder_2_lora_layers: Dict[str, Union[torch.nn.Module, torch.Tensor]] = None,
        is_main_process: bool = True,
        weight_name: str = None,
        save_function: Callable = None,
        safe_serialization: bool = True,
    ):
        state_dict = {}

        def pack_weights(layers, prefix):
            layers_weights = layers.state_dict() if isinstance(layers, torch.nn.Module) else layers
            layers_state_dict = {f"{prefix}.{module_name}": param for module_name, param in layers_weights.items()}
            return layers_state_dict

        state_dict.update(pack_weights(unet_lora_layers, "unet"))

This PR adds a check to ensure that one of unet_lora_layers or text_encoder_lora_layers or text_encoder_2_lora_layers is passed, and raises a ValueError if this is not the case.

I believe this is the right approach, as the unet_lora_layers should not be strictly required -- loras could theoretically be for one or both text encoders, so unet_lora_layers can be none as long as one of the other lora layer args is passed.

This is a simple PR that should be mergeable with no problems, tagging @patrickvonplaten / @sayakpaul for review :)

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice looks good! Thank you!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool!

@patrickvonplaten
Copy link
Contributor

Before merging can you run:

make style

to fix the linting problems? :-)

@patrickvonplaten
Copy link
Contributor

Could you now run make fix-copies && make style - seems like there are still inconsistencies

@patrickvonplaten patrickvonplaten merged commit 2fa4b3f into huggingface:main Sep 4, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants