Skip to content

Conversation

@Cyrilvallez
Copy link
Member

What does this PR do?

sys.modules can end-up being quite large, especially when using pytest. This avoids the loop while keeping the benefits

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ydshieh ydshieh requested a review from Copilot November 18, 2025 17:15
Copilot finished reviewing on behalf of ydshieh November 18, 2025 17:17
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR optimizes the guard_torch_init_functions decorator by replacing an expensive loop over all modules in sys.modules (which can be very large in pytest environments) with a targeted approach that only patches a specific list of known torch modules.

  • Introduces TORCH_MODULES_TO_PATCH constant containing specific torch module paths to patch
  • Refactors guard_torch_init_functions() to iterate over the predefined module list instead of all sys.modules
  • Renames variables in the patching logic for clarity (namefunc_name, module_name for clarity)

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
src/transformers/initialization.py Adds TORCH_MODULES_TO_PATCH constant and refactors guard_torch_init_functions() to use targeted module patching instead of looping over all sys.modules
setup.py Contains an accidental debug line that should be removed

# Here, we need to check several modules imported, and hot patch all of them, as sometimes torch does
# something like `from torch.nn.init import xavier_uniform_` in their internals (e.g in torch.nn.modules.activations,
# where MultiHeadAttention lives), so the function name is binded at import time and just doing
# `setattr(torch.nn.init, name, gloabls()[name])` is thus not enough
Copy link

Copilot AI Nov 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo in comment: 'gloabls' should be 'globals'.

Suggested change
# `setattr(torch.nn.init, name, gloabls()[name])` is thus not enough
# `setattr(torch.nn.init, name, globals()[name])` is thus not enough

Copilot uses AI. Check for mistakes.
setup.py Outdated
"tqdm",
)

a = 1
Copy link

Copilot AI Nov 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line a = 1 appears to be accidentally added debug code and should be removed. It serves no purpose in the setup file.

Suggested change
a = 1

Copilot uses AI. Check for mistakes.
@Cyrilvallez Cyrilvallez merged commit a5c903f into main Nov 18, 2025
12 of 15 checks passed
@Cyrilvallez Cyrilvallez deleted the module-loop branch November 18, 2025 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants