Skip to content

Commit b60b5ce

Browse files
Sharon Tanfacebook-github-bot
authored andcommitted
Address likely package mismatch issues (#1634)
Summary: Previous CI runs were encountering issues: 1. https://github.com/.../runs/16867033035/job/47776474001 ``` E torch.AcceleratorError: CUDA error: no kernel image is available for execution on the device E CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. E For debugging consider passing CUDA_LAUNCH_BLOCKING=1 E Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions. ``` 2. https://github.com/pytorch/captum/actions/runs/16867033037/job/47775775171 `tests/attr/test_llm_attr_hf_compatibility.py:76: error: Argument 1 to "__call__" of "_Wrapped" has incompatible type "str"; expected "PreTrainedModel" [arg-type]` This is likely due to (1) mismatched pytorch versions vs the cuda version 12.1 specified in test-pip-gpu.yml, triggered by [new PyTorch release](https://github.com/pytorch/pytorch/releases) and (2) lack of full typing causing error thrown by `mypy`, triggered by loosely defined `mypy>=0.760` probably pulling [new MyPy release](https://pypi.org/project/mypy/#history) Differential Revision: D80120024
1 parent 3ec6da4 commit b60b5ce

File tree

2 files changed

+5
-3
lines changed

2 files changed

+5
-3
lines changed

.github/workflows/test-pip-gpu.yml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,9 @@ jobs:
2222
gpu-arch-version: ${{ matrix.cuda_arch_version }}
2323
script: |
2424
python3 -m pip install --upgrade pip --progress-bar off
25-
python3 -m pip install -e .[dev] --progress-bar off
25+
python3 -m pip install torch==2.5.1 torchvision==0.20.1 --index-url https://download.pytorch.org/whl/cu121
26+
python3 -m pip install -e .[dev] --index-url https://download.pytorch.org/whl/cu121 --progress-bar off
27+
python3 -m pip install transformers --progress-bar off
2628
2729
# Build package
2830
python3 -m pip install build --progress-bar off

tests/attr/test_llm_attr_hf_compatibility.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
HAS_HF = True
2121
try:
2222
# pyre-ignore[21]: Could not find a module corresponding to import `transformers`
23-
from transformers import AutoModelForCausalLM, AutoTokenizer
23+
from transformers import AutoModelForCausalLM, AutoTokenizer, PreTrainedModel
2424
except ImportError:
2525
HAS_HF = False
2626

@@ -69,7 +69,7 @@ def test_llm_attr_hf_compatibility(
6969
tokenizer = AutoTokenizer.from_pretrained(
7070
"hf-internal-testing/tiny-random-LlamaForCausalLM"
7171
)
72-
llm = AutoModelForCausalLM.from_pretrained(
72+
llm: PreTrainedModel = AutoModelForCausalLM.from_pretrained(
7373
"hf-internal-testing/tiny-random-LlamaForCausalLM"
7474
)
7575

0 commit comments

Comments
 (0)