Skip to content

Commit 124758e

Browse files
authored
Update max context length LlmConfig validation (#11920)
1 parent 23e8ff5 commit 124758e

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

examples/models/llama/config/llm_config.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -227,9 +227,9 @@ class ExportConfig:
227227
export_only: bool = False
228228

229229
def __post_init__(self):
230-
if self.max_context_length > self.max_seq_length:
230+
if self.max_context_length < self.max_seq_length:
231231
raise ValueError(
232-
f"max_context_length of {self.max_context_length} cannot be greater than max_seq_length of {self.max_seq_length}"
232+
f"max_context_length of {self.max_context_length} cannot be shorter than max_seq_length of {self.max_seq_length}"
233233
)
234234

235235

0 commit comments

Comments
 (0)