Skip to content

Commit 08cacb2

Browse files
[3.13] gh-140576: Fixed crash produced by lexer in case of dedented zero byte (GH-140583) (#140762)
1 parent 62a3b6b commit 08cacb2

File tree

3 files changed

+6
-0
lines changed

3 files changed

+6
-0
lines changed

Lib/test/test_tokenize.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3097,6 +3097,7 @@ def get_tokens(string):
30973097
f'__{
30983098
x:d
30993099
}__'""",
3100+
" a\n\x00",
31003101
]:
31013102
with self.subTest(case=case):
31023103
self.assertRaises(tokenize.TokenError, get_tokens, case)
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
Fixed crash in :func:`tokenize.generate_tokens` in case of
2+
specific incorrect input. Patch by Mikhail Efimov.

Parser/lexer/lexer.c

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -481,6 +481,9 @@ tok_get_normal_mode(struct tok_state *tok, tokenizer_mode* current_tok, struct t
481481
return MAKE_TOKEN(ERRORTOKEN);
482482
}
483483
}
484+
else if (c == EOF && PyErr_Occurred()) {
485+
return MAKE_TOKEN(ERRORTOKEN);
486+
}
484487
else {
485488
break;
486489
}

0 commit comments

Comments
 (0)