Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 17 additions & 1 deletion include/pybind11/subinterpreter.h
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,7 @@ class subinterpreter {
/// @note This function acquires (and then releases) the main interpreter GIL, but the main
/// interpreter and its GIL are not required to be held prior to calling this function.
static inline subinterpreter create(PyInterpreterConfig const &cfg) {

error_scope err_scope;
subinterpreter result;
{
Expand All @@ -85,7 +86,21 @@ class subinterpreter {

auto prev_tstate = PyThreadState_Get();

auto status = Py_NewInterpreterFromConfig(&result.creation_tstate_, &cfg);
PyStatus status;

{
/*
Several internal CPython modules are lacking proper subinterpreter support in 3.12
even though it is "stable" in that version. This most commonly seems to cause
crashes when two interpreters concurrently initialize, which imports several things
(like builtins, unicode, codecs).
*/
#if PY_VERSION_HEX < 0x030D0000 && defined(Py_MOD_PER_INTERPRETER_GIL_SUPPORTED)
static std::mutex one_at_a_time;
std::lock_guard<std::mutex> guard(one_at_a_time);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this make us vulnerable to double-locking deadlocks?

Lock 1 = GIL
Lock 2 = this C++ mutex

If Py_NewInterpreterFromConfig releases (and re-acquires) the GIL, we could have a deadlock.

The usual trick is to release the GIL just before acquiring the C++ mutex, then re-acquire the GIL before calling back into the Python C API (Py_NewInterpreterFromConfig in this case).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Py_NewInterpreterFromConfig requires that the main interpreter GIL be held when being called. Internally it releases the main GIL and acquires the new sub-interpreter's GIL and then performs initialization.

The problem appears to be that the initializing isn't thread-safe so with independent GILs it sometimes causes crashes.

I believe it's safe it have a mutex here, because it is only ever acquired in this one place, and the main GIL is acquired first, so the mutex could only ever be locked if the main GIL is already locked. The main GIL is released before this mutex, allowing another thread to potentially get to the lock here and then wait ..... since the function doesn't reacquire the main GIL, it can continue on, return, and release the mutex. Subsequently it does have to reacquire the main GIL, but that's after this mutex has been released.

Since this crosses the boundary of two different GILs, I can't think of any other way to prevent the problem except with another lock. The only other option would be for us to leave it broken and just document that this is a big unstable on 3.12 (and also fix the unit test to not exercise this behavior).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't think of any other way to prevent the problem except with another lock. The only other option would be for us to leave it broken

The frequent CI failures are pretty distracting. Thanks for the explanation. Let's merge this and see if we actually get any deadlocks.

#endif
status = Py_NewInterpreterFromConfig(&result.creation_tstate_, &cfg);
}

// this doesn't raise a normal Python exception, it provides an exit() status code.
if (PyStatus_Exception(status)) {
Expand Down Expand Up @@ -117,6 +132,7 @@ class subinterpreter {
// same as the default config in the python docs
PyInterpreterConfig cfg;
std::memset(&cfg, 0, sizeof(cfg));
cfg.allow_threads = 1;
cfg.check_multi_interp_extensions = 1;
cfg.gil = PyInterpreterConfig_OWN_GIL;
return create(cfg);
Expand Down
Loading