Applications embedding Python as a scripting engine (think Blender, Maya, etc)
Applications that are providing alternatives to the standard CPython CLI (think the isolated-by-default system-python idea in PEP 432, alternative REPLs, the bootstrapping no-import-system binary used to freeze importlib, etc)
Apps in the second category currently have a hard time correctly emulating CPython’s argument and environment handling, since a large chunk of it is hidden inside Py_Main. Even if you call Py_Initialize and then Py_Main, it isn’t quite the same thing, since not everything gets reconfigured after Py_Main has had a chance to look at the command line arguments.
So if we expose our command line processing machinery directly, then embedding applications can use that part of the system like a support library, rather than having to try to emulate it themselves.
Yeah, right now, almost all the new “configuration” read by Py_Main() is ignored and will not be applied. Python keeps the “old” configuration read by Py_Initialize().
The problem is that many Python objects are kept alive between Py_Initialize() and Py_Main(). For this reason, it’s not possible to change the memory allocator in such case.
I would suggest to deprecate calling Py_Initialize() before Py_Main().
Oh wow, Discourse disallowed me to post a “4th reply in a row”, even if I replied to 4 different messages. It forced me to edit a previous message… So here is my 4th message:
Most or even all issues described in this discussion are solved by my PEP 587: https://www.python.org/dev/peps/pep-0587/ PyConfig_SetBytesString() and PyConfig_SetBytesArgv() decode bytes string for you, and the PEP adds a new “preinitialization” phase with a dedicated PyPreConfig configuration to configure the LC_CTYPE locale. The PEP allows to parse argv as command line arguments. It implements the 2 use cases described by Nick Coghlan with 2 separated default configuration: “Python Configuration” and “Isolated Configuration”.