How to "deprecate" modules that are moved to PyPI?

Some of the discussions around PEP 594 are suggesting that there are modules that we may want to move to PyPI, presumably without changing the namespace. This is feasible because all stdlib names are reserved on PyPI anyway, so we can certainly “just” move them there.

However, this brings up a problem, which is that if these modules are deprecated in 3.9 and the recommended mitigation is to start depending on the PyPI package, even people who have done the recommended mitigation by adding the dependency will still be seeing the DeprecationWarning, which does not seem like good UI for this. I think it would be a good idea to come up with a solution to this problem if we’re going to pursue the “move to PyPI” option. A few possible ways to solve this:

  1. Allow PyPI packages to take precedence over stdlib modules in imports (probably dangerous in many ways).
  2. Have the stdlib module detect in some way whether the PyPI package is installed on import, and make the DeprecationWarning conditional on that.
  3. Structure the packages so that the PyPI package exposes a module xxx._pypi_module, and the implementation of the stdlib module moves to xxx._stdlib_module in Python 3.9. Then the implementation of the 3.9 stdlib module is:
    import xxx._pypi_module as _module
except ImportError:
    import xxx._stdlib_module as _module # Raises DeprecationWarning

Note that this would be an “opt-in” version of #1, since the PyPI version would take precedence over the stdlib version. Don’t get too hung up on that particular example implementation, the gist is to do something like that.

I do think this is important for PEP 594, but I “forked” the discussion onto discourse because it seems the python-dev thread is currently mostly concerned with the specifics of what would stay and what would go, and I didn’t want to add yet another slightly related topic of discussion into that.

CC: @vstinner @tiran


FYI, Ruby has started this movement already: “Gemification”.