PEP 594 has been implemented: Python 3.13 removes 20 stdlib modules

Thanks so much for the terrific summary, Victor, and all your detailed notes!

My advice—just use Git Filter-Repo. It’s officially recommended by Git itself over Git filter-branch, and is an order of magnitude (or more) faster, more powerful and easier to use.

In the past I used git filter-branch and BFG (a previously popular tool for this) back when GFR was not maintained, but it was just a nightmare that took a large chunk of a day just to extract a few files in one directory. I actually found out it was back from the PEP 594 thread here and used it several times since, and it was a breeze by comparison. There was also a recipe posted over there that @jmr used to extract cgi:

Git subtree can work (see e.g. here for an example), but AFAIK only for limited use cases (a singe directory) and its not the primary purpose of the command; I’d recommend Git Filter-Repo instead as it is much more powerful and specifically designed for this task.

Just to note, unless something significant changes (and I really don’t think it should), you and your community will need to decide on a PyPI project name to use for your version that isn’t the original (the import module/package can stay the same, so it is still drop-in compatible), since at least for things that aren’t already there as backports, etc., stdlib names are not permitted for new PyPI uploads to protect against major security implications and user mistakes.

See the recent discussion on handling the opposite case, modules on PyPI that are now in the stdlib, and the discussion on whether to allow nntplib on PyPI for some more context on that and all the problems that result, for you, your users, PyPI and the ecosystem as a whole, when using the same name. Here’s my summary of some of them from the former thread, adapted in turn from the latter thread:

2 Likes