PEP 594 has been implemented: Python 3.13 removes 20 stdlib modules


Zachary Ware and me removed 19 modules in Python 3.13 stdlib, modules deprecated in Python 3.11 by PEP 594 – Removing dead batteries from the standard library.

  • aifc
  • audioop
  • cgi
  • cgitb
  • chunk
  • crypt
  • imghdr
  • mailcap
  • msilib
  • nis
  • nntplib
  • ossaudiodev
  • pipes
  • sndhdr
  • spwd
  • sunau
  • telnetlib
  • uu
  • xdrlib

Moreover, I also removed the 2to3 program and lib2to3 module in Python 3.13, deprecated in Python 3.11: PEP 617 PEG Parser.

By the way, Python 3.12 removed 5 stdlib modules:

  • asynchat, asyncore, smtpd: PEP 594
  • distutils: PEP 632
  • imp : replaced by importlib (added to Python 3.11)

I suppose that in all of these removed modules, the ones which will cause most troubles will be:

  • distutils (setuptools may save us, it still provides distutils)
  • imp
  • cgi

What’s New in Python 3.12 and What’s New in Python 3.13 list removed modules and suggest replacements. Only a minority of removed modules have no known replacement (for now).

PEP 594 (created in 2019) was controversial: people have good reasons to want a large and functional Python stdlib and we took them in account. But the Steering Council took a decision and approved it. I now suggest to move on to make this migration as smooth as possible. I chose to remove PEP 594 modules as soon as possible in Python 3.13 development cycle to get user feedback as soon as possible, and have more time to prepare the Python ecosystem for this migration: to help projects to move away from these removed modules.

For me, PEP 594 moves the maintainance of the removed modules outside Python. People using them now have to organize themselves to decide how to maintain the code they are relying on outside the Python project. The rationale is that the team maitaining Python is mostly made of volunteers and is too small to maintain 300 stdlib modules and C extensions (that’s a lot of code, tests and documentation). This maintainance is expected to be more healthy and efficient outside the Python projects with less constraints about contributions (it’s long and hard to become a Python core developer, PR reviews in Python is a known bottleneck and no simple solution was found so far).

Python 3.12 final release is scheduled in 4 months (October 2023) and Python 3.13 final release is scheduled in 1 year and 4 months (October 2024).

See my notes about the Python standard library which lists added and removed stdlib modules since Python 3.0, with references to PEPs, issues and deprecations.

If your project is affected, you have different solutions:

  • Do nothing! For now, only remain compatible with Python 3.11 and older. IMO it’s a dangerous long-term choice. The technical debt only becomes more expensive over time. But maybe someone will come with a solution for you in the meanwhile.
  • Attempt propose recipes and alternatives in the What’s New documents.
  • Create a group of volunteers and give a new life to the removed module by maintaining them on PyPI. You will be able to use pip install <module>. Then Linux distributions should package it and add it as a new dependency to your projects.
  • Copy the removed module inside your project and maintain it there. Usually it’s a single short .py file, if you chose to skip documentation and tests. Just be careful about the license and copyright. You’re now on your own to maintain it, but this solution is quick and simple.

See the Python license.

About PyPI, the removed nntplib module already found a new home there: PyPI nntplib project. So far, users feedback seem to be positive. If possible, I suggest to attempt keeping the Git history of copied files like: Doc/library/<module>.rst , Lib/<module>.py and Lib/test/test_<module>.py. Again, please be careful about licensing and copyright. To keep the Git history, you can try:

Please comment below if you have some concrete advices and feedback about these tools.

Another example is the old-demos Git repository which contains demos and scripts removed from Python 3.12 (it seems like this one didn’t keep the Git history).

You can use this discussion to share your recipes on how to move away of some removed modules, and to organize the creation of a new warm home of some removed modules.


By the way, thanks a lot to Christian Heimes and Brett Cannon for being stubborn and getting this PEP 594 accepted and implemented :grin:


Thanks so much for the terrific summary, Victor, and all your detailed notes!

My advice—just use Git Filter-Repo. It’s officially recommended by Git itself over Git filter-branch, and is an order of magnitude (or more) faster, more powerful and easier to use.

In the past I used git filter-branch and BFG (a previously popular tool for this) back when GFR was not maintained, but it was just a nightmare that took a large chunk of a day just to extract a few files in one directory. I actually found out it was back from the PEP 594 thread here and used it several times since, and it was a breeze by comparison. There was also a recipe posted over there that @jmr used to extract cgi:

Git subtree can work (see e.g. here for an example), but AFAIK only for limited use cases (a singe directory) and its not the primary purpose of the command; I’d recommend Git Filter-Repo instead as it is much more powerful and specifically designed for this task.

Just to note, unless something significant changes (and I really don’t think it should), you and your community will need to decide on a PyPI project name to use for your version that isn’t the original (the import module/package can stay the same, so it is still drop-in compatible), since at least for things that aren’t already there as backports, etc., stdlib names are not permitted for new PyPI uploads to protect against major security implications and user mistakes.

See the recent discussion on handling the opposite case, modules on PyPI that are now in the stdlib, and the discussion on whether to allow nntplib on PyPI for some more context on that and all the problems that result, for you, your users, PyPI and the ecosystem as a whole, when using the same name. Here’s my summary of some of them from the former thread, adapted in turn from the latter thread:


Oh, I was said that cgi got a new home: legacy-cgi · PyPI

1 Like

Also, @_david has published xdrlib


Perhaps I should re-create the critical files in py-xdrlib using git-filter-repo? I’ll look at that this weekend.

Is there any process evolving for redirecting users?

Upon checking again, while I can’t seem find any public discussion or notification of the decision, despite the many serious concerns expressed in the relevant thread, it seems that particular continuation of nntplib was in fact handed the original, official name on PyPI after all, as I discuss there.

As I believe we really should openly discuss and decide on a coherent policy on this sooner rather than later with the PyPI admins involved, I’ve opened a new thread in Packaging focusing on this specifically:


Victor added a note to the What’s New entry announcing the deprecation that suggests the nntplib project on PyPI as a drop-in alternative for the eponymous module. The same could be done for py-xdrlib, and this could also be backported to the deprecation notice in the relevant Python versions.

Have you considered placing the removed modules into a community-maintained (mono)repo under the Python organisation, with the understanding that there is no official involvement apart from providing the space? From there, individual packages could be released under the protected module names, removing the question of “ownership”. That would prevent fragmentation, and make it less of a headache for users whether to trust a given replacement package.

Yes, I believe that was considered. The original PEP 594 has extensive discussion of the various options considered, it might be worth reading that if you want to know the background.



Creating/maintaining a separate repo for the deprecated modules

It was previously proposed to create a separate repository containing the deprecated modules packaged for installation. One of the PEP authors went so far as to create a demo repository. In the end, though, it was decided that the added workload to create and maintain such a repo officially wasn’t justified, as the source code will continue to be available in the CPython repository for people to vendor as necessary. Similar work has also not been done when previous modules were deprecated and removed, and it seemingly wasn’t an undue burden on the community.


Additionally, the fact that the source was hosted under the official Python organization would imply ownership to me, but otherwise allowing others to arbitrarily edit it and publish it to PyPI without any “official involvement” would violate the consequent user expectations more severely than just giving a competent-seeming third party the name on PyPI alone.

Furthermore, we would still have to arbitrate who to give write access to the repo to, and who to give the appropriate role(s) on PyPI to allow them to publish releases, which is likewise essentially a superset of that problem discussed for handing out the PyPI names.

Also, hosting explicitly stdlib modules unsupported by the core team and CPython would not be consistent with Python’s Organization Repository Policy. It would also raise logistical difficulties as we’d have to admit everyone with any role in the project as members of the core python organization. And of course, there are all sorts of practical issues with hosting this as a monorepo given GitHub is not really designed for this use case.

Some of these concerns could be at least partially mitigated by creating a new independent GitHub organization, hosting each project as regular repositories under that if someone steps up to maintain it, and if so giving the org owner(s) (presumably, trusted person(s) in the Python community) and the relevant maintainer(s) ownership of the relevant PyPI name. However, it would demand a fair amount of overhead and consensus to set up and manage, still requires handing over control of both all the source repo and PyPI names to someone (a much weightier choice than before), still doesn’t totally solve the basic problems, and wouldn’t accommodate maintainers (like that of nntplib) who prefer to use platforms other than GitHub.

1 Like

Hi, I distributed all of the removed pure python libraries under standard- prefixed name to PyPI.

e.g. pip install standard-uu will install uu.

pip install 'standard-uu>=3.12' will install uu form Python 3.12 release. I hope it could be a smooth migration helper for legacy code users.

1 Like

Hey @youknowone, I noticed standard-crypt · PyPI only includes the crypt module and not the _crypt extension module with the actual implementation. As it is currently packaged, it will not work:

$ python3.13 -m venv venv3.13

$ venv3.13/bin/pip install standard-crypt
Collecting standard-crypt
  Obtaining dependency information for standard-crypt from
  Downloading standard_crypt-3.12.2-py3-none-any.whl.metadata (3.8 kB)
Downloading standard_crypt-3.12.2-py3-none-any.whl (6.0 kB)
Installing collected packages: standard-crypt
Successfully installed standard-crypt-3.12.2

$ venv3.13/bin/python -c 'import crypt'
Traceback (most recent call last):
  File ".../venv3.13/lib64/python3.13/site-packages/crypt/", line 6, in <module>
    import _crypt
ModuleNotFoundError: No module named '_crypt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
    import crypt
  File ".../venv3.13/lib64/python3.13/site-packages/crypt/", line 11, in <module>
    raise ImportError("The required _crypt module was not built as part of CPython")
ImportError: The required _crypt module was not built as part of CPython

I’d like to announce here that due to the need to implement yescrypt in the Fedora installer, we have packaged crypt_r as a standalone package available from PyPI.

It is a copy/fork of the removed standard library crypt module. Unlike crypt, our package always exposes the crypt_r(3) function, not crypt(3). Note that crypt_r is not part of any standard. We tested the package with the crypt_r implementation in Fedora Linux (libxcrypt), and it should work with compatible implementations of crypt_r (such as from older glibc).

To use this module, you can either import crypt_r explicitly or use the old crypt name for backward compatibility. However, on Python older than 3.13, the crypt module from the standard library will usually take precedence on sys.path.

From PEP 594:

  • The module is not available on Windows. Cross-platform applications need an alternative implementation anyway.

This is acknowledged, crypt_r explicitly only supports Linux.

  • Only DES encryption is guaranteed to be available. DES has an extremely limited key space of 2**56.

Other methods are available on modern Linux, such as Fedora Linux.

  • MD5, salted SHA256, salted SHA512, and Blowfish are optional extensions. SSHA256 and SSHA512 are glibc extensions. Blowfish (bcrypt) is the only algorithm that is still secure. However it’s in glibc and therefore not commonly available on Linux.

glibc extensions are available in Fedora Linux.

  • Depending on the platform, the crypt module is not thread safe. Only implementations with crypt_r(3) are thread safe.

crypt_r (the Python package) always uses crypt_r (the C function).

  • The module was never useful to interact with system user and password databases. On BSD, macOS, and Linux, all user authentication and password modification operations must go through PAM (pluggable authentication module); see the spwd deprecation.

The module is useful for the Fedora installer.

We will maintain this package at least as long as the Fedora installer needs it (currently that means indefinitely), but we plan no future development, only bugfixes. Python 3.11+ is supported (Python 3.10 or older did not emit deprecation warnings for the crypt module).