User experience with porting off

Thanks for this post and your writeup Gregory.

I have spent at least a week’s worth of time trying to port the Roundup Issue Trackers to something “more modern”. I have zero to show for that.

Since I support Python 2 and Python 3.6+, I use source distributions rather than wheels. This is partly because I want to deploy the test suite and install other files that need to be installed. This includes man pages (not installed on windows), translation files, tracker template directory trees, alternate front ends, and formatted docs.

AFAICT these categories of files don’t exist in the model of the current packaging tools. So I have had to jump through all sorts of hoops to put the files in places where the installed software can find them later. Hopefully, the users can also find them so they can use the software 8-).

I’m just holding off hoping that by the time using directly starts failing I’ll be able to find directions on how to pyproject.toml without spending more weeks trying to adapt the code so I can provide a working installation to a user.

I was worried that I was just too stupid to figure out how to do this. Seeing somebody else struggle with packaging a pure library (which seems to be the sweet spot for the packaging tools) gives me hope I may be able to keep Roundup deployable for a few more years.

Thanks for your efforts.

This is where we landed for Bokeh and also what I have had to recommend to a few other projects looking for a mechanism to condition their build in any way.

1 Like

As much as I empathize with @indygreg’s experience, I think this particular point is pushing expectations too far. Python 2.7 has been EOL for almost 3 years and Python 3.6 for almost 2 years. IMHO, you can’t expect modern packaging tooling to support these, and I would totally expect that constraining yourself to old tool versions makes things harder. For example, PEP 621 was accepted a little less than two years ago.

Edit: If this sounds like I’m contradicting my earlier post, I’m not – asking to be able to build a package on Python 2 or 3.6 is a much bigger ask than installing the package from a wheel on Python 2 or 3.6.


zstandard is not pure Python but rather it builds an extension module, so it is in fact the most complex type of library.

1 Like

That is quite the tragedy in itself. I doubt that Python would be where it is today, e.g. the glue code holding large parts of the scientific ecosystem together, if distutils and then setuptools had not made it magically easy to build C extensions.

1 Like

Building a single extension module is not sufficient to place a project in the class of the “most complex type of library”. Also distutils and setuptools did not make it “magically easy to build C extensions”. Both of these statements portray a profound underestimation of the difficulty that non-pure Python projects face with packaging.

To add to the discussion about experiences of new Python packaging I recently migrated python-flint to Python 3.12 which meant finding an alternative to numpy.distutils. It turns out that after all the many years it is finally possible to use setuptools, CPython 3.12 and MinGW together so that I could do:

if sys.version_info < (3, 12):
    from distutils.core import setup
    from distutils.extension import Extension
    from numpy.distutils.system_info import default_include_dirs, default_lib_dirs
    from distutils.sysconfig import get_config_vars
    from setuptools import setup
    from setuptools.extension import Extension
    from sysconfig import get_config_vars
    default_include_dirs = []
    default_lib_dirs = []

Ten years ago I submitted a CPython patch (thanks to @pitrou for merging!) to make it possible to build extension modules with MinGW but it subsequently stopped working because CPython changed its C runtime on Windows:

Many thanks to all of the people working on improving MinGW support in recent setuptools and for finally getting CPython to accept the basic patch needed to make it work. It turned out that migrating from numpy.distutils to setuptools was easy enough now although we still need the version check in because the MinGW will not work with CPython < 3.12. I had been dreading the prospect of migrating to new packaging systems but after recent improvements it turns out that using setuptools is plenty good enough.

I would still like to migrate from setuptools to meson and meson-python because I would like to have an integrated build system that can locate or build the underlying C dependencies rather than just a build system that builds the extension modules. I would also like to add pyproject.toml but apart from keeping “modern” I don’t yet see any benefit in doing so.


But that’s the problem: not all of us are living in your lovely world of “modern packaging tooling” (which is just an euphemism for “we broke your tools and we don’t care”), but for example M2Crypto (which I maintain) is primarily used for legacy purposes (for new developments, please, use cryptography) and with very heavy heart I have just decided to abandon Python 2, and I know that I have still users with that. On the other hand, 3.6 is absolutely must (I work for SUSE and SLE-15 with Python 3.6 will be supported for another ten years or such).

And yes, I want to also support 3.12, so build will stop working for me very soon.


Please don’t suggest that the packaging people don’t care about breaking things. This is entirely false, and @pf_moore has already explained the technical challenges that setuptools and pip face, which also have to do with lack of resources.

I’m not blaming you for supporting Python 3.6 or even Python 2. There are always good reasons. All I’m saying is that it’s not just packaging tools that are not going to support Python 3.6, but also CPython core (no security fixes anymore!), and plenty of other tools/libraries/distros/things, because the world has basically moved on. If you maintain a library that works on Python 3.6 it means accepting this extra burden, and not just for packaging.


That is a different statement. Of course there are massively complicated projects, and I imagine that is the entire reason for having dedicated build backends. I am not at all saying that the old status quo was sustainable.

But building a single extension module with setuptools was and is amazingly easy (especially if you think what the landscape elsewhere was like when the Python package ecosystem first took off). For the vast majority of projects, and the vast majority of users, everything does just work. And that experience has, as far as I can see, largely been lost for native packages with the modern build system. [1]

  1. I continue to build every C extension I write with setuptools, because everything else (that I have tried) is just so much more complicated to get into. ↩︎


I am sorry for not being kind enough I shouldn’t imply (and I don’t think it) that the developers intentionally break my tools, that’s not true.

And of course, using Python 3.6 (or even older interpreters) assumes somebody else (like me for SLE) backports current CVEs to old interpreters.

This is very interesting, and I agree with @pitrou that it is disheartening.

I’m going to refrain here from reiterating too much stuff I said on other threads, but one clear takeaway is that documentation is key. A good part of the pain described in the blog post has to do with being unable to find clear documentation.

Part of the reason for this is that the transition from “use distutils, it’s in the stdlib” to “use setuptools even though it isn’t in the stdlib” to “mix and match build frontend and backend” has resulted in the necessary documentation becoming horrendously fragmented. Some of it is in PEPs, some is on, and some is in the documentation for individual tools. Trying to figure out how to do anything means ping-ponging back and forth between all these places and never knowing for sure if what you read on one matches up exactly with what you read on another.

Although this problem isn’t in general solvable (because anyone can go out and write a new tool, and document it as well or poorly as they choose), I continue to think that a solid set of official, integrated documentation on how to do the main tasks would go a long way towards easing this pain. And this is potentially something that the contemplated Python Packaging Council could help with, because I do think that, although lack of resources is a big problem, there is at least a bit of the problem which is “no one is perceived to have a clear mandate to make a decision on an integrated workflow that should be given special recommendation and explanation in the docs”.

That’s true, but I think part of the “finding documentation” problem is that no one (not the steering council nor PyPA nor the PyPC nor individual tool authors) can control what Google happens to serve up to someone. That is why I think it is critical that a person be able to start directly from and be able to find, by drilling down through a nicely designed docs structure, a complete description of how to do things — and know that that way is officially endorsed because it’s on a domain that ends with A big open question, I think, is what “do things” means there — that is, what tasks are considered “normal enough” to be described there, and again I think that’s something the PyPC could potentially help with.


There are tools for migrating/converting to pyproject.toml. PDM has import, which I have used and works OK - some manual tidying up was necessary. I don’t know Hatch as well, it seem that new might work when setting up a new project with an existing [?].

I hope I’m not distracting too much for your (more important) broader points, but I did want to help clarify a few points and questions for you and others.

Yes, more or less, and implemented by basically all tools. It is required by PEP 517 that by default if present, build-system be assumed to be setuptools.build_meta:__legacy__, which is equivalent to legacy direct execution of And by implication, then, the default requirements are the minimum for which that Setuptools backend is present and fully functional, which are the versions indicated, as also suggested by PEP 518.

However, you should specify your backend explicitly in build-system. build_meta:__legacy__ is only supposed to be used as a fallback for pre-PEP 517/518 projects; if explicitly specifying the build backend, you should just use build_meta, no __legacy__. Additionally, the explicit dependency on wheel is not actually required, or recommended nowadays, as that is a transitive implementation detail of the Setuptools backend). Finally, you’ll need Setuptools >=61.0.0 to support pyproject metadata (in the [project] table of the pyproject.toml, as originally specified in PEP 621, rather than setup.cfg). So, as recommended in the packaging guide, the standard [build-system] table for Setuptools is:

requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"

Of course, you could pin down to a specific version if desired, though I haven’t really seen that recommended anywhere.

I can definitely see why it is confusing, but FTR, Hatch is the user-failing frontend tool, while Hatchling is its associated build backend.

Theoretically, yes, if they install from source/sdist and not a wheel. But pip as far back as pip 10/0, released nearly 6 years ago, supports installing the specified versions of build dependencies in an isolated environment, which will ensure you get up to date, compatible versions of setuptools, etc. to build the package.

Realistically, though, this isn’t much of an issue in practice, as some of the most widely-used packages have made a superset of that switch years ago, to entirely non-Setuptools build backends that don’t have a at all, and things have generally worked fine for users, and distros have more or less already adapted as needed.

For your case, the main issue that will arise is users building from source with CFFI will need to modify their invocation to, e.g., use env variables or --config-settings to pass values instead of CLI flags to But generally such users are the most likely to know what they’re doing and adapt to it, at least so long as you provide clear guidance, and as invoking directly is deprecated, they’ll need to change that soon anyway.

Depends on which of those you’re talking about. Adoption of build changes nothing for users, it just builds the same artifacts in a more modern fashion. Hatch and Poetry are both frontends; they don’t change anything directly for users, while switching to their respective build backends has no effect on the vast majority of users consuming wheels, though it does mean that they’ll need at least pip 19.0 (released 5 years ago) in order to automatically build from source with that backend.

1 Like

Yes, I agree that the problem is to large extent the one of documentation. But wasn’t the one central place for all of this the sole purpose of

Thank you, this was apparently I was missing all the time! Thank you.

/me goes to study setup.cfg documentation.


In relation to the topic of this thread maybe what is missing from those docs is an up to date guide aimed specifically at existing projects looking to migrate to new packaging standards (I just looked quickly and didn’t see one). Most likely an existing project that already uses distutils/setuptools should just be told that they can continue to use setuptools which is still supported and is actively developed and not deprecated. They can keep their but it should use setuptools rather than distutils and they should add a pyproject.toml with a build system entry that says that the project uses setuptools as its build backend. The guide could explain what the benefits are of using pyproject.toml and should also say clearly whether there are any potential downsides.

A separate guide could give a short explanation of why a project that currently uses setuptools may or may not want to consider adopting any of the other frontend or backend tools to be used either together with setuptools or in replacement of it. The guide could make it clear that there is no need to use anything else if setupools currently does what is needed and that setuptools will continue to be supported.


I don’t think it needs a packaging council so much as it needs someone to write the docs they think are needed, and submit them. The person doing the work gets to make the decisions.


That and more conversion-specific comments, like if you cannot go for whatever reason with setuptools >= 61.0.0 hold on your setup.cfg, it works just fine and it is as declarative as pyproject.toml.

Also, I like with the standard Python documentation all those notes “Changed in version 3.3: …” and “Available since 3.2: …”. Not everybody can run with the latest snapshot from the master branch.

I understand the pragmatic aspect of this answer, but I want to mention that everything in this thread points to the general problem that there is no central direction in the packaging ecosystem, and instead a myriad of individuals doing just what they want on a disparate collection of overlapping projects, resulting in the current mess.

So, even if improving the docs might be solved with “the person doing the work gets to make the decisions”, it also perpetuates the pattern that bred the current frustration amongst users.


I’m not blaming you for supporting Python 3.6 or even Python 2.
There are always good reasons. All I’m saying is that it’s not
just packaging tools that are not going to support Python 3.6, but
also CPython core (no security fixes anymore!), and plenty of
other tools/libraries/distros/things, because the world has
basically moved on. If you maintain a library that works on Python
3.6 it means accepting this extra burden, and not just for

Everyone lives in a bubble, and to them their bubble is “The World.”
In your bubble, The World has moved on yes. In my bubble, I help
maintain projects intended to run on LTS operating systems and in
some cases support users with CPython 2.7,3.5-3.12 and toolchains
contemporary to those.

It’s not easy, no, but I also acknowledge that it’s not the Python
community’s job to make that any easier. However, pretending that
The World has moved on from those use cases isn’t helpful. It
hasn’t. It’s fine to consider them out of scope for whatever reason
suits you, but they’re still very much around.