When you kick the packaging hornet's nest on Twitter, the hornets seem to want an opinionated, KISS solution

I ended up kicking off a conversation on Twitter about people’s packaging pain points after Hynek lamented how the new annual release cadence has a major burden for package maintainers due purely from the fact that setting up, maintaining, and pushing new releases to PyPI is such a major time sink.

Threads you can read

Thea pointed out that even though she helped write the docs it still took her a long time get a project set up:

Ethan pointed out a bunch of papercut scenarios involving setuptools:

Hugo said getting CI running on newer versions of Python can be a problem:

And then Anthony pointed out that he thinks conda-press could help, but was too busy getting ready to give a talk on that exact topic to chat :laughing::

The conclusions I drew

  1. Setuptools confuses the heck out of people and they can never remember all the intricate details required to do it properly (I realize there is a lot of historical baggage, but that baggage has stuck around, e.g. how many files do you have that are packaging-related in your project?)
  2. Tools like cargo and such for other languages have spoiled people to the point that they want a single tool to drive the whole process (i.e. saying “use Flit” didn’t satisfy as that’s yet a different tool to have to learn compared to everyone else, and thus you don’t have documentation and community support to use it compared to if we had a single tool)
  3. Some hope conda-press can help

Basically people wanted a KISS solution. I know this isn’t really anything new for people who participate here, but it seems people really want an opinionated, simple solution for the common packaging cases. After that I would assume they want composable tools for those times when the general opinion on how to do something doesn’t work for their special case.

Now I don’t know if this means putting more configuration details in pyproject.toml so they are shared more across build tools so the common case is consistent and you only kick out to tool-specific details when they go beyond the simple case, going down the pip build/pip publish route so the details on how to build your sdist and wheel and get them up on to PyPI is universal and documented the same everywhere, putting in the effort to simplify setuptools configuration (which may tie back into the “more settings in pyproject.toml” idea), or what.


8 posts were split to a new topic: Developing a single tool for building/developing projects

Since this thread seems to be my fault, let me use this opportunity to give my perspective on the state of Python packaging.

Python-only packages

I think we’re in a mostly adequate place here. Anybody who who wants to publish a Python-only package can realistically get it done, either with my blog post, flit, poetry, or by going setup.cfg-only. There is a bunch of paper cuts left though.

I personally feel (and I do understand that it’s only my opinion) that flit and poetry are nice approaches but they only get me 90% there. If people are cool with that, more power to them but I’m not seeing myself switching to static metadata because I don’t want data duplication between my code and package metadata. We have well-known __ variables for almost all fields, why is flit the only one looking at __version__ (and ignoring the rest)? I’ve been told on the poetry tracker to ship my pyproject.toml with my package (!) and parse it using ConfigParser(!) in my __init__.py (!), on import (!).

I am aware of importlib.metadata but unless I want to add a runtime dependency, backward compatibility makes it a non-starter for existing projects. I know about setuptools_scm, I respect it, but I don’t like to treat SCM metadata as canonical data YMMV.

I also like to be able to concat the latest changes (and only the latest, if you concatenate everything on a long-running project, it gets unwieldy) to my long_description as a service to my users – I guess I could write a setuptools plugin to do all my shenanigans? But that brings me back to why I don’t use flit and poetry, things I’d like to do dynamically keep coming up.

I also don’t see much value in them over a setup.cfg-only setuptools package, except I wish we could finally stop using setup.cfg and put that data into pyproject.toml. This half-transition that’s dragging for years now is not great.

Another major problem that won’t allow me to drop setup.py anytime soon are editable installs in PEP 517. I love PEP 517 but I really need proper editable installs. I’m still a bit surprised that this went past PEP 517. You need them for src layouts (I know, that’s my pet peeve but it’s increasingly popular) and when you have setuptools entrypoint for CLI tools.

OK these were the good parts. :slight_smile:

Binary packages

The moment you add the least amount of compiled code to your package, flit and poetry are right out and paper cuts turn into proper guttings.

First, check this beautiful setup.py: https://github.com/hynek/argon2-cffi/blob/master/setup.py

And now look at this arcane heap of terribleness that I have to use to build wheels for argon2-cffi: https://github.com/hynek/argon2-cffi/blob/master/.azure-pipelines/wheel-builder.yml

I’m sorry for the strong words, but this is bullshit. The average maintainer cannot be expected to deal with this.

The only reason I got it working after hours of fiddling at all, is that @Alex_Gaynor and Paul Kehrer did the most the work for me and I “just” had to adapt a few details.

The ~best~ nugget is certainly that on Linux and macOS we have to pin pip to a version before PEP 517 support (10.0.1).

If I understand it correctly, the logic goes like this:

  1. We need to tell wheel to build an abi3 wheel – but only on Linux and macOS, because there aren’t abi3 wheels for Windows yet.
  2. You cannot pass arguments to wheel through pip wheel in PEP 517 mode (and neither through pep517.build AFAIK – I’m personally very disinterested in the discussion whether pip does everything or not as long as it works).
  3. Disabling PEP 517 with --no-use-pep517 while having PEP 517 configuration in pyproject.toml makes the build abort.
  4. Profit?

If we could set the limited ABI unconditionally, this problem might go away? For that we’d need abi3 for Windows and wheel tolerating/ignoring that options on Python 2. However, abi3 for Windows won’t land before 3.9 so we’re stuck with the mess above for years to come.

Honestly, I have never been closer to abandoning argon2-cffi than on that frustrating Sunday afternoon.


  • Add pyproject.toml support to setuptools so we can drop setup.cfg and setup.py.
  • Consider support for metadata extraction for well-known fields.
  • Get proper pip install -e . with PEP 517 support.
  • Get abi3 for Windows.
  • Figure out abi3 tooling in the meantime. Allow passing arguments to wheel through pep517.build?
  • I love you all, thank you unironically for your hard work. I remember the times of easy_install and eggs and at least for the users, the situation has much improved.

This is a major limitation in PEP 517 at the moment. I don’t personally use editbable installs, but they are a huge deal for many projects. The only reason they aren’t in PEP 517 is that everyone was sufficiently burned out getting the basics agreed that we had no appetite at the time for more difficult discussions. I’d strongly recommend that someone pick this up and champion an extension to PEP 517 to add editable support (there have been a few attempts to start such a discussion, but as far as I know they’ve all died down - if anyone knows of one that’s still active, please link to it, my Discourse searching abilities aren’t up to finding anything at the moment).

From memory, the biggest things that need to be decided are:

  1. Do we make support of editable mode mandatory or optional? AFAIK, at the moment, editable installs are purely a setuptools feature. Do we want to require other backends to offer such a facility, or are we OK with front ends refusing to do editable mode if the backend doesn’t handle it?
  2. How do we implement editable mode anyway? Because it requires things to be installed, the front end needs to be involved as well as the backend, so we need to standardise the mechanism, so that (for example) we can install a project in editable mode with one frontend and then uninstall with another.

Add pyproject.toml support to setuptools so we can drop setup.cfg and setup.py .

+1 on this. Presumably this “just” needs raising as a setuptools feature request and someone writing a PR. That’s not to say doing so will be easy, but as something that only affects one project, it’s at least self-contained.

Consider support for metadata extraction for well-known fields.

This probably needs more clarification. In principle the idea is sound (it’s something a lot of people do, one way or another) but getting the details right without ending up back in a situation where building a wheel involves executing arbitrary code, and metadata that depends on the phase of the moon when the build step ran, is not trivial. Someone who understands the requirement needs to guide a discussion on this one.

Get proper pip install -e . with PEP 517 support.

+1. See above.

Get abi3 for Windows.
Figure out abi3 tooling in the meantime. Allow passing arguments to wheel through pep517.build ?

I’m pretty sure this is (at least in part) a core Python issue. But I’m not an expert on ABI details, so again, this probably needs someone with a good understanding of the issue to drive a discussion (and likely propose a solution, because there are few enough people with the relevant knowledge, that any discussion is going to be fairly limited).

I love you all, thank you unironically for your hard work. I remember the times of easy_install and eggs and at least for the users , the situation has much improved.

Thanks for this. Even with the improvements we’ve made, there is still a long way to go, and motivation is sometimes hard to sustain. Knowing that we’re at least moving in the right direction is good.


I think for this, pip needs to grow a mechanism to pass config_settings (PEP 517), and setuptools needs to grow some way to pass arguments from config_settings to commands (i.e. --global-option).

This is another of the PEP 517 is “almost there” items (like editables, but this one’s standardized). :confused:

I’m driving this. The problem was that when we pointed out some of the APIs that would be in abi3 to the other core devs, many decided they wanted to clean it up before committing to it (and they didn’t believe they were already committed…). According to the PEP 602 acceptance post, that cleanup is complete, so we can now actually make those APIs available on abi3 on Windows. Possibly even on 3.8.1, but certainly in 3.9.


I’m not sure exactly if this setup.py is doing something else apart from customizing the building of C extensions, but if you call this “beautiful” unironically then I think there’s already a problem :wink:

(I know, some setup.py are much more complex and I happen to work and have worked on some such projects… still)

@pganssle has been doing this in its free time, it’s a WIP AFAIK, but probably could use a lot of help.

We’ve had this discussion last year at PyCon US, AFAIK we decided it’s mandatory; of course backend implementations can always decide to raise an error if they don’t want to support this for now.

If you read last years PyCon US packaging summit notes you can find a base ground we agreed on. Before writing a PEP about this the suggestion has been made that someone should write a POC of the proposal, and this is where @pganssle effort is at.

https://github.com/pypa/setuptools/issues/1160 just needs someone willing to do it.

1 Like

I think the almost-there-ness of PEP 517 is what makes it so frustrating. :expressionless:

I can’t stress enough how ironic that statement was meant.


As much as I can see how cathartic it can be to complain about packaging, it’s not clear to me that there’s anything new here. I think almost all of these things are on our radar already, and most of them were discussed at the Python Packaging mini-summit and have specific action items see here.

With regards to the complexity of setuptools, I think that’s largely a documentation issue. Mostly speaking for myself but I think the other setuptools maintainers would agree with this, what we’re looking to do is:

  1. Make it possible for 90-95% of people to specify their configuration in a declarative metadata file (currently that is setup.cfg, but we’re happy to also support pyproject.toml), with first-class support for more complicated workflows in setup.py.
  2. Deprecate and remove the more complicated parts of the setup.py workflow, and get setuptools out of the business of being a command line application.
  3. Move to workflows based on specifications, so that other build tools can be built to those specifications.

The first item on this list is essentially moving closer to how cargo works in Rust - almost everyone specifies their builds in the declarative Cargo.toml, but some small fraction have more complicated builds and have to use build.rs.

That is probably not the correct issue, Issue #1688 has more serious discussion in it, but yes, it is true that if someone is willing to do the work we would almost certainly accept it.


I think this depends on whether we want to continue to keep common metadata requirements a per-tool thing or if we have any interest in trying to standardize common things. For instance, every project will have a name and version number, so why have every tool have their own way of specifying that? Maybe we want that so tools will pull from __version__ in some instances and not in others, but I don’t know if we want that much duplication of effort either.

Well, as much I as understand it. That isn’t in the PEP so it isn’t anything beyond me stating at Nick’s request that people should consider the stable ABI to help in their maintenance cost from a release cadence perspective.

Now hopefully this is all settled and things can move forward, but me saying something in an email most definitely does not make something true. :wink:

Correct, but that doesn’t mean efforts have not puttered out a bit, e.g. editable installs. So re-raising things to try to help kickstarting things again doesn’t hurt.

Are we getting to the point that maybe we need to start some packaging WGs to do targeted problem solving and to help keep momentum going on the key issues? For instance, should there be an editable install WG to go off and try to figure out a proposal to bring back here? Same thing for universal build tool to see how feasible that would be (and I’m not suggesting pip here on purpose)? How about other stuff on the list from the summit? Basically my worry is we are all trying to help solve all the problems and that leads to all of us being spread thin, making it easier to lose track of things and not be heading towards resolutions.

1 Like

I wonder if thence long time almost-there-ness of PEP 517 et al isn’t just a sign that the limits of volunteer labor for that kind of task is reached and we should find a way to pay someone(s) to bring it over the line.


I think it’s a bit of both this and what Brett said - everyone wants to have their say, so people’s attention gets spread too thinly. Add to that the fact that we’re all volunteers and there are significant limits on how much time people can spend on any of this (without burning out).

Groups targeted with working on individual problems, quite possibly involving some sort of funded resource, sounds like a good approach. But for funding to be an option, we’d need those targeted problems clearly defined - the Packaging WG notes are a good start here, but they probably make a lot more sense to the people who were present, if I’m honest, and could do with some tidying up and ongoing maintenance to track progress and/or changing situations. Maybe funding a project manager to co-ordinate planning and prioritisation of all the various work items would be a good solution, with the actual implementation work being handled by volunteer working groups (possibly assisted by further funded resources, if specialised expertise or even just additional manpower is needed).

IMO, the other problem with the “almost there” nature of PEP 517 is that in one sense it’s a solution without a problem - it deliberately looks at providing a standard solution that any frontend can use to work with any backend. But the reality is that the only frontend in serious use is pip, and the only major backend is setuptools (flit is a great example of another backend, but because it only targets simple, pure-python projects, it doesn’t help with more complex problems like compiler configuration or editable installs). So with the more complex problems, we keep hitting cases where we don’t have any experience to address the question of “what was wrong with the old way?” So (for example) pip gets pressure to allow people to opt out of PEP 517 features, rather than there being pressure to find a solution within PEP 517.

This is the case but doesn’t have to be. One can write a custom import hook to update c-extensions on demand, if they’ve changed. And this could be exposed as an editable install.

Only when you modify extension code, not pure Python code. So my experience is a bit different: I have one command to type when I modify extension code, yes, but that’s the regular experience when modifying any kind of C code (including outside of the Python world) – modify C code, then recompile. When I only modify Python code, though, I don’t have anything to type.

I will add that some extensions can be slow to build, so not having to re-run pip install every time I modify pure Python code is really important.


FYI y’all – https://github.com/takluyver/flit/pull/260 – flit master now has support for src/ directories. :slight_smile:


I hate to pile on but my monkey brain forces me to stress a few things that are important from my perspective:

  • This would mean Python would get an unconditional compile step, that you’d have to remember. And if you forget it, things wouldn’t break, they’d just behave weirdly.
  • This workflow is effectively already enforced when you use tox and the reason why most people don’t use tox for their main feedback loop is simply that the installation step is much too slow.
  • Most Python packages have no extensions and from those that have, many just vendor some kind of library (like argon2, rapidjson, uvloop…) that almost never changes.
  • Python Packages aren’t just for packages you upload to PyPI. There are good reasons to make your apps packages too, even if you never install them beyond pip install -e . in dev and pip install . in prod. I really don’t want the auto-reloader of my web app have to run pip install .
  • As a side-note: pip install . takes longer than a full compilation of most of my Go projects.

So yeah, there’s really no way around proper editable installs in certain contextes, sorry. :worried:


Apparently I expressed myself quite badly :slightly_smiling_face: Let me try to address the responses. My alternative proposal to -e . is most definitely not to force on a compilation step to everyone, but to have a command that knows when to do what (and when not to) before running the actual command, like how you’d use go run or cargo run instead of executing the built binary manually. And that command would go into the hypothecal tool that contains those other commands we don’t want to stuff into pip.

Using pip install directly for this would most definitely be unacceptably slow, but that wouldn’t be necessary. Part of the reason pip install . is slow is exactly because it’s not made for the developing workflow, but targets redistributing, so it builds all the needed intermediates and final redistributables (e.g. wheels) that are totally not needed during development. For Setuptools, the underlying compilers already handle incremental compilation for extension modules (I hope?), and I can’t think of any technical challenges to implement a copy-if-source-is-newer logic to put both built and pure-Python files to the destination if we know what files are built.

So for sum up, I guess the main point I’m trying to make is instead of bolting more things onto pip so it sort-of-kind-of works as something it’s not designed for (as a development tool), it’d be a better to leave it alone and actually make a good development tool, since we already have most needed pieces for the latter. And on the other hand (I believe this is known, but haven’t seen it mentioned here), there are actually important pieces missing to make pip install -e . possible.

If you ask a compiler to compile, it will compile. It doesn’t try to be smart by figuring out whether the sources have changed. That’s the job of the build tool, e.g. setuptools or make or ninja.

Figuring out C/C++ dependencies (e.g. header files) automatically is not that easy. I doubt setuptools knows how to do that. And it’s worse for Cython, I don’t think any build system out there knows to collect Cython dependencies automatically (Cython can depend on C header files but also on Cython include files and modules!).

You are correct, sorry for messing this up :frowning: I think my point would still stand though; it is better to place this responsibility on build backends (setuptools, meson, etc.), since they know best what they’re building, and let them tell the frontend what they did (and/or what the frontend should do with the result).

Good point, I didn’t think of Cython at all. Some sort of an escape hatch would be needed so a user can explicitly skip build.