User experience with porting off

After sitting on the pyproject.toml sidelines for several years waiting for build tools to mature, I finally committed to porting after being nudged by deprecation warnings in pip.

I wrote up a detailed description of my experience at Gregory Szorc's Digital Home | My User Experience Porting Off

Overall, I found the experience to be very lacking / disappointing / frustrating.

Fortunately, I think a lot of the problems are fixable. However, I lack the perspective to suggest concrete improvements, otherwise you may have seen PRs from me instead.

I wanted to share my post with this forum in hopes that my user experience feedback can lead to positive change.


This is disheartening and probably mirrors the experience that many people are having with the endless transition-without-a-clear-goal that the Python packaging ecosystem has become.

With respect to passing parameters to the build backend, in PyArrow we’ve resorted to exclusively using environment variables (which is probably not a good idea security-wise):


This is disheartening and probably mirrors the experience that many people are having with the endless transition-without-a-clear-goal that the Python packaging ecosystem has become.

Whole transition from to pyproject.toml seems like an endless exercise in disregard for users. I understand that it is not that difficult to transition simple packages (I did quite couple of those myself), but how do you transition something like M2Crypto 362-lines long And no that complexity is there not to show how bigger the maintainer has it, but because without that it couldn’t be build on wild matrix of operating systems (both Linux, Mac OS X, and Windows are strongly supported), Python versions (from 2.7 upwards, inclusive), supporting libraries (anybody experienced stability of OpenSSL API?), and architectures.

1 Like

Starting out with pyproject.toml isn’t easy, but there are tools that make the journey easier. But the key point is that those tools have to fully support pyproject.toml - I don’t know much about build, and have only passing acquaintance with Poetry, but as far as I know Poetry is not compatible with pyproject.toml.

I would recommend PDM, which is compatible. I have been using it now in a proprietary project for over four months, and it works well - I am able to manage all aspects of dependencies, building and also now publishing. And I no longer need


Actually is it possible to combine my extensions from with using build and pyproject.toml?

To everyone replying here, remember to keep things friendly. You can acknowledge that your experience was difficult without putting down the maintainers working on it. Don’t suggest they don’t care about their tools, users, or the ecosystem, as that’s clearly incorrect.


OK, I am sorry, it was a frustration was talking out of me. I will try to keep myself more technical and positive.


Yes it is! As @abravalheri likes to repeat, is not deprecated – only running it as a CLI script is deprecated.

See also Quickstart - setuptools 68.2.2.post20231016 documentation and Quickstart - setuptools 68.2.2.post20231016 documentation.

This doesn’t mean that migrating is as easy as adding a minimal pyproject.toml, because some files do things like parsing sys.argv, which doesn’t work in a pyproject.toml-based build. I’m not expert enough with setuptools to offer more precise recommendations, though.

It is generally recommended to have the basic metadata (project name, maintainers, etc.) in pyproject.toml, which is more standard today (e.g., there are tools that will help you with auto-updating dependencies). But you can totally keep it in and only have a [build-system] in your pyproject.toml file, without a [project] table.


Thanks for posting this. I agree, that’s a very frustrating journey. And although you apologise for the lack of actionable suggestions, I agree that a lot of what you’re saying self-evidently needs to be better, so I think there’s plenty to work on here.

The transition process for projects which rely on setuptools is far from ideal. I have my views on why that is, but I’m a pip maintainer and it’s entirely possible to point out the problems with pip’s approach in all this (as you did) - so I don’t want to say anything that would give the impression that I think this is someone else’s fault. It’s a community issue, and one we should solve as a community, not by pointing fingers at individual projects.

I do think, though, that the biggest issue we have here is a lack of resources. All of the projects involved are entirely volunteer-maintained, and critically under-resourced. So we have a problem of a lot of people talking, and recognising the issues, but too few able to do anything about them. If anyone is interested in helping in this area, please feel free to dive in! The problems are hard, so don’t expect to be able to provide a quick fix, and you will get pushback as you learn all the problems that left us where we are now, but help would definitely be welcomed.

The packaging survey gave us a strong impression that users wanted better tools - and that triggered some big discussions over where we wanted to end up. But I think that in doing so[1], we dropped the ball. We need to look at how we handle all the legacy projects, code, and practices that are holding progress back, before we look at where we want to go. A discussion about “what packaging tools should exist 10 years from now” seems a bit optimistic if you consider that after six years, you’re still suggesting that the move to a standardised mechanism for backend options from completely arbitrary user-coded mechanisms is being rushed!

Unfortunately, strategy discussions about legacy support and transition processes simply aren’t very interesting for the community. And tool maintainers don’t have the resources or the authority[2] to force anything to happen, especially not to block people from moving forward simply because they didn’t think enough about transition. So I don’t have any good answers either. The best I, personally, can do is to make sure that any new PEPs have a really good “how do we handle legacy and transition” story. But that’s just keeping things from getting worse. We still need to help setuptools catch up with providing transition mechanisms for people using the old approaches[3], and help update documentation to the point where web searches pull up current and useful information, and where it’s easy to know what’s out of date.

  1. something that’s completely understandable, because who doesn’t want to talk about what fancy new tools we might create? :slightly_smiling_face: ↩︎

  2. or, for that matter, the unity - we’re a group of individual projects, with no overall “management”, despite what people believe the PyPA is… ↩︎

  3. many of which were never supported, but have worked for so long that this isn’t really a viable argument ↩︎


We did this something like this with Pillow.

We had already moved static metadata from to setup.cfg and replaced invoking directly with things like pip install .

Then we created a minimal pyproject.toml with only the [build-system], and kept and setup.cfg. This required a custom backend to pass some config settings to setuptools: Use --config-settings instead of deprecated --global-option by radarhere · Pull Request #7171 · python-pillow/Pillow · GitHub

Finally we migrated the rest of the config from setup.cfg to pyproject.toml and deleted setup.cfg: Move config from `setup.cfg` to `pyproject.toml` by hugovk · Pull Request #7484 · python-pillow/Pillow · GitHub. We still have a ~1,000 line, but as mentioned, itself is not deprecated, only invoking it directly.

Some tools which can help:


Sorry for your experience with the transition Greg :slightly_frowning_face:

I would like to note a few things:

  • Your project uses extension modules so it doesn’t yet work in that case, but for all others Hatch provides a single command that can port automatically which is one of the reasons adoption has been so rapid

  • I also view the config settings flag interface to be poor UX, so similar to PyArrow, Hatchling just uses file-based config and environment variables

  • I don’t have time to advocate for this until next year with Henry Schreiner when he has more time to spare (teaching obligations) but I plan to again push for the approach taken by extensionlib. For more context see the discussion I opened for Maturin.

    Basically, there are components that produce extension modules and components that pack files into an archive which we call a build backend. These are two distinct pieces of functionality and my view is that there should be an API that allows backends to consume extension module builders to find out where things got created and where they should be shipped inside archives.

    As I mentioned in Discord recently, I am confident that this is the correct approach and the people that are saying otherwise are doing so because of my inability to express to all audiences why it is the correct approach.

    Next year I plan to push very hard for this and get it implemented in at least 2 backends to concretely show the benefits. Then will come a PEP.


An interesting question in the blog post is

I don’t know basic things like whether my adoption of pyproject.toml will break end-users stuck on older Python versions or what.

(The experts might prove me wrong.)

As far as I know, as long as you upload wheels to PyPI, pip install will keep working whatever the Pip version, because Pip prefers downloading a wheel from PyPI and installing it over downloading an sdist and building it into a wheel locally. (The wheel format dates back to 2012 so it would be hard to find a Pip that doesn’t support it.)

If you remove the entirely, then things might break for people who do wget https://your-project-source/tarball.tar.gz; tar -xvf tarball.tar.gz; cd tarball; python install or such in scripts. Not much can be done about that. One public of people who build from source is package maintainers (Linux distros, Homebrew, etc.). Since pyproject.toml-based builds are the standard today, they’ll just have to update the package definition, which should be routine.

(Me thinks this could be mentioned in the packaging guide somewhere…)


I still haven’t ported my project metadata from to pyproject.toml because I don’t understand the implications.

There is little in terms of implications. The metadata ends up represented in the same way in the wheel.

The only downside I can think of is that it might break the deprecated python install for people with old setuptools versions that won’t understand pyproject.toml metadata.

One reason to prefer declarative metadata is that there can be tools to interact with it (like auto-bumping dependencies, as I mentioned above).

1 Like

This is essentially my view as well but I agree no one seems to notice this view:


Not all platforms have bdist wheels available to download, but in any case current pip versions are PEP 517 build and integration frontends so everything works seamlessly. Even for packages not on PEP 517, pip falls back to executing directly.

PyPA’s installer and build projects originally came about because distros, like the one I help maintain the Python build process for, did not find pip to be acceptable for our use cases and thus needed separate lightweight build and integration frontends. There was a long and winded setuptools PR over this. Most use cases, even CI, can get by with pip.

1 Like

I don’t have a big problem with this view, but I think that in the light of the current thread, it’s going to be crucial to provide a clean and well-documented transition for people who currently have complex setuptools build processes - otherwise, we’re simply sending out yet another message that says “you’re using a deprecated approach and you should change, but we can’t tell you how to change because it’s complicated…”

I also think that for this model to be successful, the setuptools backend needs to support it, as well - whether that’s as a wheel creator consuming extensions built with builders other than distutils/setuptools, or as a builder that can be used by backends other than setuptools, or both. In fact, I’d consider going as far as saying that setuptools support, and a detailed “transition plan” section in the PEP, need to be available before a PEP gets accepted, in large part because of the issues highlighted in this thread.

Before anyone makes the comment that a policy like this will block innovation, isn’t that the whole point here? Innovation without sufficient regard for transition planning has got us where we are - let’s learn from what didn’t go well and do better in the future.

The issue is that pip is now looking to remove that fallback, on the basis that 6+ years is plenty long enough for the transition. But evidence suggests that it isn’t. We don’t have the means to force the change to happen, setuptools has a bunch of legacy issues they are trying to deal with on limited resources, and pip can’t simply retain the fallback indefinitely as the technical debt is enough to cause us problems that we can’t sustain much longer. There’s really no good answer here at the moment. We (pip) are trying to manage the fallout as best we can, but when the answer is “pip is calling setuptools as per the specs, your issue needs to be addressed by the setuptools project” that’s not a great situation - not for the user, or for pip, or for setuptools :slightly_frowning_face:


So, in the spirit of being positive I tried to port M2Crypto to pyproject.toml and the results are not that awesome. The biggest problem is debugging. Apparently, older versions of something (I have not even idea of what) doesn’t get metadata information correctly (see CentOS 7 and Python 3.6 (that’s probably Debian/stable)). So, I guess, these packages are not compatible with this old Python interpreters, but then pip should know what to install and what not, right?

I’ve encountered the UNKNOWN package name problem (bug?) with older versions of setuptools. I think this might be a problem with an old setuptools version getting installed due to recent setuptools not supporting Python 3.6?

Except setuptools 59.6.0 are installed in centos7 (#5418339170) · Jobs · m2crypto / m2crypto · GitLab which still officially supported 3.6 (see also 59.6.0 setup.cfg).

To be clear, I didn’t mean to be critical of any project. But clearly, there are differences in how tools comply with the relevant standards, which has consequences for the user experience.

Not really. There are differences in how fast tools are able to implement the relevant standards. That’s caused by lack of resources, legacy features that need to be transitioned, and too many other priorities.

But the packaging community in general is very supportive of our standards - tools are all working towards following the standards we agree to. Reality just gets in the way sometimes.

1 Like