I’ve got quite a few personal packages, and they’re all in a monorepo. I’ve got a release script which at its end generates a setup.py
and uses it to make an sdist and then uses twine to upload it. My script is quite old.
As when I first set up my stuff, I’ve just spent some days wading through the packaging docs, learning not enough and coming to the conclusion (just now) that I’m thinking about this all wrong.
My objectives: shift from setup.py
to setup.cfg
and/or pyproject.toml
, and to stop running setup.py
directly (which I gather is now discouraged - I’ve read some long articles about why that is so).
What I thought was happening was that I should be making setup.cfg
and/or pyproject.toml
files so that my package could be usable by modern tools i.e. that I was shipping the config files to end users or end user tools, and that pip install
itself is running my setup.py
at install time.
What I now think is that maybe that’s not the objective, but instead that I need to upload the required end products such as the sdist and possibly wheel files to PyPI. And that those end products contain the metadata and where it comes from at my end (eg pyproject.toml
) is irrelevant.
This is (to me) completely unobvious from the packaging docs. Instead, I’ve spend a lot of time wading through the (several, apparently all optional) mechanisms.
So, my questions:
Is the functional approach:
-
choose a toolset (flit, poetry, something which can read a
pyproject.toml
?) - use it to generate the sdist and wheels?
- then choose an upload tool (twine seems nice and is working for me now with sdists) and upload to PyPI?
I suspect that I’m a user in that annoying middle ground:
- not someone to can be satisfied with a simplistic “get started” recipe that walks through a single static approach - I hate “magic”
- not someone with deep knowledge of the apparently constantly in flux packaging landscape
What I feel the packaging site lacks is a core overview of the objectives and flow. Maybe I’ve missed it. For myself, what I would find useful is some short doc which outlines what “publishing a package” actually has to achieve - the essential concepts:
- specify your package (metadata)
- build the upload artifacts (sdist, wheels?)
- upload the artifacts (eg using twine)
Now, I’m just making up the above list because it is not clear to me from the packaging docs, which are Very Many.
Is the above list sensible or useful?
Does it imply that I bring no benefit to myself or others by trying to make multiple config things (setup.py
, setup.cfg
, pyproject.toml
) and that I can just pick one at my end (eg the TOML file) and a suitable tool which can read it and make artifacts? i.e. no end user sees these config files?
Is there a short page listing working tools and what config inputs they work with?
Cheers,
Cameron Simpson cs@cskk.id.au