How to publish several versions of Pypi release for several versions of Python

I have a repo with a main branch that requires Python 3.12 or later. I have a release process for Pypi.
I also have a 3.11 branch which supports Python 3.11.

Is there a way I can publish the two branches as a single release to Pypi, so that when a user pip-installs the package from a 3.11 pip it installs the version built from the 3.11 branch, and when from a 3.12 or 3.13 pip it installs the main version ?

For the sdist, I guess you would need to put all branches in the same sdist and let the back-end do the work of selecting the correct code depending on the Python version.

Wheels can be tagged for a specific Python version. That part should be easy-ish.

Do you already have a build back-end in mind?

Which level of detail are you expecting from this discussion?

My publishing process is a github action, triggered by a github release. It’s here (the toml is also relevant), resulting from discussions here.
I don’t know what a sdist is.

I was taking example on numpy, whose last version’s files list different wheels for different python versions. But looking at their code, it seems that all those wheels are built from the same branch and that only the dependencies (those installed alongside numpy and those you need for building numpy) change.

So, I’m not that sure that you can actually have several branches as several wheels in the same release…

As far as I know, numpy has different wheels for different Python versions because it contains compiled C extensions. And compiled C extensions are always compiled against one specific Python interpreter version. But it is all done from the same source code branch. I might be completely wrong for numpy specifically, but it is at least true for a whole bunch of other Python projects.

Seems possible to me, but will require quite some work. What you want to do is pretty untypical (I do not think I have ever heard of such a case), so you will not find any easy/straightforward solution readily available.

Ideally you should always distribute an sdist alongside the wheels. An sdist should contain all code and info necessary to build all wheels. You could decide to not distribute the sdist, if it is too complex. But the recommendation is to always distribute an sdist, if possible.

The build backend is setuptools, so of course you can put pretty much any code in setup.py, whatever needs to be done to build the wheels. And of course you can also write scripts that transform your git repository into a “source tree” ready to be packed into an sdist.

So you could write a script that does the necessary git operations to build a source tree like this:

  • setup.py
    • src310
      • parliamentarch
    • src311
      • parliamentarch
    • src312
      • parliamentarch

where setup.py looks something like this:

srcs = {
    '3.10': 'src310'
    '3.11': 'src311'
    '3.12': 'src312'
}

python_version = compute_python_version_somehow()

setuptools.setup(
    package_dir={'': srcs[python_version]),
    packages=find_packages(where=srcs[python_version]),
)

That is just one idea, off the top of my head, there are probably much better solutions.

1 Like

You’re right as far as numpy is concerned.

I don’t think I’ll finally go forward with this : I thought it was a general pattern (from the fact that a single release usually packs different wheels), but it’s actually not.
It also requires setup.py when I managed to do without this far (only a toml).
And I’ll use your help if I finally need to do that, but it turns out the rollback to 3.11 compatibility was not as extensive as I though.

1 Like

I haven’t done anything that really calls for this kind of build automation; but if I had such a project structure and compatibility requirement, I would… check out the branch for one version, build the wheel (and make sure metadata is set up so that the wheel has an appropriate Python version tag), then repeat for each configuration. Presumably that can be automated by GitHub Actions or whatever other system.

Short for “source distribution”, it’s essentially a tarball of the source which Pip will download, attempt to make a wheel from locally, and then install. Since your design doesn’t involve a single authoritative source for all the wheels, I wouldn’t bother trying to make one. The fact that you publish on GitHub should satisfy any open-source interest.

That said, it’s hard to understand why you have these separate branches if the code is pure Python. Would the 3.11 version really not run on 3.12 either? If there’s any way to do things that’s compatible with both (and that should practically always exist, even if it means you don’t get to use shiny new syntax bells and whistles), it’s generally best to just do so. People used to make extraordinary efforts to produce code bases that could work on both 2.x and 3.x; the differences between 3.11 and 3.12 are far more minor.

1 Like

Yeah, it’s basically about that. The backport was just creating two TypeVars for two different parametric functions.

In general, there are tools like pyupgrade and refurb, that help you use the best available Python syntax and features according to the lowest Python version supported. I am not sure it would help you in this one particular case (the thing with TypeVar), but in principle when you are decided to drop 3.11 you can run these tools and they will suggest you what syntax improvements (and others) are available to you now that compatibility with 3.11 is not required anymore.

1 Like