I propose creating a system which automatically creates GitHub (etc) issues for projects producing binary wheels when a Python release-candidate is available and the project’s binary dependencies are updated for the new version.
Once manually triggered (ie after a Python X.Y.0 release-candidate is available in build tools), it would:
scan popular (by some definition, eg top 1000) PyPI projects with binary wheels
check to see if there isn’t already a wheel for the new Python version
check to see if all dependencies which have binary wheels have released a wheel for the new Python version (this passes if there’s no binary dependencies)
get the project source/home-page/issues link and see if it’s a supported issue tracker (eg GitHub, GitLab, BitBucket)
check to see if there’s already an issue in the tracker (contains “Python X.Y”)
create the issue
repeat each day until all projects have released versions, or a time period (of eg 12 months) has passed
The issue would say something like:
Support Python X.Y
A new version of Python (X.Y) is now available to be built with, and this project’s binary Python dependencies have all released wheels supporting this new version.
I think this would be a less demanding way of notifying package maintainers that a new release is ready to build with, as it’s coming from a system rather than a user. Maintainers are able to lock the issue from further user discussion if they wish.
Do you think this is unnecessary spam? If so, why is it not worth the benefit (of users having access to binary wheels on Python X.Y.0 release)?
If the community is okay with this, I can start working on an implementation and sussing out the details.
There is probably some implementation of this kind of thing that is okay, but many implementations where it is not.
So I would tread extremely carefully. Maybe keep your script running in dry run mode, manually open and update a few issues and see how it goes for 3.12?
If [it is unnecessary spam], why is it not worth the benefit [of wheels on Python X.Y.0 release]
In general, I think this kind of question is unhealthy to ask in ecosystems that depend on volunteer effort. You’re also presupposing the efficacy of your intervention. For how many projects that respond to open issues quickly is issue opening the bottleneck for new Python support?
You’re right, the benefit I state is not a given. The direct benefit I was thinking of was that maintainers wouldn’t have to keep checking when the build systems have the new Python version available, and when their dependencies have been updated (but this supposes the maintainers know of this system).
I think in larger projects (NumPy etc, which are a minority) maintainers are keeping up-to-date, but not in smaller projects. Perhaps I can exclude a black-list.
I think this idea is well-intentioned, but I can imagine that there will be quite a few people who would not appreciate this. Personally, I would just find it annoying if it happened to my code. Perhaps I was not planning on maintaining the code further, or perhaps I was already preparing the migration (!), or perhaps the code - while public - might not yet be releasable, or … (lots of other possible reasons). There is also quite a bit of orphaned code on the GitHub, and it’s not totally trivial to have a script decide whether a code base is orphaned/still maintained…
My principle would be: If you’re not a user/client/developer of a particular library or app, do not file issues/bugs/requests with it. And if you haven’t actually run into a particular issue, also don’t file issues.
Probably only echoing the above discussion, but: I would only support this if package maintainers opt in to it somehow. The use case seems rather limited: it has to be a project popular enough that people would care about this, but somehow not have anyone eager to raise the issue manually (or perhaps an entire community of users who just don’t pay attention to new Python releases, but who also need to keep up to date eventually… ?).
I would take a totally different approach and upstream a feature in cibuildwheel where an error would be produced if your configuration isn’t explicitly excluding a newly released version and then everyone can rely on Dependabot to upgrade the cibuildwheel action.
The problem with that is it doesn’t address the main issue, as not all projects constantly have CI building their project. I’m not solving this for NumPy and other high-throughout projects, because they likely already have maintainers who are keeping up to date with Python releases.
I’m not sure that this really solves much for projects that are less actively maintained either. I doubt that the general issue of less well-maintained projects not putting out releases/wheels for each new version of CPython is just because the maintainers need reminding that a new version of Python is released. The maintainers will be aware of that but if they don’t have time to put out releases then they don’t have time.
I think it would be better to focus on things that makes it easier for those maintainers to have the releases or wheels get put out automatically or anything like that. In other words reduce the maintenance burden rather than just putting more pressure on maintainers (especially since this proposal seems to be aimed almost directly at maintainers who likely struggle to find the time).