As for this PEP, I agree with others: this is much to big a hammer to solve the problem at hand.
Also, as the developer of a third-party project (not listed in the PEP, though), what would help us most for testing a new Python release is to have conda-forge or Anaconda binaries for it in time. Right now 3.8 is available from neither.
We discussed this proposal at the Steering Council meeting this week, and our key conclusion was that we don’t think a PEP is the right vehicle for pursuing this idea.
There’s no “CPython Buildbots” PEP for example, there’s just a note in PEP 101 to check the status of the stable buildbots, and then assorted bits of documentation on managing the Buildbot fleet.
(I’ll put my own suggestions for how to proceed in a separate post from the SC level feedback)
Rather than pursuing this as a release process change, I think it makes more sense to pursue this as a CI enhancement, similar to the refleak hunting build, or the speed.python.org performance benchmarking.
That way the exact set of projects tested can be reasonably fluid (rather than being inflexibly locked down in an approved PEP) and chosen both to exercise various aspects of the standard library and to assess core ecosystem compatibility implications for a change.
If there are any PEP level changes at all, it would just be in the form of an additional note in PEP 101, and even that might not be needed if the standard practice is to file release blocker bug reports when the compatibility testing finds problems.
Sorry for jumping into this a bit late, I was told about this discussion a few days ago by @steve.dower.
I have a script on my home computer I run a few times a week that builds me a virtual enviroment with master-branch of {cpython, cython, numpy. scipy, matplotlib, pandas, ipython/jupyter,…} (basically the whole scientific python ecosystem through to my day-job code) and do most of my day-to-day development on that machine in that enviroment. It is a terrible brute-force bash script what has things like where I have various projects checkedout out hard-coded, but it does have the right order to build things baked in. I’m happy to share if people are interested.
I think it makes more sense to pursue this as a CI enhancement, similar to the refleak hunting build, or the speed.python.org performance benchmarking.
This makes a lot of sense to me. I think CI running master-branch python against latest stable releases of projects (installed from pypi!) would be a very good thing. Given that there will not be wheels yet, it may be worth using some sort of staged build.
I agree with @pf_moore’s concerns about this putting more burden on under-resourced projects, but we are going to see these failures one way or the other when Python is released so getting notified earlier would be better.
Running the master cpython - master project combination on CI is probably less valuable (but easier to get going as most of the CI services have a ‘nightly’ option that many of us use already).
…if Django, why not matplotlib, Flask…
As the lead Matplotlib developer I am on that suggestion!
Another thought that is wroth surfacing publicly is that it may be worth talking to the conda-forge folks. They have a correct machine-readable dependency graph, know where the stable sources are, and have build scripts for everything. I wonder if a pruned version of the graph they are using to run the py38 rebuild could be re-purposed for this?
Please, no pandas. It’s popular, it’s easy to use, but it’s sloooooooow. You can do the same things Pandas do with 2-3 lines more of Python or numpy code. IMHO it should not be considered a blocking project.
This PEP has been rejected. But thanks for the list of projects, we can consider them if someone works on a CI to test these projects on the master branch of Python.
Thank you Stinner, I think you’re doing a great job with CPython, reading the release notes.
And your PEP was not bad. I simply suppose it’s too much an hassle
Maybe make DeprecatedWarning on by default is more simple. This way also the people that uses those projects will bother the project’s devs to fix them
And IMHO some of the projects you listed should simply stay in the stdlib, as numpy, pip, scipy, Sphinx, sqlalchemy and pytest. From my list, I will add paramiko, pipenv, openpyxl, crc16, psutil and arrow, even if maybe pipenv is too young (but was adopted by PyCharme) and openpyxl and arrow have a not so great implementation…