I was recently reading the documentation for building pyodbc from source, and I saw its advice for not running setup.py directly, with a link to an article by Paul Ganssle titled Why you shouldn’t invoke setup.py directly explaining why. He says (emphasis original):
… as of the last few years all direct invocations of setup.py are effectively deprecated …
I read the entire article, which was very well written, and it seems to contain reasonable advice for builders and distributers of packages.
My use case is different, however. I don’t distribute any Python packages, but I do contribute (on a modest scale) to helping fix bugs for some of the packages I have used over the years. My process involves cloning the repository for the project, and then a cycle of
modify the source code
compile and link the source code with python3 setup.py build_ext
test what I’ve built
analyze the results of the previous step
back to the top: lather, rinse, repeat
I’d like to think that the advice quoted above was for a targeted audience focused on packaging, and not meant for what I’m doing, but there’s no getting around the fact that “all direct invocations” doesn’t leave much room for that interpretation.
So what should I be using instead to get the .so in place in the build tree so I can test it directly without the overhead of packaging or installing? I’d like to be prepared for the day when the ability to directly run setup.py goes away (which is what happens eventually with deprecations).
What I’d really like is to be able to use make (or the equivalent functionality), which understands incremental builds, but I gather that’s intentionally been taken off the table.
For pure Python projects, pip install -e . as Megalng suggests, is incredibly convenient.
An open source project that is normally run as a binary, is not just the source code. Its documentation at the very least, should contain at least one way to actually build that binary too. I say contributors “should” do, whatever each project recommends.
Maintainers are under no obligation to support non-standard build chains. If a capable user that wants to compile from source, chooses to deviate from the script provided, they need to be prepared to own all ensuing problems themselves.
When trying to compile new projects, I find it’s best not to get too creative, and stick to the well trodden, tried and tested route, even to the extent of running a virtual machine with the exact same OS on. If the maintainers have chosen to stick with a setup.py file, so be it.
So basically, without using python3 setup.py build_ext (which lets me alternate in the same shell between pytest … and PYTHONPATH=build/lib… pytest …) there’s no way to build without also installing?
This guide was written to clarify the situation around setup.py:
Unless setuptools deprecates python setup.py build_ext I don’t see any problem with using that directly.
To get this my suggestion is that the project could migrate from using setuptools to using meson-python. Then you can have incremental rebuilds, parallel builds and proper editable installs and more. You can also easily have your out-of-tree workflow with PYTHONPATH and in fact spin is a frontend development tool that sets up that exact workflow for a meson based project and wraps it all up with commands like spin test.
The whole build isolation and not using setup.py thing is somewhat overly dogmatized.
Build isolation was introduced to ensure that packages are harder to publish without their build dependencies properly declared and aren’t built by users with either very old versions of setuptools or the sometimes heavily patched variations of setuptools provided by Linux distros. As long as your copy of setuptools is a recent one installed from PyPI and you test the pip install . flow somewhere, adding --no-build-isolation to local pip install commands or invoking the backend directly is a reasonable way to make them run at a sensible speed.
python setup.py install and python setup.py develop use legacy .egg installs which typically break anything that tries to consume package metadata and workflow commands like setup.py test and setup.py publish were removed from the scope of what setuptools wanted to be involved with. But not all setup.py commands have these issues and, as you’ve observed, some don’t have (non-inferior) modern equivalents anyway so you’re kinda stuck with them despite the noise of deprecation warnings they make.
I’ve been using setup.py for years, and seeing the deprecation warnings recently has pushed me to find a new way to build my packages. I don’t really have any issues with setup.py myself, so I just wanted to know what I needed to do to get rid of the deprecation warnings.
I had spent several hours reading up on packing software and had gotten nowhere. This is what I wanted
One of packages is a wrapper for libSRT.
What I’m doing now is installing my wrapper package and then the first time it runs it downloads and builds libSRT, which is kind of janky.
I just hate telling people, " go install libSRT first and then install the wrapper package", I just see that as problematic. Is that the way that it’s usually handled?