There is the built-in help that is shown when running make (AFAIK), but I imagine you’re thinking of something more detailed?
In the devguide, maybe under Getting Started where the current Building guide is and where the Git cheat sheet is, etc.—I’m not really sure of a better place, as it is too basic for Advanced Tools, Development workflow focuses on the meta stuff rather than the mechanics, and and there isn’t another obvious category at least given the current structure we have.
Is the documentation build also on topic in this thread?
I would love to have the doc build requirements installed automatically in the background like tox would do for example. I know there is make venv, but it’s a bit ad-hoc. We’ve wrestled a bit with it in the French translation of the documentation.
EDIT: One example problem is that if a dependency is added, the need to recreate the environment will not be automatically detected. Also, there is a Makefile/make.bat duplication similar to the main build.
ISTM that it’s essentially orthogonal to what is discussed here. That’s basically independent of the code build system, and we don’t want to require docs contributors to have to install and run the latter just to render the docs.
We could have make build depend on make venv such that a venv gets created and used automatically, like we do for e.g. the similar commands in the PEPs repo. But I suggest opening a new topic in the Documentation category to discuss that further, if you’re so inclined.
make help would be a first step (and I promise I would create a branch for the change ). I was thinking of more than that, however. Maybe make -C Doc help is all you need for that directory, but at the top level I count 92 possible targets, most of which will only be of interest to seriously troubled individuals people interested in details of very esoteric topics (framework builds, configuration regeneration, etc). Most of those sorts of things wouldn’t be necessary in the short make help output, but should be documented for those rare occasions where they would be useful.
Edit: One other thing. Documenting the current make targets should be useful when/if a one-ring-to-rule-them-all replacement for make is undertaken.
This kind of thing is also nice to have in the Makefile itself.
I’ve taken to having a pseudo target named “_” or “help” or “_help”
which recites the common targets and their purpose. Example from my
personal kit (where the help is the default target):
CSS[~/hg/css(hg:default)]fleet2*> myke
_deploy - deploy current release to /opt/css
_freshmeat - announce current release to freshmeat
_home - deploy scripts to ~/bin etc
_pending - report unreleased changelog and [M]odified files
_pub - push to upstream repos
_publish - deploy current release to /opt/css, ezos and tip to bitbucket
_publish_ezos - deploy current release to /opt/css, ezos
_publish_hg - publish to upstream repos
_release - mark tip as new release
_remote - rsync the Python tree to $RSYNC_TARGET_PYTHON
_tarball - make a tarball of the tip
_test - do syntax checks and self tests (may set $MYKE_TEST_PYTHON_MODULES if desired)
_test3 - do python 3 syntax checks and self tests
_updcss - deploy current tip to /opt/css
_venv - make the Python virtual environment based off "${PYTHON_EXE:-python3}" using /Users/cameron/hg/css/venv-requirements.txt
Several of those are obsolete but hopefully the idea’s clear. Typing
“make help” would be pretty handy.
not having to rely on an obscure and hostile toolkit / programming language
I think this is underappreciated. The current build system works fine for the default cases, but as soon as you need something custom, you have a really hard time with it. At my company, we are building Python with a custom toolchain for about 20 different targets. This was really difficult to pull off, so much so that after some time we developed a CMake build for Python (not something that is upstreamable since it is very focused on our needs). I know this thread is not focused on the exact choice of build system, so I’m not saying this to suggest CMake, just to highlight that cross-compilation and any kind of customization is very difficult with autotools, as soon as you move off the beaten path. There are build systems that handle custom toolchains way better, so that could be a tangible improvement for Python if it were to adopt a new build.
As someone who won’t be hugely affected by any decision here, I will say that all build systems are to an extent “obscure and hostile” if you don’t understand them. It’s certainly true that autoconf/configure is impressively bad in this regard (from what I know of it) but personally, I find CMake pretty difficult to use for anything non-trivial, as well. And that’s after trying to set up a (small) project using it. I haven’t used meson much, but I would be fairly confident I’d feel the same way about that on my first attempt. But I could probably learn meson or CMake if I needed to (I feel like I might find meson easier, but that may just be “it’s Python, so it’s familiar” talking…)
My point here is that it’s awfully easy to get into subjective judgements if we go down this route too far. With that in mind, here’s some things I’d like to see in a new build system that try to capture the point being made here in some at least vaguely objective points:
Works on all platforms Python supports. This means that all the core devs use the same system, and so we can better share knowledge and understanding.
Well maintained, with an active community and support channels. This means that if we have problems or questions, we can get answers.
Ideally, the tool community is aware that CPython is looking to switch to their tool and actively wants to help. This means that we’ll have tool experts available to help with the transition, who have a vested interest in our success.
Popular in open source projects. This means that CPython contributors coming from other open source communities are more likely to be familiar with the tool. Something that’s more common in commercial contexts is less likely to be familiar to (for example) Linux distributors. Which leads on to…
Works well with the distribution processes of our main distributors (Linux distros, conda). We don’t want something that breaks the release processes for 3rd party distributors.
If you talk to people who have used both CMake and autotools, the general consensus is that CMake can be tedious and cumbersome, but autotools is really horrible. You can pretty much “get things done” with CMake in a way that’s usually almost impossible with autotools.
That said, I also agree with the objective points in your bullet list.
I’m used to typing stuff like make man && make install-man. At least from my Linux experience, the build system very often also takes care of installing docs (commonly in the form of building man pages and installing them in an appropriate location).
“Better docs integration” seems to me to be a “valid answer” to “What do you want to see in tomorrows build system”.
This is a good list (spoken as a maintainer of one build tool, these things come up a lot in evaluations). I’d add another one:
works (reasonably) well with external projects which may not share your choice of build tool. If you happen to need to build a local copy, as cpython does for several things, you don’t tend to want to rewrite the build system of someone else’s project, you have better things to do
I agree, and as someone who’s mostly going to use the build system and not maintain it, I’d support “being able to use the same build system to build the docs and the code” as something I’d like to see in any new build system.
We’ve been doing a similar thing at Google to build CPython for 8 years now. We use our Bazel derived build system for everything. This means writing our own BUILD (and .bzl Starlark macros) definitions for anything to build.
These are specific to our unique needs and environment and not something that’d be meaningful to share. We’re rarely retain the upstream build system of anything we use within that environment - not just CPython. Even projects using Bazel themselves in open source can’t be expected to integrate as is with our internal environment.
For other things where we do invoke the upstream build, so long as setting up what effectively amounts to a container with explicitly specified paths to all build tools (compiler, linker, make, make, ninja, etc) and their flags can be done, invoking any other build system as a single action within our larger one is doable. I’d expect that to be true for anyone. This is generally the least efficient way to build something. But does work.
So I don’t think integration of CPythons build system into anyone else’s system beyond that kind of “make it possible to specify all inputs and tooling and produce a fixed well known set of outputs” should ever be a requirement. Direct integration into the unseen unknowable is unsolvable. The build tool and it’s collective inputs is effectively your public build API.
Anything that is done (if done) should allow easy entry for new folks to be able to build CPython from source. Optimally any platform would be basically the same flow.
Right now when i build on Windows, I like being able to just run build.bat each time I make changes. On Linux I use ./configure then make -j8. Even then, on Linux, I’m almost always missing certain libs. Honestly it took a bit of messing around and reading to get to this point.
The lower the ‘cost of initial entry’ the better. There are some docs, yes, but its still scary for a new contributor.
That’s an interesting point. I run into it as well. I will, from time-to-time, run make distclean, which also nukes config.status. This requires me to go back and rediscover where one or two required libraries live. On my Mac, that’s currently OpenSSL. It’s not difficult to resolve, just mildly annoying. I can imagine it would be a significantly bigger problem for someone new to the Python dev environment.
In this case, does that mean that any new build system should know all the most common platform-dependent places to find library dependencies (say, /opt/homebrew/Cellar/...)?
if you think about it, this is actually pretty horrid from the point of
view of getting to build reproducibility… rootle around and look for
things, and make some choices based on what you find, and if after a
while that breaks, then just redo the rootle. Hoping the versions of the
things you find are usable ones…
Probably okay for just messing around trying things, though.
People have already grumped about autoconf, so I won’t further belabor
the point.
Before we put effort in trying CMake, I just came across a survey of C++ users by the C++ foundation. CMake was specifically called out as large pain point for most cross-platform C++ developers.