Custom build steps / Moving Bokeh off

Hi, I am interested to move Bokeh away from [1] in the near-ish future. However, Bokeh is a cross-language project with compiled TypeScript components that need to be identically included in all published packages (wheel, sdist, conda). Our current build automation does this to build the packages before publishing:

def build_sdist_packages(config: Config, system: System) -> ActionReturn:
    try:"python sdist --install-js --formats=gztar")
        return PASSED("sdist package build succeeded")
    except RuntimeError as e:
        return FAILED("sdist package build did NOT succeed", details=e.args)

def build_wheel_packages(config: Config, system: System) -> ActionReturn:
    try:"python bdist_wheel --install-js")
        return PASSED("wheel package build succeeded")
    except RuntimeError as e:
        return FAILED("wheel package build did NOT succeed", details=e.args)

The question comes down to that --install-js option that we pass in. That option is currently handled by code in our and what it does is copy an existing, built BokehJS into the Python source tree for inclusion in the package. Without that option, BokehJS gets built from scratch [2] every time. This is undesirable from a package automation standpoint:

  • It is somewhat time-consuming to build BokehJS, so it’s preferable to do it only once, rather than once for every package type build.
  • It is crucial that every package type (wheel, sdist, conda) has the exact same BokehJS files (i.e with identical hashes). While the risk of somehow getting slightly different TS build outputs from subsequent BokehJS builds is very small [3], any risk at all here is unacceptable. We simply must use a single source of truth for BokehJS across all packages.

So what are our options here? Looking at build it does not seem sufficient to support a customization like this. Are there other tools that support defining custom steps as part of a build, or have extension APIs that we can leverage? [4]

TLDR; How can we support custom build steps in two cases:

  • A default build should build BokehJS from scratch and move it into the Python package.
  • An “install-js” build should move a pre-buit BokehJS into the Python package.

  1. Why you shouldn't invoke directly ↩︎

  2. Essentially: cd bokehjs; node make ↩︎

  3. Maybe some datetime-dependent codegen is erroneously introduced somewhere, etc ↩︎

  4. Certainly we could “shell-script” our way out of this but I would much prefer to stick to community standard commands and tools to the extent possible. ↩︎


The “build backend interface” (PEP 517) offers a “config options” argument that tools can use to pass build configuration information like this to the backend. It was intended to cover this type of custom flag, but I’m pretty sure the setuptools build backend API doesn’t use it like this (yet?)

If you want to use standards-based tools to replace, then config_options would be the way to go - both pip and build have a UI to pass such settings to the backend. But you’ll need help from the setuptools project to implement the backend side of such custom flags. Until there’s something in place for that, I don’t think you can move off invoking

Although thinking further, I guess you could change your so that, as well as (or instead of) accepting an --install-js command line flag, you checked for an environment variable INSTALL_JS. That wouldn’t need the build API to be involved - you could just set the environment variable and invoke build. Would that be an option for you?

1 Like

Project Jupyter had a similar need and just switched to Hatch.

You could configure a custom build hook by making a file named by default

from import BuildHookInterface

class CustomHook(BuildHookInterface):
    def initialize(self, version, build_data):
        if self.target_name == 'wheel':
        elif self.target_name == 'sdist':

then in pyproject.toml put:


or to be explicit:


edit: also for:

Hatch creates reproducible sdists


Hi @bryevdv, please note that having a file is not deprecated per se. You can still use to customise setuptools commands and build steps… the part that is deprecated is executing it as a script.

You can use setup.cfg sections to pass options to the commands. Maybe you could try that? If that does not work for you, you can also try to use the config-setting in the build command line to pass options…

Hi all, thanks for the replies. Some comments/questions

@abravalheri I am trying to get rid of setup.cfg as well. :slight_smile: I think at this point everything still in there can go in a pyproject.toml and I very much want to reduce the scatter of configurations to make it easier for future contributors. I will say

please note that having a file is not deprecated per se.

This is actually confusing messaging to me. I don’t use for anything other than install, develop, sdist etc. So if I’m not using it for those things in the future, I don’t understand why it would be kept around. I guess that’s the frontend/backend thing, so other tools can call But that is also confusing, if commands are going away, why is a script necessary just to define some metadata. Anyway, I digress.

@ofek Hatch looks interesting and promising, I will definitely take a close look! Thank you for the reference.

@pf_moore I suppose an env var could be an option, maybe the simplest thing in the short term. I will experiment. Regarding the config_options are there any relevant issues or PRs that I can follow?

I’m somewhat confused here. Presumably, is where you’re defining your custom logic to handle the --install-js option. If you want to continue using setuptools, you’ll still need a to hold that logic. What’s deprecated is not having a, but rather running it, as a script.

I don’t know. The setuptools maintainers can probably point you at any documentation that exists for how they handle config_options, and how that ties in with customisations like your --install-js. Or if that’s not yet supported, then maybe they’ll know of any feature requests or PRs to add it. @abravalheri can you help?

To be honest, though, hatch with a custom hook to replace your --install-js code may well be your best approach longer term.

The idea is that not always you can manage to do everything using only a descriptive approach, for some small number of use cases you will need to write some Python code with “build time logic”. I assume that this is also the reason behind custom hooks in hatch.

The file can still be used for that, nothing changes in that regard.

If you really want to get rid of setup.cfg, there is an experimental feature right now that you can use:

  • The equivalent of [sdist] in setup.cfg would be [tool.distutils.sdist] in pyproject.toml (with the appropriate INI => TOML syntax changes).
  • This is not stable (so far I haven’t received any feedback, and to be sincere the naming is not great), so likely to change in the future.

However, since you are providing your own implementation for the --install-js flag, you are not limited to this form of passing arguments…

For example, you can read the file yourself and be in total control of the situation regardless of the changes in setuptools:

from pathlib import Path
from setuptools import Command

import tomli  # Dependency to be added to `[build-system] requires`

project_dir = Path(__file__).parent

class YourCommand(Command):
    def finalize_options(self):
        if self.install_js is None:
             config = tomli.loads((project_dir / "pyproject.toml").read_text("utf-8"))
             self.install_js = config.get("tool", {}).get("yourtool", {}).get("install_js")

(disclamer: untest example, might need some iterations to get it right)

When using build as a frontend, there is a hint on how that can be done in Wheel tags · Issue #202 · pypa/build · GitHub for the --python-tag option of bdist_wheel.

Please feel free to open an issue/PR if you need other features.

I suppose in the long term both solutions should be fine.
If you feel like switching to hatch will be a good thing for your project, go ahead.
If you feel like setuptools is still useful for you and can minimise the amount of changes you have to implement, you can also go for it.

There is no plan to remove support for customisations on, the only thing that is being deprecated is the ability of using python as a CLI tool.

1 Like

I’m glad this question is being discussed here. In the past similar issues were raised for Panel and sphinx_rtd_theme. In Nixpkgs we notice these type of issues directly as it results in failed builds.

We prefer to have the build be pure, that is, there is no network access. Basically, that means that when creating a wheel the artifacts should already be there. In case of an sdist, it is my opinion the same should apply. Basically, that means that any artifact collecting should be done prior, outside of the build step.

To avoid artifacts in the repository (like node_modules) I think the best solution is to have a git submodule that contains them or a simple script that can be invoked to create the artifacts prior to using a build frontend for building a wheel or sdist.

There is an increasing amount of packages that would like to package these kind of artifacts. I think it is important that in the packaging user guide we discourage the bundling of artifacts during the build step.

Maybe we want to at some point standardize some kind of entry point for impure build steps so that distributors know there is an impure build step that they need to handle.

Thanks for sharing thoughts @FRidh but I do not agree that those ideas are universally applicable.

I think the best solution is to have a git submodule

Bokeh has used a monorepo for over ten years and there is zero chance we would move away from that. All the most active contributors prefer it, the two “halves” of the project need to be kept in lock sync so having unified commits is vastly preferable. A submodule adds complexity but would buy nothing of note for us (negative value, really) so it’s a non-starter. [1]

I think it is important that in the packaging user guide we discourage the bundling of artifacts during the build step.

I suppose this just comes down to a philosophical difference about where complexity should be distributed. Bokeh has two halves, but it is a single project. We want a single build tool invocation for the project as a whole that can generate everything, in one go, in a repeatable manner. In one sense I agree: We want to build BokehJS once, up front. But we don’t want the BokehJS build to install into the Python source tree, and we also don’t want more steps to explicitly coordinate. I want to point the package build at all the pieces and just say “put everything together”.

But also maybe we are using terminology differently. It’s hard to tell.

  1. In fact, Bokeh started off with submodules but we switched to monorepo after a very short time. It made development (and especially onboarding new contributors) much simpler. ↩︎


I guess what I am saying is that (speculation) for the vast number of users, for the last many years, those two things have been completely identical and indistinguishable. Maybe it would have been cleaner (conceptually) to in fact just deprecate entirely, and stipulate a new preferred module for the “backend-only” setuptools to consume going forward, because the the current messaging (to me as a plain user) has definitely left me confused on points. But I’m veering off topic at this point.


Interesting to hear you used a submodule in the past. Right, if the assets need to be updated regularly when changes in the Python code occur then that is definitely not going to work.

From a development point of view I understand. You want one entry point to build your entire project. This just gets hard with polyglot projects.

I was chatting with a meson developer about this a bit. If meson were to be used, you could put the npm part in a subproject. That subproject likely would do some run_command invocations, preferably splitting the impure parts (such as downloading with npm) into a separate invocation so they can be easily identified. Subprojects can embed their sources or binaries, which in your case are the node_modules. Downstreams can disable the use of embedded sources if they want to with a flag.

@pradyunsg showed a tool they wrote, GitHub - pradyunsg/sphinx-theme-builder: Streamline the Sphinx theme development workflow. It’s a build-backend specifically for sphinx themes. While I am not sure whether a backend is the right solution for solving this issue, I very much like that it standardizes things. It also comes with a cli for managing those types of projects, including scaffolding using stb new. I wonder whether it would be good to have a template for nodejs + Python packages, say using meson.

(Note I keep pushing for meson because I am afraid we’re otherwise going to see an exponential increase in build systems.)

1 Like

Let’s hold off on advocacy until we standardize, otherwise it’s still just lock-in.

Speaking from the perspective of the Jupyter project, we’d rather not force all Jupyter extension authors to learn a new build system (meson). That’s why we made jupyter-packaging originally, to abstract the hard parts of setuptools. The new hatch_jupyter_builder plugin will allow extension authors to use declarative config in pyproject.toml and ensure that their JS assets are built and included.


First off, to make this as useful for @bryevdv quickly… Broadly, I’m suggesting changing your release build process from:

npm make build
python sdist --install-js
python bdist_wheel --install-js


npm make build

My concrete suggestions are:

  • Don’t try to remove for now. The blog post you’d linked to as motivation is literally titled “Why you shouldn’t invoke directly” and not “Why you should get rid of from your project”. There’s a good reason for the specific wording there.

  • Stop invoking python ... and instead use python -m build/pip directly.

  • Use an environment variable instead of the --install-js flag. When the environment variable is set and BokehJS is not built locally in the relevant location, error out. If it isn’t set, you can keep the existing behaviour of invoking npm make build.

    The build-system tooling for Python has build configuration mechanisms, but you don’t need them for your usecase (as far as I can tell) – you can move the responsibility of passing this configuration “boolean” to the OS, instead of the Python packaging tooling.

  • There are alternatives to setuptools available but Bokeh doesn’t need them – they can provide a developer experience improvement but switching to them is not a requirement and can bring its own “growing” pains + migration costs.

As for improvements you could make to your build system, I have a one suggestion: Move the logic that invokes npm make build in and performs the copy of the built JS, into a build_py subclass and override the default build_py class with it (using setuptools.setup's cmdclass argument) – see “Extending the build through an override” below for details.

There’s a few things in the discussion already, so I’m gonna try and group them:

  • Moving off of

    Realistically, is not going away as a way to configure Python package builds. It has been here for more than a decade, and will be around for likely longer. OTOH, it gives every user a Turing-complete mechanism to describe every possible key-value pair, which is far from ideal.

    That said, we do want people to stop doing install and sdist bdist_wheel and move to pip install . and python -m build – they do a few more things to ensure that builds happen correctly and are better solutions in terms of interoperability and available maintainance bandwidth. See also blog post noted above.

    Personally, I’d like package authors to describe as much of their metadata statically as feasible, in files that don’t need to be executed with a Turing-complete thing to parse and for dependency resolution mechanisms for Python to be able to get this information cheaply.[1]

    Today, this information can be specified statically in the [project] table in pyproject.toml (which is backed by a interoperability standard) and setup.cfg (which is implementation-defined, as are most legacy things in Python Packaging) but neither is used during dependency resolution today. There’s some tooling advantages, eg: it’s easier to parse/modify those files than a file using an automated tool.

  • Adding a custom build step to setuptools

    1. Moving the build-logic into a dedicated project

      A demonstration for doing this is available in setuptools’ issue tracker, written by one of the maintainers: Support for custom build steps · Issue #2591 · pypa/setuptools · GitHub.

    2. Extending the build through an override (this is what I recommended above)

      You can extend an existing build_py command in setuptools, using cmdclass and do additional build work in there. This has the advantage of being an intended point of extension for the setuptools build system and eliminates the need to look at sys.argv at any point. :slight_smile:

      I recently did something like this in Memray for an example of that (full disclosure: that’s an OSS project from work). That project builds JS assets using an npm run-script build command – it extends build_ext, you can extend build_py since you don’t have extension code. That project has C++, JS and Python build systems and was a fun one to get building correctly.

  • Changing to an alternative build-backend

    As noted by a bunch of folks already, there’s a lot of alternatives available for setuptools today. None of them were popular late last year, except for Poetry which does not have the extensibility you need anyway (AFAIK).

    In my opinion, what you’re seeing is well-meaning enthusiasm (and skepticism) from the folks around here, about the new build-backends in Python’s packaging ecosystem. In broad strokes, it took a lot of effort to get to this point and folks prefer the newer build-backends over setuptools for both “simple” and “complex” use cases; since they’re being built without the backwards compatibility constraints of setuptools and are able to innovate + improve various aspects of the developer experience.

    I’m not familar with any of the ones relevant for this discussion as a regular user though – so no real suggestions on that front. Mostly just wanted to provide context for why alternatives to setuptools are being enthusiastically mentioned. :slight_smile:

  1. I know a bunch of other folks want this too but I don’t wanna speak for anyone else. ↩︎


@pradyunsg Thanks for the detailed feedback. I have followed your approach in

I did have a few questions at the end of the PR, in case you (or anyone else) has a few minutes to offer any comments.

I just wanted to circle back on this and give an update for anyone who might be interested in the details, since we did manage to get to a happy place eventually:

A few other miscellaneous cleanups were part of that PR, here are the relevant bits related to this discussion:

  • Conda 3.22 was just released, adding load_file_data function that can read toml files, so we can template the conda recipe dependencies directly from pyproject.toml. The conda recipe also now just builds from the wheel, which is simpler.

    Combined with setuptools support for pyproject.toml this means that all of our runtime dependencies can finally be specified in one place. We even have a Sphinx extension for including dependencies list in the docs.

  • We got rid of versioneer and all the vendored files it adds. We switched to using setuptools-git-versioning instead. We did look at setuptools-scm but could not make its automagical dev version behavior work for us.

  • We updated our to control the JS build via a BOKEHJS_ACTION environment variable in stead of command line args. This is important to be able to have single source of truth for BokehJS across all package types.

  • Lots old code could be deleted from, so we rolled all our previous helper functions in a directly back into and then made a custom build sub-command to wrap it up, now that it is possible to customize build directly.

    Even pulling everything into one file, the is still <150 LOC (including the colorama reporting code I can’t give up :smile:)

  • Since the only thing left in setup.cfg was Flake8 configuration, we moved the file to .flake8 which is hidden at least. Maybe some day Flake8 will add pyproject.toml support but I’m not too hopeful since they seem pretty aggressively against it. Still the top level of the repo is much cleaner now!

Putting things together, our dev workflow looks like:

Non-editable local install

# Build and use fresh BokehJS
pip install . 

# Use existing (already built) BokehJS
BOKEHJS_ACTION="install" pip install .

Editable install (editable for Python modules only)

# Build and use fresh BokehJS 
pip install -e . 

# Use existing (already built) BokehJS 
BOKEHJS_ACTION="install" pip install -e .

My only very tiny complaint is that in order to see our nice BokehJS build report output without being buried in the voluminous output from setuptools, we have to invoke like:

BOKEHJS_ACTION="install" pip -v install . --config-settings quiet=true

I do hope maybe this could be improved in the future (e.g. via some logging API we could hook into instead of just using print)

For releases, our automation does something more or less like this:

cd bokehjs
node make build
cd ..

# use the single already-built BokehJS for both
BOKEHJS_ACTION=install python -m build -s -w .

# conda-build using the wheel with the already-built BokehJS
VERSION="3.0.0" conda build conda.recipe  --no-test --output-folder /tmp

which satisfies our main critical requirement. If I could improve one thing here, it would be to find a way to not need to coordinate the VERSION to conda build explicitly, but so far I have not found a way.

All in all, it was a net deletion of ~2700 LOC which is always a good feeling! Many thanks to @pradyunsg and @abravalheri for their help and guidance about the custom build command.


If you drop the -s -w, build will build the wheel from the sdist instead of from the provided directory. This means the sdist is automatically verified during part of your build process

1 Like

@EpicWink I think you have actually led to uncovering a slight snag. If I run

BOKEHJS_ACTION=python -m build

then the resulting sdist (and hence wheel) are broken: they are missing the BokehJS files that were supposed to be installed. It seems that the build step is not called when making an sdist? I guess that makes sense but I am not sure what the best course of action is for this problem in our case (@abravalheri @pradyunsg any thoughts appreciated)

Edit: in the previous instance, the sdist was presumably still broken, but the wheel was not since it called build from the directory (not packaged sdist) and thus did copy the necessary BokehJS files over.

Well, maybe more than a little snag. :confused: I guess as of the PR we had: working wheel but broken sdist in CI. I’ve tried several things to get the sdist working but nothing I have done has let to pick up the JS files that are in the source tree (even though they are listed in the manifest they still seem to get ignored). Is there any reason why it would be bad to just skip sdists altogether? I can build working wheels so we could just publish those and call it a day.

What is even weirder is that the python -m build . produces very different results locally than it does in CI, but I really have no idea why that could be. Running locally includes many more directories and fies, including the BokehJS files. We are using build isolation and setuptools >= 64 in both places.

That makes sense. Once you start building and compiling, that’s not really a “source distribution” anymore. In my opinion, an sdist doesn’t need to be much more than a Git checkout, minus some unneeded files (CI config, docs source, generated code,
and controversially tests).

I’m not sure why the JS files are not picked up after declaring them in, perhaps a bug with the version of setuptools-git-versioning in CI.

Not distributing sdists means users on unsupported platforms can’t install the package, and companies with a policy (eg Linux distros) to use the source aren’t allowed to use the package.

Usually an environment mismatch, either versions are different or tools/libraries are not present in CI (including pip and system libraries).