Building extension modules, the 2020 way


1 Like

I have no idea what you are suggesting here, sorry. You replied to my post, but I don’t see anything that’s relevant to what I said. Could you clarify?

How is the PEP517 citation relevant to what you said (and the rest of this thread)?

Perhaps there is a link to where the “optional hook” mechanism is specified.

Ah, I see. No there’s nothing specific, it’s just that PEP 517 notes that backends don’t have to provide all of the hooks (for example, get_requires_for_build_sdist). So installers have to be prepared for backends not having a hook, and any future extension to PEP 517 that adds a hook for editable installs can be handled by frontends in the same way.

1 Like

For my python project (cyminiball), I use Cython to create an extension module.

CI-tools such as tox suggest using pep517 for packaging. Therefore, I decided to give a try with the new pep517 tools. But like the OP, I had difficulties to find useful documentation about it.

The first steps were still relatively easy. I created pyproject.toml with the following lines, and I was able to build distributable packages (wheels and sdist) by calling python -m .

# pyproject.toml
requires = ["setuptools", "wheel", "Cython", "numpy"]
build-backend = "setuptools.build_meta"

To avoid cythonization during setup, I usually follow a two-step approach, nicely described in this blog post. The idea is to first cythnoize the .pyx (and .pyd) files, and to create an sdist that makes use of just the c/cpp files created by cython. This brings the benefit that package installation does not require cython and therefore is faster, while still being very portable.

For this purpose, I have to run twice when building the package in the legacy setuptools way:

python build_ext --inplace
python sdist bdist_wheel

For this to work, I need a little bit of logic in

cmdclass = {}
subcommand = sys.argv[1] if len(sys.argv) > 1 else None
if subcommand == "build_ext" or True:
    from Cython.Distutils import build_ext
    import Cython.Compiler.Options
    miniball_src = ["bindings/_miniball_wrap.pyx"]
    cmdclass["build_ext"] = build_ext
    # This uses the "pre-compiled" Cython output.
    miniball_src = ["bindings/_miniball_wrap.cpp"]


Question 1: Is it possible to configure this “two-step” approach using module pep517? The options --source and --binary only select the package type (sdist or wheel), but don’t work as lined out above.

Question 2: Will the at some point be substituted by pyproject.toml and pep517? If not, what will go into, and what belongs into pyproject.toml?

If it is, it’s because setuptools decided that and not because the Python ecosystem did.

Pyproject.toml contains the link to show that you’re using setuptools, and everything else comes from them, which might involve other sections in the pyproject.toml.

For example, you could use pymsbuild instead, which would still have a pyproject.toml but uses a different configuration file than (and is considerably easier for doing Cython, since that’s what I needed, but doesn’t yet compile C code with gcc…).

No, because the definition of how they work (with regards to your post) is entirely up to the build backend, setuptools in this case.

You might be able to trick it into working, but only by knowing how the backend works, which means you’re better off working WITH the backend rather than against it.


With the recent acceptance of the, once setuptools implements it, from the end-users point of view pyproject.toml will replace in most cases. Note, here the decision was made by the python ecosystem, not setuptools. The note to make here though is that only the core metadata is defined as should be in pyproject.toml, however backends (e.g. in this case setuptools) are free to continue using their own configuration files ( and setup.cfg). So in a sense is likely not going anywhere, but likely you’ll see more and more content in pyproject.toml and might only be needed/recommended for advanced use cases.

1 Like

That PEP doesn’t cover specifying the packages to include, which is arguably the most important part :wink:

If setuptools chooses to start accepting lists of source files in pyproject.toml etc, etc.

1 Like

For PyArrow we have a (see current version) that drives CMake from a build_ext subclass – and CMake itself drives Cython. It allows to build a wheel using python bdist_wheel. I don’t know if it’s a reasonable choice, however; it’s just what we managed to get working. :grimacing:


I’ve wanted to create a PEP 517 backend to drive CMake or Ninja and create platform wheels for a while now. But alas there are too many things I want to build, and I’m still hoping someone could work on it before I do.

1 Like

I can tell you that the challenge with that kind of backend is you still need to calculate all the command lines and config variables yourself and pass them in explicitly (for all platforms), because the best built-in support these tools have involves searching for a (different) copy of Python.

This is why I still haven’t enabled cross-plat support for pymsbuild :slightly_smiling_face: Far more effort than I have brainspace for right now.

1 Like

I owe the community a PEP517-style implementation of ‘editable’ installs for enscons. The intention in our stalled discussion is to produce a wheel that provides a stub module that redirects to the source. When you import the stub module it adds the correct paths to load the development code.

In the meantime it may still be possible to provide a that is only used for the develop command.

Update: I added editable installs for enscons, in a way that is compatible with @sbidoul’s