Building extension modules, the 2020 way


I have been doing some research on this topic, but I think that it is fair to say that the available documentation about how to compile and distribute python extensions (as in Python modules written in a compile language using the CPython C-API) is not in the best shape. It seem that there have been some relatively recent activity in this area and I have the impression that this has still not been found its way in the documentation (or, if it did, I must have overlooked it).

setuptools and “standard” driven build using ext_modules work fine for “easy” cases, however, I am working on a project that aims at distributing python extensions written in C++ with non trivial dependencies. I started to experiment with Meson (and its python module) to build the project and it works very nicely. However, I am a bit lost when it comes to distribute the package in source form and in wheel binary form: integrating any alternative build tool into setuptools seems not easy to do and impossible to maintain.

I have found mesonpep517 and while I haven’t tried it yet, I am still putting the pieces together, it seems to do exactly what I would like. However, I don’t understand how the development workflow is supposed to look like in this case, namely: what does replace the old python develop?

Am I looking at the right tools? Is PEP-517 intended to support also a development workflow? More generally, is there a document where I can learn about the large picture about Python packaging future?

Thank you!



Hello there, I’m maintaining a tiny extension module whose build process integrates CMake into setuptools, and I agree that it’s not at all easy to figure out a working config. I believe PEP 517 backends that directly uses the chosen build tool (e.g. Meson, CMake, etc.) should be the future, though I’m sorry that I’m not sure what’s the current state of each backend.

As for the workflow, since PEP 517/518 aims to provide build isolation, I think it’s intended to build and test the extensions in an isolated environment. Personally I run Tox if I modify the module, or install the module using pip and run the testing framework (pytest in my case) directly. Notice that editable install (i.e. develop in the case of setuptools) doesn’t work for extension modules due to the need of compilation.

As for documentation, I so far rely on PEP 517/518, the manylinux PEPs but I share the feeling that it isn’t sufficient.


I don’t think this is true. I use it routinely.

1 Like

Well technically you’re correct, the command does run, but it doesn’t keep your site up with the code base as you edit the code.

1 Like

The technical answer for how you type python develop in a PEP517 age is with pip in editable mode, I believe:

pip install --editable ./

As pointed out though, this will require re-running this command each time you need to re-compile your code.

I don’t know of specific resources to point you at though unfortunately, though I always find a good place to start.




Thank you @pelson, but this does not work here:

$ pip install -e .
ERROR: File "" not found. Directory cannot be installed in editable mode: /Users/daniele/src/foo
(A "pyproject.toml" file was found, but editable mode currently requires a based build.) 

Am I missing something?

1 Like

Editable installs are not supported with PEP 517 at this time.

1 Like

Suggesting another tool for static binary distribution may not be the welcome answer, but e.g. bazel can support CMake (though BUILD files may be sufficient) and has a python_binary build target.

" [Distutils] Re: pip and missing shared system system library" ::

Are you requesting an implementation of autotools / autoconf / pkg-config / libtool in Python, in setuptools?

Existing workarounds for building and distributing portable binaries:

W/ shared library dependencies:

  • auditwheel & manylinux

auditwheel show : shows external shared libraries that the wheel depends on (beyond the libraries included in the manylinux policies), and checks the extension modules for the use of versioned symbols that exceed the manylinux ABI.

auditwheel repair : copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns. “”"


Thank you @westurner, I don’t like Bazel that much, but neglecting this (important) aspect, how do you generate a distributable Python wheel with Bazel? I haven’t found a way. Meson is perfectly capable of generating the binary extension but there seem no defined story on how to generate a wheel from there.


I understand @pradyunsg, but how is someone supposed to develop a package using PEP 517? I tried and it seem to detect that the project it is building is in version control and only build from a copy of the sources without uncommited changes. While I see while this may be a nice property in some circumstances, it makes (test driven) development almost impossible even if someone could go through the build and install for each source code change.

1 Like

That will be due to your build tool, not pep517 itself or PEP 517 itself.

1 Like

Meson specifically says it builds the latest commit, and it always builds in a separate directory (ie no in-place building) in the documentation.

Unfortunately, for in-place development, the most supported solution is to use setuptools and, where you subclass Extension to build binaries with Meson then place the artefacts in the source. Sdists and wheels (native and manylinux) are then straightforward using the existing means

1 Like

Where does the documentation say that? I can build my project just fine with uncommited changes taken into account. It seems that the interaction with or mesonpep517 causes that.

In-place development is not a requirement, but it is the current workflow. How is the development story for the tools that are pushed to replace setuptools / Is anyone maintaining a real package using the new tools?

If I am reading the code right, subclassing Extension is not enough to place artifacts in the source directory. This requires modifying the build_ext command (in a rather fragile ways). Furthermore, placing artifacts in the source directory only helps for in-place development as wheel creation picks up binaries from the build directory. Do you know of a package that does what you describe and that I could use as a guide to implement it?

Thank you.

1 Like

I misread from another part of the documentation:

The main mechanism replacing is PEP 517. I’m not familiar with these tools which compile extension modules, however enscons and the mesonpep517 you suggested look promising.

Yes, I was thinking a subprocess then a copy relative to __file__. A far cry from the declarative intentions of setup. I did mean to say Distribution, not Extension.

On that, using is most likely not the 2021 approach, and it seems the ecosystem is moving toward solving extension-module development via PEP 517. For now, as I said above, the most support is for the method (with cffi, Cython, etc)

1 Like

Well, “replacing” should be “aiming at replacing” as ti seem that basic functionality is not implemented yet and the details of how the system is supposed to work are not quite defined yet.

enscons seems to be a dead project: the development was hosted on a Bitbucket Mercurial repository that no longer exist and I haven’t been able to find any other web presence for the project other than the PyPI page. All the project listed as to using it seem to be equivalently dead.

From my tests so far it seems that mesonpep517 is maybe a prototype, but it still fails on rather simple cases and it seems that it is not meant to support development workflow (or I am using it in the wrong way, but the documentation does not suggest the presence of any knob to turn to get it to behave as I would like).

Do you have an example? I am still not sure I follow what I mean. The only way I see to make it work is to replace the implementation of the build_ext command class with something that build the extension and then places it where the native build_ext would have placed it. Although, there doesn’t seem to be any documented “interface” for how the different parts of the setuptools build process interact, thus I am afraid that this solution is extremely fragile in respect to the version of setuptools, python (as the underlying distutils library has been distributed with python for a long time), or platform.

1 Like

It moved to GitHub recently and the links haven’t yet been updated

import setuptools
import distutils.cmd

class build_ext(distutils.cmd.Command):
    def initialize_options(self):

    def finalize_options(self):

    def run(self):
        import os
        import shutil
        import subprocess["meson", "build", "."], check=True)
            os.path.join("build", ""),
            os.path.join("src", "mypkg", ""),

setuptools.setup(cmdclass={"build_ext": build_ext}, ext_module=...)

Warning: completely untested

FWIW you may be interested in, where I build a Python extension based on cairo. On Linux and OSX I “cheat” by declaring a dependency on pycairo instead and stealing the shared library from it at runtime, but on Windows I just explicitly copy the dlls into the wheels (or the source directory, to support editable installs). This does require cairo to have already been built, but I guess otherwise that would just be a couple more subprocess calls to invoke the built tool?
(You may also be interested in matplotlib’s, which fully builds libfreetype with subprocess calls and then links it either statically or dynamically depending on setup configuration options.)
Although this requires a bit of hacking in I am quite happy to be able to do this all in stdlib/setuptools Python+subprocesses, rather than having to learn any build tool DSL or third-party library.

1 Like lists a few experimental and 3rd party methods for generating wheels with bazel which could be ported to ninja and/or meson:


1 Like

@pradyunsg - could you clarify a little please? I get the impression that there is some very specific detail which I’m missing, because I see PEP517 build dependencies being installed when I do an editable install with pip.

Is it literally “support” which is missing, the lack of “editable” standard definition in PEP517, or is there a whole swathe of functionality which doesn’t yet work? Do you have in sights a real “editable” extension capability, where you basically get to edit C code and have it compiled and available automatically, perhaps? Is it that in order for “editable” to be supported, all backends must implement it (not just setuptools)?

(please feel free to link me to the sources if it exists, rather than having to (re-)write tons of stuff)

Of course, the first thing that I would want to do with an editable install is to fix the binaries ala. auditwheel (Auditwheel repair... without the wheel (e.g. developer install)) - but given the separation of these stages with wheels, it is reasonable to expect there to be a separation for development mode also.


1 Like

PE 517 does not have a hook that allows a frontend to ask the backend to create an editable version of a project.

So front ends have to use backend-specific mechanisms (pip does this for setuptools only, running develop) or not support editable installs.

The amount of functionality an “editable” install gives you is backend-specific, and is nothing to do with PEP 517.

No, it would be individual backends that would cover that (or not).

No, PEP 517 has an “optional hook” mechanism. If a backend doesn’t support the editable hook, pip would just report “editable mode is not supported for this project” on projects using such a backend.