How to run tests on Python project with C/C++ extensions?

I have a Python project with a single Python module and single C++ module based on pybind11. I have configured it using pyproject.toml and setup.py to:

  • install it in editable mode using python -m pip -e .[dev]
  • build package using python -m build

The question is how I can test the project without installing it into the system? With pure Python module I can add the parent directory of the module into the PYTHONPATH and run pytest. Before pyproject.toml the proper way was run python setup.py test. What is the idiomatic way with new project configuration?

You cannot. A C/C++ extension must be compiled and this is often done during an install.

After pip install -e, you can then import the extension module in your test scripts.

There is no need to modify PYTHONPATH with this method, see Good Integration Practices — pytest documentation

3 Likes

Furthermore, a C/C++ extension compiled with pip install -e does not place the extension library into the system path (e.g., /use/local/lib, /path/to/python/lib). It is placed, IIRC, where the package’s source code is.

Therefore it is effectively not installed in a “common” path where it may affect other softwares on your OS.

1 Like

I argue a project should always be installed before testing, so what you’re testing becomes a better representation of what a user will have.

The trick is, you don’t have to install to your system: instead I recommend installing to a virtual environment. If you want to do this during development set-up, check out venv (standard library) or virtualenv. If not, check out tox.

4 Likes

Technically speaking, the standards don’t cover testing at all at the moment. As others point out, the idiomatic way to get the extension built for testing is to actually install the package. You have roughly two options there:

  • install in editable mode (pip install -e .) which roughly means placing the compiled extension in the source directory, effectively making it like “building” in the old terminology
  • install into a virtual environment and test from there

If your compiled extensions are inside Python package (i.e. are not top-level extensions), please place everything in a subdirectory (such as src) so that it doesn’t get imported implicitly. This makes sure that if someone chooses the virtual environment path, the source directory without compiled extensions won’t get onto PYTHONPATH implicitly and instead the installed version will be used.

For example, instead of frobnicate/__init__.py (and frobnicate/_ext.*.so) put it into src/frobnicate/__init__.py.

As for testing itself, you effectively need to call the specific test runner directly, e.g.:

python -m pytest
python -m unittest

Ideally, use a tool like tox that conveniently takes care of installing the project and running the tests in a virtualenv for multilple Python implementations.

2 Likes

Thanks you for the answers. Yes, it is clear that native module should be compiled before testing. Thing which confused me is that there is a build module to build the package under a virtual environment, but there is no “test” module to test the package under a virtual environment or test a package which is already built. Sure it can be done manually: build, make virtualenv, install, run tests, cleanup but it is strange there is no a standard tool for it.

tox is a nice candidate for such tool but it has its own configuration. Also I am not sure it is possible to test the packages which are already built with tox? Or vice versa is it possible to extract the packages which are built and tested by tox?

This just means nobody has built it yet.

We aren’t a corporate entity with project managers and huge budgets. We’re a group of volunteers doing stuff in our free time. Things are only as “standard” as the community declares them to be, and things are only built when someone desires it enough to spend their weekends on it.

2 Likes

I don’t know exactly what do you mean by extract but usually you can find tox’ “build artifact” under the folder .tox/.pkg/dist folder (I think it is configurable).

For example:

mkdir -p /tmp/myproj/src/myproj
cd /tmp/myproj

cat <<EOF > pyproject.toml
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"

[project]
name = "myproj"
version = "42"
optional-dependencies.test = ["pytest"]
EOF

touch src/myproj/__init__.py
mkdir tests

cat <<EOF > tests/test_myproj.py
import myproj

def test_myproj():
    assert myproj
EOF

cat <<EOF > tox.ini
[testenv]
extras = test
package = wheel
commands = pytest {posargs}
EOF

tox

ls .tox/.pkg/dist  # => myproj-42-py3-none-any.whl

Note that different kinds of “build artifacts” can be produced depending on your configuration (e.g. sdist, editable wheels…). That is why I included package = wheel.

1 Like

I am sorry if it sounds like someone must do something. I didn’t mean this. Just wanted to highlight this issue.

1 Like

Thanks @abravalheri, yes, it is what I mean by extracting. If I can copy artifacts from .tox directory then it works for me.

1 Like

Hi @vsbogd, please note however that if you want the C/C++ extensions that you are building to be compatible with a wide range of systems (e.g. manylinux),you are likely to require a more specialised build process, like the one provided by cibuildwheel.

1 Like

For the record, if you’re looking to use tox’s artifacts as a part of release pipeline, I would advise against that — it’s not a documented interface, i.e. something you can rely on.

I don’t think there’s a really good solution here. build is focused specifically on preparing build artifacts for publishing, and tox is focused specifically on testing your package. You could try to hack some rules to reuse build artifacts as part of tox environment, or to make tox invoke build and grab its artfacts.

Also note that setuptools right now tends to leave build tree inside the source directory, so in general subsequent invocation of build and tox will only build the extension once. Though I imagine it’s not the same as ensuring that you’re testing precisely what you’re about to publish.

1 Like

Thanks for the link to the cbuildwheel!

I have done something similar to what you’re asking about with a set of SWIG bindings that are generated by the same CMake build tooling as the C++ that is being bound, by changing to the build directory before invoking pytest. However, my experience is with something that uses the deprecated workflow of invoking setup.py directly, and the build/ directory is available after building finishes. PEP 517 does add some wrinkles with this sort of approach, which have been mentioned above, and getting around those wrinkles is probably more trouble than it’s worth.

I would say the best option is to build your package and install it into a (possibly ephemeral) venv to avoid installing “into the system”, and run your tests against that venv. tox is a useful way to automate that process (especially across multiple Python versions or other configurations), but not an obligatory one.

1 Like

With CMake I could update PYTHONPATH to allow loading library from the original location. But I would like to allow users making an editable install of the Python package. It is why I moved the building process into setuptools Extension. On the other hand I would also allow running tests without manually installing package into user environment or virtual environment. There is also a separate question of testing packages before publishing as @mgorny mentioned.

For now I think I will try building native module with CMake and call CMake from setuptools Extension. tox allows testing it on the development box without explicit installation. As an alternative one can use a editable install and then run tests manually calling pytest. Regarding release build testing the approach I think should work is using CI box, making distribution package, installing it on a box and running pytest. I hope cbuildwheel allows it for some platforms at least.

For the record, scikit-build seems to be the standard tool for combining CMake and setuptools. However, please note that I haven’t used it, so I don’t know if it’ll do what you need it to.

If you aren’t married to CMake, my personal suggestion would be to use meson instead. It comes with a PEP517 backend meson-python, so you wouldn’t have to use two build systems to get the best of both worlds.

1 Like

Yup, that’s basically exactly what cibuildwheel does, for all major PyPI-supported platforms.

1 Like