Use case supported? 2 namespace packages in single source repo with 1 setup.py

I’ve been struggling to get a pip install -e . to work for this scenario with namespace packages … and realizing that maybe it’s just not a supported use case. Can someone clarify?

In plain language, basically trying to put modules for the same logical package at different directory paths in a single source repo. See below.

Most examples that I’ve seen for packaging up things with namespace packages I think assumes that the packages live in different source roots. Or that each incarnation of the namespace package has its own pip distribution being created. In both of those cases, that would lead to two different setup.py files that are packaging different “parts” that happen to want their code namespaced in the same packages.

The net-net of what happens is that if I use the structure below and try to pip install -e . the package for local development is that the .../site-packages/easy-install.pth file only has an entry for /project/repo, since that’s where the setup.py lived and that’s where you told pip the code was (with .).

So specific questions:

1. Can the same directory representing a namespace show up in different paths of a single project?
2. If so, can there be a single setup.py at the root of a project that packags the whole project, and accounts correctly for the namespace package incarnations?
3. If so, how do the package_dir and packages values get set in setuptools.setup(...) in that setup.py?

/project/repo
├── libs
│   ├── my_namespace_pkg
│   │   ├── foo
│   │   │   ├── foo_module.py
│   │   │   └── __init_.py
│   │   └── bar
│   │       ├── bar_module.py
│   │       └── __init_.py
├── src
│   └── my_namespace_pkg
│       ├── baz
│       │   ├── baz_module.py
│       │   └── __init_.py
│       └── qux
│           ├── qux_module.py
│           └── __init_.py
└── setup.py

I’m not able to answer your question directly, but it seems to me that this is a pretty unusual setup, and I’d be unsurprised if there were problems - maybe bugs, maybe things that on reflection we don’t want to support. Or maybe it “just works” - but uour experience seems to indicate otherwise.

But one check I would do, is to ask how you’d expect your project to get packaged as a wheel - and does it actually get built that way? And if you install that wheel, does the project work? If it does, then I’d say that this is something that you’d need to discuss with the setuptools project to explore whether it’s a supported case for editable mode. If your wheel doesn’t look like you would expect, or doesn’t work when installed, then maybe the problem is more generic.

Thanks Paul – with respect to “how is it packaged as a wheel”, really this is a web app project that is not going to be distributed as a package. But we’re looking to follow the best-practice approach of ensuring you can self-reference (import) your own code, for things like unit tests, running scripts in the code, and the like.

There’s been recent migration towards the “src/ layout” of projects [1][2] and my attempt was riffing on that. Separating e.g. src/ from tests/ but going a few steps further and separating other concerns into top-level project folders (config/, automation/, tools/ … what-have-you). While those concerns are separate, it can be helpful that any python code under those directories be importable from any other of the directories, even though they’re not under one uber python package – and I saw namespace packages as a potential way to do that.

I haven’t been able to get it to work though.

The alternative is to put all those top-level folders into one all-encompassing directory, which becomes a top-level python package. The trouble there is your project is harder to navigate in a logical structure as you get right to the source code from the very first directory.

e.g. combining the namespace above would lead to (replace libs/ with any concern you are trying to separate).

/project/my-namespace-pkg
└── my_namespace_pkg
    ├── __init__.py
    ├── setup.py
    ├── libs
    │   ├── foo
    │   │   ├── foo_module.py
    │   │   └── __init_.py
    │   └── bar
    │       ├── bar_module.py
    │       └── __init_.py
    └── src
        ├── baz
        │   ├── baz_module.py
        │   └── __init_.py
        └── qux
            ├── qux_module.py
            └── __init_.py

And that doesn’t make sense to have .src. as python package, so get rid of that. Leading to:

/project/my-namespace-pkg
└── my_namespace_pkg
    ├── __init__.py
    ├── setup.py
    ├── libs
    │   ├── foo
    │   │   ├── foo_module.py
    │   │   └── __init_.py
    │   └── bar
    │       ├── bar_module.py
    │       └── __init_.py
    ├── baz
    │   ├── baz_module.py
    │   └── __init_.py
    └── qux
        ├── qux_module.py
        └── __init_.py

… And then you’ve kindof lost any top-level organization you wanted to have, as python packages and module .py files are getting conflated with other concerns that may have source-controlled artifacts other than python code (docs or tests or config or infra-as-code).

Seems like the src/ layout helps with this, but still forces you to put all your python code and packages under the src folder only.

  • Maybe all python code under src/ (and precursors of this) is the cleanest way and most pythonic way to do it, and I need to give into that.
  • Or maybe I need to realize that code that is separated into different directories should be packaged separately too, and I need to create multiple setup.py files.

[1] https://github.com/pypa/packaging.python.org/issues/320
[2] https://bskinn.github.io/My-How-Why-Pyproject-Src/

Interesting - Python’s packaging tools are fairly fundamentally designed around the idea of developing and publishing a package. (Many things that aren’t obviously packages can be treated as such - many applications are distributed that way, as are a lot of web applications, as I understand it).

So it’s quite possible that you’ll hit some rough edges if you don’t buy into that basic idea, even if it’s not precisely aligned with your needs. Unfortunately, the story for distributing Python code that isn’t structured as a package is fairly weak (possibly because “think of it like a package” works well enough for so many cases), so you may not find an alternative that’s much better…

So sorry, I don’t have any good answers here. But I’m not a web developer, so take what I say with a pinch of salt - hopefully others will have some more specific suggestions for you.

The idea of the src directory is to separate “things that should be distributed” from “things that should not”. It is meant to be added to your PYTHONPATH variable to get your package importable while testing - it isn’t supposed to be a namespace package.

I’m pretty sure all you’re after here is more search paths (via PYTHONPATH or updating sys.path at runtime), though tbh I think you’ll be fine putting everything under a top level package. For libs that should be loaded by their own top level name, separate out to a separate repo entirely and treat them like any other dependency.

Namespace packages are only useful where you have multiple packages (sdists/wheels, rather than importable modules) that can be installed/uninstaller independently. I posted some more about this recently in How to best structure a large project into multiple installable packages.

But as I said, I don’t think you want this. Stick to one big package for all your self-contained code, treat libraries as libraries, and use the simplest packaging approach possible.

If you are still interested in that, I have a setup.py that I believe fulfills these requirements (sdist and wheel should be installable). But it does not work well with editable installations, not because of namespaces directly, but because of all the package_dir modifications it requires. The editable installations can deal with the typical package_dir modification required for a src-layout (package_dir = {'': 'src'}) but anything more than that and it gets impossible.

Now as others have already said: I would not recommend any of that. Your actual requirements do not seem to have a stand for such an unusual directory structure. Stick to a standard src-layout and then split your code into normal subpackages (I do not even see the need for namespace packages here): my_project.core, my_project.tools, etc.


import setuptools

def _main():
    all_packages = []
    package_dir = {}
    for source_dir_name in ['src', 'libs']:
        packages = setuptools.find_namespace_packages(where=source_dir_name)
        for package in packages:
            package_dir[package] = '{}/{}'.format(
                source_dir_name,
                package.replace('.', '/'),
            )
        all_packages.extend(packages)

    setuptools.setup(
        packages=all_packages,
        package_dir=package_dir,
        # see 'setup.cfg' for the rest
    )

if __name__ == '__main__':
    _main()

Not fully tested, but it seemed like it would work. No guarantees. There are probably better ways.


Also out of curiosity: what is this story of “it’s a web app, so I do not need to package it correctly” that keeps popping up? Where does that come from?
(By doing that it seems like you are excluding yourself from a huge variety of Python tools that rely on correct packaging.)

So, yes - the goals are twofold:

  1. better separation of concerns/components among a project’s diverse codebase, while importing code from across components under the same root package/namespace.
  2. automatic search path (while developers are developing, they don’t have to inject a PYTHONPATH into their shell, or add sys.path.insert(0 ... in code in order for the code underdevelopment to be recognized. When setting up a virtual environment, the paths to all the source roots get added to the sys.path and thing just work.

Before you pointed out the thread mentioning the Azure SDK for Python, I ended up finding a similar solution, embedding setup.py at each component subdirectory’s source root. And then they each share a common base package that is a namespace package (like azure in that project). There are few of them that I don’t need an uber setup.py, and just initiate packaging via a Makefile.

src/ and libs/ probably weren’t the best names to use, especially libs/. It’s all code related to the same project, just componentized. E.g. web APIs, background processes, DevOps scripts, data pipelines / ETLs, etc.

I may end up throwing it back under one single wide source root if this experiment doesn’t pan out well. The hypothesis being tested here is that in an expansive project, it is easier to reason about where to store things and find things (things that aren’t always python code) if you focus first on the concerns being addressed by the version controlled assets, come up with a top-level set of folders that address those concerns, then add in python code beneath those top-level folders where python code or a python app or script can serve a purpose.

1 Like

Yeah, I had a similar routine in setup.py but it kept yielding an issue where there were effectively two or more keys with '' as the source root, because there were multiple source roots. Brings me back to the conclusion that there needs to be 1 setup.py per source root.

Not that it doesn’t need to be packaged correctly. The namespacing (packages, folders, modules) is all kosher. Just that we don’t distribute it as a reusable package for others beyond publishing the code to our version control system.

Hatch makes file inclusion a lot easier Build - Hatch

You can just do e.g.:

[tool.hatch.build.targets.wheel]
packages = [
  "libs/my_namespace_pkg",
  "src/my_namespace_pkg",
]

Editable installs will work too