Do we want to keep the `[build-system]` default for `pyproject.toml`?

I had someone at work ask me why something called setuptools was yelling at them about “Multiple top-level modules discovered in a flat-layout”. It turned out they had done pip install . with a directory that had a project with a couple of Python files in it as they didn’t put all their files into a package or src/.

I realized this came about because of the default we have for [build-system] and they are not enough of a Python developer to understand what the error meant. Should we consider dropping the default and making it an opt-in via your installer? That would have probably made the failure a bit more obvious as pip could say, “please define a [build-system] table in pyproject.toml”.

4 Likes

The default for [build-system] is required for backwards compatibility for the huge number of past, and probably many current, projects that only provide a setup.cfg and/or setup.py, pip just stopped special casing setuptools specific files.

But that doesn’t preclude a better error hint for building when building a source tree without a [build-system] or pyproject.toml and hitting a failure.

5 Likes

IMO, it would be a lot better for setuptools to error out with a clear error message, if there isn’t enough explicit metadata to know what name and version is being built, instead of trying to provide a no-configuration experience.

I do think that a no configuration experience like that ought to live in a different backend, rather than the implicit default.

5 Likes

What were they expecting it to do? Was this supposed to be a pip installable project or just a collection of modules and a pyproject.toml without any [project]/[build-system] config and this someone was somehow misled into thinking pip install . might do something meaningful?

4 Likes

I think Brett was saying that the user has a simple enough package that setuptools was working, but they didn’t understand “why” or “how” their package was being built. When they made changes that resulted in setuptools errors and warnings, the name setuptools appeared to this user for the first time.

That implies that there’s not a setup.py or setup.cfg? Or were those in place, but created by some other team member?


I can’t imagine flat out dropping the default for legacy packages with setup.py/setup.cfg.

So I’m not sure I’m clear on the idea and what it would mean for this to become opt-in. Can’t build frontends (including installers) warn in this situation today? So pip could print that message, but still support the default setuptools behavior?

1 Like

For full context, this is how the error message looks like:

Multiple top-level {packages/modules} discovered in a flat-layout: {list of packages/modules}.

            To avoid accidental inclusion of unwanted files or directories,
            setuptools will not proceed with this build.

            If you are trying to create a single distribution with multiple {packages/modules}
            on purpose, you should not rely on automatic discovery.
            Instead, consider the following options:

            1. set up custom discovery (`find` directive with `include` or `exclude`)
            2. use a `src-layout`
            3. explicitly set `py_modules` or `packages` with a list of names

            To find more information, look for "package discovery" on setuptools docs.

Now something that I personally don’t think should be in the error message is if you don’t want to install the project, don’t run pip install ..

For the time being this is not the direction the project wants to follow.

2 Likes

Did the user intend to install the contents of the directory?

If yes, then the error message from setuptools should provide helpful hints about what went wrong. There’s always room for improvement here, and suggestions are welcome.

If not, it doesn’t seem reasonable for setuptools to try to guess that despite running pip install . the user did not want the project to be installed.

I’m aware as I helped write the spec. :wink: But the point I’m going to make below is I think it’s different when you are doing pip install . and pointing directly at a project versus indirectly trying to make some sdist or source tree dependency work.

To get their dependencies installed. I realized last night the reason they were taken aback is that uv sync doesn’t run setuptools if [build-system] isn’t set and just installs one’s dependencies. So they were fine when doing .venv/bin/python app.py as the dependencies got installed.

Actually, Anderson was right with the error message they saw.

Not specifically; they just wanted it to work.

I think the bigger thing here is that pip install . is the only way to install from project.dependencies, but that unconditionally tries to install your project code as well. There’s Add `--only-deps` (and `--only-build-deps`) option(s) · Issue #11440 · pypa/pip · GitHub for suggesting providing a flag to skip installing, but I’m wondering if uv’s approach of skipping the project install when you are pointing directly at a source tree with no build back-end defined makes sense.

2 Likes

Can you elaborate on the difference you mean between “pointing directly at a project” and “source tree”? It is not clear to me.

If I do pip install <dir>, then I’m pointing pip directly at <dir>. And by “source tree” I mean Source distribution format - Python Packaging User Guide .

Still not sure what the distinction is sorry, given pip only builds using PEP 517 now, when is <dir> not a “source tree” in this context?

I think that is the problem here. The user has somehow misunderstood what pip install . does - possibly by an incorrect analogy with uv sync.

As far as I know (please tell me if I’m wrong so that we can fix it!) pip’s documentation makes no suggestion that pip install <something> will ever not install <something>. The fact that uv allows “non-package” directories that contain a pyproject.toml where you can ask to install what’s in dependencies without installing the “project” is a UI design that uv chose to implement, but it’s not standardised - there have been discussions here about such “non-package projects”, but no consensus or proposal has yet emerged.

PEP 517 is the relevant standard here, and it’s the one that states that when there’s no build-backend key, tools should invoke setuptools. As a standards-based installer, this is the behaviour pip implements (and IMO should continue to implement, until PEP 517 is superseded by a new standard).

IMO, that sort of change of behaviour based on something that subtle is a fairly significant bug magnet. I don’t even know what “pointing directly at a source tree” means - is https://github.com/user/project/main.zip a source tree under your definition? Or \\file\share\project_libs\core_utils?

Pip’s answer to this feature request is pip install --only-deps .. But (a) the exact semantics are still under discussion, and (b) it still works on the model that . is an installable package source tree, not some sort of adhoc “project” that isn’t intended to be built into a wheel.

3 Likes

Sorry, I didn’t mean to suggest it wasn’t a source tree. What I meant to suggest is it should only matter when you point pip at a source tree.

I don’t think it does, but unfortunately blog posts and AI don’t spell that out (these folks are C# developers).

Yep, me bringing this up to see what people currently think about the default we have in the spec and the general expectation around it and whether we should change it.

Pointing pip at a directory that has a pyproject.toml.

That gets into an interesting question of zip files and directories and inline script metadata as well; what is the “scope” of this metadata we have standadized in terms of the file system?

I think there’s several issues:

  1. Setuptool’s heuristic for non src/ based layouts of picking the only directory or module as the thing to package is fragile in counterintuitive ways. Just having a script or directory in the root of the project that isn’t in the exclude list and doesn’t have a . prefix is enough to trigger the Multiple top-level {packages/modules} error. Flit’s model of enforcing import name == install name or the heuristic that hatchling uses of assuming the distribution and import names are the same[1] are much more robust in this regard.

  2. That Multiple top-level {packages/modules} error is tailored towards the very rare case of the user legitimately wanting multiple top level packages in a wheel rather than the much more common scenario of the user merely having a loose directory or scratch file. Maybe it would have worked out better if the error message was tailored towards that scenario, listing the things that it thought might be packages and suggesting to use:

    # pyproject.toml
    [tool.setuptools.packages.find]
    include = ["the-right-one"]
    
  3. There’s the misassumption about what pip install . does.

  4. And then there is the issue that having an implicit backend causes – that the user can see a setuptools error without setuptools being mentioned anywhere in the project.

Overall I think 4 is the least significant. It sounds unlikely to me that the user would think to run git grep setuptools, find the reference to it in [build-system] and then be any closer to solving their issue for it.


  1. or sluggify to the same name ↩︎

2 Likes

I think this is a good idea – but I think the opt-in should be via “is there build-related infromation?” instead. In particular, I do think it’s reasonable to allow build frontends to have flexibility on the behaviour here to provide a better user experience.

Even more specificially, I think what we should try to enable is allowing a build frontend to refuse to build if a [project] table or [build-system] table does not exist (and a pyproject.toml does). uv prints a warning for this case today and I’d be personally happy for pip to make this an error as well.

IIUC, that would be sufficient to allow installers to provide the appropriate user experience here - which IMO is an error with guidance on how to add proper metadata.

(yes, I’ve changed my position on where this UX fix should live)

Update: I’ve filed Allow errors for missing `[build-system]` table with no metadata by pradyunsg · Pull Request #1944 · pypa/packaging.python.org · GitHub as a specification update, to permit this.

4 Likes

I think what pip might do for these cases is instead present a message to indicate that it’s using setuptools because it’s the implicit default, rather than changing behaviour for this – to fully defer to setuptools on control of the specific mechanics of the build. We have a few too many projects “in the wild” that don’t do this today[1], and converting them into an error would be an issue IMO.

I feel less strongly about cases where the user didn’t have explicit [project] / setup.py, in which case I do think that it should be an error, because there’s no indication that the project was meant to be installable in the first place.


  1. This == have a pyproject.toml file, without a [build-system] table – I considered checking pypi.org but it’s grown enough that this isn’t a “quick check”… this point doesn’t need emperical data for people to trust it tho ↩︎

3 Likes

I’m okay with the language in the GitHub issue but two things:

Does it address the original issue here? I thought that the point was there wasn’t a pyproject.toml in the users directory, but I’ve misunderstood a lot of the context here, so maybe there was.

And is it possible for setuptools to install correctly without a setup.py, setup.cfg, or pyproject.toml? Because if so I’m likely going to be -1 on pip throwing an error, as I don’t see this worth breaking existing workflows, though I will be okay with a warning.

Currently, it doesn’t get invoked by installers in those cases. See https://gist.github.com/pradyunsg/1cb8a7401320136babc8e200d0a5e8eb#file-empty-dir.

Edit: Is it possible? AFAICT, yes, setuptools will create a package out of an empty directory if invoked. It just does not get invoked today so this is behaviour should not be a concern IMO.

I believe so – it’d present the error that OP is asking for here. See https://gist.github.com/pradyunsg/1cb8a7401320136babc8e200d0a5e8eb#file-empty-pyproject-with-two-python-files (which is the error the user sees today, based on the discussion here).

In all the cases in the Gist, I’m trying to make the case here that a good behaviour would be an error from the installer, instead of implicitly running a build in some of those cases – the installer has no indication that these directories should be built. The specific change I’m suggesting is reflected by the PR linked in my previous post.

They did have a pyproject.toml, but all it had was a [project] and it was mostly for
project.dependencies (i.e. no [build-system]).

I keep coming back to this topic but getting stuck on the same thought:
“Why was the user running pip install . if that’s not what they wanted?”

It seems like they really wanted something like pip install $(toml get pyproject.toml project.dependencies).

As a result, it feels like the user would have been “happier” with

[dependency-groups]
main = [...]  # instead of project.dependencies

+ pip install --group main.

But that’s not socialized as one of the canonical ways of handling application dependencies.

And “happier” in scare quotes because the advantage of this setup is that pip install . just hard-fails. In practical fact, it might be an even worse user experience.

I was incorrect about the user story before, so I’ll ask: am I at least understanding the situation better?

3 Likes