PEP 771: Default Extras for Python Software Packages (Round 2)

Isn’t that also the case if a package adds a new required dependency and says “we split the library part into example-core, example is now the application part”?

I don’t disagree that this makes it possible for a dependency to change in meaning, and probably sooner or later someone will try to start a flame war on Reddit (joke’s on them, I don’t visit Python Reddit!). But I’m not clear on how this is new as a problem, given that the same sorts of changes can happen to packages anyway.

I am most likely missing something, but it looks like to me that the extras are trying to achieve something like pyqt || pyside? Is what is missing a way to express a choice between requirements?

1 Like

True, but unlike the case where a package adds a new dependency (and so any complaints fall onto that package because the dependants cannot change anything), default extras requires support from installers and rest of Python packaging. It seems unlikely that both pip and uv will have support at the same time (putting aside other installers/workflow tools), different users will see different results and suddenly the tools you are using and their settings are even more important. That doesn’t mean default-extras are inherently bad, I just think using them to perform discovery (e.g. listing them on the PyPI page, tools like poetry asking users if they want to include the default extras in their config file, IDEs suggesting their installation) is more likely to have a net-positive effect across the ecosystem than changing what a dependency specifier means.

1 Like

I think this is an important point which we’ve been talking around so far, but have never really addressed head on.

Currently, the requirement foo specifies a minimal working installation of the project foo. Installing this requirement will include all required dependencies for the project and nothing else. That’s behaviour that has been consistent since requirements were first introduced, and people rely on it. This PEP fundamentally changes that meaning, such that foo will now install the required dependencies, plus any recommended dependencies the project specifies. To get a minimal install, the requirement foo[] is needed instead.

Regardless of the technical details, this is a huge change in meaning for the idea of a requirement, and providing a way to get the “old” meaning with a new spelling is not backward compatible, it’s simply a way for users to fix the backward incompatibility (if they are able to modify the requirement, which isn’t always the case).

Like it or not, that’s the reality here. It really doesn’t matter whether the current meaning of foo provides the best user experience, or whether we’d have done something different in hindsight. We’re breaking backward compatibility by changing the meaning of a bare requirement, and we need to decide whether that’s what we want to do.

IMO, that’s too big of a compatibility break for me to be comfortable with, and I’d rather we looked for alternative solutions that kept the current semantics intact. Maybe there aren’t any such options, and we have to reconsider what we’re willing to pay to get a better experience for the “give me the recommended bundle to go with this package” scenario. But right now, I think we’re focusing too heavily on the “default extras” approach, because superficially it feels like a small change when actually it isn’t.

8 Likes

This is certainly what I’m seeing in this discussion.

I’m still convinced that splitting example into example-core and only having metadata for example is the way forward (and maybe we need a build backend for this that doesn’t mind having multiple definitions in the same repo, which is the only limitation I can imagine that makes it harder to upload these two packages than one[1]).

But failing that, perhaps an alternative to the default extras approach is a maker that allows a non-default extra to “reset” the requirements and start fresh? So if a particular extra is specified, the default requirements of the package are ignored and only the extra’s requirements included. That would let example[core] (or choose your own name) become possible, and won’t affect any users who are currently working (even if they’re emulating it using --no-deps-equivalents).

The downside is repetitive metadata, but of all the downsides we’ve seen so far, I’m quite happy to impose some repetition on the package developers who want this if it saves everyone else from it.


  1. Noting there is no content in one of them! Just dependency specifiers. ↩︎

4 Likes

Default extras as specified already do that - explicitly mention any extra and the default gets disabled. But that doesn’t alter the fact that the meaning of the project name with no extras specified has changed from “package plus essential dependencies” to “package, essential dependencies, plus some non-essential but recommended dependencies”.

Doesn’t that reconstruct the same scenario as the default extras approach? The minimal install is now requested with example[core] rather than example [], but it’s the same thing.


I still think the discussion has focused a little too much on the assumption that users can and should be able to “get back to” a minimal install.

For use cases like my own and the flask/werkzeug one, this is a way of providing a more minimal install without breaking current usage. Current requirements can become defaults and the minimal package footprint can shrink.

But also, it’s not uniformly better for your second and third order dependencies to be specified with “no defaults”. That’s a decision that needs to be made in each package about its dependency relationships. If people start aggressively adding [] everywhere “just in case”, things are going to break for them when one of their dependencies makes a feature they’re using optional.

Right now, if a package I’m using adds a dependency, I’m stuck with that dependency. I could reach into site-packages and surgically remove things, but I’m potentially breaking environment integrity. So “getting back to minimal” is not even on the table in such a case.


I don’t really agree with Paul that adding this feature incompatibly changes the meaning of an unqualified package name. It introduces a new possibility, that extras can remove dependencies. And we’re adding a magical extra named [] which may remove dependencies from packages.

i.e. I think this feature is equivalent or almost equivalent to what Steve suggested above, that extras can replace the dependency list. It’s just phrased differently.
(And maybe the phrasing is what matters? I’m not averse to that idea.)

2 Likes

There’s a few things about this that make me a strong -1 on this ever being something I will use as a library author, and I’ll be dissapointed if I have to work around this as a user.

  1. My experience with default features in rust is somewhat negative, and this is with there being a strong set of feature names like no_std that have expected outcomes.
  2. My experience with “Reccomended” dependencies with linux distribution packaging is even more negative.
  3. I don’t personally believe in bringing in more dependencies than are evaluated for multiple reasons, ranging from potential security impact to just bloat, and I don’t want a situation where the ecosystem encourages people blindly pull in extras just for convenience because “Well, the people that care can deal with it”
  4. There are already gaps in testing for things like “maximum python version, minimum supported dependency versions”, and I don’t see structuring dependencies in this way as likely to ever have a positive impact on quality of testing or developer assumptions. I do think decisions made here will have an impact on this, whether they should or not.
2 Likes

I thought one of the motivations here is that this is not necessarily the case. The requirement foo doesn’t specify a minimal installation. It specifies “whatever the authors decided to put in project.dependencies”.

In some cases that means a bunch of default extras, because the authors came to the decision that their users almost always want those other packages installed. That isn’t a great design for the package, but it does seem to happen.

I think the use-case I described above would benefit from this idea. It might just be a different way to get the same feature, but it has a better compatibility story: nobody changes their current dependency list, but they have the option to add [core] as an extra that removes dependencies. So the change can be opt in for downstream users.

5 Likes

I think the current examples centre on “installing a package imperatively”, instead of “installing a package declaratively”, in which the latter can take many forms, such as PEP 621 project dependencies, PEP 723 script metadata, PEP 735 dependency groups, or requirements-txt files. I want to distinguish these because only one of those declarative forms is a “project” workflow in which there’s an intermediate package. I think the PEP is specifically missing examples for these declarative workflows.

I’m not sure I want to pick at the meaning of “pure install” here, but for the purpose of “direct requests”, I think other tools and workflows do support this, e.g., pipx install example (or uv tool install example).

The direct install behavior becomes complicated to reason about in more forms than just the “project” syncing workflow. I think pip install -e ., pip install -r requirements.in, or pip install --group example are good examples. The definition for a case like uv run pep-723-script.py is also unclear. I think we agree on the outcome here, so I don’t think we need to pick at the details, but I wanted to note more of the relevant cases so they’re considered in future discussions.

I actually care a lot about “projectless” workflows, and want to make sure that those users continue to be served by new tooling. I think we’ll get off-topic going into it here, but we should find some time to talk about this elsewhere. We might be more aligned than it feels.

On this point, I agree — a consideration of more workflows is all I was asking for, not a removal of the pip examples.

2 Likes

Yeah I think people are sort of forced into this, because it’s otherwise too hard to get the additional, common dependencies into user’s hands.

To make sure we’re on the same page here… I think it’s breaking forward compatibility, but not backward compatibility, in the sense that installing a package from a PEP 771 aware tool would change the result. So you could be missing expected packages if you’re switching from an aware tool to an unaware tool, which is breaking; but going the other way, you will still get all your previous dependencies, which isn’t breaking.

Is the forward compatibility breakage what you’re concerned about? It sounds like you’re more concerned about the change in meaning, which I think is something we should carefully consider, but aren’t most new features a change in meaning? e.g., when the [build-system] configuration was added, the behavior of pip install for a package changed and the ecosystem had to add options to opt-out of and control that new behavior.

I think this framing is helpful, especially since people are already declaring non-minimal dependencies in the project.dependencies table.

2 Likes

I think there’s two different types of packages that could adopt this and the compatibility story is different.

  1. foo was previously including a bunch of stuff in project.dependencies. They move those to “default extras”. install foo doesn’t change if your installer supports the new feature. If it doesn’t, then you’re suddenly missing stuff you previously installed. This PEP includes a metadata bump, so installers will warn about it, but warnings can be missed and this will break workflows.
  2. bar only depended on the minimum before, but now that they have this feature they opt to define default-extras for the common use case. If your installer doesn’t support that, you get a warning but nothing changes for you. If your installer does support it, install bar gets you a bunch of extra stuff you didn’t ask for. This is probably not breaking[1] but it would be annoying, especially if bar was a transitive dependency of something else and so it’s hard to specify bar[].

I think 1 should be considered a backwards-compatibility problem on the part of foo: my project worked before, with an existing tool, and now it doesn’t. But probably that means I should add those packages to my own dependencies, rather than relying on foo to grab them for me…

2 is maybe more of a social problem–if the authors introduce defaults that most users didn’t want, that means they misread their audience (maybe they didn’t realize their package was being used in a certain way).


  1. unless it clobbers a dependency you did want, i.e. there are conflicting package names ↩︎

3 Likes

I’m not sure I understand the distinction. There’s no change in behaviour until a project adds a default-extras key. But when they do, old tools will install the wrong set of packages. That’s the only actual change. But there’s also a semantic change, in that the requirement foo currently installs the foo project’s dependencies and not the optional dependencies. Under PEP 771, you have to start thinking of the requirement foo installing the foo project’s dependencies and any default extras (of which there may be none). That’s less about the behavioural change, and more about how we teach people what the concept of a requirement means.

Backward? Forward? :person_shrugging:

The first incompatibility can be addressed by projects not adding default extras until tools which are not PEP 771 aware have effectively vanished. Or by projects wanting to use default extras increasing the complexity of their install instructions with a bunch of “if you have a pip version older than X…” qualifications.

The second is much harder to address, because changing people’s views of established concepts is hard. It’s mitigated by the fact that the average user probably doesn’t really think too hard about the distinction between required and optional dependencies. But for experts, it’s a lot more difficult - witness all the trouble the experts have in this discussion, trying to figure out the implications of the new semantics…

2 Likes

Declarative installation would definitely help @zanie.

For @bwoodsend and others, it’s not only Qt but also different backends for visualization as well. We do a check at runtime. However, each hurdle to installation leaves the scientist with the opportunity to just go with a paid solution.

Honestly, as a bench scientist, I could image someone just putting into an LLM “Give me an environment with Python 3.13, napari, pytorch, and narwhals running on my system which has xyz GPU on a cloud provider instance”.

I honestly don’t know the best way forward. @zanie Maybe a tutorial on declarative installs for maintainers in Scientific Python makes sense since we also pull in some rust optimized stuff too.

3 Likes

I just watched an OmegaBall game which is football/soccer played in a circular field with 3 goals and 3 teams. Thinking about packaging in different ways makes sense. I’m very encouraged by variants, wheel-next, declarative options, and standards to improve opportunities to satisfy multiple user groups.

Thanks @pf_moore for the suggestion about pypi’s command re: pip install napari. It probably doesn’t help but I’m not sure that the site is causing the issue since the README has the install instructions and links to the docs.

The resolver would expand flask into flask[werkzeug]. After that, normal extra merging logic applies, so flask[] | flask[werkzeug] = flask[werkzeug] and pulls in werkzeug-dev-server.

Essentially I think the idea is that internally, a bare package name is just a shorthand to declare the dependency to that package with the default extras, so instead of trying to figure out extra substraction, we maintain the existing always-additive extra resolution logic.

But yes, the PEP should explicit describe how this should work.

3 Likes

I’ve realised there’s another group we haven’t considered so far: non-installers which use the dependency metadata. This includes well-known tools like dependabot, but there’s probably a long tail of tools, some of which would deal with security. So any mismatches in support there could cause additional ecosystem issues. I think this is probably a minor issue overall, but probably needs thought as to how this PEP should interact with such tools.

3 Likes

I started explaining how it was different and realised that it’s still got the same issues.

So I guess I’ll have to remain -1 on the entire idea, and continue to recommend “just split up your package if you care this much” to package developers (which already works and doesn’t require a PEP).

4 Likes

Thanks to everyone who has been chiming in! I’ve been trying to digest this all for the last few days, and I think there are a lot of interesting points being raised. Thanks also to @sirosen for the useful summary of the current main concerns with the PEP!

I completely agree with @willingc and @pf_moore that we should try and focus on how we can change the PEP to reach a consensus, and I am not personally tied to the present technical solution. I am open minded about switching to a different approach if we can find one. Having said that, we are not rushing this - this is a discussion that started almost five years ago and so there has been a lot of opportunity for another solution to come up, and so far the one presented in the PEP is the one that has had the broadest consensus. Whether it should be accepted or not is another matter, but I just want to make it clear this is not just a rushed idea and that many alternatives haven’t been considered.

There’s a lot of things I’d like to follow up on, but to keep things manageable, I’d like to focus on some of the general concerns that are not specific to extras, because I think we need to make sure we reach consensus on some bigger picture aspects before delving into the detail.

Let’s consider the approach based on the package splitting approach (whereby package-core is a base package with minimal required dependencies, and package is a meta-package that brings in optional additional dependencies). As we’ve discussed here, and is discussed in the PEP, this is already possible although it is not currently a trivial amount of work, which is why a number of people have said here that they have not or will not do it.

Now imagine that we were to now write a PEP to propose a mechanism to make it possible for people to easily declare additional meta-packages in pyproject.toml that should be built and published at the same time as the main project (and would automatically handle version pinning, etc). The details of how this would be achieved in pyproject.toml are not important here, what matters is what if we made it really easy for packages to build metapackages to solve this problem?

This approach would likely meet some of the same criticism as the current PEP, including that packages switching to this would introduce bloat into the ecosystem because the default package name package would now mean the package plus optional dependencies, and if a single package that a user installs has not updated its dependencies to be package-core instead of package, the user will get the full installation (in fact the situation is a bit worse than for PEP 771 here in the sense that since package-core would not exist prior to a specific version, it would be difficult to allow depending on old versions of package or more recent package-core versions). Package developers for existing packages would be changing the semantic meaning of package in the process, similar to the extras approach.

But, of course, the split package approach is actually already possible, just not trivial, and so some of the concerns above already apply to the current ecosystem. The difference between the status quo and the hypothetical PEP mentioned above or PEP 771 is that we want to make it easier - as @sirosen already pointed out above:

So forget about extras - there is a fundamental question here about whether we should make it easier for package to potentially change meaning and for getting a minimal installation to require a little more effort. If this is something we can’t reach a consensus on, then the details of how we achieve this don’t really matter.

As a side note, the status quo is not without cost. Imagine that we simply can’t reach consensus on the above general point. This would mean not providing any easy way for developers to make it so that package means a package that will be appropriate for most users and to have some opt-in mechanism to get a minimal installation. At this point, some projects might decide to make the required dependencies include more than the strict minimum of dependencies (or keep it that way if it is already the case), which would definitely not be optimal since it would increase bloat without the possibility of opting out. Other projects might switch to the split package approach, which as we saw above still shares some of the same concerns as the extras approach. While one might argue that projects that want to do these will already have done it, I think failing to reach consensus here could be a trigger for some packages/projects to use other approaches which may not be as optimal.

I also think that while we are thinking a lot about misuse of this feature and assuming many developers will use it without understanding the consequences, I do think that more widely used packages are in general developed more responsibly/carefully, and I don’t think many of these packages will blindly switch to using default extras (or whatever approach this PEP settles on) without understanding the consequences.

2 Likes

Thanks for the analysis, which was useful.

Here’s another possibility, which I think deserves a similar analysis/review. Why not just formally adopt a convention that the extra name “full”[1] always means what would go in the default extra under PEP 771? Then there’s no user confusion, and people wanting the default if there is one could use pkg[full] - we could even standardise that tools must ignore an extra named “full” if the package doesn’t provide one to make this usage safe to use all the time, if necessary.

Basically, it’s the same proposal as PEP 771, but with the default extra being opt-in rather than opt-out - the advantage being that opt-in is backward compatible. The cost is that it’s slightly less discoverable, but once the new convention is solidly established, it shouldn’t be too bad[2] (we can educate new users by explaining that [full] is just the same as “with recommended packages” in other package managers).

Putting it in similar terms to how you analysed the split package approach, people have said they don’t like having to choose a particular extra name for the “default” extra, and teach their users to use it. How about if we made that choice, and the process of informing users about the option to have a “recommended” install, easier, rather than changing what a plain install means?

Personally, I’m thinking mainly about compatibility, and how misunderstanding the new feature (rather than misuse as such) could result in unforseen issues. Maybe this is something that could be addressed under “How to teach this”, by explaining how we ensure that package developers do understand the consequences. And to be clear, I’m not talking about big projects like astropy, I’m thinking of users relatively new to packaging, who have mostly just struggled through the maze of information to find something that works. Or maintainers of older packages who have done the bare minimum over time to keep up with changes to the packaging ecosystem, and who just wish things would work, without “best practices” changing every 5 minutes. Or package authors who don’t care much about the niceties of package management, and are faced with a well-meaning PR from an enthusiastic but short sighted contributor, that describes itself as “Add support for PEP 771”. But ultimately, I just want the PEP to address the question of what if things don’t always take the happy path?

Any new feature can be used well. What matters to me is whether it tends to guide people into good patterns of use, and away from bad patterns.

While I appreciate the point that we should stick to the big picture, I think that the question of how to handle multiple independent types of plugin is very much a “big picture” question as well. To take a specific example, consider a data plotting package with a cli extra adding a command line utility, and a series of “display” and “datasource” extras that specify particular rendering and data access backends, respectively.

Artificially limiting the user interface to “here’s the recommended set of all of this”, with no options to change or remove one default while leaving the others as they are, is a technical limitation of the default extra proposal, and it’s something we should address. I don’t think it’s a good design in the abstract sense, and so if we’re looking at general concerns not specific to extras, I’d like to see “we should solve the whole problem, not just a small part of it” as one of those general concerns.


  1. Feel free to bikeshed the name later if this idea goes anywhere ↩︎

  2. I’m happy to accept feedback from maintainers of packages that want default extras, that a known extra name is worse than I’m claiming. But in that case, can we also accept the feedback from people with reservations about PEP 771, that changing the default behaviour is worse than the PEP 771 enthusiasts are suggesting? ↩︎

4 Likes