PEP 771: Default Extras for Python Software Packages

Firstly, I want to say that sunpy would find this very useful as it would allow us to address the needs of both our direct users (who might write scripts, or Jupyter notebooks etc) and packages which depend on us (or parts of our library). Users running pip install sunpy will get a fully functional install where all methods work without additional package installation commands and packages depending on us can specify which subpackages they require like they do currently or they can opt out of any extras. (I am :+1: on a default way to opt out).

Essentially, I think either users are ignoring the docs (in which case they won’t know about the extras), or they are reading it and hopefully can be shown how to make things work.

This is currently the case for our users anyway, as either they read the docs and install sunpy[all] or they don’t and have to run a load of install commands anyway.

2 Likes

I agree that the complex cases need to be handled in documentation. Whether or not it’s strictly necessary that the PEP to walk through this under How to Teach This, it would probably be worth calling out the basic scenarios under discussion.

At a minimum, I think it’s important to make sure there’s clear guidance on what maintainers of different packages should be documenting.

  1. A package with minimal and default dependencies.
    • e.g., fastapi[standard]
    • How to install with the default. How to do a minimal install.
  2. A package with pluggable backends.
    • e.g., qtpy (this doesn’t actually define its soft dependencies as extras[1]; does anyone have an example which uses extras?)
    • How to install with the default. How to do a minimal install. How to select a non-default backend.
  3. A package with various non-exclusive soft dependencies.
    • e.g., sunpy (fantastic example of this, thanks @cadair for weighing in!)
    • How to install with the default. How to do a minimal install. How to select various options.

What I’m getting at is that saying “packages have to document it” may be right, but it’s worth thinking through what that documentation will look like and how it will be different for different kinds of packages.


  1. I tend not to either. I find that since I have to document it, I might as well just document the soft dependencies themselves. So I don’t have good examples from any of my own work. ↩︎

1 Like

I just wanted to chime in that this would be valuable to the Bokeh project. Bokeh currently has a server component that depends on tornado which makes it difficult for users to install in WebAssembly environments. [1] It would be great to be able to have our users be able to:

pip install bokeh    # same as today, no breakage for current users
pip install bokeh[]  # does not include tornado, works for webassembly

or whatever syntax gets decided for “minimal”.

Summarizing: we’d be happy to have any solution that recognizes sometimes the minimal case is the special case, and provides a path to achieve that.


  1. AFAIK tornado will probably never be available in WebAssembly. ↩︎

2 Likes

All of these “ooh, I’d use this!” posts are just reaffirming my belief that this feature is going to be overused and exclusively misused, leading to caveat-ridden, bloated and buggy installations to be seen as a features rather than a design problem [1].

For all those who aren’t opposed to this change, please just grab a Docker container for a Linux distribution that has already supports some variation of this, try setting up some applications or developer environments (or anything!) in those containers and really get to experience what you’re asking for:

Take Debian/Ubuntu or Fedora (which have recommended dependencies) with and without the --no-install-recommends/--setopt=install_weak_deps=False flag to apt-get install/dnf install:

  • Notice how the bloat factor starts at about double and grows with the depth of the dependency tree. Even though only a small ratio of packages use recommendations, that ratio compounds itself for each layer of the dependency tree giving much higher cumulative bloat ratios than you’d imagine.
  • See how tenuous some of the recommendations are. e.g. fedpkg recommends podman and qemu (+368MB!) even though they’re both at the bottom of the list of recommended virtualisation backends.
  • See if you can find a shred of used functionality lost by dropping the recommended dependencies (in years of doing exactly this, I never have).
  • Look at the delta of minimal vs full package name lists and see if you can figure out why the recommended ones are even there. Notice how the bulk of them come from dependencies of dependencies and that such non-top-level recommendations quickly cease to make any sense.

A post PEP-711 world will be much worse than the above because we have no single global --no-install-recommends opt-out. It’ll also be much worse in Python since Python has so many more ways to store or pass around or track or install lists of packages in which the opt-out can get lost [2]. (And also because I suspect recommendations will be (over)used more heavily by Python packagers based on other people’s comments here and from the ways extras are already overused.)

Or look at Void Linux or OpenSUSE (which support interchangeable dependencies):

  • Observe how quickly things break on Void Linux because xbps thinks two packages are interchangeable when one’s really not due to lack of large matrix testing.
  • Rewind to about 6 months ago to when OpenSUSE Tumbleweed’s default [3] build system was incapable of building hello world for over a year because it couldn’t handle the ambiguity of interchangeable dependencies!

This feature will be much worse in Python because we have more tooling that needs to adjust and because the Python community as a whole already massively underestimates the need for [4] top level packaging-like testing (look at how many projects still don’t even run a pip install . without the -e flag in CI then are surprised when they publish broken wheels with missing files).

And interchangeable dependencies take so much more environment-level testing than people realise. Take the Qt example already in the PEP:

  • To do this properly, you need to test all backends in isolation plus multiple backends including and excluding the default (4+1+1 test jobs/separate venv environments so far).
  • You’ll quickly discover that people want some way [5] to force a preference so now you need to cover the cases default explicitly specified, non-default specified, non-installed variant specified (+3 = 9 test jobs now).
  • But you’re not done yet. Not many people know that importing more than one Qt variant into a single session can, on Windows, cause one variant to load the DLLs from the other leading to anything up to and including a segfault so you’ll need to do some "PyQt6" in sys.modules trickery to protect users from that [6] and test that too (at least +1 test job).
  • But you’re still not done! You’ve probably been naively using try: import PyQt6; except ImportError: try the next one but since pip install package[PySide2] makes it possible to circumvent the PyQt6 >= minimum_supported_version constraints you (hopefully) put under the default extra [7] so you actually want a whole load of importlib.metadata.version("PyQt6") comparison checks and a for loop over available backends to find the one that was actually installed with your package rather than some stale version of a preferred backend that was already there. (+1 test for the for loop and +1 test for the explicitly specified backend being too old. 12 test jobs in total and we haven’t even got round to multiplying things by 5 Python versions and 3 OSs!)

And don’t get me started on what it’s like to triage a bug report with such a package involved. It’s already impossible to get a reproducible example out of a confused user with a matplotlib.backends issue who isn’t fluent in venv. [8] [9] This PEP would make that the norm. :upside_down_face:

WayTooLong/Didn’tRead [10]: Every ecosystem I know of that has some form of this feature is much worse off for it and a simple comparison shows clear reasons why this proposed Python variant will leave us even worse off than those other prior arts. So to anyone who has explored other ecosystems’ variations of this feature, please show me how this proposal is supposed to do better than those existing ones that are already detriment their corresponding Linux distributions. And to anyone who hasn’t, I honestly don’t believe you have any idea what you’re asking for. It’s certainly not going to be the well measured world the PEP alludes to (and like I say above, the compounding nature of dependency trees makes the definition of “many default dependencies” much smaller than you’d think).


  1. usually a package that should really be two packages ↩︎

  2. or possibly not even be supported! ↩︎

  3. weirdly not the one they use in production ↩︎

  4. and generally isn’t very good at ↩︎

  5. if you’re really lucky, they might even stop at one way! ↩︎

  6. although you can’t do anything to prevent the user importing a variant after you’ve incorrectly guessed the one they want so you should still expect to get segfaults reported to the issue tracker of any vaguely related collateral package ↩︎

  7. Here I want to write a preemptive apology letter to whichever poor soul has to figure out and implement whatever pip check will even mean in this scenario ↩︎

  8. even a pip freeze is inadequate since half their environment is usually conda packages squashing pip packages squashing conda packages ↩︎

  9. well, short of asking them to clone their hard-drive and FedEx it to me ↩︎

  10. Sorry it got so long. I tried to not write this at but this whole proposal has been eating at me for months ever since I saw the original setuptools issue. ↩︎

5 Likes

Wouldn’t it make more sense for WASM specific dependency handling to be specified via WASM specific dependency handling markers? e.g. Something like:

dependencies = [
    ...,
    "tornado; sys_platform != 'wasm'",
]

This feels like another one of those tantalising wrong uses of recommended dependencies.

1 Like

As far as I can tell, there is an important difference between APT’s Recommends and PEP 771: With Recommends, there is no way for a dependent package to “turn off” the Recommends of one of its dependencies. So if A depends on B which recommends C, by installing A without --no-install-recommends you’ll get C, even if A’s maintainer was certain that you’re not going to need it to use A and would’ve liked to disable the recommendation for C.

PEP 771 lets package authors do just that. So while Recommends can only ever add and therefore naturally tends towards bloat, PEP 771 allows packages “further up” in the dependency tree to prune it as appropriate, which should prevent the exact situation you describe in which recommendations from packages further down the tree don’t make much sense near the top anymore.

usually a package that should really be two packages

That points towards one of the ways in which PEP 771 could reduce bloat: Right now, if my package A depends on B which depends on scipy (~50 MB) for some of its functionality, but I only use the parts of B that don’t need scipy, my only option short of forking or vendoring it is to ask its author to please split their package in half. That’s a big ask, and I can’t even do most of the work for them (getting the repository and associated infrastructure set up etc.). With PEP 771, I could just submit a PR turning their scipy dependency into a default extra and, once accepted, disable the default extra for my package’s dependency on B.

2 Likes

Just to be clear, PEP 771 doesn’t actually disallow such a flag, we just discarded the idea of this flag as being the primary way that one controls whether default extras are installed or not. But pip, uv, and other package installers are free to implement such an option if they wish (and it could be useful for some users).

2 Likes

Comparing Python to a linux distro in a docker image might be a bit extremist. Nothing about this is minimal, with or without defaults. You could also compare with Conda, which has no idea of extras (default or not) at all, and has a ton of bloat with basically everything depending on everything possible. Maybe a better comparison would be the Rust ecosystem; Cargo has default-enabled features, and AFAIK everyone is very happy with them. Though I’m not very deeply integrated into the Rust community, perhaps there’s something I don’t know. I found it extremely user friendly to just add dependencies, and then later go over them (there’s even a tool to find unused default dependencies for cargo!) and trim out the ones I didn’t end up needed before finalizing a project.

Any feature could be misused, but I think there are lots of places this is very useful in helping newcomers use Python while allowing maintainers to also provide better support for minimal dependencies. I’m interested in this for build; today, we can’t depend on virtualenv, since we have to support minimal dependency bootstrapping, but the most common issue is “build can’t make a virtual environment”, which nearly always has the universal solution “install virtualenv and it will work”. It would also be nice to provide a way for users to opt out of a couple of other dependencies we do add to the required dependencies today eventually. That would mean environments could get slimmer, not just fatter, with this feature! Though I think, due to backward compatibility (old pip won’t get the defaults), that would take time.

Like every packaging feature, it comes with responsibility: if you don’t test with minimal dependencies, it might be broken. You could misuse it to add too much by default or make the minimal case too minimal. But that’s a general issue; minimum version limits also has the same issue, along with many other things. This sort of problem can be improved with good documentation on how to test, static analysis, etc. For example, if you ran into the missing files when only installing with -e., my check-sdist tool likely would have caught that. Like most good features, this it’s a feature that adds “package developer” responsibility, but it simplifies the “package user” experience.

Wasn’t one reason to have package[] not install default features due to technical limitations in pip? IMO, it helps enforce that package[thing] doesn’t also install [default]. I think the other reason was that some use cases, like specifying a backend, would want to disallow the default case, though IMO someone doing package[] probably would have reason to do so. I would not be surprised if I tried to install a tool with a required backend and left this empty and got no backend.

Edit: Another example: I finally caved in to demands and added the pyproject builder extras to the required set for scikit-build-core. But having default extras would not only make it possible to do minimal installs again, but would even mean plugins (like the setuptools and hatchling plugins) would no longer need these extra two packages, making isolated builds faster!

Edit: The cli extra for repo-review and the all extra for validate-pyproject would be good default candidates.

7 Likes

After further investigation, it might actually be possible to do this in pip – so I wouldn’t assume it’s not possible at this point. We’ll continue working on it and report back once we know for sure if it is possible!

2 Likes

If it can’t be done in pip[1], that’s likely a showstopper. But I’d be extremely interested to know why it’s difficult (even if it’s possible) because that might indicate where the problematic areas of the idea lie.

And when I say “showstopper” I don’t mean just to the idea of having a notation for “don’t install default features”. There seems to be a fairly clear consensus that people want that capability, so we need some way of spelling it (and the other ways have problems, as you noted in the PEP).


  1. or in uv, but I assume the issues aren’t tool-specific ↩︎

1 Like

Why would using pip instead of apt mean I care any less about bandwidth or footprint? And why does Conda being unsuitable for low-waste setups (even ignoring extras or this feature) change anything?

I’m seeing very few examples given here where they’d use this to make hard dependencies into soft dependencies [1]. Almost all of them are adding new soft dependencies, from anything to a bonus CLI (which should be a separate package) to an author’s go-to ASGI server (which should stay as an extra so users aren’t misled into thinking the uvicorn command is part of fastapi). So this (I agree possible) world where footprints get smaller just isn’t going to happen. All the evidence indicates we’ll just have to work harder to get the same footprint as before.

So does the ability for packages to put upper bounds on their dependency versions. Do you feel that that ability is used responsibly? (I know this is very much an argument of perspective but my position as maintainer of a packaging tool that takes a lot of collateral damage from other packages abusing good features gives me very little reason to believe that enough package developers are responsible.)

Uhmm, what part having to recursively audit my dependency tree every time an (indirect) dependency updates to see if it’s grown a new recommendation I don’t need?

In that case, can I at least request that this being successfully implemented be made a requirement for this PEP being accepted? Likewise for an implementation of pip check that can handle the result of pip install package[alternative-backend]? Otherwise, I will be forced to check the metadata file of every version of every package I come into contact with for the rest of my life the keep the bloat away.

But splitting B is still the right thing to do, even if it’s hard. Package B is trying to do too many things and it’s degrading the user experience. Giving B the soft dependency escape hatch will encourage them to keep on abusing the scope of their package. Let this philosophy perpetuate long enough and we’ll end up with dependency soups instead of trees.


  1. and the few I do all belong in my shouldn’t have been any kind of dependency to begin with mental category ↩︎

3 Likes

I didn’t bother to provide my example because I’m still mulling whether or not or when I’d actually use this, and it’s tangled up with pre-commit (which has its own issues), but I do have such a case. I have to wait for support in pip to be more or less universal to pull it off, which I think is a common problem for “making hard dependencies into parts of the default extra”.

For check-jsonschema, one of my primary targets is usage under pre-commit. Pre-commit doesn’t have a mechanism[1] which lets me default select extras for my package, except by creating a new and separate project to serve the hooks, as a wrapper over the original. If I were doing things over from scratch, I’d split the project, but it’s too disruptive for me to justify at this point.

Because almost all of the builtin hooks target YAML files, I’m including a YAML parser by default. That means that even if you just want to use check-jsonschema on, you know, JSON, seeing as the tool is for JSON Schema, not “YAML Schema”, you need to install a YAML parser. That’s silly.

I would like to make the YAML parser optional, but without removing it from the install experience provided to pre-commit users or naive pipx install check-jsonschema users or homebrew users (should brew include the default or not?), etc.

I’d also like to move the vendored schemas provided by check-jsonschema – necessary for those pre-commit hook users – into a separate data package which gets picked up by default. So although I’d be “adding a package” in that case, it’s for the purpose of making it possible to build a slimmer install.
I could even make the requirement for regress (Rust-backed ECMAScript regex engine) into a soft dependency with more work, making for an even more minimal install.

All that is to say: I made some imperfect decisions 3 or 4 years ago, and default extras might give me a low impact way of making the core of my package smaller without negatively impacting users.

Okay; there’s my example. Does that mean that you’re wrong, that adding default extras won’t lead to bloat? No. I really don’t know. I’m actually concerned about exactly what you point at, that many of these cases seem to be taking things which are optional and including them by default. But I think it’s notable that you can only consider moving things around like this once support for default extras is nearly universal, so anyone wanting to make such moves would necessarily be a late adopter.


As a separate matter, I don’t think matrix testing is directly pertinent here? That’s already a massive and frequently underestimated problem in terms of compatibility. I frequently see projects missing testing of their lower bounds for dependencies, the full matrix of extras (or a significant chunk of it), all supported Python versions, or all supported platforms. The PEP could hardly make this worse if it wanted, but it’s not materially impacted by the addition of default extras, is it?


  1. As far as I know. Someone can correct me if I’m wrong. ↩︎

3 Likes

TL;DR: Improving package author’s ability to expressive requirements will help more than it will hurt, with some more removal examples.

It is clearly misused sometimes, but does that mean we shouldn’t allow it at all? I would not give up the ability to quickly add a cap for a broken package just so I could force everyone else to avoid adding needless upper caps! During the numpy 2 transition, we needed <2 for a while. As bad as needless upper caps are, it’s a really important feature. I think this default extra feature is similar - it can be misused, but overall it’s an important feature that will help far more than it harms.

Again, for a closer comparison, see Rust. I think it works very well there. That’s a similar situation, with first-party distribution and dependencies. Apt, conda, etc. are all third-party repackagers.

It’s much easier to come up with examples of adding packages (like adding virtualenv to build), since we can take advantage of this right away. The removing examples (like removing the two technically optional dependencies of build) will take time, everyone needs to be on a version of pip that supports it before it can be done.

Removing example: I’ve added the pyproject.toml parsing requirements to scikit-build-core (by vending pyproject-metadata and making the ignore-file parsing dep required). But all plugins now have to download this extra package (or two if I unvendor pyproject-metadata again); they don’t need these, since they are only used for the native builder. With this proposal, I could remove them again, and people who don’t read the docs and just pip install scikit-build-core and try to do a non-isolated build (everyone) will still be happy. No one noticed or understood that they had to do pip install scikit-build-core[pyproject]. But plugins could easily depend on scikit-build-core[nopyproject] instead of vanilla scikit-build-core, since that’s not every user, but just a few plugins that users use.

Nox could also remove virtualenv from the requirements as it can use venv, like build. But it’s not used in bootstrapping, so virtualenv is in the requirements. Also argcomplete and maybe colorlog are in nox’s requirements, but really could be optional. A lot of packages have added a hard dependency on uv, like hatch and tox-uv. But it’s much better to have a single install of uv (like the one I’m using to manage hatch and tox!), and then not install it in every venv. I could do that if I could opt-out of the uv requirement. I’d say many packages have “optional” dependencies in the requirements, because there’s no out-out mechanism.

Splitting packages has a developer cost. Now you have to have two packages. Distros might not package the CLI package. You have to handle upper/lower bounds between your CLI and package. Python tooling is pretty bad at monorepos, so that’s usually multiple repos and tags. And making a CLI opt-out instead of opt-in makes the user experience better. My example would be repo-review. I don’t want to make the CLI required, since that would require downloading CLI dependencies when using it as a library, most notably for the WASM version. I really don’t want to download click and rich in a web browser, I want that minimal. But that means pipx run repo-review and uvx repo-review don’t work - it’s pipx run repo-review[cli] and uvx --extra cli repo-review (I think the [cli] syntax is supported now there as well, or is about to be). If I split it into another package, like pipx run repo-review-cli, that’s not a better user experience either, actually. Users still have to find and use the CLI package. (and uvx doesn’t provide a nice way to have mismatched names, so it would be uvx --from repo-review-cli repo-review). However, I don’t mind depending repo-review[] or repo-review[minimal] in the web app, or if I add it as requirements to another package. That’s not something you type all the time (see harmful example below)!

Another example: validate-pyproject has to require minimal dependencies, as it’s used in setuptools. However, the dependencies it’s missing are things like trove-classifier validation, packaging(!), and tomli (on older Pythons). In fact, I’d argue most build backends might like to depend on trove-classifiers by default, but won’t unless there’s a way to opt out.

Harmful example: I think it’s worth writing out how this could be misused. Say someone has a package for solving Rubik’s cubes, we’ll call it cube. It’s really minimal, but has a fancy matplotlib integration that allows it to draw 3D cubes. This PEP comes out, and they add matplotlib as a default extra. That itself is not overly harmful, since anyone who needs the minimal package can still get it (say for WASM, or in a library). The harm comes if someone writes a package that depends on that package and doesn’t know to either disable the default extra or reexport it as a new extra. Note that this new package would also need to be able to handle matplotlib missing, or stripping the default extra is actually wrong! Let’s say the new package is cube-animation - that probably would actually break horribly if matplotlib was missing, depending on cube with the default would be correct. It’s only if the package, say cube-algs, didn’t touch the matplotlib parts or made it’s integration also optional that it would be a bug to depend on cube and not cube[minimal] (if someone was using that new package in WASM or in a library). IMO, this is pretty convoluted to worry too much about, and if it happened it would be an easy PR to suggest. And today, the hypothetical[^1] cube package might decide to add matplotlib to the requirements, and then you’d have the same situation except there is no way to avoid the extra bloat (and in real packages, especially in scientific packages, this is pretty common!).

Overall, I think it allows package authors to more accurately describe what they actually require and expect most users to need. It will improve the bootstrapping story and allow some key packages to reduce their footprint. It might even encourage packages to minimize required dependencies. And it will help some cases where going with minimal dependencies has hurt users who are just using pip/pipx/uvx.

3 Likes

Somewhat off-topic, but my impression is Rust also has issues with over-inclusion due to default features (with forks/rewrites occurring due to this, see e.g. Fat Rand: How Many Lines Do You Need To Generate A Random Number? | Armin Ronacher's Thoughts and Writings and the various social media discussions on it), so I’m not sure it’s a good example. Also, Rust lacks the same level of historic bundling that Python libraries have (especially in the scientific space), so the effects are smaller (and usually limited more to long compile times rather than large binaries).

I do think having levels of prioritisation of extras is a good thing, the issue is by defaulting on, the top level user has no control over what gets resolved for installation. Lets say pytest adds a default extra (adding a dependency on an additional library). Unless every package listed at https://docs.pytest.org/en/stable/reference/plugin_list.html (which is 1487 packages as of when I posted this) removes the default extra, by treating pytest or pytest >=2.0 as including the default, no matter what the user does to their list of requirements, that additional package will be installed. Defaulting off, on the other has, means the user can add the extra, or any package which needs that extra can add it in. This asymmetry is inherent to the fact that PyPI packages cannot be changed. --no-default-extras would be a hack to try to work around this (the inverse --yes-default-extras isn’t inherently required, because you can always add packages to the list to install). You could imagine “top level packages” (i.e. those passed to pip et al directly) could have their default dependencies included, which avoids issues around not being able to control what dependencies list, but that’s likely more confusing, and makes it harder to determine what should be a default or not.

3 Likes

I wonder if one thing that would help with extras usage would be the option to add a description of the extra? Such information could appear on PyPI, in IDEs etc. and allow users to discover more easily associated packages.

2 Likes

Quite recently! CPython’s developer container uses install_weak_deps=False, and was missing runtime libs for sanitizer builds, which some people require and others never use. In this case, saving a few MiB is not worth the pain of finding out what package you need on this system and how to install it.

Also, I rather enjoy dnf install tox pulling in all CPython & PyPI versions it can.

install_weak_deps=False is the option to use if “368MB of stuff that might be useful” is excessive for your use case. It’s a perfectly valid option, but nowadays the default trade-off between disk space and convenience on a dev box does go in the “everything and the sitchen sink” direction.

Maybe Python too should have an env marker for “this is a minimal headless server, no bloat please, I’ll manage any missing functionality myself”. It is a useful even as a single environment-wide boolean.

Maybe it could be recommended that package dependencies use the full expression if possible. Going on the pytest example, say the new default extra is color, and pytest[color] is the default. If plugins actually started using the new color functionality, then they should be encouraged to update to "pytest[color]" instead of the equivalent "pytest" dependency. This would go hand-in-hand with a global “disable all default extra” flag. The flag would only work if packages include the actual extras they depend on. But an end user can still “fix” this by adding this themselves, and can still get a truly minimal install. For example, if pytest-pretty needed the color addition, pip install pytest-pretty --no-default-extras would miss some dependencies, but you could fix it with pip install pytest-pretty pytest[color] --no-default-extras. And the “normal” user using pip install pytest-pretty would always get a working install even if they don’t update, it’s only the minimal flag that is affected.

2 Likes

Is this PEP really only about improving user experience? I feel like the improvement is not really worth it. I wonder what kind of user is able to pip install foo but not pip install foo[bar]? Maybe it would be more valuable to work on improving the discoverability of extras.

Yes!

Why are extras not displayed on PyPI? Why doesn’t pip show foo display the list of extras that are available? Why can’t I pip show foo[bar] and get a list of the optional dependencies? What UX do all the other tools from IDEs to development workflow tools provide around discoverability of extras?

3 Likes

This argument depends heavily on there actually being a --no-default-extras option. If that’s the case, maybe the PEP needs to say that installers MUST include a way to install packages without the default dependencies. And if that’s going to be part of the PEP, maybe we also need to discuss questions like:

  • Is it necessary to offer a way to install without the default extra on a per-package basis, or is it OK to just disable default extras for everything?
  • What about dependencies? Does disabling apply to dependencies or just to “top level” packages?

If none of this goes in the PEP, that’s fine, but it simply means that pip will have this discussion, and uv will have it separately. That’s perfectly normal, and lets tools give users a choice, but if the arguments in the PEP about usability of the new feature rely on certain options being available in tools, then they won’t be valid if tools don’t implement those options.

FWIW, I can see pip implementing --no-default-extras <pkg>, with :all: to mean “for everything”. But I can also see pip not bothering to add the option, if it’s complex to implement.

2 Likes

Many users struggle with pip install anything. Looking at it from SymPy’s perspective there are a large number of “end users” many of whom have very limited Python, CLI etc skills and possibly no understanding of the Python software ecosystem. This can include school children for example and it is even possible that these users are only using Python so that they can use SymPy and have little understanding of what Python otherwise is.

I would like these end users to just get the “recommended” installation for someone who wants to use SymPy directly with minimum fuss. Even discussing the idea that there are “extras” and different installation options is already complicating things more than I would like.

There are also packages that depend on SymPy though and that typically only need particular parts of its functionality. I expect package authors to be much more capable of figuring out and specifying which extras they want than end users and I would always want to give them the option for a minimal installation.

There are also many other people who lie in between total novice and package author but those are the two extremes. I think of it like a pyramid where there is a very small number of package authors at the top, then a large number of relatively expert Python users, and so on all the way down to the bottom where there is an enormous set of users who find anything to do with Python’s packaging difficult.

To me it seems much nicer to ensure that advanced users have an easy way to ask for a minimal install rather than expecting novices to realise that they likely want more than just the package.

1 Like