I think it is worth attempting to address the concerns raised here before going ahead with the PEP publication and subsequent discussion since if we can’t get close to consensus here, what hope do we have with the wider community
. I doubt we’ll all agree on everything here and I don’t think it’s realistic to wait for that to happen, but let’s see what we can converge on 
I’ll try and reply to the various open questions concerns in no specific order. Please let me know if I have missed anything! I numbered the different questions/concerns to make it easier for people to refer to them.
Q1: Errors for non-existent extras in Default-Extra
@notatallshaw mentioned:
I would like to see language that says that tools SHOULD throw an error for packages that have a Default-Extra that does not exist
This seems reasonable to me - in fact the PEP currently says that any Default-Extra entry must match an existing Provides-Extra entry, but I could add a sentence to say explicitly that tools that tools should raise an error if that isn’t the case?
Q2: What happens when users ask for non-existent extras?
@notatallshaw also said:
If the package has a default extra and the user specifies a non-existing extra, e.g. package[this-extra-does-not-exist-ecdf6822b], should a tool install the Default-Extra or not?
This is a trickier one but I believe that yes the default extra should get installed (I can add this explicitly to the PEP). In the example you give it is true that package[no-default] will still install defaults for versions 3 and 4, but I don’t think that should act any differently to say package[non-existent-extra]. Essentially if you give a non-existent default, I think that e.g. pip should emit a warning and then act as if it hasn’t been passed since it is getting ignored. One important reason that we should install the defaults if the extra is not recognized is because otherwise people could systematically add [no-default] to literally all packages to try and always disable defaults, which would be bad.
Q3: Repackagers of Python libraries
@bwoodsend asked:
What do we do about repackagers of Python libraries?
I agree that this is an important aspect to mention in the PEP.
Some packaging systems such as conda don’t even have the concept of extras, and I’ve seen cases where e.g. a conda recipe actually includes several of the extras by default, so that some conda packages are already bloated compared to the strict minimum they need to function. Some linux installers such as apt have the concept of recommended vs minimal dependencies, so in some (but not all) cases where default extras are used for this, they might actually line up well with that model.
However at the end of the day it’s difficult to have a single answer to this question, and given that the PEP could be applied in different ways (e.g. minimal vs recommended dependencies, default backends/frontends/etc), I think that repackagers will need to judge on a case-by-case basis what to do. The key is that they already need to make judgment calls as described above, but it is true that this PEP may add a little extra cognitive work for packages that make use of it. I can edit the PEP to acknowledge this.
In any case I am going to reach out to some repackagers that have not been involved in the discussion so far to ask them how they feel about the PEP.
Q4: pip freeze and pip install -r
@bwoodsend mentioned again the issue related to the fact that restoring from a pip freeze command may not round-trip if there are packages with defaults, and @pf_moore mentioned that normally one should use --no-deps to restore an environment from a freeze file. Note that this is true even prior to this PEP. Here is a concrete example: astropy has a required dependency on pyyaml, but in fact a lot of astropy could function without it. So if I wanted to make a lightweight install, I could in principle do:
pip install astropy
pip uninstall pyyaml
pip freeze > requirements.txt
pip install -r requirements.txt
However in this case, pyyaml (which is listed as required by astropy but not strictly required for a lot of the package) would get installed again, so the final result wouldn’t match the content of the requirements.txt file. So even right now there are use cases which highlight that one should use --no-deps. I think we should do three things:
- Update the PEP to mention that correctly using
--no-deps when installing from a freeze file (and equivalent for other packaging tools) is going to be especially important if the PEP is accepted.
- Acknowledge in the PEP more clearly that this kind of issue could happen with other tools too.
- Update the pip docs as soon as possible to recommend the use of
--no-deps when restoring from a freeze file, regardless of what happens with this PEP.
Q5: Burden to check for minimal installs
@bwoodsend said:
It is a massive pain for anyone who doesn’t want to spend the rest of their life reading small prints or source code to find what [minimal] will avoid installing unnecessary bloat.
This might have a technical solution: in principle it should be simple to write a tool that can scan the core metadata for a given package and determine if a package is using default extras and if so whether it defines an empty extra that effectively disables the defaults. Interestingly, as a user I find it a pain sometimes to have to figure out which extras I can enable, so in some ways this is not a new problem.
Perhaps a longer term solution orthogonal to this PEP would be to actually have a way to document extras in project metadata, to provide a one-liner about what each one does, and tools could then do e.g.
pip list-extras astropy
to show what extras are available and this could then show the description too. Perhaps a separate PEP? 
Q6: Interchangeable dependencies
@bwoodsend said:
One area that’s not really covered here is, for interchangeable dependencies case, what installers should do if a non-default choice is already installed
As written, the current PEP would lead to installers being agnostic to what is already installed. So yes in principle it would be a waste to install a dependency that isn’t needed, but there’s no other sane option, because how would pip (or other packages) know that a package already installed fulfills the same needs as the default extras? There’s no mechanism for specifying that two dependencies or two extras are equivalent and interchangeable.
Q7: Conflicting dependencies
@pf_moore replied to the above comment and said:
And it’s worse than that. Because there’s no record of what extras got installed, it means that pip install foo[bar]; pip install foo would install both bar and the default extra. And given that the two could easily be mutually incompatible (if they are alternative backends, for example) this could result in a broken installation.
I’d be interested in whether such a case of actually mutually incompatible alternative dependencies exists as it is mentioned sometimes but it’s hard for me to address without actually seeing it in the wild - it seems like it would be poor design for a package to actually not be able to work if two alternative backends are present for instance (at least it should just pick one and ignore the other).
Let alone the present PEP, there is no way to guarantee that two other packages might not each pull in one of the mutually incompatible dependencies. Let’s say package A needs package B or C but it will crash if B and C are both present. What if a user then installs packages D and E which depend on B and C respectively? In this sense this PEP is not really introducing a completely new problem.
The PEP as written states:
Note that this PEP does not aim to address the issue of disallowing conflicting
or incompatible extras - for example if a package requires exactly one frontend
or backend package. There is currently no mechanism in Python packaging
infrastructure to disallow conflicting or incompatible extras to be installed,
and this PEP does not change that.
Should anything be added beyond this to address this point?
Q8: Concrete examples
@konstin said:
Some real world examples with popular package, such as fastapi-cli[standard], would make the PEP more tangible and ensure we’re solving those cases.
I’m reluctant to add too specific examples, because it’s hard to find specific cases that will actually resonate with people (for example, I have no idea what fastapi is!). I’m also worried that specific examples aren’t timeless and could even be out of date before a decision would be made on the PEP. I would prefer to try and describe different general hypothetical cases that people can map onto what they are familiar with, as I’ve tried to do in the examples of usage in the PEP (but happy to be convinced otherwise).
Q9: Compatibility with older versions of tooling
@konstin and @dustin also raised the following point:
Wouldn’t this apply with the current approach too, because installing a package relying on default extras for e.g. a good getting started experience would have a broken installation with older tooling, except it’s not shown as an error? [and subsequent comment about authors moving required to optional default dependencies]
Ok so I think this is a very interesting and important question and we should add something about this to the PEP. I think the key here is that package authors should take the same care with default extras as they do with other aspects of dependencies at the moment, in the sense that they should think about what will happen for users who don’t have the latest tooling. Not everyone will use default extras to reduce their required dependencies, some may simply add dependencies that were regular extras before.
At the end of the day, authors are the ones responsible for ensuring that they know their audience and support their users, and this is true even before this PEP. As a package maintainer myself, for packages where I know I have users that may be using old versions of pip I’ve had to be careful before (in the past) adopting pyproject.toml too soon, or completely getting rid of setup.cfg/setup.py too soon. So while I agree that it means that it might be hard for some maintainers to adopt this in the short term, isn’t it pretty common for established packages to have to wait a bit before adopting the latest shiny feature?
From my perspective, the bottom line is that I think a lot of the responsibility for this specific issue lies with the authors, but I think the PEP should also include a description of these potential gotchas in ‘How to teach this’.
Q10: PEP organization
@pf_moore: I have noted your comments regarding splitting out some of the sections from ‘How to teach this’ into an ‘Examples’ section and expanding ‘How to teach this’, I will try and address this on the next edit once we converge on other issues.
Other issues
Anything that uses importlib.metadata.requires() will probably gain a logic error and possibly the same impossible conundrum as the distro mantainers.
@bwoodsend I’m not sure I understand this one above, can you give a specific example?