I’m here with a wall of text again. I seem to be doing these a lot lately.
There seems to be some amount of “what exactly is this PEP for compared to what we have” undertones. Here’s my attempt at trying to address that, followed by responses to specific questions / comments.
The way I’ve been thinking about this file is that it’s a much better approach for doing what a requirements.txt file (with all the packages pinned, and with hashes for the specific files) wants to achieve. That is the kind of requirements.txt file is what pip-compile generates today – here’s an example. That’s effectively a lockfile. I’m gonna call these “locked requirements.txt” files for the rest of this thread.
There’s many other kinds of requirements.txt files, like weaker attempts at being a lockfile (pinned without hashes), or a bundle of requirements (eg: pip/tests/requirements-common_wheels.txt at main · pypa/pip · GitHub), or something else. This PEP does not care about those. Those are a different usecase from what this PEP is trying to address, but those can be used to generate a valid file in the format this PEP describes. It is also not trying to do anything to substitute an individual package’s declaration of install_requires, although that information is taken from the distribution files and encoded into the generated file.
The initial version of this PEP was basically a 1:1 replacement for locked requirements.txt files [footnote 1].
This is meant to decouple the description of “install these exact packages into the environment” from pip’s requirements.txt format (that is only specified as “whatever pip implements”). Creating a format to describe this information is already something that existing tooling has been doing to varying degrees (pdm, poetry, pipenv). The goal here was to have a well specified common format, that describes what needs to be installed into an environment, to recreate it in a reproducible manner.
As already mentioned, we did a round of “hey, would this work for you?” feedback prior to posting the PEP here and as a result of that, we made the format much more flexible to enable this format to be platform-agnostic. The behaviors possible with this additional flexibility are not really achievable by using locked requirements.txt files today. [footnote 2]
There’s an example below that shows what the “additional flexibility” entails.
Well, you don’t have to be platform-agnostic with this format.
If you’re going to be resolving for a specific platform/environment, you can encode that into the lockfile using metadata.marker and metadata.tags. Those are the only pieces of information about the environment that can affect what dependency or wheel file could be used. By encoding that information, the installer tool can determine whether the lockfile is compatible with the environment it is installing into.
Off-topic-ish: poetry’s resolver is pubgrub based, and it’s probably be a decent idea to take a look at how it handles generating the lockfile. 
No. This flexibility seems to be tripping people up, so I’m gonna try giving an example:
Say you have a requirements.txt file like this:
pytest < 4.6
pytest-cov
The potential set of versions of pytest and pytest-cov that are compatible with these exact requirements is pretty huge. You could generate a pyproject-lock.d/very-broad.toml that lists all the potential versions for each of them with all their assets. You could also generate a pyproject-lock.d/strict.toml that has as few versions of each package as possible – usually only the highest version that is possible given the constraints.
The strict.toml is conceptually equivalent to what a locked requirements.txt file, or poetry.lock, or Pipfile.lock, or pdm.lock are today. This is what people usually mean, when they say a lockfile. This works exactly like how you’re expect a lockfile to work.
The very-broad.toml might seem not super useful at first. Why would you want something like that? Well, we’ve transformed what was a pip-specific file format with an implicitly declared package index, to an interoperable tool-agnostic format with exact hashes and URLs. With that, the installer no longer needs to make multiple requests for simple index pages. It can now download these packages in parallel. It can be much simpler than pip / poetry / pdm etc since no longer needs package index interaction logic. Notably, it can also be consumed by a platform, which can create an environment by picking a single version out of sets of packages provided.
And, yes, all these benefits also apply to the strict.toml case. Hopefully, this clarified how this format is both capable of being a lockfile and is a lot more flexible than what the usual expectation would be from a lockfile.
The main reason that this flexibility exists though is because this format is trying to be platform agnostic. You can have different dependency requirements for different environments and since some distribution files can be compatible with multiple environments, so the installer needs to be able to deal with that possibility if this format is going allow generating a single platform agnostic file.
I hope that’s clear by now – this is definitely the first one. It’s not replacing install-requires metadata or acting as a substitute for that in any way.
Could you elaborate further on this? What usecases do we know of where someone would actively want to specify different dependencies on a per-file basis that we want to support universally?
FWIW, it is definitely possible to only have a single file per package pinned in this format, so projects that do this are not excluded from this format in every case – only when the dependencies differ across the files pinned; which may be worth making an explicit required error. I do think we’d want to figure out a good way to communicate this to the users though, with a decent example error message if we decide to make this an error.
to have different dependencies due to having different needs across Python versions or platforms
This is what environment markers are for.
because setup.py makes it trivial to create a package that will have this characteristic
This reads like “it is possible for someone to do this” => “we should actively support this”, which… I disagree with. Not supporting all the misfeatures of existing tooling is a good idea IMO. 
[footnote 1]: Okay, so, uhm… the format was not exactly a 1:1 replacement.
It was making one important optimisation (or assumption, pick your word) – if you know the package name, package version, package index and set-of-acceptable-file-hashes, then you effectively know the exact asset that you want to download. Right now, there’s one additional network request for getting the simple index page for the package and extracting the exact asset URL out of that. Theoretically, the package index could use a different URL for every request or whatever, but they practically do not. And those URLs usually need to be stable, if they want to benefit from pip’s caching anyway. The format, instead, encodes the exact URL into the file. This eliminates a network request and, when combined with the fact that dependency information is encoded into the format, this means that the entire process of “figure out what to install” is offline and deterministic.
[footnote 2]: So, theoretically you could be exceedingly hacky intelligent and generate a requirements.txt file that has “merged markers” but… uhm… I’d argue that it’s better and easier to write your own resolver + a lockfile format like this given that is what most people have actually done. 
PS: @brettcannon @uranusjr maybe we should’ve bikeshedded the name of PEP a bit more. 
PPS: I’m not proof-reading this (I’ve already spent an hour on this) so there’s a decent chance that there’s some stupid typo or a glaring mistake here – please be kind when pointing those out.