No disagreement from me â a format like that proposed in prior iterations of the PEP would definitely be an improvement over requirements.txt (to solve the lack of standardization, lack of secure-by-default installs, etc.)
The âarguments againstâ that I laid out above mostly stem from asking the question: if itâs not used ânativelyâ by uv or Poetry, when / why would users of those tools use this format? And, consequently, it important that this is a standard? Or should it just be a feature of pip?
(But, again, I am still supportive of it being a standard.)
You of all people should know that we canât assume pip is the only installer any more But more importantly, requirements.txtis the requirements-like format thatâs a feature of pip - if weâre not going to standardise the new version, then we can just stick with requirement filesâŠ
Iâm supportive of a simpler âsecure requirementsâ format being a standard, but I want us to be very clear about the use cases, and what would be required from project management tools like uv, Poetry, PDM, hatch, etc., for the new format to achieve its goal of addressing those use cases.
For example, if a use case is âa project can set up Dependabot checking and Dependabot will be able to determine the projectâs dependencies without needing to understand every workflow tool individuallyâ, then how will that work if the projectâs chosen workflow tool doesnât maintain an up to date pyproject.lock file, but requires an explicit export step?
Put this another way (continuing with the Dependabot example) - what is uvâs expected solution right now for projects that use uv to work with Dependabot? Will that change if we have a ârequirements-styleâ lockfile standard? If it wonât, then Dependabot isnât a very good motivating use case for the new lockfile standardâŠ
Repeat as often as necessary for the other motivating use cases for the new format
Yeah, thatâs the problem Iâm concerned about with only standardizing an âexport formatâ. I can export a requirements.txt from PDMâs or uvâs lock right now, but Dependabot will just issue useless PRs updating it instead of the lock file. So whatâs the point?
Thatâs why I thought about some way for tools like Dependabot to make a standard {executable} list-deps and {executable} update-dep-version {dep} {min-version}. On the other hand, Dependabot could fix this problem on their end by allowing a project to create a workflow to install their tool of choice and generate an update. That honestly sounds better, but also unlikely to happen.
As a user of workflow tools (Poetry & uv) I also donât think that Dependabot is a great motivating example if we are trending towards âThis new standard canât fully satisfy the needs of Poetry.lock and uv.lockâ.
Users of Poetry/uv still need to check in Poetry/uv specific lockfiles. Checking in another lockfile just for Dependabot would be a big DRY issue. Also, Dependabot would submit PRs to update the pyproject.lock but not Poetry/uvâs lockfiles so only the alerts would be useful, not the PRs.
As Iâve been following this PEP (as a user), Iâve been wondering if perhaps taking the option of âletâs waitâ might not be terrible given that it seems like unifying everyone at this point looks very difficult.
For Dependabot specifically, Iâd expect that for a tool as popular as uv, Dependabot will add support (the issue has 214 ) like they did for Poetry when it got to a certain popularity
As someone who works with people new to Python a lot, I do worry a bit about the teachability cost of yet another lockfile that doesnât fully replace the needs of existing workflow tools. Thereâd then be three things to make new users aware of even if they stick with a single workflow tool (e.g., uv)
requirements.txt (since old formats donât go away)
pyproject.lock
uv.lock
I wonder if waiting for tools like uv to mature and revisiting this in the future could be an option?
Another option is simply to declare the Dependabot scenario out of scope - after all, people manage at the moment with only requirement files. The problem is, if we have to reduce the scope too much, thereâs no justification for a standard lockfile format any more.
Is poetry not mature yet? I donât think the problem is about tools needing time to mature, itâs more about there being multiple tools, with multiple philosophies. We could wait for one tool to become dominant, but at that point thereâs no need to standardise - services like Dependabot wonât need to support multiple tools.
IMO, if we donât manage to standardise something in this round, we can pretty much assume there will be no standardised lockfile for the foreseeable future. Which may be fine, but we shouldnât kid ourselves that weâre just âwaiting a whileâ.
Iâm using âDependabotâ as a stand in for any tool that wants to inspect the dependencies of a project without needing to add specific support for every possible project management tool. If weâre saying tools that introspect dependencies are out of scope, what is in scope?
If every tool needs to support every project management tool individually, then that immediately puts any new project management tool at a disadvantage in the future. Thatâs the current state of things now, but ideally a standardization would level the playing field for novel development.
Were earlier revisions looking to do that for you? I.e. was the âset of packagesâ approach seemingly going to work for you? I ask because if it would then I think @frostming and PDM might be supportive as well, which would mean uv is still exporting a file but everyone else is aligned.
The file format is versioned for a reason; extending it later is always possible.
Does that actually have to be explicit after the first time? Nothing is stopping a tool from recording the fact that the user wants a cross-compatible export of their lock data and so any time the tool-specific file is updated the exported version is updated as well.
Letâs take this even a step farther back: why do any tools other than pip support requirements.txt files when they have their own lock file format? My guess is because itâs the closest thing we have to having an explicit list of things to install that every tool supports. Otherwise no one would be able to get a foothold as no one is going to want to rely on that workflow tool that no e.g., cloud host will support because they donât want to put the time in to support some other bespoke file format (it Iâm wrong I hope a tool author will correct me).
To me that says thereâs a need for a standard and itâs currently being filled by a convention.
Yes, because I have literally spent years on this. Waiting longer isnât going to magically make this any easier unless all but one project shuts down and we are okay letting a tool makes decisions via convention (which we did in the past and we didnât like it).
Yeah, I donât see this going better sometime in the future either.
Yes, so this is essentially the âminimum viable standardâ - we accept that all we will do is replace the requirements.txt format with a properly standardised version, and leave it at that. We wonât solve any new problems that way, but we will (hopefully) cover all of the use cases that are currently handled by requirements files.
(Sorry, I know this is just restating what youâve said previously, but I think itâs worth emphasising in the light of @charliermarshâs comment)
For the same reason they use requirements.txt. If you assume that everywhere uv deals with requirements files you should update to also deal with the new lockfile standard, thatâs basically what the ârequirements 2.0â form of the lockfile proposal would look like. Does that help put things into context?
Iâll give one small use case for why I both use uv and today only produce requirements.txt type file with uv pip compile. My current work place has some security audit tools. Those tools work on requirements.txt file but not other lock files. So for compliance my team needs requirements like file. If I were to add uv.lock Iâd be stuck having both.
Although honestly if we invent new pyproject.lock file Iâm weakly skeptical of our audit tools adding support for it anytime soon. Itâll probably come eventually if it becomes widely used standard.
Yeah, we support it as an escape hatch. It lets you take the output of the workflow tool, and operate it from other tools and other workflows.
Concretely, it was asked for here, with motivations being (1) install in an environment where uv is unavailable, and (2) Dependabot.
Another comment mentions that they can then run safety over the file (and get Dependabot support). And then, within uv, they can run uv pip install --target from the exported requirements file to get a zip archive for AWS Lambda. (So, they export uv.lock to requirements.txt via uv export, then uv pip install from the requirements.txt, since the uv pip interface doesnât operate on uv.lock.)
Again, Iâm fine with a standard that replaces requirements.txt for this kind of interoperability! It makes sense. Our goal, though, is to solve these problems over time so that users donât need to uv export. Taking Lambda as an example: we might want to create a dedicated uv command that bundles all of your dependencies as a zip, so that you donât need to export-and-install with --target.
(I get that this is up to the tool but I wouldnât be thrilled with this outcome, as a tool author and a user. Keeping them in-sync sounds like a pain⊠I think thereâs also some complexity here given that projects represent a wide range of scenarios (extras, dependency groups, resolver configuration), but the lockfile itself represents a single scenario â so in your original example, itâd be updating five lockfiles every time?)
As soon as you need to support extras and groups (like in Poetry), doesnât that cause problems with those earlier revisions? Or am I misunderstanding the suggested format?
Even if you total up the alternative formats which currently support a lockfile, thatâs ~30% to pipâs 77%. I worry that weâre losing sight of what people are actually using â most users arenât using universal resolution. Similarly, I think much of uvâs growth has been its compatibility with the requirements.txt format.
(I guess weâll probably get the 2024 data sometime in early 2025? That may help inform how much of uvâs userbase is using the universal lockfile.)
I donât think this is fair. Yes, Poetry is a mature project management tool but there are limitations in their universal lockfile. Solutions to these limitations are new and changing rapidly. I donât think that anyone, including uv, has a universal lockfile thatâs clearly mature enough for standardization.
While I think itâd be great if we could standardize both universal and single-platform lockfiles in one go, I feel like this discussion has made it apparent that there is significantly more complexity in the universal case and a higher risk of failing to actually improve interoperability. I think thereâs a much simpler path to standardizing a format with the same goals as requirements.txt, but with a modern take on topics like security and reproducibility.
I agree with the sentiments above that it would be pretty disappointing for this effort to go nowhere. Perhaps it would be worth focusing again on how we can design a new single-platform format which doesnât inhibit us from introducing a multi-platform format in the future? I think this still has some problems with how or if you specify the âgroupsâ that are being included in the file â I donât think thereâs been a clear solution proposed for that, other than arbitrary strings which breaks interoperability.
Please feel free to point out if Iâm missing some critical prior discussion, itâs challenging to follow this whole thread
I know you are, but others like @davidism arenât so thrilled if the outcome is âyet another file to keep track ofâ if we can help it.
Thanks for the segueâŠ
While I was trying to fall asleep last night (yes, this whole endeavour consumes way too much of my mental energy), I started to think about how this compared to pyproject.toml and why that seems to be successful (and please donât argue with me if you think it isnât; that would be a distraction for this topic). And I thought about the three things pyproject.toml gives you:
[project] allows for some tool portability which also helps to prevent lock-in.
[project] allows for some data analysis most of the time (but with dynamic and [tool] there are escape hatches for tools that need them which technically hurt this and point 2, but people seem generally okay with the trade-off).
[build-system] allows for programmatic understanding of how work with a source tree, in this case to create a wheel.
I think the mistake I have made is trying to meet all three points with lock files. I tried to make this happen with the separation of lockers and installers. But the split ends up hampering lockers so much that they canât really operate very much outside of the spec and that prevents lockers from filing in gaps or innovating. And that leads us to the current situation where no one is getting what they want from the PEP because thereâs always that next use-case to support.
So what if we stop trying to separate lockers from installers, instead letting tools write lock files with some tool-specific info that no one else might be able to work with? That tosses out point 1, but I donât know if tool lock-in matters that much for something thatâs meant to be written by code and not by hand.
You can still have point 2 where basic information can generally be available, maybe even in most cases, but in some instances some data will be tool-specific. But based on our experience with dynamic and [tool] in pyproject.toml it doesnât seem to be a massive concern that you canât always access all data upfront statically (Brainstorming: Eliminating Dynamic Metadata - #110 by ncoghlan notwithstanding).
Point 3 is possible if we make lock files document what tool to install and how to run the tool that created them (think [build-system] but with a CLI API). And this doesnât necessarily need to be extensive and cover all use-cases, but Iâm sure we can think of some common use-cases for what should be specified in the API. And thanks to lock files not meant to be hand-written we donât have to worry about being too verbose or anything.
Does reframing things from this perspective and ditching the locker/installer dichotomy hold enough promise to suggest we could still maybe get that âany and all scenarioâ lock file dream to come true? I would bring back allowing for multiple lock files as exporting with no [tool] tables would be like having a standardized requirements.txt file for when thatâs critical to someone. The PEP would be changed as necessary so that there was flexibility at all key points for tools to tweak things as necessary when they needed it while covering the common scenarios.
Iâm not sure I follow - isnât that what the [tool] section is for already? Are you simply suggesting that we will no longer require that a lockfile with a [tool] section can be installed by a tool other than the one that produced it?
Where would pip fit in this new framing? I donât see pip as ever being a âlockerâ in the sense of doing a locking operation so what is pip if not an installer that isnât also a locker? I donât see pip implementing something that reads a lockfile, determines the file was produced by uv, installs uv and runs uv sync, in order to install from a lockfile. So the [build-system] analogy breaks down for me at that point.
Iâm not against the idea - I just donât feel that I understand what youâre getting at well enough to have a view on it yet.
Would an example here be something like: the spec defines a flat list of packages â name, version, where to find it. But then everything else is up to the locker / installer? E.g., for uv, weâd put all the dependency metadata and graph in [tool.uv]?
(Just as an example: this would allow tools that need to ask about âwhat dependencies might be includedâ, like Dependabot, to function.)
Given Brettâs statement, wouldnât pip reasonably both read and write a âplainâ lock file in this format as an alternative to requirements.txt? Certainly a lock file that doesnât interact with pip would be doomed for failure.
My question is what attempt will be there to have a minimal level of interoperability with the tool-specific lock files? It has to be more than âzeroâ, otherwise itâs no different from keeping uv.lock, poetry.lock, etc. in parallel with a next-generation of requirements.txt. And would @pf_moore accept this if pip has to error out if handed a, e.g., uv-flavored lock file?
If not ânothingâ then what basic data should be required to be in the common section of the lock file? Certainly some sort of listing of packages. Do we include markers? Should tools be encouraged to make the âcommonâ package listing self-consistent?
As a user, I picture that the âcommonâ section should target some minimum âbasicâ installation (no extras or optional groups) suitable for CI or to run an application, but supporting at least the major platforms. This avoids the need to deal with potentially conflicting extras or specifying groups.
That said, I can imagine that trying to implement that is probably much easier said than done.
Pip doesnât write requirement files, which is why I donât expect it to write lockfiles either. I can see pip installing from a lockfile, but only if it can do so ânativelyâ (i.e., without needing to know which tool wrote the lockfile)
Thatâs my point. I think it would have to be âzeroâ because pip wonât include tool-specific knowledge (weâre still working on removing the places we did that for setuptools ). The best I could imagine would be aborting if the lockfile had a [tool] section.
I honestly thought this was what pip freeze is for. Is there no expectation for pip to provide a similar output for this format?
Agreed that pip (or any tool) should not be expected to have any specific knowledge for any other tool. I interpreted @brettcannonâs proposal to strive towards some basic level of interoperability, but perhaps I misunderstood.
Ah, sorry. pip freeze does produce requirements-style output. The information necessary to produce a lockfile (specifically, the URL of the file used to install a package) isnât available, so we canât produce lockfile-style output from that. If PEP 710 gets approved and implemented, we will be able to produce lockfile output, but only as long as all the packages in the environment have provenance data, which in practical terms means newly created enviornments, built with tools that support PEP 710.
Also, Iâll point out that a standalone tool could just as easily (with PEP 710) create a lockfile that specifies how to recreate a given environment. Thereâs nothing really in pip freeze that requires that it be part of pip. So while pip probably would at some point gain a pip freeze --format=lockfile command, it doesnât mean that pip necessarily has to be a locker.
Iâm hoping thatâs the case, but thatâs why I asked for clarification - Iâm not completely sure what Brett intended here.