Pre-PEP: Include pylock.toml files inside wheels

This is outdated: See Pre-PEP: Add ability to install a package with reproducible dependencies for the new proposal

:page_facing_up: Proposal: Including pylock.toml in Wheels

I would like to propose a PEP that allows including pylock.toml or named lock files like pylock.<name>.toml (as defined in PEP 751) inside a wheel’s *.dist-info/pylock/ folder.

As explained in PEP 770, adding new files or folders to a wheel does not require a new metadata version.

:books: Background

Although PEP 751 was accepted, there is currently no standardized way to distribute lock files, apart from placing them in a version control system (VCS) repository.

Most projects rely on CI/CD pipelines to validate their builds.
Including the lock file inside the final build artifact — often a wheel — that is later distributed makes sense.
These artifacts are typically the result of successful CI/CD runs and therefore signal that the package works as expected with the specified dependencies.

Distributing lock files via wheels eliminates the need for additional infrastructure just to support reproducible installations.

:wrench: Use Cases

  • Packages that are frequently installed as tools (e.g., via pipx or uvx/uv tool) MAY include a lock file, which MAY be used by the package installer to create a virtual environment.
  • Companies maintaining software composed of multiple packages can use their existing infrastructure to install reproducible builds of their software stack — with the option to easily roll back to earlier versions, including dependencies.
  • Library authors can distribute the set of dependencies their library was tested against. This helps less technical users to identify whether issues are dependency-related.

:cross_mark: Non-Alternatives

  • Lock files could be distributed alongside wheels, similar to how .METADATA files might be handled (warehouse issue #8254):
    • This approach requires an additional HTTP request to check for existence.
    • It becomes difficult or impossible to detect named lock files.
    • It would require changes to wheel-hosting services (e.g., PyPI’s Warehouse or devpi).

:package: How Should Package Installers Handle Lock Files?

Assuming the installer supports lock files in wheels:

  • When installing a wheel as a library:
    • The installer SHALL NOT consider any lock files unless explicitly instructed by the user.
  • When installing a wheel as a tool (i.e. using pipx or uv tool for installation):
    • The installer MAY use the lock file unless the user explicitly opts out.

:thinking: Things to Consider

Python Version Compatibility

As discussed in the thread on Locking a PEP 723 single-file script, special attention must be given to Python version compatibility.

Wheels are usually built once for a specific Python interpreter version. When a newer Python version is used, users are often forced to build from sdist — a process that can be difficult or even impossible in some environments.

Therefore, I suggest loosening the strict requirement of requires-python to only allow a lower bound in the top-level metadata of pylock.toml. Instead, adopt the same semantics as for packages.requires-python, which uses standard version specifiers.

In most cases, existing tooling already supports upper bounds for Python versions, so minimal or no changes should be necessary.

The requires-python field could indicate the range of Python versions under which the package was tested during CI/CD.

Also note: since the lock file is distributed inside a wheel, the wheel itself can already be limited to a specific Python version.

:locked_with_key: Security

In the PEP 723 single-file script discussion, some raised valid concerns:

Locked old dependencies might contain known vulnerabilities (CVEs).
Unlike ad-hoc scripts, package builds typically occur in CI/CD environments with tools like Dependabot or Renovate keeping dependencies up to date.

Another concern is that lock files might reference untrusted or unexpected sources via arbitrary URLs. To mitigate this:

A package installer SHALL require user confirmation if any requirement in the lock file is to be installed from a source different (=mismatching host) from the one used for the original wheel.

This is another benefit of bundling the lock file inside the wheel: it enables verification that dependencies come from trusted sources (e.g., the same index).

:hammer_and_wrench: How to Include Lock Files in Wheels

Including lock files in wheels would be the responsibility of the [build system].

The build backend could either copy an existing pylock.toml file or generate one dynamically during the build process. This would provide yet another mechanism for updating and locking dependencies as part of the build workflow.

Disclosure: I used an LLM to proofread/improve (NOT generate) this post as I am not a native English speaker and a bit dyslexic.

2 Likes

On a first read, at a high level this seems directionally like a good capability to have: native support for packaging a complete environment spec in a wheel.

I appreciate that you called out this aspect as something important to consider. A question on it: by what mechanism are you envisioning package installers will determine whether a given installation action is for ‘library purposes’ versus for ‘tool purposes’?

It seems likely to be challenging or impossible for an installer tool to automatically inspect its context and reliably make that determination. So, it would have to be decided at wheel build time (by specifying a default) and/or specified by the user invoking the installer at install time, I would think?

Or, perhaps it might be the simple presence or absence of the lockfile in the wheel that would decide? If a lockfile is present, it should be applied (or, applied by default, at least)? That potentially runs the risk of behavior the user might not expect, though.

2 Likes

by what mechanism are you envisioning package installers will determine whether a given installation action is for ‘library purposes’ versus for ‘tool purposes’?

I only pointed to that in the the Use Cases.

There are special entrypoints/programs to install a package as tool, i.e.:

They all install a package in an isolated environment and make it known to the path (or run it directly).
So we know the user wants to install a tool and we also know that we have a dedicated venv to do that.
In ALL other cases we assume a library installation.

This mean that also inject additional depencies in an environment i.e. uvx --with some_pkg cowsay hello will ignore the lock file of some_pkg but consider the lock file of cowsay

2 Likes

My feeling is that this is conflating the distribution format for a single package with the distribution story for an entire application.

In particular, there is no reason that a wheel should be assumed to be the final distribution artifact for a Python application - we can build a zipapp, Android app, iOS apps, Windows installers, Docker containers, macOS .dmg etc.

12 Likes

Aha - missed that. Thanks!

The existence of tools like pipx or uvx proves that users already use wheels/pypi to install tools/entire applications.
They are extremely helpful especially for new user and they can be used cross-platform.
Authors of tools for the Python ecosystem can simply tell other python users how to install their tools.

On the contrary there is no standardized nor easy way to convert a python package into a full blown app.
And if you do there is the ways to install them is very platform-specific causing installation guides to be complicated.
Also building is only one part of the problem. You need to manage updates and distribute them, so you/your users need packages managers as well as hosting services to distribute your app.

Using the proposed users and package/tool creators can stay in the python ecosystem they are used to without the need learn a probably platform depended way to install a new package that in worst case even needs admin permissions they don’t have.

For people responsible for infrastructure deployment it eliminates the need to manage yet another (or worse more due to platform restriction) package index.

Furthermore it is up to the package installer/user to use the lock files included or not and in the most cases it is ignored.

And as explained in Use Case even for library authors it can be helpful to get more meaningful bug reports, as there will be an easy way for end-users to replicate the dependencies that were used to test the library.

So, I understand the concern that this mixes installation of a package with distribution of an app (which indeed are two different things). But given the advantages I would argue that usability overturns pureness in this case.

My understanding so far has been that tests should only be distributed within the sdist to keep the wheel size small, so including the pylock file in the wheel would somewhat defeat that purpose, doesn’t it?

3 Likes

The propsal never spoke of including tests only lock files, whose size is small enough.
The lock file only could be used to reproduce an end user bug with a given depency set that is know to work, without requiring user to know about git etc.

And again. All of this is optional. Maintainers of libraries most likely will not include lock files by default. They will only do this once they feel the need to get their endusers to have a specific environment and feel that their endusers are not technical enough to clone their repo etc.

I like this idea. I can imagine uv, pip, or similar tools offering a --locked flag as an opt-in to install wheels with their locked dependencies. Even if this proposal primarily benefits pipx-style isolated installs, it would still be incredibly useful because these installations are already very popular due to their ease of use and intuitive nature.

Another significant benefit of a --locked flag emerges when a dependency releases a breaking change. For instance, if you install foo, which depends on bar>=1.0.0 and has bar internally locked to bar==1.0.0, a standard pipx install foo could pull in a new, incompatible bar==2.0.0 if it’s released. This would result in a broken installation of foo. The --locked flag would allow you to avoid this, ensuring foo continues to install and run correctly with bar==1.0.0 until foo itself publishes an update compatible with bar==2.0.0 or later.

1 Like

I’m a strong -1 on focusing on wheels as an application distribution format (and that includes proposals like this, which are specifically targetting that use case). I think the Python packaging community needs to put a lot more effort into the question of how people should distribute applications, on the understanding that the majority of applications are aimed at users with no understanding or, or interest in, Python. This is a huge topic, and one we’ve barely even scratched the surface of yet. We don’t even have a realistic understanding of the range of use cases that exist[1]. But I think we have to look realistically at what people want if we’re to make progress. And outside of the Python developer community, I’m pretty sure people don’t want uv or pipx. The Javascript community have npm run, but no-one thinks that VS Code should be invoked from the command line as npm run vscode.

Lock files are a great tool for developing applications, but they should be used in the build process, where Python code is assembled into a working application that can be shipped to users.

To give one specific example here, the failure modes of wheels as applications are incredibly user-hostile. For example, if the user is running on a platform that isn’t supported, either by the application itself or by one of its dependencies, tools will fall back to building from source. When native code is involved, this could invoke a compiler (if the user even has the compiler installed) which would then fail with a bunch of errors that the user has no chance of understanding.


  1. Which is why I’m so uncomfortable with this proposal - it focuses on one particular use case, when we don’t even have an understanding of how important that use case is in the broader context ↩︎

13 Likes

I agree that application distribution for non-Python users is a huge and largely unresolved topic — and you’re absolutely right that we need better tooling for that space.

However, I think we’re talking about two very different problem spaces here.

Just by the fact that wheels require a Python interpreter and a Python package installer, it’s already clear that they are inherently targeted at people within the Python ecosystem — not general end-users. Tools like pipx, uv, or any form of pip install are not aimed at the general public, but at developers or technical users who are already working with Python.

The proposal I’m supporting here is not about turning wheels into a general-purpose application distribution format. It’s about improving the reproducibility of workflows within the Python user base, particularly for command-line tools or developer-facing apps. These workflows already rely on tools like pipx or uv and are not intended for non-technical end-users.

I agree that these kinds of failures are frustrating — but this is true for any use of pip install. It’s something Python developers have learned to deal with. If you want to distribute an application to non-Python users, then yes, you should use something like BeeWare, PyInstaller, shiv, or similar — that’s a totally different domain.

This PEP specifically avoids touching that domain.

Even if a package is ultimately meant for non-Python users, and the maintainer doesn’t have the resources to build full-fledged native installers, then asking users to run something like uvx my_tool is still more controlled than having them run pip install my_tool and get unpredictable dependency resolution errors. If we can make that fallback more reproducible, that’s already a clear win — even if the user experience isn’t perfect.


Back to the actual problem this PEP tries to solve:

There’s currently no standard way to install a CLI tool or even library like Apache Airflow, which has many dependencies and a high risk of breaking due to upstream updates, in a reproducible way — unless you manually track down a lock file (if it even exists) somewhere on GitHub or as a build artifact.

This change proposes a simple, Python-user-friendly way to install such a tool along with a vetted, working set of dependencies. It’s not perfect, but it’s a big improvement over the status quo, especially in real-world use cases where:

  • Package authors already build lock files to ensure stability.
  • End-users want to reuse those lock files without needing complex setup or external tooling.

I don’t understand how this proposal would make the situation worse. Could you elaborate?

EDIT: Used LLM to improve message readability.

Couldn’t lock files explicitly help with this though?

For example, applications could ship lock files with wheels only in the lock file, and then the installer (pipx, uvx, etc.) can explicitly tell the user that their platform isn’t supported.

Compared to the very real current situation of users installing applications with unbound dependencies via wheels.

I agree with you that wheels are not ideal for distributing applications, and it would be good to find a better solution. But I also think we can’t ignore that this is happening right now, and so small changes to the spec that could improve the safety for those users would also be a big benefit. Rather than waiting around for a future hypothetical solution.

2 Likes

I simply don’t think that we should formalise a workaround for a larger problem in a standard. I don’t think it will be as useful as the OP expects, and I think it will be bad for the packaging ecosystem in the long term.

I guess we’ll simply have to agree to differ. If this ever becomes a formal PEP, and I’m still PEP delegate at that point, that’s when it’ll be worth trying to change my mind. For now, though, I’m going to bow out of this discussion.

4 Likes

I’m trying to understand the context of where you’re coming from, do you mean “workaround” as in a lock file or workaround as in running CLI and applications via wheels?

Specifically, is it your view that users should never use pipx or uvx to run a Python CLI or application?

If they shouldn’t I will note that pip does it as part of it’s build process: pip/.github/workflows/ci.yml at 25.1.1 · pypa/pip · GitHub, so it’s pretty ingrained into the Python packaging ecosystem, and I don’t see how that’s going to change whether this PEP is accepted or not.

If they should but a lock file is a “workaround” then what do you mean by that? Lock files can provide explicit dependency safeness, and protect against package hijacking and dependency confusion attacks. I only see as a path forward to providing a little bit more safey for this case.

2 Likes

I mean “workaround” as in “putting a lock file (which is about pinning exact requirements) into a wheel (which was designed for libraries which don’t pin their dependencies)”. It’s a confusion of abstract vs concrete requirements in my opinion.

Not at all. It’s an expedient and useful approach. But the model was designed around pip install <app>, which resolves the package dependencies at install time. If we want a lockfile-based equivalent (I don’t, personally, but that’s not the point) then we should redesign the model based around lockfiles (and probably locking a particular Python interpreter version as well, if we want reproducibility).

I never said it wasn’t. Entry points have been part of the wheel spec since it was introduced. I’m not disputing that.

Nobody ever asked for requirements files (with pinned dependencies) to be shipped in wheels. I find it difficult to believe that this use case has suddenly appeared, right when lockfiles get standardised. So I have to ask, what were people doing about this before lock files - and why can’t they carry on doing that?

There’s also a bunch of UI questions here. For a start, would the lockfile include the containing wheel itself? If not, then it’s not a complete lock. But if so, then what if the wheel doesn’t match the lockfile? Come to that, why isn’t the lockfile on its own sufficient? Maybe the UI should actually be pipx run http://url/to/pylock.toml entrypoint_name? If you wanted to omit the entry point, you could have a tool.pipx.entry_point field in the lockfile.

My feeling is that this proposal is too focused on the solution and hasn’t really looked at the underlying problem.

6 Likes

You’re right this use case did not suddenly appear, I definitely remember @potiuk asking about applications being able to distribute optional constraints files, for the same motivation as this.

In the mean time you can see applications produce their own workarounds such as:

Side Note: both these solutions often restrict you to installing these applications via pip, or via applications that have copied pip features exactly, rather than standards.

What I feel is far more common though is that when dependencies break users give up or complain to uv, pipx, the application they are trying to install, or the dependency that broke. And it does not occur to most involved that they could come to DPO and discuss a new standard.

I agree that this is a good question and should be answered by the PEP.

If I generate a lock file for the pip project using pip lock . it looks like this:

lock-version = "1.0"
created-by = "pip"
packages = [
    { name = "pip", directory = { path = "." } },
]

My instinct is that’s how the lock file should include itself in it’s own distribution. But perhaps there’s some issue I am not thinking of. It should certainly be well specified by the PEP.

Agreed these are important questions the PEP should answer.

One answer might be that any lock file uploaded to PyPI should only point to distributions on PyPI, thus giving users a way to validate all wheels and deferring some level of trust to PyPI that the lock files aren’t being changed over time.

I think that’s a valid criticism, that the PEP author should take a step back and survey the problems users have.

Although I am of the opinion that there are real problems here that can be solved. But maybe they can, or should, be solved at the tool specific level. Perhaps someone should also gather feedback from pipx and uv by opening said feature requests of installing tools directly from a lock file.

This is a coincidence: A organisation I work for is currently NOT using pinning at all, it is mostly regulated by limiting existing packages on different indexes: An increasing pain point and my TODO list for long.
I happens to be that I just visited the Europython to learn about the new standard.
Learning also about SBOMs / PEP 725 and having already a package index that is secured (i.e. only specific customers can access specific projects/weels), putting the lock file in the wheel just felt like a too good fit.

Coming from all the problems with breaking dependencies I really hope that I do not confuse things there. But I will cycle back with some colleagues to make sure.

Because then we loose the package manager feature like, installing a specific version of the application and need to create yet another service to manage and distribute the lock files.
The problem I want to solve is how to easily distribute a versioned lock file without:

  • writing another package manager
  • and it’s service counterpart
  • teach users how to use it
  • define what a release it and how to sync it with the wheel package release

That is an interesting question. From an installation perspective I would say, no, it doesn’t. It would create a chicken egg problem, as it would need to know it’s own hash and url. Using a lock file of a wheel implicitly install this wheel.
From a perspective after installation, it make the lock file quite useless when copied into site-packages.
So we should look into that and possible use case (what could people using the site-packages version of the lock file for, if at all? Can we simply not copy it?).

Can you specify what you mean by that?
I mean of course we could define “pylock applications” that would do something similar. But is there any advantage/use case that can be solved only with this kind of model? I can’t think of any but on the contrary fear would increase complexity and confuse users: I.e. airflow has version 3.4.1 but the locking has the version 3.0.1. Which airflow version is installed with the locking?
Is it the most current version or did they simply forget to update/publish the locking?
I just feels easier to glue the lock file and the wheels together.

That is part of my the suggestion, even if its not that strict formulated. Normally packages are tested against a ranger of interpreters, therefore IMHO it is fine to allow the tested range.
Also interpreters, can be harder to install/obtain in some environments[1]`.


  1. In our organisation we only want a specific python interpreter version to prevent develops from using features that are not compatible with all deployments and therefore disable uv’s python installer. Also standalone interpreters have some quirks, which make python interpreter installation yet another can of worms that don’t want to solve with this PEP. ↩︎

It’s also an important question whether PyPI even wants to become an application distribution hub, as opposed to a package index. Yes, I know it’s being treated as an applicatiion distribution point for many projects already, so maybe there’s no problem here. But again, formalising something without ensuring that the affected stakeholders (PyPI in this case) are OK with it is impolite at best.

One advantage of something like pipx run https://url/to/pylock.toml [entry_point] is that it can be implemented entirely as a tool feature, without needing any new standards. That would at least give an indication of how useful the functionality is.

2 Likes

Sorry, I thought that creating a PEP is the process to actually include all stakeholders. I will try to get them in the loop.

Actually we could even put the pylock.toml directly within dist-info instead of a subfolder and would be in alignment with the standard.
After the stakeholder argument, I am not sure if that would be a good idea.

The proposed pipx run https://url/to/pylock.toml [entry_point] syntax would not help me in my use case, missing deployment/pkg manager features, as explained above.

I wouldn’t call the timing of this coincidental. Abstractions matter, and standardized lockfiles give us a better ability to discuss these topics than we had before the standard was accepted just days-- er, months ago!


The original phrasing of this thread presents itself as a standards-track discussion. I think that’s premature. We should establish a shared notion of what users want and need before trying to build out a specific solution.

And I have some fundamental concerns about this specific proposal. It mixes an environment specification (pylock.toml) with the contents of an environment (distributions/wheels). So it’s “inside-out” – the thing which goes “inside” constains a spec for the thing which is on the “outside”.

I think that inversion leads to a number of problems. For example:

  • How would locking work with this? Would pip-compile or a locker have to unpack dists and read their contents, rather than just their metadata?
  • What should a locker (e.g., uv lock) do when it finds a “nested lockfile” while trying to update, not freshly generate, a lock?
  • How should tools behave when installing packages which have conflicting lockfiles? What if those packages have compatible dependencies? What if they don’t?
  • Should tools ever ignore nested lockfiles? Does that match the intent of the maintainer who put a lockfile into their distribution?
  • When updating/reinstalling a package with a lockfile, are tools expected to do any differential analysis of the lockfiles from before and after the update?

I have confidence in the (often amazing!) abilities of our community of developers. I’m sure we could make it work, but if the design is wrong, the end result will be worse for users than finding a more suitable design.

And if “making it work” really is good enough, why don’t we try that before writing a standard?
I see three relatively simple ways that installation of a locked application via a lockfile could be supported by tools today:

  • Bundle a pylock.toml in your package as in-package file data (like py.typed, if you need an example) and make a tool which uses that file
  • Use VCS/HTTPS installs from a source repo, and make a tool which looks for pylock.toml in the repo root
  • Provide a Project-URL with a link to the associated pylock.toml (served/hosted separately)[1]

You can do this with a completely new tool, a pipx or uv fork, or by wrapping those tools.

(EDIT: I just noticed that the OP mentioned putting it in dist-info above, even without a standard, since the spec technically doesn’t forbid you from including extra files. I think that’s reading the spec a little too loose, since it does reserve rights to all subdirectories of dist-info, meaning it reserves pylock.toml/. I’d rather not play with fire and just not touch that dir.)

This caught my eye because I think it’s also interesting to ask what community-owned distribution channels we should have. Should there be a Python community container repo? A zipapp repo? If PyBI work resumes and gets accepted (ref), where would that go?

PyPI is being used to distribute applications today. If we want to shift the balance of usage away from that, we need to provide or describe an alternative (read: superior) way to deliver the tools which are distributed that way. And that includes tools like mypy/pyright/pytype which may interact with the contents of a user-controlled Python environment. So that sounds like a package, unless we have some new ideas about how to distribute those tools.

I consider the status quo to be: “PyPI is a valid distribution channel for applications, if those applications are happy with the benefits and limitations inherent in being a Python package.”

This proposal is trying to alter what the tradeoffs are for a Python package.
That’s fine, but it’s not the only degree of freedom which we have as a community: as an example, we could allow packages to serve a “recommended lock” alongside their dists on PyPI.

But of course, any proposal needs to be well grounded and meet the usual standard of quality for community adoption. I think we’re having the early discussion, before it’s even clear if anything should be done.


  1. maybe dicey for security, but no worse than a setup.py ↩︎

3 Likes