This is part of why I think it would make sense to focus on the environment aspect, of which creating a lock file is just one particular use case for them.
No that’s not a constraints file. In pip terminology a constraints file is a file that will add additional constraints, but will not otherwise cause something to be installed. So if you have mousebender==1.0.0
in your constraints file, and you do pip install -c constraints.txt requests
, you’ll end up with an environment without any mousebender installed. If you do pip install -c constraints.txt mousebender
, you’ll get mousebender 1.0.0 (or an error if that creates an unresolvable dependency tree) no matter what other version specifiers for mousebender exists.
This would be replacing what the typical output of pip-compile etc are, which is a requirements file with the full version set “locked” to a specific version, but not otherwise mandating where it comes from. That’s actually more generally useful with how Python’s packaging is typically setup because it means I can install from PyPI when I’m at home, and from the company mirror when I’m at work without having to recompile the lock file.
Even my solution is missing the ability to specify hashes without specifying where the files come from, which is also super useful.
I personally have no real interest in a lock file that doesn’t let me continue to use mirrors as normal.
It doesn’t really matter though if you think it’s asking for trouble or not. If you don’t specify the input format and how it’s compiled into this format, then people can do things like that. If you don’t want them to do that, then you need to specify the input format (but then you’re removing the ability for tooling to experiment).
Just to be clear, in my example the “lock file” would have that information fully embedded in it, but the “locker” would have an input file that allows specifying an external file for some data ( as an example feature ).
So at a minimum the PEP needs to remove the idea that this PEP means that a tool like VS Code could generate lock files without implementing tooling specific code. Unless I misunderstand at least.
Yes, because you don’t know what the “locker” tool uses for it’s source of truth. You have an output that the locker produced, but you have no way to know that changes to that file will persist the next time the locker is ran.
That sounds very wrong to me, and falls into the same kind of confusion that I first wrote about in setup.py vs requirements.txt · caremad. pyproject.toml is for abstract dependencies. You cannot conceptually use it as the input for a lock file (the only way you can do that, is to generate an empheral “input” that contains one entry, the source tree that contains the pyproject.toml).
How does a pyproject.toml dependency specifier indicate that it needs to add my internal company PyPI? It can’t and never should be able to, otherwise you end up with keys in pyproject.toml that only sometimes matter, which is the exact kind of spooky action at a distance that confuses people.
I understand poetry does this, but it is, IMO the wrong choice to make, and we should not be perpetuating that.
If that’s the goal of this PEP, then I would be a hard -1 on it (which of course I’m not the PEP delegate so that doesn’t block it).
I don’t think it’s purely bikeshedding, I have repositories with several environments that end up being created, centralizing all of those lock files to a single directory just ends up making things way more confusing IMO. Now I have to worry about namespace collisions, unless we end up littering the directory tree with pyproject-lock.d
directories.
As an example, I have a project that has two docs.txt requirements files in different directories (API docs, user docs). If I have to colocate the locked output of those into the same directory, I end up having to munge names around.
You could take the .pyc
approach and put directories colocated next to those files, but that needs to be decided at least and spelled out.
I mean, I have ancedotal evidence of it causing problems outside of a resolution context within Warehouse. I would guess the tooling that does this, the people who ran into problems with it just stopped using those tools. If you want I can do more digging and come up with more popular projects where it would be a problem.
Overall I don’t understand why we’re choosing this hill to die on. Nobody has suggested that there aren’t projects it will break on, just that it’s “rare”. Ok fine let’s accept that on face value, why are we choosing to break those projects when we know in advance we might? What is the benefit to us? Slightly shorter lock files?
Yea the first need to was wrong.
Why would we assume this? This feels like a common thread in this discussion, the spec gives us the power to not have the locker deal with it. We need to either remove that power and mandate that the locker has dealt with it, or we need to provide the tools to deal with it ourselves.
This is a common pattern for projects that want to produce a universal python 3 wheel, but don’t want to continue to support old versions of Python. py3
technically works on 3.x, including 3.0, then projects will typically include python-requires to further filter it. Marker doesn’t solve it because marker is for the entire file, not specific files (and this spec doesn’t require the file only be for a specific environment, so the locker can’t always “handle” it in advance anyways).
Can’t we just suggest or mandate inline tables for hashes if that’s the only problem? Designing a new format making a decision that we can tell will cause pains for us in the future if something we’ve already had to do at least once happens again seems like a bad call?
I personally would be very annoyed if using lock files turned all of my version references into url references. Maybe I’m the weird one, but I can forsee that causing frustration and confusion I think.
That’s making assumptions that the installer and locker get upgraded in lockstep no? Otherwise all the problems I mentioned still exist.
I don’t think pip has ever supported a wheel url that didn’t end in a filename that was well formed. I’d have to test it to make sure, but I’m pretty sure this problem doesn’t exist for anyone using pip. Do other installers implement it differently?