Sometimes all I want is the source for modules - and I’ll work out the dependencies later - manually.
So, I have been trying something such as:
pip3 download --no-deps --no-build-isolation pandas
However, it still goes into a long process (building wheel data I guess, it from what it finally says).
(py37) aixtools@x070:[/home/aixtools/python/download]pip3 download --no-deps --no-build-isolation pandas
Using cached pandas-1.0.5.tar.gz (5.0 MB)
Preparing wheel metadata ... done
Successfully downloaded pandas
(py37) aixtools@x070:[/home/aixtools/python/download]ls -l panda*
-rw-r--r-- 1 aixtools aixtools 5007108 Jul 04 11:03 pandas-1.0.5.tar.gz
However, to get this I first had to install cython and numpy. What I would rather see happen is that either ONLY pandas package source gets downloaded, or pandas, numpy and cython sources get downloaded (and I have to figure out the dependencies).
Likely, I am not understanding one of the command-line arguments - if so, my many many thanks for pointing it out!
You probably want
--no-binary :all: to tell pip not to try to find wheels. You shouldn’t need
--no-build-isolation as you don’t want to build
This is similar to pip#8387. To summarise:
- pip needs to get package metadata to verify a downloaded sdist.
- Since sdist metadata is currently not trustworthy (problem 1), pip needs to build a wheel for that.
- Pandas has a highly customised
setup.py that makes its invocation very involved, even if you don’t want to actually build the package, but just retrieving metadata (problem 2).
- Without build isolation, pip cannot even run its
The deal with the problem long-term, we need to make sdist metadata useful, but that’d take a long time. Before that, it’s unfortunately up to individual package maintainers to make metadata retrieval a viable operation.
Alternatively pip can quit verifying downloaded artifacts (beyond checking the hash). That may or may not be a good idea, I don’t have an opinion on that.
--no-deps avoid building the metadata in this case?
It somehow still does, I’m not sure why, probably something in
InstallRequirement triggers it. I think it’s very possible to delay that until the installation phase with some refactoring though (although that’d also mean
pip download loses the metadata integrity checks that come with preparing the sdist, e.g. ensuring the package name and version is correct).
OK. What I read here is, roughly:
For Data Integrity pip builds things (regardless of arguments) - and to do that it will always need the dependencies (which is why even with
numpy had to be ‘installed’ first before the download of pandas could proceed). And, I am guessing - that also explains why
--no-build-isolation had no noticeable affect.
Would it be conceivable to have an option e.g.,
--no-integrity-check that is intended for source downloads (e.g., this option forces
p.s. I did not check for pip
issues - as I did not want to assume there was a bug here - I expected “user error”.
Not quite. There’s a lot of messy history here, which I doubt would be of interest, around what we have and haven’t standardised, and how pip treats sdists (which have no standard).
There is a possible pip issue here. In theory, pip should only need the project name and version as long as you say
--no-deps, and those are available from the sdist filename. So there should be no need to build when you have
--no-deps. However, we don’t yet have a standard that says sdist filenames must include the name and version, so there’s a remote possibility that by assuming that, pip gets things wrong. We double-check during the build (when we do get a reliable name and version) and give an error if there’s a problem.
I don’t believe we’re deliberately doing that check for
pip download - it’s a side-effect of implementation details. Certainly I don’t think the check is valuable enough to warrant a command line option.
The real solution here is to finally get round to standardising the sdist format. Obviously that’s not much help right now, though. We could try to stop pip doing this check on
pip download as a short-term fix, as @uranusjr suggested. We may have to document that we dropped that check, but I doubt (famous last words!) anyone would care.
However, this is all in an area that the “new resolver” work is changing quite significantly. It may be that the new resolver code doesn’t even have this issue, which would mean that “use the new resolver” would be a sufficient workaround in the short term.
So, could you try
pip download --no-deps --no-binary :all: --unstable-feature=resolver pandas, and see if that avoids the build step?
The new resolver also installs build deps and build the wheel. I think @aixtools wants (because this is what I also wants sometimes) is something equivalent to
apt source, which solely fetch the source distribution. IIUC
pip download was designed to have a different purpose of downloading packages for later installation but I’m not entirely sure.
In this very case, it seems the fastest way is to run
pip download --no-deps --no-binary pandas --no-build-isolation pandas
which skips the build (even on new resolver this is still needed). However, it’s worth noticing that metadata is still prepared
Running command /usr/bin/python3 .../pip/_vendor/pep517/_in_process.py prepare_metadata_for_build_wheel /tmp/tmp12w7gxw0
and due to a long standing Cython-setuptools integration issue, cythonization was still invoked and it took quite some time.
The best work-around for this AFAIK is to go to PyPI and find the distribution directly, unfortunately.
I dug around and thought about this a bit more, and came to the conclusion that we really, really need to standardise sdist. pip needs the name and version from a distribution artifact. Both are pratically specified in the file name, but pip cannot really use that without checking since there is no guarentee an archive file follows the sdist naming convention (even that is not standardised).
I can think of the following choices:
- Standardise sdist metadata, and let pip use that instead, eliminating the need to build wheel metadata.
- Standarise sdist filename, and amend PEP 503 (Simple Repository API) to mandate that if a file served under this name pattern MUST be an sdist, so we can guarentee
foo-1.2.zip always means “project
1.2” and does not need to check for consistency. (We need to limit this to PEP 503 indexes because e.g. on GitHub you can download the repository as a zip, and for a repo named
foo and branch
- Standarise sdist filename, but invent a new extension like wheel did. This would avoid the requirement to amend the Simple Repository API.
We’ve always hit complicated debates when we try to standardise sdist metadata. I’m not 100% sure why, I think there are some cases where people get quite anxious about the possibility that backends could generate different metadata when building the wheel than they did when building the sdist. I honestly don’t know why that could happen, or why we can’t simply declare that as something that backends are no longer allowed to do - but it does make standardising sdist metadata a potentially time-consuming process.
But I see no reason why we can’t standardise the filename - projects are already effectively required to freeze the name and version when building the sdist, so we’re not imposing anything new.
I wish we could just bless the current format as standard, but you make a good point that other sites like github generate names in that format that aren’t sdists. But conversely, I’d somehow feel uncomfortable if sdists got a new extension. I know that’s silly, so I’m not going to argue too strongly, but how about this:
- sdist filenames MUST take the form
NAME-VERSION.sdist.tar.gz. The “name” and “version” portions must be canonicalised the same way as wheels. Tools MUST assume that any file with extension
.sdist.tar.gz is a sdist.
- The NAME and VERSION parts of the filename MUST match the distribution metadata - both the metadata in the sdist itself (when that gets standardised) and the metadata of any wheel built from that sdist. It is a backend error to create a wheel whose name and version don’t match the sdist filename.
Currently PEP 503 doesn’t make any statement about what “project files” an index can serve. I’m inclined to leave that unchanged, as it requires tools to make judgements about files without considering where they came from (which is overall a good thing). But I would make one exception to that, for compatibility purposes, and say that tools MAY assume that files named
*.tar.gz and served from a PEP 503 index are sdists, and proceed as if they had been named
(It’s not inconceivable that some tool will choose to treat all
.tar.gz files like this, but I’d view that as an implementation choice about how to treat non-standard files, rather than something the standard should take a view on).
Even if we do want to go further and standardise sdist metadata, I’d still advocate for the above as the specification of the sdist filename. It feels like the minimum change needed to give us reliable information.
.sdist.tar.gz counts as a new extension in my mind, so that works in this regard
One problem with this particular design though would be backwards compatibility. If my memory to pip’s implementation serves (I didn’t actually check), a
foo-1.0.sdist.tar.gz would be identified as an sdist of project
foo and version
1.0.sdist (legacy, non PEP 508 version), gets picked up and downloaded, and then fails to install. We need to invent something that does not accidentally get picked up by old pip, maybe also easy_install, versions.
Sigh. That basically means a completely new extension (i.e., not one that ends with
I foresee endless bikeshedding. But I’ll start by suggesting
NAME-VERSION.sdist, in the hope that it’s sufficiently obvious to be uncontroversial.
I created a PEP draft and put things discussed here in it.
Edit: The PEP is in discussion at PEP 625: File name of a Source Distribution
For your information, another use case where the current situation is inappropriate : I am running windows but trying to download macosx and linux packages. This fails with pip download because of the build step.
To give more context, this is for the PyQt5-stubs projects. The PyQt5 wheels and source distribution include some platform specific stubs and modules. Since I am distributing one set of stubs for all platforms, I want first to collect all stubs from all platforms by downloading all packages. And this fails…
PEP 643 now standardises a means for build backends to reliably include static metadata in a source distribution, specifically including name and version, which can then be read without a build step. So the resolution of this issue now is:
- Implement PEP 643 support in common backends, in particular, setuptools as it is still by far the most commonly used backend.
- Implement PEP 643 support in pip, reading static metadata from the sdist without a build step, but still falling back to the existing behaviour if a sdist is not PEP 643 compliant.
- Wait for projects to rebuild and publish new sdists that conform to PEP 643.
Steps (1) and (2) can be done now. All they need is someone motivated enough to write the code. Step (1) is key here - once there’s reliable sdist metadata available in at least some proportion of sdists, the case for supporting it in pip is much easier to make.
So ultimately, this issue now boils down to someone interested in the issue, spending the time writing the code and getting it included in the relevant tools.
ERROR HTTPError: 400 Bad Request from https://upload.pypi.org/legacy/
'2.2' is an invalid value for Metadata-Version. Error: Use a known
metadata version. See
https://packaging.python.org/specifications/core-metadata for more
You have been very patient - with a suggested command. The problem persists - simple downloads do not seem to work:
(py39) aixtools@x064:[py39]pip download --no-deps --no-binary :all: --unstable-feature=resolver ansible-core==2.12.6
pip download [options] <requirement specifier> [package-index-options] ...
pip download [options] -r <requirements file> [package-index-options] ...
pip download [options] <vcs project url> ...
pip download [options] <local project path> ...
pip download [options] <archive url/path> ...
no such option: --unstable-feature
(py39) aixtools@x064:[py39]pip download --no-deps --no-binary :all: ansible-core==2.12.6
Using cached ansible-core-2.12.6.tar.gz (7.8 MB)
File "/home/aixtools/py39/lib/python3.9/site-packages/pip/_internal/utils/unpacking.py", line 256, in unpack_file
File "/home/aixtools/py39/lib/python3.9/site-packages/pip/_internal/utils/unpacking.py", line 226, in untar_file
with open(path, "wb") as destfp:
UnicodeEncodeError: 'latin-1' codec can't encode characters in position 138-141: ordinal not in range(256)
WARNING: You are using pip version 21.1.3; however, version 22.1.2 is available.
- I should have upgraded pip first - I’ll update this in a moment.
- Still dreaming of the day I can just get the download done - it’s been downloaded to a temp directory! but not to my working directory.
- p.s. this is not about the unicodeencodeerror. I know what that is and it has been stated elsewhere that it is (currently) and won’t fix issue. (My dream is of simple downloads so I can manually create an installable wheel).
UPDATE: Same issue with latest pip, fyi.
pip download --no-deps --no-binary :all: ansible-core==2.12.6 (pip 22.1.2) I get the sdist downloaded into my wording directory – you could pass
--dest for explicitness?
You’re responding to a message from 2 years ago. Unsurprisingly, things have changed a bit - the new resolver is now default as you found. The follow-up discussion after my post from 2 years ago pretty clearly said that the suggested command wouldn’t work because we would still do the build. So I’m not sure there’s much I can add.
Nobody has done any of the implementation steps suggested in this thread, so it’s all still waiting on that, basically.