I have tried to structure the Python Launcher for UNIX to eventually be turned into a wheel on PyPI so that there’s only one instance where the discovery logic needs to be implemented. So my hope is that once I magically find the time hit 1.0 with the Launcher for CLI usage I want to then work at making it a PyPI package for those that need the discovery aspects. That, though, will require making it a universal launcher even on Windows which is a bit more work.
Talking with my Python extension for VS Code hat on, we find a lot of users have no idea about virtual environments, so when we recommend it we are recommending a new concept to them (same goes for when we tell conda users to create a conda environment; a lot of users just abuse the base environment). There is definitely an educational step here of letting people know that virtual environments exist. After that it’s managing them (I had a twitter thread about naming the director you install into and boy were there varying opinions!).
Yup. Practically it’s more complicated since you’d also have to deal with 32/64-bit, platforms (for e.g. WSL), ABI, Python implementation, etc. The greatest missing piece here IMO is how tools can resolve to the same one when the user asks for simply “3.8” or “PyPy” when multiple virtual environments are created, i.e. some kind of identifier for an interpreter. Maybe we can reuse some knowledge from wheel tagging, but I’m unfamiliar on that area.
As a side note: This is also one of the larger roadblocks hit by PEP 582
__pypackages__; I think any improvements to the current virtual environment usages would need to solve it first.
That’s also an issue with the Python Launcher (which I have been avoiding by not caring about bitness ).
Probably not because I don’t think most people are very familiar with what
cp38 means or would want to bother saying
py38 when all they care about is the version. Really all people care about is Python version, occasionally which interpreter, and very rarely bitness (and does everyone remember how bitness is specified on the various OSs for wheels? I wrote the code and I can’t ). But having said that I’m sure someone is now going to tell me they have a use-case of virtual environments for the same Python version, interpreter, and bitness but with differently dependencies installed.
But yes, coming up with a general naming/reference scheme would solve that issue. It might require storing more details in
pyvenv.cfg, but that’s not a huge issue.
They only care about version until they install a package that has a native module in it. Then suddenly they care a lot about platform and bitness
virtualenv built-in discovery mechanism supports both Unix and windows, and currently allows users to care about: implementation (cpython Vs pypy), version (major/minor/patch), and bitness (32 Vs 64). tox already had all this, and to some extent virtualenv too, so consolidated code to virtualenv entirely. In theory could be it’s own package
I couldn’t find docs on how tox handles this. Is there a page listing how to specify the varieties of options?
Probably the first step is to decide on what format we want to standardize on for specifying what interpreter you want and get that written down in a PEP or something. Once we have that then we can make sure virtualenv, tox, Python Launcher, etc. all support the same format and the Python-based ones can all standardize on a library implementing it (Python Launcher probably can’t as it needs to be in C/Rust and I don’t think tox or virtualenv want an extension module dependency).
Ok, I’ll admit that I didn’t actually read the whole thread, but one thing I read over and over is “should pip do everything?” And while I think this is a good question (e.g. wheel is an external package that demonstrated enough utility to become the defacto distribution format), I think the very fact that pip implements a PEP 517 interface with setuptools already suggests that it should do everything. Or at least, it should be able to build from source.
I gotta say folks, PEP 517 feels like a godsend. I’ve been trying to figure out how to package python packages “the right way” for the better part of 2 years now, and it’s not very easy to figure out. It took me longer than it should have to realize that wheels are the way to go. It also took me forever to figure out that setuptools is in fact the preferred way to package. PEP 517 makes 2 afirmative statements that make my life much easier: (1) “Here’s how to package python”, and (2) “Here’s the default tools for packaing python”. This at least makes it clear that there’s an interface and a set of options that aren’t (or are) non-conformant hacks. It also finally provides the interface necessary for me to decide as a packager to use non-standard tools and not be afraid that users won’t be able to install my package.
As a python user and package author it would be very confusing if the tool that’s capable of building a wheel using a long-standing interface (i.e. pip wheel) suddenly disappeared because building packages got better . So again, as a user here, I’m pretty happy with a discussion on what would feel like an implementation details about publishing another package for pybuild as the equivalent to pip wheel but for PEP517. It may be a very good idea to separate out that functionality into another tool, if that tool is also the default choice for pip, (and replacable via pyproject.toml). It may also make sense to first implement the interface in pip and later pull it out into a shared implementation. But frankly, seeing a
pip sdist will just make so much sense to me given the existing tooling, names, pep517 interface, etc.
Furthermore, all python developers are package maintainers, right? Except in corporate environments where there’s enough money/staff for python build engineers. The notion of user modes as producer vs consumer is a false dichotomy, except for those rare python developers who never share their code with anybody. And frankly, are we considering them a core demographic to consider? I mean isn’t the very implementation of pip and pypi.org and ssh+github links in requirements.txt a value statement about sharing open-source code?
Anyways, I love what y’all are doing. pip makes the very idea of distributing shared code seem a world easier than what I’m usually dealing with (c++ distributions ). I hope my comments can shed a little bit of light on what users are wanting.
Or provide an alternative https://www.python.org/dev/peps/pep-0582/
I hope this is tongue-in-cheek, because it is very untrue. The vast majority of Python developers never create or publish a package, and even if you “exclude” those who get paid for it you’ll still exclude the vast majority of people who do publish packages.
Considering only “people like us” is what got packaging into the state it’s in. We’re trying to move (incrementally and compatibly) to a place where it works for everyone.
I wouldn’t characterize setuptools as preferred, more the historical way to do packaging. Some do prefer it, but members of the PyPA are explicitly trying to move away from forcing people into preferred solutions.
Once again, it’s more about historical tools, not default. No one here is making a judgment call of setuptools being better or worse than flit; they meet different needs for different people and are equally useful.
Be careful about over-generalizing here. For instance, I have never used
pip wheel and I have been a Python user since before pip existed. That’s like saying we should never have bothered with PEP 517 because setuptools already existed with its
setup.py bdist_wheel interface.
I agree with Steve that I don’t think this is true. As an example, there are plenty of people who deploy code straight to Docker these days and don’t have to package anything.
That’s a huge generalization. I have plenty of code that I have never shared with a single person and that never left the computer it was initially written on. Python’s scripting heritage, for instance, very easily leads to people writing code that is only meant for the author.
And thanks for the comments!
Thanks all for the responses. It seems I’ve been pretty unclear about quite a few points. It seems I also didn’t read the thread thoroughly enough. I’m mostly making an argument for including packaging as a core use case, mostly because I believe it’s valuable, and I believe the python community at large believes it’s valuable. Let me respond to a few of the specific comments to try and clarify what I was trying to get across.
In fact I’m not trying to exclude those who get paid to develop python code and publish packages, most especially since that would exclude me! I understood you to be making the argument that some people are python developers and others are package maintainers. I guess I was making a false equivalence in that behavior only happening in corporate environments, because I have only had that experience in corporate environments. Your point is taken that this may happen in any large team, including open source projects.
I’m trying to advocate for the perspective that even quickly-developed code may quickly turn into a package if the friction is low enough. It sounds like you’re on the way to that.
I think my meaning got lost in my hyperbole. Let me amend what I said to something more along the lines of “packaging python is a core process of python software development”. That’s an opinion. I believe that sharing python code is a core process. Fortunately pip makes it dirt-simple to install shared code as a package with a number of options (e.g. ssh, git, local directories, simple index servers, etc.).
Again, I’m advocating for a perspective here where any developer can create and distribute a package with minimum friction, and that they need not fit some specialized role of “packager”, fetch special packaging tools, read about packaging philosophy, get confused about what a sane default is for a simple project (which I’m only just learning flit does a good job of). pip actually already supports uploading to pypi.org right? So it seems like this is a core value. So
pip build for distributing locally seems useful
This is the one comment that I think I don’t understand. What’s the analogy? Are you saying that PEP 517 applies to something other than the
pip wheel command? Or observing that not everybody uses PEP 517? I’m not sure I follow.
Hopefully I can offer a clarification, that for me as an individual having the
pip wheel command deprecated will make me sad. It’s my opinion that
pip wheel and PEP 517 just made the process building wheels easier than any other process. PEP 517 is declarative and its isolated build is infinitely reproducible. It also isn’t tied to one tool. I’ll never have to worry again about developers trying to build my package having too many or too few packages installed to reproduce the build as it happens on build machines.
By “default” here I specifically mean “if you don’t provide a build-backend in the [build-system] section of your pyproject.toml, we’ll use setuptools”. In that way it is the default in the sense of it’s what used if an alternative is not declared. I read this type of default as “this is what we officially believe represents the interface” as well as “this probably works for most legacy projects”, even if there are better modern tools out there. I understand that it’s not a judgement on what’s “good”, but more about what’s backwards compatible. IMO, this is a really good default because it’s also implicit that you can replace it with something that better suits your needs.
Yes, definitely. The point of PEP 517 was to decouple building of Python packages from being setuptools-exclusive to allowing any tool participate in a clean manner.
pip wheel is not a magical, special command and don’t get attached to it simply because it exists.
Moving away from Python 2 also made people sad.
And that’s the key point/question.Should we try to shoehorn it all into pip simply because it’s there, or should we try to break things out like twine into its own tool? Or do we do a new tool that is oriented towards building and development as that isn’t the same as installing dependencies and allow pip to be oriented towards dependency installation?
So my key point to you is
pip wheel isn’t special. It’s there and I’m glad it’s useful for you. But its mere existence doesn’t preclude us from considering not making it the way people build wheels going forward.
pip wheel is the only implementation of the PEP 517 front-end, right? In that way it is special. At least as an end-user with the current version of pip.
Ha ha. Good point.
pip build will be welcomed if it handles important use cases
And with that same package it’s literally a single function call to do a build, so creating a tool to build using PEP 517 isn’t difficult.
It does not! That’s one of the questions in the discussions around «pip should do all packaging tasks» vs «different tools can do different jobs». People use
python setup.py upload,
flit publish, the web interface or other tools to upload.
Maybe, maybe not. When I work on my projects, I know what command I have to run to build distributions. When I contribute to a project, I read their instructions. When I install something from source, I don’t care that
pip install builds a wheel under the covers. And I don’t remember ever running
So I use
pip wheel extensively, either for creating local caches of wheels or for building my own from an sdist. And that dual use would suggest it may need to change one day, even though it works.
With the primary piece of building a wheel now being installing its dependencies though, I don’t see how or why we’d split it out from pip completely. There are plenty of ways to build one, but the only reliable way is to install the correct dependencies first, and while pip can’t quite do that yet people are still going to use it and packagers are going to have to deal with the limitations.
Uploading is another matter, but again, if you need pip to get twine, you need pip to upload and we may as well bundle it. Presumably pip has to be able to do nearly everything to pull from authenticated feeds as twine has to for pushing, so why have a separate configuration file to get out of sync?
None of this precludes other tools from doing the job, but since pip already covers the download/unpack side, having it also do the pack/upload side makes it more obvious that they’ll be compatible.
No. As Brett already pointed out, there’s
pep517. And the whole point of PEP 517, and most of the interoperability standards work we’ve been doing, is to make it so that alernatives to pip can be built. It’s an explicit goal of the standards work to not focus on pip as the only front end tool, and an explicit policy of pip to implement standards-based functionality, so that people don’t get locked into implementation-defined behaviour.
While I don’t see anything wrong with that idea (after all, who would argue against being able to do anything with “minimum friction”?) you appear to also be assuming that writing packages is the best solution for everyone. And that’s where I think you’re wrong.
Many Python users don’t write packages, and never will. Instead, they write single-file scripts, plugin code for applications, data science notebooks, etc. To give a specific example, a significant proportion of the people I know who use Python in my organisation, only use Python’s packaging infrastructure to consume 3rd party libraries. They never package their own code, and only ever share it in constrained situations where wheels, PyPI, and pip are essentially irrelevant.
Now it’s not entirely clear that this makes much difference when we’re talking about “building distributions”, which is the topic of this thread. But I think it’s extremely important to keep a sense of perspective on what we’re discussing. Building packages is a relatively minor part of the Python packaging ecosystem. Consuming packages is far more significant. And many, many Python users have no interest in building packages, and rightly so - we shouldn’t force building packages on them as a solution to their problems (even if the problems they have are related to deployment, dependency management, etc. - wheels and PyPI aren’t, and shouldn’t be treated as, a “one size fits all” solution, IMO).
Coming back to this today, it’d be useful to see how much of this can be done with poetry today.
I personally don’t have the time to take a deeper look into this right now.
Moving python-build to PyPA is a discussion about taking the route of going down option 3 (see original post for what this thread was for).