Can you expand on this? Not sure I follow.
My interpretation was that this is about having tools that manage virtual environments for you. Things like tox/nox would seem to fit into this area to an extent, as would tools like pipenv that take the idea of the user having a “project” directory and manage the environment associated with that project for you. I assume that @uranusjr’s own pyem would be another example of this, and maybe even “thinner” wrappers like pew and virtualenvwrapper.
Personally, I can definitely see the need for some sort of tool here - I mostly just use “raw” virtualenv/venv, but I do find that the admin around managing environments is annoying. But I’ve never yet really tried to pin down exactly what I’d want from such a tool - which I what I took @uranusjr as referring to when he said “more interest is needed”.
Yup, that’s what I mostly have in mind. To back up a little, a lot of modern Python users (coming from other ecosystems) hard-couple the concepts of project and environment. This leads to them feeling the activate/deactivate step redundant. But people used to virtual environments (myself included) most definitely want to keep being able to switch between environments within a project. Both camps have their tools that expose the appropriate abstraction. That’s fine.
The problem, from what I can tell, is that tools don’t interop. You have Pipenv and Poetry if you only want one environment, Tox and Nox if you want multiple environments, but the formers don’t do task runner things well, and the latters don’t do dependency management; Pipenv doesn’t install packages into a Tox-managed virtualenv (without hacks), and Tox can’t run tasks with a Pipenv-managed virtualenv. This leads to tension between the two kinds of users, and virtualenv gets bad rep because it (being the shared implementation between both usages) seems not up to the needs.
One way to consolidate is to define standardised locations to place virtual environments within a project (my PyEM is trying to figure out a good scheme), and let tools share environments and complement each other. So users don’t need to know the underlying virtual environments that make Nox and Pipenv work, and
pipenv install would satisfies whatever
nox needs to run its task.
IOW have a way to say, “this is my Python 3.8 virtual environment, tools, so everyone who needs that version of Python should just use this virtual environment”? That way recreation is unnecessary and tools can just piggyback on some other tools that previously created the virtual environment?
I have tried to structure the Python Launcher for UNIX to eventually be turned into a wheel on PyPI so that there’s only one instance where the discovery logic needs to be implemented. So my hope is that once I magically find the time hit 1.0 with the Launcher for CLI usage I want to then work at making it a PyPI package for those that need the discovery aspects. That, though, will require making it a universal launcher even on Windows which is a bit more work.
Talking with my Python extension for VS Code hat on, we find a lot of users have no idea about virtual environments, so when we recommend it we are recommending a new concept to them (same goes for when we tell conda users to create a conda environment; a lot of users just abuse the base environment). There is definitely an educational step here of letting people know that virtual environments exist. After that it’s managing them (I had a twitter thread about naming the director you install into and boy were there varying opinions!).
Yup. Practically it’s more complicated since you’d also have to deal with 32/64-bit, platforms (for e.g. WSL), ABI, Python implementation, etc. The greatest missing piece here IMO is how tools can resolve to the same one when the user asks for simply “3.8” or “PyPy” when multiple virtual environments are created, i.e. some kind of identifier for an interpreter. Maybe we can reuse some knowledge from wheel tagging, but I’m unfamiliar on that area.
As a side note: This is also one of the larger roadblocks hit by PEP 582
__pypackages__; I think any improvements to the current virtual environment usages would need to solve it first.
That’s also an issue with the Python Launcher (which I have been avoiding by not caring about bitness ).
Probably not because I don’t think most people are very familiar with what
cp38 means or would want to bother saying
py38 when all they care about is the version. Really all people care about is Python version, occasionally which interpreter, and very rarely bitness (and does everyone remember how bitness is specified on the various OSs for wheels? I wrote the code and I can’t ). But having said that I’m sure someone is now going to tell me they have a use-case of virtual environments for the same Python version, interpreter, and bitness but with differently dependencies installed.
But yes, coming up with a general naming/reference scheme would solve that issue. It might require storing more details in
pyvenv.cfg, but that’s not a huge issue.
They only care about version until they install a package that has a native module in it. Then suddenly they care a lot about platform and bitness
virtualenv built-in discovery mechanism supports both Unix and windows, and currently allows users to care about: implementation (cpython Vs pypy), version (major/minor/patch), and bitness (32 Vs 64). tox already had all this, and to some extent virtualenv too, so consolidated code to virtualenv entirely. In theory could be it’s own package
I couldn’t find docs on how tox handles this. Is there a page listing how to specify the varieties of options?
Probably the first step is to decide on what format we want to standardize on for specifying what interpreter you want and get that written down in a PEP or something. Once we have that then we can make sure virtualenv, tox, Python Launcher, etc. all support the same format and the Python-based ones can all standardize on a library implementing it (Python Launcher probably can’t as it needs to be in C/Rust and I don’t think tox or virtualenv want an extension module dependency).
Ok, I’ll admit that I didn’t actually read the whole thread, but one thing I read over and over is “should pip do everything?” And while I think this is a good question (e.g. wheel is an external package that demonstrated enough utility to become the defacto distribution format), I think the very fact that pip implements a PEP 517 interface with setuptools already suggests that it should do everything. Or at least, it should be able to build from source.
I gotta say folks, PEP 517 feels like a godsend. I’ve been trying to figure out how to package python packages “the right way” for the better part of 2 years now, and it’s not very easy to figure out. It took me longer than it should have to realize that wheels are the way to go. It also took me forever to figure out that setuptools is in fact the preferred way to package. PEP 517 makes 2 afirmative statements that make my life much easier: (1) “Here’s how to package python”, and (2) “Here’s the default tools for packaing python”. This at least makes it clear that there’s an interface and a set of options that aren’t (or are) non-conformant hacks. It also finally provides the interface necessary for me to decide as a packager to use non-standard tools and not be afraid that users won’t be able to install my package.
As a python user and package author it would be very confusing if the tool that’s capable of building a wheel using a long-standing interface (i.e. pip wheel) suddenly disappeared because building packages got better . So again, as a user here, I’m pretty happy with a discussion on what would feel like an implementation details about publishing another package for pybuild as the equivalent to pip wheel but for PEP517. It may be a very good idea to separate out that functionality into another tool, if that tool is also the default choice for pip, (and replacable via pyproject.toml). It may also make sense to first implement the interface in pip and later pull it out into a shared implementation. But frankly, seeing a
pip sdist will just make so much sense to me given the existing tooling, names, pep517 interface, etc.
Furthermore, all python developers are package maintainers, right? Except in corporate environments where there’s enough money/staff for python build engineers. The notion of user modes as producer vs consumer is a false dichotomy, except for those rare python developers who never share their code with anybody. And frankly, are we considering them a core demographic to consider? I mean isn’t the very implementation of pip and pypi.org and ssh+github links in requirements.txt a value statement about sharing open-source code?
Anyways, I love what y’all are doing. pip makes the very idea of distributing shared code seem a world easier than what I’m usually dealing with (c++ distributions ). I hope my comments can shed a little bit of light on what users are wanting.
Or provide an alternative https://www.python.org/dev/peps/pep-0582/
I hope this is tongue-in-cheek, because it is very untrue. The vast majority of Python developers never create or publish a package, and even if you “exclude” those who get paid for it you’ll still exclude the vast majority of people who do publish packages.
Considering only “people like us” is what got packaging into the state it’s in. We’re trying to move (incrementally and compatibly) to a place where it works for everyone.
I wouldn’t characterize setuptools as preferred, more the historical way to do packaging. Some do prefer it, but members of the PyPA are explicitly trying to move away from forcing people into preferred solutions.
Once again, it’s more about historical tools, not default. No one here is making a judgment call of setuptools being better or worse than flit; they meet different needs for different people and are equally useful.
Be careful about over-generalizing here. For instance, I have never used
pip wheel and I have been a Python user since before pip existed. That’s like saying we should never have bothered with PEP 517 because setuptools already existed with its
setup.py bdist_wheel interface.
I agree with Steve that I don’t think this is true. As an example, there are plenty of people who deploy code straight to Docker these days and don’t have to package anything.
That’s a huge generalization. I have plenty of code that I have never shared with a single person and that never left the computer it was initially written on. Python’s scripting heritage, for instance, very easily leads to people writing code that is only meant for the author.
And thanks for the comments!
Thanks all for the responses. It seems I’ve been pretty unclear about quite a few points. It seems I also didn’t read the thread thoroughly enough. I’m mostly making an argument for including packaging as a core use case, mostly because I believe it’s valuable, and I believe the python community at large believes it’s valuable. Let me respond to a few of the specific comments to try and clarify what I was trying to get across.
In fact I’m not trying to exclude those who get paid to develop python code and publish packages, most especially since that would exclude me! I understood you to be making the argument that some people are python developers and others are package maintainers. I guess I was making a false equivalence in that behavior only happening in corporate environments, because I have only had that experience in corporate environments. Your point is taken that this may happen in any large team, including open source projects.
I’m trying to advocate for the perspective that even quickly-developed code may quickly turn into a package if the friction is low enough. It sounds like you’re on the way to that.
I think my meaning got lost in my hyperbole. Let me amend what I said to something more along the lines of “packaging python is a core process of python software development”. That’s an opinion. I believe that sharing python code is a core process. Fortunately pip makes it dirt-simple to install shared code as a package with a number of options (e.g. ssh, git, local directories, simple index servers, etc.).
Again, I’m advocating for a perspective here where any developer can create and distribute a package with minimum friction, and that they need not fit some specialized role of “packager”, fetch special packaging tools, read about packaging philosophy, get confused about what a sane default is for a simple project (which I’m only just learning flit does a good job of). pip actually already supports uploading to pypi.org right? So it seems like this is a core value. So
pip build for distributing locally seems useful
This is the one comment that I think I don’t understand. What’s the analogy? Are you saying that PEP 517 applies to something other than the
pip wheel command? Or observing that not everybody uses PEP 517? I’m not sure I follow.
Hopefully I can offer a clarification, that for me as an individual having the
pip wheel command deprecated will make me sad. It’s my opinion that
pip wheel and PEP 517 just made the process building wheels easier than any other process. PEP 517 is declarative and its isolated build is infinitely reproducible. It also isn’t tied to one tool. I’ll never have to worry again about developers trying to build my package having too many or too few packages installed to reproduce the build as it happens on build machines.
By “default” here I specifically mean “if you don’t provide a build-backend in the [build-system] section of your pyproject.toml, we’ll use setuptools”. In that way it is the default in the sense of it’s what used if an alternative is not declared. I read this type of default as “this is what we officially believe represents the interface” as well as “this probably works for most legacy projects”, even if there are better modern tools out there. I understand that it’s not a judgement on what’s “good”, but more about what’s backwards compatible. IMO, this is a really good default because it’s also implicit that you can replace it with something that better suits your needs.
Yes, definitely. The point of PEP 517 was to decouple building of Python packages from being setuptools-exclusive to allowing any tool participate in a clean manner.
pip wheel is not a magical, special command and don’t get attached to it simply because it exists.
Moving away from Python 2 also made people sad.
And that’s the key point/question.Should we try to shoehorn it all into pip simply because it’s there, or should we try to break things out like twine into its own tool? Or do we do a new tool that is oriented towards building and development as that isn’t the same as installing dependencies and allow pip to be oriented towards dependency installation?
So my key point to you is
pip wheel isn’t special. It’s there and I’m glad it’s useful for you. But its mere existence doesn’t preclude us from considering not making it the way people build wheels going forward.
pip wheel is the only implementation of the PEP 517 front-end, right? In that way it is special. At least as an end-user with the current version of pip.
Ha ha. Good point.
pip build will be welcomed if it handles important use cases
And with that same package it’s literally a single function call to do a build, so creating a tool to build using PEP 517 isn’t difficult.
It does not! That’s one of the questions in the discussions around «pip should do all packaging tasks» vs «different tools can do different jobs». People use
python setup.py upload,
flit publish, the web interface or other tools to upload.
Maybe, maybe not. When I work on my projects, I know what command I have to run to build distributions. When I contribute to a project, I read their instructions. When I install something from source, I don’t care that
pip install builds a wheel under the covers. And I don’t remember ever running