Wanting a singular packaging tool/vision

I don’t agree the UX is irrelevant, I know of people who were learning Python who would have given up if they had to use conda (I can’t get details as I no longer work with the individuals in question, I’m afraid). However, I’m happy to put that point aside.

The most glaring example is that (to my knowledge, at least) conda doesn’t work with python.org builds of Python, the Windows Store distribution or Linux distro builds.

I agree. It wasn’t me that asked for a “vision” here, I’m mainly just exploring the implications if people want to go down that route. Personally, I’m largely happy with the direction things are going in under the PyPA (on the understanding that progress is frustratingly, even glacially, slow :slightly_frowning_face:), and having conda be a separate ecosystem for people who prefer/need it.

Developer ecosystems are often able to standardize on tools with defective UX compared to competitors (git vs. Hg, for example), which is why I think UX is not the primary concern here. Also, let’s not forget that the pip UX isn’t always pretty either (witness the hodgepodge of options pip install has, for example)

I agree that conda not being able to work with non-conda Python installs is one of its major drawbacks. I’ve never had any important concerns with the conda Python builds, but I suppose YMMV.

Of course it doesn’t, conda works with its own builds, because it is for managing the entire system.

conda in no way equals pip. They are fundamentally different tools. Trying to use them both the same is only going to lead to confusion (which you appear to be enjoying already :wink: )

FWIW, I don’t think there’s a need to reconcile conda into a “Python packaging vision”. They can remain totally independent and self-promote, because they’re full stack.

Also, I never asked for a singular packaging vision, but a singular vision for growing, developing and supporting the Python user base that includes packaging (and education, and documentation, and outreach, etc.)

I had hoped that the SC was in a position to provide that vision, but it appears they are not. So we’re looking to some other person/group to pull everything together and find the important priorities.

Right now, honestly, Anaconda is doing it best, giving their users multiple tools, docs, guidance, etc. and actively developing new ways to use Python. Meanwhile, python-dev is looking like mere caretakers of a GitHub repository, and PyPA is trying to put out fires and reconcile massive divergence between ideas that became implementations because the discussions were too hard.[1] I hope we can Discourse our way out of it into something with a bit more focused momentum, but it feels unlikely.

  1. And before anyone takes offence, I am definitely putting myself in the “caretaker” camp. I have no affiliation with Anaconda, and just get to watch on from the outside while they do all the cool stuff. ↩︎


I understand UX isn’t the only concern, but I’d argue it is the primary one and quite literally the only thing the OP is asking about.

By the way, I just embedded on a Rust team for a month to assist with a project and I totally understand the OP’s perspective. It was quite lovely using (mostly) just Cargo for everything.

Hatch is well positioned to provide this unified experience since it is already pretty much there (and supports plugins like Cargo) except for 2 things:

  • lock files: this is out of my control but I’m confident @brettcannon will save us :slightly_smiling_face:
  • distribution: I tried really, really hard to package it as a single binary with PyOxidizer and Nuitka but virtualenv / venv hard requires a filesystem and PyInstaller also couldn’t. Basically, it needs to be distributed as Python but with Hatch pre-installed

I’m not up to speed on the history of lock file format standardization, but I hope you’re right. I can say that lock file support is going to be increasingly important for the work I’m currently doing.

1 Like

Here’s an attempt at user story to give some concrete examples of what kind of unified/rallied tooling vision I’m imagining for a future new Python developer.

Disclaimer: This is obviously very inspired by Rust, which I think does a fantastic job in this area, but some of the minor details may need to be tweaked for Python.

The new developer finds out about Python and wants to try it creating a Python project. They visit the official python.org website and are presented with a Getting Started page that provides them with a cross-platform interpreter installer/manager that has the same UI/CLI on every major platform supported. This could be similar to pyenv but Windows is a first class citizen and the installations could be pre-compiled rather than needing to be built.

We’ll call it pyup to sound similar to rustup.

Idea: Remove the duplicity of ways new developers are told to install Python: HomeBrew, stand-alone installer, deadsnakes PPA, build from source, Linux system package, etc.

Upon installing pyup, the latest stable release of Python is then available to them with a common, cross-platform name, e.g. py.

Idea: Remove the confusion regarding launching Python as python3 on *Nix and py on Windows (unless it’s the Windows store and then it’s python).

pyup can be used to install and manage multiple versions of Python. The developer decides they need to test their application on an older version of:

pyup install 3.10

Or change their default Python to 3.10:

pyup default 3.10

Python 3.11.1 is released, and the user can easily update to a new patch release:

pyup update 3.11

Idea: Make it simple and consistent across platforms how to install, update, and manage multiple Python interpreters.

Included with each interpreter installation that pyup installs is a package manager that looks something like (Poetry or hatch-with-lockfile) + pipx. We’ll call it pyrgo for now to keep with our silly naming conventions.

A coworker mentions that httpie would be a great tool to try out an API they will be interfacing with. They are able to easily install a tool globally on their system from PyPI (essentially vendoring a pipx-like experience:

pyrgo global-install httpie

Idea: Make bootstrapping installing Python tools something that is included out of the box. Rust has this with cargo install, Node has this with npx bundled, etc.

The developer is now ready to create their first Python project, so they run:

pyrgo new

This is similar to poetry new and hatch new in that it creates a new, standard project structure for them, including a populated pyproject.toml

├── pyproject.toml
├── README.md
├── demo
│   └── __init__.py
└── tests
    └── __init__.py

The developer learns they will be using FastAPI for their application, so they add it as a dependency and automatically update pyproject.toml:

pyrgo add fastapi

They can then lock their requirements into a cross-platform lockfile, similar to Poetry, for reproducibility:

pyrgo lock

And install those into a virtual environment (which is managed for them, again similar to Poetry and hatch):

pygro install

Idea: Avoid the common papercut of how venvs are activated different on Windows vs *Nix.

The developer would like to automatically format their code to community standards. They find that pyfmt (e.g. black/isort) come preinstalled from pyup.


And one could imagine extending this to a linter or type checker (e.g. mypy) depending on community consensus.

The developer is ready to publish a new open source project to PyPI, they build the sdist and wheel using a single build command similar to Poetry/hatch build:

pyrgo build

They then upload the sdist and wheel to PyPI using a single publish command similar to Poetry/hatch publish:

prygo publish

Idea: Avoid the developer having to discover the existence of the build and twine PyPI packages, find and read their separate user guides, install them into a virtual environment, and invoke them using different commands.


The entire tooling workflow started with Python.org. This avoids a bootstrapping problem and many platform-specific steps that trip up new and seasoned developers alike. The community rallies around these central tools and reduces duplicated effort by pooling ideas/resources.

Importantly, it creates a cohesive ecosystem where a developer is much more likely to be able to drop into a new Python project and already know the workflow/tools.

Since all of these tools are standard across platforms, IDEs and editors have an easier time integrating and keeping up with updates and changes.


I like this (not least because it’s like rust, which I like :wink:). One immediate question springs to mind, though, how would “scripting” (writing single-file runnable Python utilities) fit in with this? This is where the analogy with Rust breaks down for me, because Rust doesn’t have the idea of scripts.

Many of my work colleagues wrote small scripts like this. It’s (in my experience) a very common use case for Python, and one that’s not well served by the “build a project” style of workflow.


A single project can define multiple binaries

Hatch always uses an env so this would work fine

For standalone scripts with no dependencies, I would think you could use the py executable that pyup provides directly and skip the pyrgo new/lock/install steps I listed. This is one of Pythons strengths (scripting) that we can continue to support.

1 Like

I was going to say some things regarding why conda environments as-is are a no-go for me based on work experience, but I’m not diving into it as I think it’s a bit distracting.

I will say I’m interested in seeing what having Jupyter standardize on Hatch does for this.

No pressure. :wink: (Still slowly working towards it, BTW).

For other people’s benefit, this last came up in Creating a standalone CPython distribution . This is something else I would like to see solved upstream, but I have to clear some space in my schedule to start tackling it.

Short version:

  1. PEP 665 – A file format to list Python dependencies for reproducibility of an application | peps.python.org got rejected (search around here for the threads on the topic and PEP).
  2. I’m working towards making GitHub - brettcannon/mousebender: Create reproducible installations for a virtual environment from a lock file be an opinionated PoC for wheels-only lock/pinned dependency files
  3. Need to review PoC: Metadata implementation by dstufft · Pull Request #574 · pypa/packaging · GitHub for parsing metadata
  4. Need to do a new release of mousebender for PEP 691 (and probably PEP 700 soon)

Long version: you know where to find me. :wink:

https://pyup.io beat you to the name. :wink:

I’m hoping that once we have pre-built binaries for CPython releases I can get something like this going with the Python Launcher for Unix.


I was going to write a long description of the “writing a script (with dependencies)” use case, but it occurred to me that @njs described this a lot better, back in 2018, with this post.

I think Python tooling currently focuses mostly on the “Reusable Library” section of this, with some help for the rest of section 3. But we tend to ignore sections 1 (“Beginner”) and 2 (“Sharing with others”). Those two early phases are where people have “a bunch of scripts in a Work directory” and where a formal project directory is more overhead than they want. And as a sidenote, I think that referring to stage 1 as “beginner” is misleading - I’ve been using Python for years and I still find that most of my work is in this category.

It’s standalone scripts with dependencies that I care about. They are the case that is badly supported right now IMO. These are often all dumped into a single directory, so PEP 582 doesn’t fully address this. The pip-run tool helps, but it reinstalls dependencies for every run, which can be painful.

I’d like a “unified vision” to cover this use case as well, and not just the “create a Python project” workflow.


Hatch fixes this. For example, Hatch itself uses:

detached = true
dependencies = [
update-data = [
update-licenses = "python backend/scripts/update_licenses.py"
update-classifiers = [
  "pip install --upgrade trove-classifiers",
  "python backend/scripts/update_classifiers.py",

I like hatch a lot, although I haven’t used environments much yet. But I think you’re missing my point (or maybe I’m missing yours).

I have a directory on my PC, C:\Work\Scratch, where I keep all sorts of junk - snippets of code in C, Python, and all sorts of other languages, directories with temporary work, etc. There’s no structure and barely any organisation. The other day, I wanted a script that would take a number and display factors of that number and other numbers “close” to it. I opened a new file, and started coding. I needed sympy in the script as it has factorisation routines. How would hatch environments have helped me there? My scratch directory isn’t in a hatch-based project, and the code I was writing wasn’t worth making into a project.

At the moment, I use pew to make a temporary virtualenv, install sympy, and run my script. But I have to remember (or read the code to check) what dependencies my script has when I run it, and build a throwaway virtualenv each time.

This use case is why I often push back against people saying “using packages from PyPI is easy”. It is, but package management is a big enough overhead on a throwaway script that sticking to the stdlib can end up being preferable.

No need for a pyproject.toml, enter C:\Work\Scratch then touch hatch.toml then hatch shell

The solution for this use case used by pyflow is very nice IMHO GitHub - David-OConnor/pyflow: An installation and dependency system for Python

Seriously? Wow! Hatch just keeps getting better :slightly_smiling_face:

1 Like

I just want to throw my prototype into the mix: monotrail

It’s one single static binary, it will download the correct python version (feat. pyoxy’s standalone cpython distribution), manages dependencies and environments (using poetry internally for locking and lockfiles). E.g. you can do monotrail run -p 3.9 command pytest which will run pytest on python 3.9 with dependencies from pyproject.toml and (if present) poetry.lock; I’ve put a lot more examples into the readme. While it’s a very opinionated take on environments, I believe it nicely showcases a lot of the single tool features requirements