Hatch always uses an env so this would work fine
For standalone scripts with no dependencies, I would think you could use the py
executable that pyup
provides directly and skip the pyrgo
new
/lock
/install
steps I listed. This is one of Pythons strengths (scripting) that we can continue to support.
I was going to say some things regarding why conda environments as-is are a no-go for me based on work experience, but I’m not diving into it as I think it’s a bit distracting.
I will say I’m interested in seeing what having Jupyter standardize on Hatch does for this.
No pressure. (Still slowly working towards it, BTW).
For other people’s benefit, this last came up in Creating a standalone CPython distribution . This is something else I would like to see solved upstream, but I have to clear some space in my schedule to start tackling it.
Short version:
- PEP 665 – A file format to list Python dependencies for reproducibility of an application | peps.python.org got rejected (search around here for the threads on the topic and PEP).
- I’m working towards making GitHub - brettcannon/mousebender: A package for installing Python packages be an opinionated PoC for wheels-only lock/pinned dependency files
- Need to review PoC: Metadata implementation by dstufft · Pull Request #574 · pypa/packaging · GitHub for parsing metadata
- Need to do a new release of mousebender for PEP 691 (and probably PEP 700 soon)
Long version: you know where to find me.
https://pyup.io beat you to the name.
I’m hoping that once we have pre-built binaries for CPython releases I can get something like this going with the Python Launcher for Unix.
I was going to write a long description of the “writing a script (with dependencies)” use case, but it occurred to me that @njs described this a lot better, back in 2018, with this post.
I think Python tooling currently focuses mostly on the “Reusable Library” section of this, with some help for the rest of section 3. But we tend to ignore sections 1 (“Beginner”) and 2 (“Sharing with others”). Those two early phases are where people have “a bunch of scripts in a Work
directory” and where a formal project directory is more overhead than they want. And as a sidenote, I think that referring to stage 1 as “beginner” is misleading - I’ve been using Python for years and I still find that most of my work is in this category.
It’s standalone scripts with dependencies that I care about. They are the case that is badly supported right now IMO. These are often all dumped into a single directory, so PEP 582 doesn’t fully address this. The pip-run
tool helps, but it reinstalls dependencies for every run, which can be painful.
I’d like a “unified vision” to cover this use case as well, and not just the “create a Python project” workflow.
Hatch fixes this. For example, Hatch itself uses:
[envs.backend]
detached = true
dependencies = [
"httpx",
]
[envs.backend.scripts]
update-data = [
"update-classifiers",
"update-licenses",
]
update-licenses = "python backend/scripts/update_licenses.py"
update-classifiers = [
"pip install --upgrade trove-classifiers",
"python backend/scripts/update_classifiers.py",
]
I like hatch a lot, although I haven’t used environments much yet. But I think you’re missing my point (or maybe I’m missing yours).
I have a directory on my PC, C:\Work\Scratch
, where I keep all sorts of junk - snippets of code in C, Python, and all sorts of other languages, directories with temporary work, etc. There’s no structure and barely any organisation. The other day, I wanted a script that would take a number and display factors of that number and other numbers “close” to it. I opened a new file, and started coding. I needed sympy in the script as it has factorisation routines. How would hatch environments have helped me there? My scratch directory isn’t in a hatch-based project, and the code I was writing wasn’t worth making into a project.
At the moment, I use pew
to make a temporary virtualenv, install sympy, and run my script. But I have to remember (or read the code to check) what dependencies my script has when I run it, and build a throwaway virtualenv each time.
This use case is why I often push back against people saying “using packages from PyPI is easy”. It is, but package management is a big enough overhead on a throwaway script that sticking to the stdlib can end up being preferable.
No need for a pyproject.toml
, enter C:\Work\Scratch
then touch hatch.toml
then hatch shell
The solution for this use case used by pyflow is very nice IMHO GitHub - David-OConnor/pyflow: An installation and dependency system for Python
Seriously? Wow! Hatch just keeps getting better
I just want to throw my prototype into the mix: monotrail
It’s one single static binary, it will download the correct python version (feat. pyoxy’s standalone cpython distribution), manages dependencies and environments (using poetry internally for locking and lockfiles). E.g. you can do monotrail run -p 3.9 command pytest
which will run pytest on python 3.9 with dependencies from pyproject.toml and (if present) poetry.lock; I’ve put a lot more examples into the readme. While it’s a very opinionated take on environments, I believe it nicely showcases a lot of the single tool features requirements
I think this is a management and messaging problem first and foremost. If the python packaging authority doesn’t mention conda anywhere, a lot of people will never even discover it. And even people who are aware are doubtful - I see the confusion all the time (in my dayjob and online) about which way is “the right way” to do python packaging and dependency management.
I firmly believe that the vast majority of users would adapt to any paradigm that solves their problems and doesn’t get in their way too much. I think the strongest resistance actually comes from those people knee-deep in packaging entrails, and the significance of that group is that many of them are the movers and shakers of the (non-conda) packaging ecosystem.
I think that willingness for change would be there, as long as there are no major functional regressions in terms of problems already solved today in conda-land. And that’s not just me saying it personally, but echos the statements by the anaconda CEO in the twitter thread I’ve already linked twice (also I believe most of conda-forge/core has a very solution-oriented approach to this as well).
End users really don’t benefit from a zoo of different solutions, the confusion, and the straight-up incompatibilities between those two worlds (in both ways).
I see the users of conda as part of the exact same larger ecosystem (not some parallel world), only that they have been so thoroughly underserved by the “native” capabilities that they found(ed) a new home[1]. So I disagree quite fundamentally with this:
Conda is full-stack because that’s – unfortunately – what’s necessary to deal with the inherent complexity of the problem space. But it’s not a beneficial state of affairs for anyone IMO; that divergence is an issue that affects a huge amount of python deployments (e.g. having to decide how to prioritize the benefits of pyproject.toml
/ poetry’s UX, etc. vs. the advantages of conda) – it’s possible to claim that it’s too much work to reconcile, but fundamentally, that schism shouldn’t have to exist.
It’d be nice if PyPA and CPython folks didn’t treat conda as such a world apart, because for one that soft form of “othering” is not helpful in finding a common way forward, and secondly because it is a large part of the wider python ecosystem and deserves some usecase-empathy (beyond “of course we care about the data science persona!”).
Just to clarify my point – in the big picture, the whole discussion is about UX, including e.g. the avoidance of very frustrating crashes and insanely hard to debug situations. What I was referring to above was slightly more narrowly-scoped UX (as experienced through CLI/config etc.); providing a solid enough foundation is IMO the much harder thing to pull off in terms of technology / standardisation; shaping things into a nice-to-use CLI/config is important but by itself cannot solve the underlying problems.
-
or, for the sake of the metaphor, let’s say they’re living in the garage. ↩︎

It’d be nice if PyPA and CPython folks didn’t treat conda as such a world apart, because for one that soft form of “othering” is not helpful in finding a common way forward, and secondly because it is a large part of the wider python ecosystem and deserves some usecase-empathy (beyond “of course we care about the data science persona!”).
I for one would love to see conda participating in packaging discussions. We’ve asked a number of times but it’s never really happened[1]. Maybe we aren’t reaching the right people?
-
Excluding the occasional non-productive “just abandon all the exiting tools and use conda” comment. ↩︎

It’s standalone scripts with dependencies that I care about.
As Ofek mentioned, both Poetry and hatch have a shell
command that handles activating a managed virtual environment in a cross platform way. In the pyrgo
example I was giving:
Developer writes a script, script.py
that depends on scipy
pyrgo add scipy
This adds the dependency to pyproject.toml
They install the dependency:
prygo install
They activate the virtual environment with these depedencies:
prygo shell
And run the script within the activated environment that includes scipy
:
(venv) $ python script.py
Both Poetry and hatch also support a run
command to short-circuit the need to activate the virtual environment. So rather than running the shell
command and then innovating the script in a second command:
pyrgo run python script.py
Of course, this virtual environment could also be used to install/uninstall arbitrary packages with
pip
based on how things currently work, if the developer wanted to fall back to the additional flexibility/unstructure that currently exists withpip
andvenv
.
Sigh. I think you’re still missing my point. One further try.

This adds the dependency to
pyproject.toml
I don’t have (or want) a pyproject.toml
for this use case. That’s *exactly my point. I’m not even in a Python project, or writing one. If it helps, assume I’m writing my script in /tmp
.

They activate the virtual environment with these depedencies
… and I don’t want to manage a virtual environment associated with my script. I want the system to do that for me.

Both Poetry and hatch also support a
run
command to short-circuit the need to activate the virtual environment.
Which is great, but they still need me to manage the environment in the sense that I have to pick a name for it, delete it when I’m done, and remember that that environment is associated with my (probably throwaway, but I’m keeping it “just in case”) script.
My ideal here is for scripts which depend on 3rd party libraries to be just as easy to write and use as scripts that only rely on the stdlib. And crucially, for all situations that pure-stdlib scripts can be used in.
The nearest I’ve found is pip-run
, which lets you say __requires__ = ['requests']
in your script, and it will then install requests in a temporary environment when you run your script. Its main disadvantages are that it re-creates the environment every run (slow if you have complex dependencies) and that it has a somewhat clumsy command line syntax. But integrate that functionality with something like hatch run
and you have pretty close to what I’m talking about.
Seems like pyflow
is close enough (I have not tried). It seems to have support for __pypackages__
and __requires__
.
I haven’t tried pyflow
either, but this is where it seems like __pypackages__
would really come in handy, especially if built into Python (perhaps opt-in). Just brainstorming here:
- I write a little
toy.py
script and it imports sayrequests
- I run
python3 -M toy.py
(I’m just picking-M
for “magic”) - Python reaches the import, sees the missing
requests
dependency, goes out to PyPI and installsrequests
into__pypackages__
, satisfying the import - Python merrily and magically (there’s that
-M
again!) continues to executetoy.py
I’m not papering over all the complexities, security, metadata, etc. issues here. Well, maybe I am but deliberately so to give some feel for what would be the happy path to very simple, built-in, magical scripts. TPI? [1] Yeah, probably.
-
Terrible Python Idea ↩︎

Python reaches the import, sees the missing
requests
dependency, goes out to PyPI and installsrequests
reminds me of David Beazley’s autoinstall
, David Beazley - Modules and Packages: Live and Let Die! - PyCon 2015 - YouTube

I for one would love to see conda participating in packaging discussions. We’ve asked a number of times but it’s never really happened. Maybe we aren’t reaching the right people?
That’s not really true, e.g. here.
It’s a chicken-and-egg problem in the sense that as long as pip / PyPA
consider installing non-python dependencies out of scope, yet installing all possible binaries that python projects might bring along in scope[1], then it leaves basically zero room for conda to contribute, because that stance effectively defines the problems that conda is solving out of existence.
So the 100’000ft view is that, to find a common path (and have it be pertinent for conda people to contribute), pip has to either:
- expand its mandate to also cover non-python dependencies, effectively becoming a full-stack, cross-platform installer
- restrict its purview, and allow resp. rely on plugging in other tools to fill the required gaps in installing binaries
Finally, while I don’t speak for anyone but myself, I’m spending the lion’s share of my FOSS time curating conda-forge, and know that ecosystem (and many involved people) quite well. Feel free to tag me on DPO for anything conda[-forge]-related.
-
more in-depth discussion in the link above ↩︎

Conda is full-stack because that’s – unfortunately – what’s necessary to deal with the inherent complexity of the problem space. But it’s not a beneficial state of affairs for anyone IMO
Part of the issue with any of these discussions is that:
- The actual problems (related to compiler toolchains, ABIs, distributing packages with compiled code in them, being able to express dependencies on non-Python libraries and tools, etc.) are quite complex,
- Most people here don’t have those problems as package authors, and in many cases they don’t have them as users either (simpler packages with some C/Cython code work fine as wheels),
- The solutions to those problems do necessitate some overhead, which make them hard to accept for folks that don’t have the problems,
- The scientific computing and ML/AI community hasn’t always explained the problems in enough detail. Often it’s a long email/discourse thread about one specific topic, and folks talk past each other because possible solutions are brought up before the problem is very clearly explained.
That makes it difficult to get anywhere with this conversation.
I would also say that it’s not only Conda that solves these problems. PyPI has quite fundamental problems when dealing with complex packages/dependencies with C/C++/CUDA/Fortran/etc. code. Those kinds of problems are solved (mostly) by Conda, but also by Spack, Nix, Homebrew, Linux distros, etc. Of those, Conda, Spack and Nix all have the concept of environments that you can activate/deactivate.
I’ll do a little pre-announcement here: I’m making a serious attempt at comprehensively describing the key problems scientific, ML/AI and other native-code-using folks have with PyPI, wheels and Python packaging. That as a standalone website (first release by the end of the year) which is kept up to date, aimed to serve as a reference, so we hopefully stop talking past each other. It will not have proposed solutions - this Discourse is the place for that. At most (for now) it some outgoing links to key PEPs and discussions around potential solutions to some of the issues.
I’ll reach out to invite you to participate in helping shape the content on some of the topics that you’ve brought up.
One nice thing is the standard py launcher can apparently now use Conda/Anaconda/Miniconda Python via py -V:Anaconda
.
Do I understand correctly that the main problems Conda solves are related to ensuring low-level ABI / binary version compatibility for shared, non-Python dependencies?
And that the downsides are a more limited set of available packages (e.g. no Python 3.10+ yet, only popular libraries that have been specifically packaged by Conda) and/due to additional effort to make these packages available?
How does it compare to Christoph Gohlke’s wheel binaries, which can be used with normal Python and pip? Why can they not be on PyPI?
Maybe all this will be explained on that new website?
It seems the other / original topics here are largely orthogonal, right?
- install Python versions easily (“pyup”)
- lock files
- simple scripts with dependencies but without having to deal with environments
- cross platform standardization (py vs python3)
As the py launcher can now already “plugin” external Conda, and is the bundled user facing top-level tool, maybe it would be the natural place to integrate more of these as subcommands via plugins?