Announcement: Hatch v1.8.0

You’re right that I could flesh out the description but by virtue of the private flag which you have already mentioned I would assume people would understand not providing that (because that does have a description) would add it to your PATH.

On Windows it updates your user path via the registry, otherwise it will modify the startup scripts of your current shell.

edit: for others reading this is the command in question Reference - Hatch

OK, it looks like the hatch python command is not for me :frowning: Thanks for clarifying.

2 Likes

Sorry, I misremembered. The last time, I think I just manually nuked ~/.local/share/hatch/.

I understand, and I’m also annoyed that there is no good cross-Unix way to add some directory to $PATH, but as a matter of fact, Rustup/Cargo do the same and they’re universally appreciated in that language community. Which is an objective proof that most users don’t care :person_shrugging:

2 Likes

They care when every single Python install gets its own PATH entry and they end up out of order or overflowing the cmd.exe limit and everything just fails to run.

If we had a versioning approach that didn’t rely on side-by-side installs it would be far less of a problem. As far as I can tell, Cargo has added exactly one entry to my PATH, and has no reason to ever add more. If I actually used PATH for Python, I’d have seven entries…

5 Likes

What’s the limit exactly? Is it that severe? (Sorry for my total ignorance of Windows, as a happy Linux user…)

It’s something like 4600 characters in either the system or per-user PATH setting, or 9200 all up, and if you launch cmd.exe with that length it’ll just clear things out. It seems to be a cmd.exe bug, it doesn’t affect regular processes or PowerShell, but there are sometimes unavoidable situations (e.g. we recently encountered it using nmake, which launches cmd to run commands from the makefile you’re building).

I’m a new user, I haven’t used hatch before but I’m familiar with Poetry. By no means I’m a python expert, I’m pretty new to this.

  1. hatch-python piqued my interest but I’m not sure if I’m misreading it or not understanding it.

    Why Hatch? - Hatch compares it to pyenv, so I’ll use that comparison to better explain my issue. I can install python multiple python versions with hatch, but how do I select one out of the multiples? Pyenv has these commands but I can’t find the equivalent hatch command:

    pyenv shell <version> -- select just for current shell session
    pyenv local <version> -- automatically select whenever you are in the current directory (or its subdirectories)
    pyenv global <version> -- select globally for your user account
    
  2. hatch python show is confusing. The help text says “Show the available Python distributions”. It wasn’t obvious to me if it means “show available python distributions installed on your system” or “show available python distributions that can be installed”

  3. I couldn’t wrap my head around the new app builder. Is the documentation incomplete or am I just stupid? I tried following it and got nowhere. There was no error. It simply didn’t do anything. I would love some examples that a newcomer like me can just easily understand and reproduce.

I hope i conveyed my thoughts in a proper manner, apologies if i misunderstood anything

1 Like

That’s not a technical requirement: Tox could put a .gitignore containing * and git would ignore the directory. This is what venv will start doing in Python 3.13.

I always assumed if we came up w/ a common location on Unix OSs we would have separate directories per version and then symlink to a common bin w/ bonus symlinks for micro versions. So something like ~/.local/cpython/3.12.1/ for the install and symlink into ~/.local/bin. Could even make python3 always point to the newest version.

3 Likes

I’m assuming you built a binary successfully and then tried the binary and then got a distribution not found error at runtime from pip. That is because the project+version combo has not been released to PyPI. In order to use a dev version you must embed the wheel.

Honestly, as a fellow Windows user I’m quite surprised by the preference you and Steve seem to share regarding PATH manipulation. As one of the comments above mentioned, this is quite common for other tools.

Questions in earnest:

  1. Do you experience the same level of trepidation when the installers for, as an example, Rust and Go modify your registry to add the apps to your PATH?
  2. What do you think is preferable for the average user that is not a literal expert as both of you are?

edit: I think everyone here knows this implicitly but just to be explicit about it, my strategy for Hatch is that the defaults are desirable for the majority of users. That usually maps on to what provides the least amount of friction for new users. Cases for experts will always be supported and perhaps even the default way of doing things, but that is not always the case.

edit 2: What do you, as experts, want out of Python installation via CLI? What is the desired workflow?

1 Like

Just add it once to your global gitignore like you do with your IDE too hopefully? :sweat_smile:

2 Likes

Indeed. In the earlier long threads about packaging strategy this kept cropping up as an area of real disagreement.

For the record, I prefer the environments not to be stored within the project directory. This is largely because for me many environments are not project-specific. A lot of my “projects” begin as exploratory code using a pre-existing environment that has “a bunch of useful stuff” installed. It’s only later, when (or if) those explorations begin to be whittled down and polished up, that I may make an environment specifically for a certain project.

I also want to point out that your characterization of a “newcomer’s expectations” itself involves the assumption that a newcomer is working with (or wants to work with) project-based environments. This is not necessarily the case. It’s equally possible that a newcomer follows some set of steps they find online, thinking they’re setting up a reusable environment to play around with, and then later they delete some “project” called “testproject” that they had set up initially, and are surprised when that deletes their environment. So in that situation it would safer not to delete environments along with a project. (I’m speaking here in ignorance of how exactly hatch environments work; my point is just that newcomers using whatever tool may or may not assume that “projects” and “environments” are linked.)

There is such an approach. Or rather, not a “versioning” approach, but just an environment management approach. It is to have something like the py launcher, but allow to specify an environment name rather than just a Python version. So all you need on your path is a single environment manager tool. You can use that tool to activate an environment so that it puts that environment’s Python on your path (for the current session, or until you deactivate it); or you can use the tool to run something in a specified environment like tool run --environment myenv myfile.py

(And guess what, there is a tool that works this way and does a lot of other cool stuff too, and it starts with a c and ends with an onda.)

I can’t comment on Windows and how it handles PATH, but as a Linux user, I also strongly dislike tools that modify shell config files. Either install in the appropriate directory (e.g. ~/.local/bin) if this is a tool I expect to have a single instance of, or if this is something I’m going to be switching between, then either give an activation file I can source, or spawn a sub-shell with the configured environment (and defer any autoactivation to a tool that is configured to do that, such as direnv). Autoadding multiple pythons to the global (i.e. not in an activated env) PATH (especially if they’re accessible as python) makes it hard to debug when something is going wrong, especially when the number of entries (and hence possible causes of errors) becomes large.

4 Likes

Well, I’m talking about hatch, which describes itself as “Hatch is a modern, extensible Python project manager.” So I think that assuming a project-based workflow is reasonable… :slightly_smiling_face:

Wanting tools for non-project based workflows is a whole other discussion, which I’m very carefully not getting into here. Many of my Python scripts are very much not project based, being created and living their lives in a “Scratch” directory on my PC that contains all sorts of junk. But I don’t consider hatch (or poetry, or PDM) to be appropriate for that type of work.

With rust, no. As Steve said, it only adds one directory to my PATH. But that’s after-the-fact reassurance, I admit. When installing rust for the first time (and with go, and npm) I am nervous about whether it’ll add “clutter” to my PC that I can’t get rid of.

As regards average users, I think it’s misleading to think that there is an “average user” - there are a wide range of very different experiences. In my experience, though, a significant proportion of WIndows users (even developers, in many contexts) almost exclusively use GUI tools, and the CLI is an alien environment to them. GUI tools and CLI tools have very different conventions and assumptions, and what’s reasonable to a GUI user is frustrating and complex to a CLI user - and vice versa.

In this context, I consider myself a CLI user, and Python (and hatch) CLI tools. Most of my experience with beginners has been with experienced developers who write code in GUI environments (database stored procedures, Java and business process modelling code, etc). These people know how to code, but are almost completely unfamiliar with the CLI. When I teach Python, I start by teaching a basic level of familiarity with the CLI, and its conventions and behaviours. This means teaching the whole concept of directories, a “current directory”, etc. My students often start from a position where they open up cmd, find themselves in C:\Windows\system32, and start creating their files in that directory! Yes, tools like VS Code layer an “edit a project” model on top of the GUI view of folders, and make it easy to open a CLI “in your project”, but I’m assuming we don’t want to base CLI tools’ defaults on the assumption that people are using VS Code (or indeed any IDE).

So I don’t honestly think there’s a “preferable default” for a user who isn’t comfortable with the CLI. With that in mind, I think that conforming to common CLI conventions is the right choice.

So on to some specifics:

  1. Application locations like %APPDATA% are the only choice for a GUI application (and on Windows, they are designed very much with that in mind - they are hidden away, and the user isn’t expected to deal with them). Config settings are edited using the GUI and persisted in the same “safe, hidden” location. CLI applications, on the other hand, typically allow the user to manually edit a config file, and therefore that config file should be in an easily-accessible location - Windows doesn’t really have such a location, but tools following Unix conventions have gravitated towards the user’s home directory.

    The overall consequence here is that things are a bit of a mess, because of differing history, but “have easy to find config files” and “use %APPDATA% sparingly, if at all” are good principles here. Storing tool data in the working (project) directory is also a good choice, because again it’s a common CLI convention.

  2. GUI applications don’t get added to PATH. They are run by double clicking them. So PATH is an inherently CLI focused convention. And the model for PATH is very much that of a list of directories, each of which holds many executables. Path entries with only one, or a couple of, executables, are uncommon, and usually frowned on (for the reasons Steve mentioned). It’s also very bad form to have multiple copies of the same program on PATH, in different directories.

    So adding installed Python interpreters to PATH violates this convention twice over - it’s a one-application-per-entry approach, and it adds multiple copies of python.exe to the path, which can only be disambiguated by checking the value of PATH closely.

Yes, but those tools typically (in my experience) add a single directory with multiple executables in it (gcc, clang, imagemagick, openSSH, Rust). Personally, I almost never install tools that add a directory with a single executable in it to PATH (see below).

Going back to

I routinely “try out” new tools. Probably way more than I should :slightly_smiling_face: I’ll often try them out, discard them, and come back to them again later. Because of that, I very much prefer to work with tools that I can install, try, and then uninstall completely. There’s nothing worse than trying out a new tool, getting very confused because it doesn’t work the way the docs say it should, and finding that I’d checked it out 2 years ago, and left a config file lying round that changed all the defaults but I’d forgotten about it. Or there’s cached data with surprising impacts on my experience. There’s a concept in GUI tools of “portable applications” that are totally self-contained, and can be used from a pen drive with no installation needed, or impact on the user’s system. This is my ideal for CLI tools - and as an IT consultant, it was often critical for carrying around a “toolkit” of programs I can use on a client’s environment without seeking approval to make changes to the client’s systems.

So yes, I have serious trepidation about any install method that isn’t “unzip and run”. And I dislike intensely any tool that doesn’t provide full disclosure on what changes it makes to the system it’s installed on, or doesn’t provide a “fully portable” means of working (whether that’s a portable config file, or defaults that “just work” for me).

Specifically, I use the scoop package manager very heavily, because it gives me that “self contained, easily removable” experience for my command line tools. I use pipx for Python tools for the same reason, but that doesn’t isolate me from tools putting stuff in %APPDATA% like scoop typically does.

That’s fair, as long as “supported” means “experts can configure this behaviour to be the default for them” rather than “you always need a particular command flag”. Along with easily-discoverable and visible (and ideally portable) configuration, so you know what your non-standard configuration is).

To give an example, my hatch.ini file is as follows

mode = "local"
project = ""
shell = "pwsh"

[dirs]
project = []
python = "isolated"
data = "C:\\Users\\Gustav\\AppData\\Local\\hatch"
cache = "C:\\Users\\Gustav\\AppData\\Local\\hatch\\Cache"

[dirs.env]
virtual = ".venvs"

[projects]

[publish.pypi]
user = ""
auth = ""

[template]
name = "Paul Moore"
email = "p.f.moore@gmail.com"

[template.licenses]
headers = false
default = [
    "MIT",
]

[template.plugins.default]
tests = true
ci = false
src-layout = false

[terminal.styles]
info = "bold"
success = "bold cyan"
error = "bold red"
warning = "bold yellow"
waiting = "bold magenta"
debug = "bold"
spinner = "simpleDotsScrolling"

I have literally no recollection of what any of that is for (apart from [dirs.env.virtual]), so I’ve no idea why I added it, or whether something added it by default, what of that is changing a default and what is simply re-stating the default (I think [dirs.data] and [dirs.cache] are the defaults).

As an expert[1], that’s not what I’d call usable.

Most importantly of all, integration with the conventions of the python.org installers and the Python launcher for Windows. By that I specifically mean that it should follow the conventions established by them:

  • Python is not on PATH by default, but is invoked using the py command.
  • Users can run a specific Python interpreter by quoting the full path name, or by adding the interpreter’s path to PATH manually. Note that this means the path name should be easy to remember and discover[2].
  • Users who don’t have Python on their path should not have it added by default.
  • I don’t have an opinion on what the right experience should be for users who check the “add Python to PATH” box in the python.org installer, as I don’t do that myself and I don’t recommend it to people. But if you can’t distinguish between this and the previous case (or don’t want to) then the “don’t add python to PATH” should take precedence, with “add it” being the opt-in case, to match the python.org installer).

The way the Windows Store version of Python exposes a python.exe command (including the[3] shim that opens the Store app) should be taken into account - again, I disable this and so it’s covered under “users who don’t have python in their path” for me, but as a maintainer, I don’t want to have to support users who (for example) are getting weird results because hatch overrode the store Python and they didn’t realise.

For Python installation via CLI, what I specifically want (remember, I’m in the “no Python on PATH by default” category):

  • hatch python install 3.10 installs Python somewhere accessible, but does not add it to PATH or “activate” it in any way.
  • By default, it is installed in the project directory. Installing it centrally should be available as an option, but not by default, and it should involve specifying the location where the shared copy goes, not picking a location for me.
  • I’d like the naming to be something like .pythons/3.10 by default - note that it’s just version number, not architecture or anything like that. Non-default architectures can go in a subdirectory if that’s a thing, but I want the default naming to be “clean”.
  • I’d want to be able to create virtual environments from that executable easily - virtualenv -p .pythons/3.10/Scripts/python.exe is acceptable. As is .pythons/3.10/Scripts/python.exe -m venv. Naming the interpreter by path is also fine for nox/tox/pew/pipx. Clean, short naming for the directory is crucial for this usage.
  • I’d like hatch to be able to find that interpreter by version number when creating environments (as long as the environment was created in the default location).

I hope that helps. I know it’s long, and very much just my own personal preferences and views. I fully expect a lot of people to say “I don’t agree with this”, but that’s the whole point - @ofek asked for my views so that’s what I gave. I’m not trying to say everyone should work my way, or that my way is best for anyone but me, all I’m trying to do is explain my perspective, and why I say hatch “isn’t for me” as it stands now. No-one’s forcing me to use hatch, and I’m fine with that, but I’m also grateful that @ofek is interested in my feedback as to why I made that decision.

(If there’s a better public place for a discussion like this, I’m happy to take this elsewhere. But I feel it’s more or less on topic here, as it’s feedback on the Hatch 1.8.0 release)


  1. which I don’t think I am, but whatever ↩︎

  2. Yes, this means I don’t think the default install location for the installer is ideal, but I can’t think of a better place ↩︎

  3. IMO ill-advised ↩︎

7 Likes

I also strongly prefer that tools confine themselves to the project directory when possible.

To give a concrete example of a negative consequence of not doing this, I use pre-commit quite a lot, but it stores its environments in a cache directory in my homedir. (I don’t really like that it calls this a cache, but that’s neither here nor there…)
Frequently when troubleshooting (e.g. I recently had a hook that was failing on py3.12), I’ll want to clear that cache, but the mapping of repos to environments is opaque. I can’t tell which one I’m supposed to use between pre-commit gc and pre-commit clean, and sooner or later I get annoyed with the whole toolchain and just blow the whole cache dir. Then each time I touch a repo for a while it will take a little longer to rebuild those environments. Oh well. At least I fixed my bug.
I like pre-commit a lot and get a lot of value from it. But when it breaks, its decision to opaquely manage environments for me is in the way.

In contrast, I use tox a ton, and never have any confusion about it. The commit which adds a tox.ini also adds one line to the gitignore. If I need to dig deep and troubleshoot, I can reach into .tox/py313-rainbows-unicorns/ (that’s a 3.13 env with rainbows and unicorns, for those unfamiliar with tox :wink:) and look at the site-packages dir or whatever I need.

What I’m trying to get at is that inclusion in the project directory helps make tools more transparent and understandable in their mechanisms. That, in turn, is a huge benefit when things go wrong.

Vis-a-vis hatch, this mostly means that if I start using it again, I’ll want to configure it in this way. Which I didn’t know was a thing I could do – probably my fault for not reading the docs carefully enough. But it might also guide docs or outputs from diagnostic commands. I want to be able to see the mapping from a repo to it’s environments. And given that a local repo can be deleted, I want to be able to inspect that mapping regardless of where I am on the filesystem.

1 Like

Thank you for expressing your preference! FYI Hatch environments can be found using the hatch env find command.

1 Like

This command finds environments that are associated with the current project i.e. by looking for pyproject.toml in the current directory and finding environments that are associated with that project. If the project directory has been deleted then this command will not find the environments that were previously associated with it.

I just tried adding virtual = '.virtualenv' into config.toml and now hatch has lost the environments that it previously created in ~/.local/share/hatch/env/virtual. They are still there using disk space but hatch env find no longer sees them and I am not sure if there is any hatch command to show that they are there.

If the environments are not stored in the project directory then what I want is an easy way to see a list of all environments, showing their disk usage and showing which are not needed any more. Then I want an easy way to delete the unused environments.

2 Likes

Now this is a great, very specific and achievable feature request! Can you please open an issue asking for that so that I may track it?

1 Like

The python.org macOS install breaks this rule.

1 Like

Not really, because they add only 1-2 executables that aren’t likely to conflict with other apps, don’t require me to disambiguate between versions, and don’t pollute the DLL search path (a.k.a. PATH) with their own DLLs.

(Unlike Paul, I’m not so concerned about directories being added with only a few executables, though I’d prefer Windows had a model where apps installed into a /bin directory that was known to be on PATH. Store apps follow this model, but traditional apps cannot unless they do it themselves.)

The py launcher also meets these criteria (ignoring the parenthetical). But python does not, and in particular the Scripts\ directories do not. It is shockingly easy to end up with multiple mismatched sets of Python executables/scripts on PATH, which is the problem I am concerned about. One-off/shared namespace installs like Rust, Go, Node.js, .NET, etc. don’t have this problem.

The Windows Store app model is a step in the right direction (and I encourage it take more steps in the right direction because it still has issues). With a single directory/namespace for global commands, you have a much more direct sense of which one is active, and it has much more accessible controls for changing it (though they could be even better, and I hope one day we get there).

e.g. in this image, it’s pretty easy to tell whether python.exe means Python 3.12 or 3.13, and obvious how to change it. PATH offers neither aspect.

However, my personal preference is to use temporary changes to PATH, which is typically an activated virtual environment.[1] This way I have an uncluttered PATH by default, and when I’m working in a particular project I can activate it and now python is “the right Python for this project with all its requirements”. I still tend to use -m rather than global script names, but those are available too. If I’m working in two projects at once, I’ll have two terminal tabs open with different environments activated.

Having the tools “know” the right settings for the current “workspace”. Obviously the definitions of those two quoted words will vary, but I would imagine that in a terminal, the current directory implies the workspace, and Hatch has a config file for that workspace, so that its run command just does the Right Thing. Equally, an IDE like VS or VS Code has its own concept of a current workspace, and pressing F5 in those should be able to generically locate/create and launch the right environment without needing PATH to be set up at all.

Basically, I think the workspace[2] is what the “average user” should be concerned with. They clone one from VCS-of-their-choice, and run the tool that “knows” how to prepare it.[3] Only one tool has been installed, only one tool exists on PATH all the time, but once you’ve used that tool to activate your workspace, you may have convenient access during the current session to the other tools your workspace uses.


  1. I’d still prefer a PEP 582 style ability to have a global Python recognise my project-specific libraries, but that’s clearly unpopular. ↩︎

  2. “Project” is likely the word a lot of people are already using for this, but I think that’s subtly different. You could have multiple projects that use the same workspace, for example. ↩︎

  3. In case a concrete example helps: the frontend tool that sees pyproject.toml and finds the section that identifies the backend tool needed, acquires, and launches it, and that backend tool uses its own configuration to download actual runtime components and launch the requested app/script. ↩︎

6 Likes