Publish Linux installer on

I have been using debian (not ubuntu) for more than 20 years as my primary development and desktop environment. I am a full-time python application developer and also contribute to multiple python open source projects.

I really like the philosophy of debian packages/apt try avoid code duplication in the system as whole rather than approaches that involve including every dependency in every package which results in very large disk space usage by comparison. I also like how cleanly apt removes dependencies when I uninstall.

When I need a new python package installed I always look first to see if it is available in apt before resorting to pip.

Python packages installed via apt automatically update when I update my debian distribution and get a new version of Python, which means that I only need to reinstall via pip the very few packages that I need not available in debian before my new system is fully up and running.

These tradeoffs are very much worth it for me in exchange for having to wait one year to get the latest python release.


I think the conversation has strayed a bit.

The idea of installers would be to help less savvy people get a fresh version of python. I don’t think we would want it to replace or conflict with the system one.

Optimally Linux users shouldn’t need to hit the command line and manually build to get a new python version. On Windows, we can have multiple side by side (albeit without a system python). So forget about the system python, system packaging, etc. I think the desire is an easy way to do that on Linux.


I think offering an installer that installed a specific version of python to /opt on linux that was the full “everything documented as part of CPython is expected to work when obtained via this installer” would be a net positive, but I’m a little disappointed in the situation downstream that’s led here.

If we’re just focusing on what can be done now, building and installing a working interpreter to /opt and not replacing existing symlinks for python would avoid disrupting system python on distributions where system python is not just an implementation detail, and is exposed to users.


This describes me. I’m fairly savvy with Python but only use Linux occasionally and the C environment not for decades. I’ve built a CPython succesfully a small number of times, but it wasn’t entirely straightforward. ISTR it would fail in the middle looking for dependencies and there were options that suggested I might not have built the best thing.

I read enough the first time to know I should not replace the system Python. It does not seem necessary to discuss that.

It took a while to believe this was actually what I was expected to do to get a mainstream language on a popular OS. (Yes, I’m sure the search took much longer than building it once I gave in.)

My need is for something I can install, ideally without root (so maybe not just to /opt), with only one trip to the Internet (not every computer is connected), and that I can put on my path once installed. Then an easy way to switch between versions when I have several installed.

asdf and pyenv (new to me from this discussion) look promising if the disconnected use case is served. Is the argument here simply that a third party tool should not be necessary?


But that IS the best way to make sure that it’ll work on your exact system. I have run into annoying problems trying to use something that was packaged for Ubuntu and I’m on Debian; since the package assumed a particular version of some library, and I had a different version, the program wouldn’t run. Building from source ensures that this doesn’t happen.

1 Like

The forced rebuild from source Is part of the reason offsetting the “year of the Linux desktop”.

Doing make is easy, but getting ctypes and other c libs to compile is harder.

I’d +1 on something to make it easier. A manylinux installer could get closer. Yes there may be edge cases, but they can be more smoothed over centrally.

1 Like

sudo apt build-dep python3
./configure; make

I’ve never had a problem building ctypes. Have you actually done it, or are you afraid because you expect it to be a problem?

That creates a slow Python interpreter without important performance optimizations. The build also does not use system’s security police or system libraries. You are loosing about 20% performance and security hardening.


True, and I would always add parameters to the configure. But you should get a fully working build with that. @csm10495 said that it wouldn’t even compile, and I am highly dubious of that happening on any modern Linux system.

Now that we’re talking about security, there’s the question of security updates for these Pythons. Whatever installer mechanism we’re looking at should have a way to do updates, and a way to let the user know when they need to do updates.

These problems are solved when you look at some of the existing third party distribution mechanisms (APT repos, snaps, flatpaks, etc.) But they all come with their own advantages and downsides.

APT Repos: Only useful for Debian derivatives. Requires users to trust you with root on their machine.
RPM Repos: Only useful for RH derivatives. Requires users to trust you with root on their machine.
Snaps: Unconfined snaps can work for command line utilities. Typically the snap binaries aren’t on the default PATH (some setup required).
Flatpak: Not a great option for CLI although it should work.


Once again I think we’re over thinking this. Use the Windows installer as an example: it doesn’t auto update or notify.


My mum uses linux and has been doing so for 10 years. Never once has she asked me about “missing headers” or “how to get the latest python version”.

I think software developers are expected to be able to use the terminal.

Or the macOS one. This is an install and forget option for those not that technical (ML, data science or new users to the language).

Not all python users are software developers. We shouldn’t really assume that.


I’m guessing that, like my Mum, she doesn’t need to build Python from source, because Debian provides a perfectly usable desktop system.

I don’t have a particularly strong opinion on whether should ship Linux binaries or not. I think it’s probably fine if it wants to do that, and it possibly makes things that want to manage Python installations (like PDM, hatch, etc) easier as they can rely on the “standard” Python binary rather than trying to find one produced somewhere else.

Something like rustup or so would possibly be even better.

I think all of the discussion around whether what distros are doing, and whether it’s a good or bad thing (or whether a particular distro is doing a good or a bad thing) isn’t particularly useful from either “side” of this [1].

Many people undoubtedly get great value from their distro provided Python, and I don’t think anyone is (or should) suggest that those users are somehow wrong for preferring that, and I think doing so is entirely unproductive.

Likewise, there has always been a bit of an impedance mismatch between what most (and maybe all?) distributors of Python provide and what some of Python’s users want, and trying to tell those people that the things they want is wrong is also entirely unproductive.

In some cases, providing a linux binary may even relieve some of the pain that distributors have with this, as they have a much simpler escape hatch they can point users to who are trying to do something that their distro doesn’t support.

  1. Coming from someone who remembers a lot of the Distro / Python Packaging “wars” of old, the folks maintaining these stacks downstream are typically super helpful, albeit can’t always fix everything because of their own policies. ↩︎


I’ve thought about this more, and I think what would be useful here is to come up with an up to date enumeration of the issues and shortcomings [1] that upstream (both Python itself, and the packaging tools) has with the status quo of how binary versions of Python on Linux are distributed today.

I think this would be useful for a few reasons:

  • It gives us a metric by which to decide whether publishing linux binaries on is solving the issues that we’ve outlined or not.
  • It can be used in discussion with downstream to determine which of the issues are fundamental issues (and unlikely ever going to be solved), which are solvable but would require a lot of work (and thus are unlikely to get solved without significant investment), and which are solvable and where there are fixes that a volunteer can implement in their free time.

I can see benefits to having ship linux binaries, so it’s entirely possible that it’s a good and valuable thing to do even if we were happy with the state of Python + Some Downstream, but I also think it’s going to be a good amount of work, and particularly ongoing work that Python would be signing itself up for, and I think it would be a shame to spend a bunch of time on it if there is a path to resolving these concerns without having to do it.

Of course, even if we decide that it’s valuable even if the experience with downstream distributors was friction-less (for what they’re willing to provide), there’s still value in having a list and trying to chase down the right contacts for trying to address these concerns down downstream. Users are still going to use those Pythons, and the more of the friction we take away from them, the better off everyone will be.

I’m not entirely sure what all of the concerns are with downstream anymore, but I will say specifically where Debian is concerned, I’ve been hanging out in their IRC and mailing list for… well many years now, and have advocated and gotten changes into the way things are done between barry, tumbleweed, and doko, and at least some of the patches (at least as it relates to the packaging toolchain) and I’ve at least given a tacit “well it’s better than it was” approval to a number of them-- so to some extent some of the issues can probably trace back to me OKing something that turned out to not be a great idea [2].

  1. I’m aware of the gist created by Christian, but it’s now 3 years old and I know there’s been movement on at least one of these things with the creation of a python-full in Debian… but I also don’t think Debian is alone here. For instance, sometimes you need specific point releases, but afaict Fedora doesn’t offer an option to install a specific X.Y.Z release, or it may require root/sudo privileges, etc. ↩︎

  2. Of course most of these decisions were compromises between Debian Policy, Upstream, and people’s available time to work on solutions. ↩︎


This would be the exact reason I recently switched over to the Windows Store Python as my only non-WSL Windows Python install. It wasn’t 100% smooth (the terminal shortcuts didn’t activate automatically), but the automatic updates outweighed any other concerns I might have had :slight_smile:

1 Like

One particular thing I worry about is users nuking their Linux systems by downloading and installig Python from and sudo pip installing stuff into it.

Whatever the installer will do, it must remain “isolated” from the distro Python. Preferably installed into /opt, ~/.local or similar.


Yes, this sounds interesting, this is something I would likely use.

I do not understand this comment. Why would it be any different (worse?) than the current state of things?
[Also, maybe it is time to finally get rid of the pip, pipX, pipX.Y scripts, and allow pythonX.Y -m pip only, but that is a different discussion.]

Yes, to this. Not to be underestimated. If something gets done, then I would really like to see something that integrates with existing package managers. I guess it is good that the packages in apt, dnf, pacman, etc. are under control of the distros (although situation on Debian is not great, not sure I can rely on deadsnakes). But maybe work can be done to provide (or take ownership) of the Python packages in things like flathub, snapcraft, appimage (?), homebrew (?), others (?). Maybe that is for a next step after this one.