This topic came up at the packaging summit organized this year. It seems that we publish installers for macOS and Windows, however they are no such binaries published for the lineup operating system. Instead we are relying on the operating system to repackage our third distributions, but this means that we give them the opportunity to change the interpreter in ways that introduce unexpected behavior for end users, such as all the custom patching that Debian does, that pretty much break all the tooling we have for packaging.
Because of this it is kind of like internal secret within the more advanced and experience users that instead of installing Python via the operating system on Linux they should just use tools such as pyenv or asdf. This however it’s nowhere documented on python.org and create a behavior that differs between the other operating systems. I’ve opened this thread to start the discussion on doing something like this would it be possible and iron out any kind of major blockers in achieving something like this.
Majority of the users at the summit expressed a strong interest in providing an installer that is distribution independent and available on python.org. while this might not be there easiest I think it would provide real value for people starting to use python especially.
In your opinion and experience, are other Linux distributions like Alpine, ArchLinux, Fedora (including CentOS, RHEL, and derivatives), Gentoo, and SUSE also affected? Or is this a problem on Debian and Ubuntu-based distributions only? I documented known issues with Debian/Ubuntu in 2021. Are these problems still unsolved despite several meetings between Steering Council and Debian maintainers?
It would be unfair to lump all distros together when only one family of Linux distros is causing issues. People like Michał Górny and Miro Hrončok pour their hearts into excellent Python support. The Python community should support and promote their hard work as well as “good citizen” distributions.
In my personal experience, Python from the operating system works great and is superior in performance and security compared self-compiled Python interpreter – as long as the OS is not Debian-based. Disclaimer: I work for Red Hat and do most of my work on Fedora, CentOS, and RHEL with occasional testing on Alpine, Gentoo, and Ubuntu.
I do not use any Debian based operating system so I cannot comment on this. I am just a proverbial messenger here
I forgot to mention but people brought up the rustup example for how it is normal to install programming languages outside of the Linux distribution package manager.
In the past I have used Arch Linux and one problem I did notice is that you have multiple versions available, so as a new user it can be a bit confusing which variant should you install.
I will say that there are two major downsides to Linux OS provided Python installation: they are often slightly delayed as far as availability goes (compared to other platforms where you can easily install it on days zero) or not available at all because of the LTS nature of these. Hence why on Debian deadsnakes ppa exists. So having installer you could download from python.org would allow the users to side step these limitations.
We have manylinux for distributing platform wheels that work on most linux systems, it makes sense to me to use the same env to distribute python itself in a way that works on most linux systems.
You are describing the state of Python packaging on Debian/Ubuntu, not Linux distros in general.
Fedora has CPython 2.7, 3.6-3.13, MicroPython, and PyPy 2.7, 3.9, and 3.10 in the core distribution. You don’t have to deal with 3rd party repositories like Deadsnakes PPA. New releases are built, packaged, and shipped to users within a day. dnf install tox installs them all, so you can run tox tests with a multitude of Python versions.
Gentoo does Python packaging similarly and comes with multiple recent Python versions.
CentOS / RHEL regularly updates Python and even adds new Python version. The Python versions are not cutting edge, but come with security backports and bug fixes. CentOS and RHEL are Enterprise Linux distributions, which aim to find the right balance between stability and updates.
3.6
3.8
3.9
3.10
3.11
3.12
RHEL 8.9
system
yes
yes
-
yes
-
RHEL 8.10
system
yes
yes
-
yes
yes
RHEL 9.2
-
-
system
-
yes
-
RHEL 9.4
-
-
system
-
yes
yes
REHL 10.0 dev
-
-
-
-
-
system
CentOS c8s
system
yes
yes
-
yes
yes
CentOS c9s
-
-
system
-
yes
yes
CentOS c10s
-
-
-
-
-
system
By the way, does deadsnakes PPA still target Debian? deadsnakes · GitHub looks like Ubuntu-only.
I use pyenv on both macOS and Linux so I don’t think I would use Linux installers if they were published as well. The reason is that I don’t just want to install Python but also to have some way of managing multiple Python installations, controlling which Python installation gets used and managing environments that are created with them.
What I would rather have is a cross-platform system for installing Python binaries that is all joined up with managing environments and installing Python packages into them so that installing Python is as easy as installing something from PyPI. I assume that uv is going to become that and then it would make sense for python.org to host whatever binary formats (like pybi?) would be used.
Just because it’s not the universal problem I don’t think it’s not a problem. By the way I ran into this limitation my company too with Red Hat in the past but since we worked around the problem by building our own python interpreter. Having an installer that one can freely download and install on any distribution would be a win, even if perhaps wouldn’t be needed everywhere or users shouldn’t be using it all the time. It would also allow users to test interpreters way in advance of them being vailable from the operating system, this would be great for example for CI and alpha/beta/rc testing.
The state of Python on Fedora is by far the best but can’t say that in general about other Distractions which is the problem itself.
I concur with you, the problem exist and has been a major pain point for both new users and experienced users. I’m trying to make the point that not every Linux user is impacted the same way. The Python community has limited resources and Linux packaging is more complicated than macOS or Windows packaging. A specialized solution may make your lives a lot easier, especially in regard of security updates, OpenSSL linking, and system trust store.
What will it be?
Linux installer for all distros and major architectures?
Linux installer for glibc-based, manylinux distros on most common architectures x86_64 and aarch64?
python.org deb packages Debian Stable and Ubuntu LTS?
I concur that distributing Python applications on Linux (and other Unix systems) is not easy due to the many different ways distros tend to change their Python installations in non-standard ways. This was the main reason why my company started the PyRun project: we wanted to a ship a product on Linux and needed an easy to support way to do this.
However, you are talking about an installer. Given that there are plenty of ways you can install applications on Linux and other Unix systems, I think the goal needs to be spelled out in a more concrete way.
Normally, an installer refers to an application which you run on the target system to deploy an application on that system. It’s not completely uncommon to use such an approach on Unix platforms (many commercial applications are deployed this way), but most of these use a different approach which requires a package manager and a package download of some kind for the installation - not a separate installer per application.
Could you describe what the package summit folks had in mind when referring to “installer” ? Thanks.
This was mentioned on some of the various packaging “strategy” threads and I think it’s a good idea. Personally I think that some of the reasons we would want such a thing on Linux also apply to Windows (and probably MacOS, although I have almost no experience with that). Basically what is wanted on Linux is isolation from the host OS, but why not also include isolation of multiple Python versions, etc.?
In earlier discussions the main sticking point seemed to be that existing solutions exist but are not developed/maintained/controlled by the Python organization, and there appears to be reluctance to just having Python.org tell people “use something else than what we provide here”. My opinion remains basically the same. . .
Isn’t this solved pretty well by the python Docker images? (Which are “official” in the Docker sense – it’s part of Docker’s set of official images – not in the sense that it’s published by the PSF.)
That gives you a consistent version of Python on any Linux distribution, with all the dependencies neatly packaged up and no installation required.
You can get Python versions as old as 2.7.9 or as new as 3.13 beta 1:
$ podman run --rm -it docker.io/library/python:2.7.9-slim
Python 2.7.9 (default, May 20 2015, 08:30:30)
[GCC 4.9.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> print "hello" <> u"hello"
False
$ podman run --rm -it docker.io/library/python:3.13.0b1-slim
Python 3.13.0b1 (main, May 14 2024, 06:52:27) [GCC 12.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from base64 import z85encode
>>> z85encode(b"hello")
b'xK#0@zV'
Using Docker is an order of magnitude worse than native. The developer experience is nowhere as seamless. Docker is a solution, but one that comes at a steep price when it comes to UX and performance overhead for your operating system. So I don’t consider it a solution, it’s more of a work around.
Bernát is right. Standard container images with Docker do not provide the same UX. They are useful for deployments and CI, but not for local development and local testing.
Toolbx containers are a great alternative, though. I have been using Toolbx a lot in the past few months. Toolbx provides almost seamless integration with the system including full access to /home, SSH agent, and virtually all other system resources. Basically only /usr, /lib*, /bin, and /sbin are replaced. Toolbx is also available for Arch and Ubuntu.
At a previous job, we used to build Linux installers/tools on Ubuntu 10.04 (years after EOL of that version). The idea was if it worked on 10.04, it would work on just about any more recent version. If we built on today’s Ubuntu, it wouldn’t work in 10.04.
This was a pain at first since we had to each have sketchy vms to build in and had to use less modern C++ features, since gcc hadn’t been updated in a while… and the vms barely worked since they were very EOL. I made the situation better by creating a build container based off 10.04 but adding in a recent gcc, and other recent versions of dependencies.
Stuff built, worked on 10.04 and everything later that it needed to: including rhel/centos/Debian and probably some other distros too.
Aside:
If the marketing team for that job reads this: learn to tell customers we don’t support ancient servers.
Anyways:
Not saying this is a great way to handle compatibility in Linux, but it worked.
I’d be in favor of a manylinux installer of some sort. Even static python executables could be cool.
I was also under the impression that most of Debian/Ubuntu’s more egregious packaging-related patching sins had been eliminated in Debian 11 (bullseye) and Ubuntu 22.04 (I could easily be wrong about that, though - like @tiran, I am a Fedora Linux user myself).
Either way, if anything was to change in this area, my personal preference would be that it be along the lines of talking to @indygreg about possible options for making portable CPython binary builds available as part of the upstream release process.
FYI all, hatch offers a python installation function since version 1.8.0 (Dec 2023) and it also sources binaries from that same python-build-standalone project.
On debian if a python version is supported it means that all the modules are compiled for that version. If I install 3.12 on stable but nothing is compiled and I need to use pip anyway… might as well compile python too. Since pip will start compiling anyways.
Plus, of course stable distributions do not want to change python version. Imagine how the users would feel when stuff stops working because some allegedly unused battery got removed from the standard library.
So it would need to not interfere with the system python, or breakage would be very likely.
Compiling python is trivial. Certainly easier than understanding all the mess with pip, poetry, conda and whatever. (xkcd: Python Environment).
So if an install isn’t going to use the packaged python modules, then what’s the point exactly?
I’ve personally never had an issue with how Python is packaged on Debian (e.g. I don’t need X11 to be pulled onto a headless machine), but as far as I know, the main changes were the installation paths (in order to reduce breakage from sudo pip), and the splitting apart of python into multiple packages (in order to be able to have a more minimal install, which is what I’d expect to happen on a binary-based distro for all apps where reasonable), and so python3-full just depends on everything (so pulls in X11 and all the other possible dependencies that python could use). See debian/patches · master · Python Interpreter / python3 · GitLab for the patches that are made to the Debian package (I don’t know where the equivalent for Ubuntu is).
I’m not sure docker on linux has a performance overhead (naturally if a VM is used, there’s overhead, but you never need to use a VM to run linux containers on linux). I wouldn’t say it has great UX (though, given the somewhat questionable packaging ability of some upstreams, using a docker container can sometimes be easier than trying to use their “native” package…), but if you’re going to deploy into a container, running tests in the same environment (rather than whatever you’ve got set up on your system) is probably a safer bet, and bind mounts can make things quite seamless.
If you’re trying to develop a native app and deploy on linux, I’m not sure how an installer is going to help either, given either it’s open source and so should use the packaging system provided by the distro, or you use flatpak (which has its own base python environment which you should use) to handle your dependencies and installation.
Sorry, haven’t been able to make it to the packaging summit in the last couple of years. If only it allowed online participation…
As always, please get in touch with me if you are running into issues with something fundamental in Debian’s Python stack. We want to provide something that works for our users and developers who use our distro. We don’t have the resources to completely revamp the world in the short term (a separate system python, etc.) but we can try to move things in the right direction, and obviously care about fixing bugs.
I’d be happy to help with design here, to build something that works alongside Debian’s packaged Python.
It may work if you pick the right releases, but I don’t think it’s tested to work. There is a similar repository for Debian, produced by an individual Debian Developer. But I wouldn’t expect a lot of support for it.
Yes, adding a supported python version to a Debian system means all installed pure-Python modules will attempt to compile against it. For a new / ancient release, there are almost certainly going to be incompatibilities (not to mention missing compiled C extensions) that are not protected against in the packaging.
So, in practice, any kind of third party (Python.org) installer for Python on Debian would not be able to use the Debian provided Python modules/extensions.