I think that usually if micropip is used then it is used from JS code before starting the main Python âprocessâ. It is just also possible to invoke micropip from within Python. The default REPL tools that I have seen will invoke micropip as an import hook for packages that are in the pyodide distribution. That is what happens if you do e.g. import numpy in some of the websites listed above.
Ouch. As you say thatâs crazy, and itâs not something that will ever be accepted into pip itself. What we might support is the sort of --target mechanism I described, but I donât know how useful that would be to you.
Nevertheless, thanks for all your explanations, I feel like I have a much better picture now.
This sounds good to me. What should the diff look like? Is the main change here to the installer section? Maybe like this:
Installers may choose whether or not to support Pyodide. pip is not planning to support Pyodide. Installers that support Pyodide should use the _emscripten_platforms() function shown above to determine which platforms are compatible with an Emscripten build of CPython.
These sentences:
We need a tag for pyodide. We need it because pyodide is a supported Python platform, and existing tags arenât sufficient because pyodide has its own binary format.
sound like additions to either Motivation or Rationale (or do you think whatâs currently in there should be removed or cut down?). And I feel like the remainder of what you have said is already in the other sections of the PEP.
How do these changes sound to you @pf_moore? Are there other specific changes youâd like to see?
To be honest, I donât know - this is all too vague in my head for me to make specific suggestions. Itâs unfortunately one of those âIâll know it when I see itâ types of situation.
I still donât know what you mean by âsupport Pyodideâ. For me, that means âcan run in a pyodide environmentâ, and youâve stated thatâs not possible for pip because subprocess and urllib support isnât there. Thereâs also the fact that there are two different things that you could call âpyodideâ in this context - the one that runs in node, which has more capabilities and a command line, and the one that runs in the browser, which is much more limited.
So I donât see it as âpip does or doesnât support pyodideâ but rather âpyodide isnât an environment that can support pipâ.
Maybe? I donât think a 4-line statement is sufficient for a PEP, though. I was assuming youâd end up somewhere in the middle, with most of the âwhat is pyodideâ discussion cut out, and only leaving the core that supports the basic statements I included in my 4 lines. But I donât have a feel for what that would look like.
Reading through this thread gives me an uneasy feeling. I just feel there are too many instances of âwe have this hacky way of patching stuffâ and âthese things donât workâ and so on and itâs unclear to me how being able to put emscripten wheels on PyPI would move things forward in that regard. Rather I keep wondering why those issues cannot be ironed out using a separate repository, and then worry about putting emscripten wheels on PyPI after thatâs been dealt with. Thatâs not to say that the various rough edges are bad, just that they give me the impression that some of this is still a work in progress and itâs not clear now is the time to add PyPI wheels into the equation.
More specifically itâs unclear to me whether for instance, a wheel that is tagged with the Pyodide tag will be guaranteed to run in both browser and Node environments.
I should add that I havenât used Pyodide except for playing with a few toy demo things. I have used a couple other Python-in-the-browser solutions (mostly Brython) and that experience has made me skeptical of the idea of plugging such runtimes directly into the normal packaging mechanism.
We are going to run into exactly the same set of problems for iOS and Android support. I donât think we want to block cellphone and web support because they canât run pip and pip has no support for cross-venvs. Particularly when uv will be able to support all three without trouble.
Good question. Itâs similar to the question of whether a wheel tagged py3-none-any works on Windows. Many do, but the os module does not behave in exactly the same way and so if there isnât thorough testing on a platform there may be incompatibilities.
The platform is intended to say that the shared libraries contained in the wheel will all load in any Pyodide environment. Out of the ~260 supported packages the following require special code paths or do not work at all for node:
urllib3
httpx
pygame-ce
pyxel
matplotlib
For making synchronous requests, we need to stack switch in node. Matplotlib can render to a file in Node but cannot make an interactive graph. And pygame-ce and pyxel rely on graphics capabilities that are missing in Node.
This sort of inference is unhelpful. We are trying to standardize packaging for Pyodide and not other Python-in-the-browser solutions because Pyodide is more compatible and more mature than those other solutions.
I would have the same reservations about those platforms. Basically my position is that it should be possible to get everything working âin parallelâ to PyPI before there is a need to add a tag to facilitate uploading wheels to PyPI.
That makes sense, I suppose, although it seems like the level of risk may be greater for Pyodide.
My point isnât about maturity or compatibility. Itâs just that the web platform is so different from others that I tend to think anyone targeting it is always going to have to think in a more explicit way about the packages they are using, and indeed the entire ground-up design of their code, and any kind of âautomatically install this using something like coarse-grained compatibility tagsâ approach is never going to be adequate and will lead to unpleasant surprises. Iâd love to find out Iâm wrong though.
Anyway, hereâs another way of coming at this, which actually may be helpful to me and others who are unclear on how this PEP fits into the bigger Pyodide picture. Suppose I have some existing pure-Python code (not written specifically for use in a web app) that uses some pure-Python dependencies like, say, pyparsing and maybe some spellcheck library like this one. Now I want to write some small web wrapper, my little <script type="text/python">, that somehow creates a web frontend for that. Can you sketch what the build process is for something like that and point out which parts are currently blocked on having the Emscripten tag, but would become nice and smooth wtih just the addition of the Emscripten tag?
Well pyspellchecker is a pure Python project with no dependencies (and in particular no binary dependencies). So no part of that is blocked on this. But hereâs a demo webpage using pyspellchecker.
I agree, we are. And I think itâs a general issue that should be solved for all three platforms, not something that we work around or patch over for each platform in turn.
My biggest problem isnât that the solutions Pyodide currently has are âhackyâ, but rather that they keep the platform separate from the mainstream of the packaging ecosystem. What we want to avoid here is a situation where âpackaging on Pyodideâ needs a different set of tools/workflows/knowledge than packaging on Linux, or Windows, or MacOS. Same for âpackaging on iOSâ and âpackaging on Androidâ. If we let things go that way, we risk ending up with more conda-style splits in the packaging community.
Iâm getting increasingly annoyed with this suggestion that âpip doesnât support whatâs needed and uv doesâ. Both pip and uv are perfectly capable of supporting Pyodide (and I hope iOS and Android). Both tools will need changes to do so. The fact that pip will need people from the pyodide community helping on implementing support is the the only real difference, and if the pyodide approach to pip is to say âpip doesnât do cross-venvs, so we canât use it, weâll just hack our own forkâ then we end up with a self-fulfilling prophecy - the people who need pip to support cross-venvs wonât contribute, so pip doesnât get support, so the claim seems justified
Indeed, thatâs what platform tags are for. A library with no native code should not need a platform tag. To that extent, the request for a new tag is uncontroversial - thereâs a new native code format that we want to support, it should get a tag.
That list includes urllib3, which is pure Python. If youâre saying urllib3 needs changes, how will that be published as a âPyodide wheelâ? I would be completely against misusing the new platform tag for that. Doing so would cause major problems (if thereâs no appropriate pyodide tag for a given version, installers are required to fall back to the py3-none-any wheel, which is not going to work).
On the other hand, the urllib3 documentation says that emscripten is supported, so maybe the necessary code changes have been upstreamed, and there wonât be a problem. But looking at httpx and matplotlib, that doesnât seem to be the case - there are discussions in pyodide-only forums, and I canât see any sign that changes are getting incorporated in the upstream packages.
That suggests to me that the pyodide community doesnât yet fully understand what being an integrated part of the wider packaging ecosystem would involve, and thatâs something that should be addressed before we start setting up infrastructure to support pyodide.
And yes, that same proviso applies to Android and iOS. At the moment, though, I have no idea what âpackaging on those platformsâ looks like - itâs as unknown to me as pyodide was before this discussion.
You should also be trying to adapt Pyodide to (standard) packaging, and move away from custom, hand-crafted solutionsâŚ
Iâll note that even micropip comes under this point. Jupyter has a %pip magic that installs packages into the running environment. Having a standardised way of doing this (even if it has risks and provisos) would be generally beneficial, so why not work with Jupyter to propose something? It might need to be in the stdlib, to work around the bootstrapping question, but thatâs not an insurmountable obstacle.
This is what I mean when I say work with the existing ecosystem, rather than building your own independent one.
My biggest problem isnât that the solutions Pyodide currently has are âhackyâ, but rather that they keep the platform separate from the mainstream of the packaging ecosystem.
I agree with this. I would like to see there be greater convergence in the future. But itâs hard work. I think one thing that will help prevent permanent fragmentation is that the existing Pyodide-specific solutions are just not that good. I expect that if we do get the standard tools to work, Pyodide users will adopt it because the standard tools are much better. Though I think what we are doing with pip right now doesnât fragment the ecosystem too badly because we havenât replaced any of the logic of upstream pip, merely monkeypatched sysconfig to report about our target environment.
The fact that pip will need people from the pyodide community helping on implementing support is the the only real difference, and if the pyodide approach to pip is to say âpip doesnât do cross-venvs, so we canât use it, weâll just hack our own forkâ then we end up with a self-fulfilling prophecy - the people who need pip to support cross-venvs wonât contribute, so pip doesnât get support, so the claim seems justified
Agreed. I think it will take a lot of effort for us to get out of this self-fulfilling prophecy though. pip is quite complicated and I donât understand it very well or talk to the maintainers very much. Pyodide is also quite complicated. And all the maintainers of both tools already have a whole lot on their plates.
As I said, @freakboy3742 and I are hoping to work together on trying to standardize cross-venvs and cross-builds. And as you said, that is going to be a lot of work.
You should also be trying to adapt Pyodide to (standard) packaging, and move away from custom, hand-crafted solutionsâŚ
Agreed, this is our goal. This is difficult because the packaging ecosystem is complicated, I donât understand it very well, much of it doesnât handle cross-builds and cross-venvs, and I have short term goals of duct taping together something working enough.
Iâll note that even micropip comes under this point. Jupyter has a %pip magic that installs packages into the running environment. Having a standardised way of doing this (even if it has risks and provisos) would be generally beneficial, so why not work with Jupyter to propose something?
In Jupyterlite, %pip uses micropip. I could see the benefits of a standard here, but even making a simple standard requires a huge amount of effort and I personally doubt I would ever want that particular standard badly enough to work on it.
The intention here is to make pyodide just like the other platforms. If we have the platform tag and can upload the wheels to PyPI then for library maintainers pyodide is just another tag in the build matrix. You can build the wheels with e.g. cibuildwheel and upload them all with the same CI job.
Is the difference you are referring to just that the commands to create a cross-venv are different from creating a normal venv?
Note that the whole purpose of the cross-venv is that once created it works like a normal venv so switching between native CPython and emscripten CPython is much like switching to a different Python version or to PyPy or to the free-threaded build etc. Perhaps in future tools like uv can get you the Pyodide python directly with uv python install 3.13_pyodide_2025_0 or something and perhaps actions/setup-python etc would be able to do the same. For now the cross-venv needs the native pip so the simplest thing is to bootstrap from native CPython.
Before the cross-venv existed you would have needed to write some JS code to launch pyodide and run it under node. The cross-venv solution along with the commands to make it is much better for Python developers but as I have repeatedly pointed out it needs to be better documented.
Itâs mostly the fact that cross-compiling is non-standard. And a lot of that is because thereâs not (yet) many places where you have two environments on the same machine. You canât cross-compile for Linux on Windows, or for Windows on Mac. So cross-compiling itself is the difference. And no-one has been motivated to work on cross-compiling yet because all of the platforms that can use cross-compilation are niche. Thatâs changing, and we should treat cross-compilation as a first-class use case.
Yes, thatâs hard. And itâs unfortunate for pyodide that they are the first platform to have to deal with this. Iâm encouraged that they are working with iOS to understand what works best for both platforms, but itâs still a lot of work.
IMO, standards are premature when we donât know what the solutions are yet. A build tag is relatively uncontroversial, as Iâve said multiple times, but the implications are not.
For instance, is allowing emscripten wheels to be uploaded to PyPI that big of a benefit? PyPI requires that projects build and upload their own wheels - thereâs no mechanism for the pyodide project to build wheels for projects like matplotlib and upload them. And the matplotlib might not have the expertise or the interest in supporting pyodide themselves. A separate index avoids that problem (at the cost of creating others, conceded).
What exactly could be standardised and what benefit would it bring?
Bear in mind that for e.g. python-flint you first need to setup the emscripten toolchain and cross compile GMP, MPFR and FLINT before you can even begin to build the wheel. As long as those things are out of scope for the standards then in practice building a wheel requires doing many things that are not standardised. The benefit that PEP 517 brought was migrating from setuptools to meson-python to have a better build system for the extension modules but a lot more is involved than PEP 517 provides for.
When you go to build the wheel the tricky parts are things like having the backend find the right compilers and libraries etc and nothing about any of that is covered by PEP 517 except for the generic capability to pass non-standard options from pip through to the backend. Note that the pyodide build command is a PEP 517 build frontend. I think it mainly just sets up the emscripten compiler before invoking the backend although perhaps it also monkeys with sys.platform and other things (maybe this part is what could be standardised?).
It is part of the process of making it so that pyodide fits with the rest of the Python packaging ecosystem. To quote you from before:
Pyodide already has a separate distribution with its own curated wheels conda-style. The purpose of this PEP is to move from that over to standard Python packaging.
From my perspective I donât want to maintain pyodide and conda-forge packages/recipes on top of producing wheels for PyPI. Note that the fact that these recipes are outside of the project repo does not mean that the project maintainers donât end up maintaining them as well.
I also donât want to have to make my own index to store the pyodide wheels. It is easier if pyodide is just another platform in the ci build matrix with all the wheels being uploaded to the same place using the same tools and workflows.
How to describe target environments (see PEP 739 for part of that), how to get the layout (sysconfig paths) of a target environment without needing to run the target interpreter, things like that.
And my only concern about that is whether itâs what we should be doing first, or whether itâs better to get other parts of the process sorted out first.
But Iâll repeat this yet again, Iâm not against adding an emscripten tag. All of the debate here was triggered by me not knowing anything about pyodide, and asking questions - but itâs arguably totally off-topic for this PEP.
We cross compile on Windows all the time, the problem is thereâs no standard to inform a build backend of the target platform (and so we either rely on backend-specific environment variables, or on running the target Python under emulation so that the defaults are magically correct). x86 vs. x64 vs. ARM64 Windows all require different binaries, and hence it requires cross compilation.
Iâm pretty sure itâs also possible to use MinGW to compile to ELF[1], and Iâve heard of people compiling for Windows using Wine on Linux. Itâs also possible to cross-compile for different Linux distros on Linux, Iâve done it (once ).
But these donât involve details for front-ends to worry about - itâs backends that have to figure out how to do it. And if a project has used a backend that doesnât support it, then the answer is to file a bug with the project and ask them to switch.
Whatâs missing is a way for the user to pip install --target <dir> --platform <platform> <package>. And maybe thatâs not even missing, it just isnât obvious enough? Though I donât think thereâs an interface for that platform request to be passed to the backend. Once thatâs in place, cross-build is entirely up to the backend, which means it can be solved immediately by those that already support it (setuptools, pymsbuild, scikit-build, and IIRC pdm and Maturin).
Defining a new platform tag is really only going to affect whether PyPI is able to add it. No other index needs to care about it - they should be allowing arbitrary platform tags, only PyPI tries to exclude those with inherent compatibility limitations.
(FWIW, this PEP in its current form looks the right âshapeâ to me. The only thing Iâd like to see more explicit is that the platform tag is intended for PyPI and packaging.tags - the current wording feels too passive to me.)
Which no doubt still requires a lot of help to actually run on Linux, but might be viable for an extension module with few dependencies. âŠď¸
Also meson supports this via cross files. I suppose meson-python just needs a way to associate the platform request from the frontend through to a cross file.
Of course it would be nice to make the interfaces consistent but itâs not clear to me how much benefit you get from being able to run e.g. python -m build --platform=pyodide compared to pyodide build. You will still need a bunch of Pyodide-specific stuff to happen before that builds everything else and other stuff to bundle the everything else after. If weâre aiming for a world where you can make many wheels on a single platform by looping over --platform=$PLATFORM then that sounds great but a load of other stuff needs to happen in that loop that I assume is not going to be included in any standards here.
Agreed, and I particularly agree that itâs not unreasonable for users to directly invoke their build backend. But still, itâs basically essential for -m build to support a platform argument if you want to have dependencies also be built properly.
I would expect this to end up in cibuildwheel, which already can do all Windows architectures from a single invocation (possibly versions, too). This one doesnât really have to be standardised - practicality beats purity - but given the hacks involved it certainly wouldnât hurt if --platform would pass the information to the backend rather than cibuildwheel setting environment variables and hoping theyâre sufficient.
How is one to determine what version of Emscripten to use for a given pyodide_${YEAR}_${PATCH}_wasm32 tag? I didnât find anything in either PEPs 783 or 776 to figure that out. And Iâm not looking for anything fancy here via code, it could be as simple as âthis table in the PEPâ or something. But as it stands (assuming Iâm not missing anything), there isnât a way for me to know what Emscripten version to compile with to be compatible with a specific platform tag.
Using the year also limits you to only those Emscripten versions that happen to have a tag defined for it. So if I have a private Emscripten build of Python and want to build my own wheels for it, is there any guarantee that there will be a wheel tag I can use for that Emscripten version? And to be clear, Iâm very much for coordinating around a tag for each Python version, but people do build stuff on their own for their own reasons and I think we should try to allow for that.