I’d argue that this applies to open source projects, too–many important projects are one burned-out maintainer away from shutting down.
One long-lasting impact of uv is that it moved the packaging discussion from “what should pip do” to “what should installers do”. There were other projects in the space, but none as impactful as uv. I think that was valuable even if uv doesn’t continue to develop[1].
I view Astral’s contribution similar to how I think of the faster-cpython project[2] or Meta devoting resources into the nogil work. A corporation decided it was worth it to invest in the Python ecosystem–not entirely altruistically, but not entirely to advance their own goals. That’s fine. We don’t have to get mad about it when that inevitably ends.
I see a lot of posts here worried particularly about uv as a foundational Python packaging tool, I think it’s worth taking a step back and looking at what it offers that are people are worried about, and considering investing directly in existing tooling to help catch up:
Speed installing packages: This is significantly driven from uv’s cache mechanism, where packages are installed to a cache and then linked to inside a venv, that could be added to pip, or your Python package managers of your choice.
Fast metadata/version handling: well we’ve been making significant progress in packaging to speed up metadata handling, 26.0 significiantly improved things and since then Henry added automatic benchmarking and we’ve made even more progress on performance. Before 26.2 I plan to land a new implementation of filtering Versions which for complex specifiers should be more than 10x faster. And there’s a lot more that could be done, an option for a global Version cache, native code implementations, have the standard library provide very stable parts of Version parsing, PRs welcome. This benefits all tools that use packaging.
More advanced resolver algorithm: I’m working directly on this right now to help pip catch up here. I hoping to integrate this directly into resolvelib, but it’s possible we will need a new resolver library. I am working on speed, but there is also the aspect of “universal” resolution, which is what may require a new library.
Lock files: We have a standard lockfiles that pip can generate, then next step is being able to install from them, work is progressing there also.
Syncing: pip-tools offers this but I think pip should have native support, especially with lock files, an issue is open and I would review a PR or eventually add it myself.
Workflows: This is already provided by many tools for different use cases, such as Poetry, hatch, pdm, etc.
All in one tool: Some of the workflow tools are already all in one, but nothing stops someone from wrapping the libraries they think are best into a single CLI.
I’m not suggesting people stop using uv or investing into it, I’m just not so sure it’s the only tool that offers most of those conveniences, or that we couldn’t fairly quickly add those conveniences.
Probably an irrelevant detail but this was true long before uv showed up. To me lately, it rather felt like the trend was going more and more towards “what should uv do” instead of “what should installers do”. I hope this will be an incentive for people to try out Hatch, PDM, Poetry, and others.
Most of all I wonder what is going to happen to python-build-standalone[1]. Lately it has felt like a bunch of tools and systems were considering adopting python-build-standalone. I wonder how this will turn out.
this is still a far stretch but it could be a better comparison to the Bun situation ↩︎
IMO this is very true. There were always other installers like PDM and Poetry, but the narrative never really moved beyond pip as “the installer” until uv started gaining traction. I do hope we can retain that change in perspective.
Also, another lasting impact of uv is the clear demonstration of what well-funded development can bring to the packaging ecosystem. Whatever the results of the acquisition, uv has made huge progress in an amount of time that previously we would have considered impossible. If we could use this experience to motivate future investment in packaging tools, that would be another big win. Again, regardless of what the future holds for uv itself.
I think there’s work going on to bring python-build-standalone under the “core Python” umbrella somehow. Initially, I think it was motivated by the fact that it was a one-person project and important enough that we wanted to reduce the risk associated with that. I don’t know if astral becoming involved took some of the pressure off the core to be directly involved, but I hope that post-acquisition it becomes a goal to get python-build-standalone hosted on python.org somehow. I believe it would be a real problem to have a key[1] binary distribution owned by a single company - especially one with the sort of public perception that OpenAI has.
It’s used by a lot of packaging tools, I believe ↩︎
And I’m continuing to work on standards and such so that a baseline workflow can be based on standards. Add on the perf work in pip that Damian covers in OpenAI to acquire Astral - #23 by notatallshaw and I’m hopeful uv usage will feel like a choice and not a requirement to anyone.
There is and I’m effectively leading it. I kicked it off in October last year after the core dev sprints.
No, and in fact Astral as always supported the idea of upstreaming things so PBS doesn’t have to carry patches. Unfortunately they have been so busy since I started the work that they haven’t had time to push anything upstream lately. But I’m planning to push on regardless.
You missed the single best part of uv (for me but also for others in my circle): the fact that uv does not run on Python so can be used to bootstrap and manage Python (Then, add all other others you mentioned.)
Agreed, although like many of the other points @notatallshaw noted, it’s not exclusive to uv (or even to the fact that uv is not written in Python). We could certainly implement a tool like pip using an embedded Python interpreter so that it would look like a standalone executable, uv-style. It’s just that existing tools haven’t taken that approach.
Changing pip (for example) to be a standalone executable using an embedded Python interpreter and acting on the user’s Python environment without being installed into it would be a substantial change in architecture, probably big enough to need funded resource to achieve it. But it’s certainly not impossible, if the lesson from uv is that it’s a better approach. And of course, new tools don’t have the constraints that existing ones do, so could take that approach much more easily.
IMO, the packaging community should be investing a lot more effort into improving things around building and deploying standalone applications, including this sort of bundling. But I’ve said that before, and I don’t think it’s worth having that debate here. Suffice it to say that it’s certainly possible[1].
The new Python manager for Windows uses exactly this sort of approach, for example. ↩︎
Given Python runs in more places than uv I think if you put in enough time and effort to building a good installer it could be better than uv. e.g. have an installer set up like uv, curl ... | sh, and it installs it’s own private copy of Python, and then you use that to run your Python packaging tool.
If it were easy to “just” install a private copy of Python off the shelf…
python-build-standalone (currently maintained by Astral since uv relies on it) is a non-trivial amount of work, if it were 3 shell calls and that’s it then we’d be good.
Now… maybe CPython needs to get to a level where python-build-standalone basically doesn’t need to exist. Just really isn’t just a question of running existing installers. I think it would be nice if existing installers could absorb as much of this work as possible to make it possible though!
Ty for sharing that! It’s clear now the acquisition: they want an uv (pyx) with GPU steroids, that’s really cool for AIs.
This is a really good news! But uv it’s not just a faster and less hard disk hungry pip – since it uses hard links or soft links when possible. As @jamestwebber and@pf_moore pointed out, it’s a better instaler package manager. And I can add it’s not only a better installer package manager. With uv you can install different versions of python, use uv tool, uvx and uv venv. It’s one ring to rule them all, and it’s officially supported by pycharm.
I love pip and pipx. And the first times I just used uv as a faster pip. But now I’m using it for all my new projects, because it’s not simply faster than venv + pip + pipx + poetry + *args, **kwargs, it’s much more simple.
A really game-changer will be something like maven, that doesn’t need venvs. Venvs are a really difficult beast, and for newbies a big entry barrier that they avoid – as I did – until nothing more works. But probably this will be an huge change in all python ecosystem.
Indeed IMHO it’s premature FUD. But maybe this fear can move people to improve the current available tools. And that’s good!
(a) is satisfied by a lot of people. Many of these posted in this thread
(b) is not strictly necessary, even if it will be better
(c) why? The history is full of people that left a company and created a similar or identical project. See mariadb, Libreoffice or the X programmers that moved to Meta. But I’m not a lawyer, and this kind of things give me a headache X-D
Hang on, where did I say that? You’re misinterpreting my comments here.
The uv pip command is basically pip, but with improvements that could be made because uv had a funded development team working full time on it. That’s “better” in a certain sense, but not in any insurmountable way. And similarly, I believe the rest of the uv interface is what other tools like PDM, Poetry and Hatch could achieve[1] with funding and more development resource.
Money doesn’t solve everything, but it helps a lot with some things
If they wanted to - workflow tools are notoriously opinionated and there’s no “one size fits all”. ↩︎
To maintain an open-source project that is 98.3% Rust (according to GitHub), you absolutely need fluent Rust developers. Unless you consider writing a new README and then more-or-less-blindly cherry-picking all commits from upstream to be “maintaining a project”.
“fluency” is the minimum bar for working on unsafe Rust.
Otherwise, the Rust compiler is that good, it’s possible for generally experienced coders who know other system languages, to maintain a Rust project in a usable enough state that users’ pipelines still work. Assuming the maintainers learn more Rust as they go on.
I haven’t checked yet which category uv falls under (not feeling any urgency in the slightest to work on a fork yet).
Please don’t reframe my post as dismissive on the amount of work required, I never "just"ed the difficulty, I wrote “if you put in enough time and effort”. I’m well aware of the difficulty.
I agree, Astral identified it as key dependency and took over maintainership, they’ve been a good open source community participant and upstreamed many of the patches. I do think thie PSF/Steering Council/Core Developers should evaluate producing their own version, if at least for the bugfix of each release.
That’s not even true, if use uv for any period of time you’ll quickly find that uv is more disk hungry than pip, all those old fully expanded installs of packages in uv’s cache you no longer need eat a lot disk space than pip’s wheel cache.
I’m trying to be constructive and identity what uv does better than other Python packaging installers and what can be done to close the gap, I wish you would do the same, it isn’t help to just say “better”.
I will say though, one thing Charlie has repeated often is being faster isn’t just quantitatively better, there becomes a point where a tool is so much faster you can use a tool in a lot more places and it becomes a qualitative improvement. I see no reason other tools can’t learn lessons from uv here.
As far as I’m aware all of those features are supported in subset or whole by hatch, poetry, pipx, and PDM. If you think there’s something missing, please be specific.
Again that’s vague and unhelpful, I also use uv a lot, and I’m able to point to exactly why, and work on those in other tools.
Sorry, I didn’t only misinterpreted your post. I confused installer (pip) with package manager (poetry etc). I will correct that in the original post.
I don’t think it’s impossible to par with uv as an installer. On the contrary, I think it’s inevitable.
But uv pip is not just equal to pip. I noticed that they diverge in the way they manage the package conflicts. I remember I had a problem installing from a requirements.txt with pip, and uv had no problems. And indeed searching on their site I found this doc.
Furthermore, uv pip it’s mainly for compatibility and to lure people like me that used pip by dozen of years to use it “the pip way”. The “uv way” is to create a project and do uv add. Is it simply uv pip + writing in toml? Dunno.
pip can be improved, and I suppose the difference between uv pip and pip will be irrelevant. But pip is an installer. It’s not poetry, pipx, venv and winget / flatpack. uv is that and more. And uv did a lot of improvements and features quickly in a low level programming language.
Honestly, I don’t know very much about the other package managers. I was “old school”: pip + venv + setup.py. I worked a little with poetry because of work. But it didn’t really impressed me. I preferred to continue with the old way, until I discovered uv it’s not just a faster pip. I doubt that poetry will change its interface to make it simpler and to do all what uv does.
And that’s good! I prefer a Unix Philosophy: one tool that do one thing very well. But the fact is that in 2026 no one downloads the source code of cat, grep etc, compile them and install them. They download Ubuntu and they have all the zoo.
The current tools can be improved, and I will be really happy to see them improving. But I’ll turn back using them? Honestly. I don’t know. Let’s see what will happen.
Of course, if you want to fork uv. But pipx and venv can be improved without using Rust right now. Never is better
I would be happy to see more people using Rust. I would be happy to see also Rust in Cpython code! AFAIK, Rust now is used also in Linux kernel, and Linus would more people using it.
But honestly, is it really needed to write all the code in Rust? Usually, you write in Python and then rewrite the bottlenecks. And you have a lot of options than just using Rust.
You can do ``uv cache prune`. I don’t know if pip can do it. If not, it will be a great addition!
AFAIK, uv doesn’t copy the deps inside the venv. uv caches it and use hard links when possible. This is saves an incredible amount of space, when you have many projects that uses the same packages. And in particular when the packages are big, like SciPy, Tensorflow etc. I don’t know if pip has plans for using hard/sym links when possible.
About the rest, I hope I clarified my thoughts No one is saying pip is a bad tool. On the contrary! I blessed God when pip came out. Installing python packages before pip was a pain.
Side note: I’m just saying my impressions as an end user. No one wants to dismiss dozen of years of hard work for making tools that helped a lot of people like me, made by people that had no money back and maybe sometimes ever a Thank you. That’s the FOSS world baby :'-) I know this feeling.
That’s the point. You’re an expert in this field. I’m only the end user.
And as an end user, I used pip, venv, pipx and poetry. I don’t know them extensively, as I don’t know uv extensively too. I know only what matter for my work.
My workflow before was: create a venv with python -m venv venv, activate it, pip install something, use pip freeze > requirements.txt. If I needed a tool, I used pipx. If I have to update pip, I used pythonX.Y -m pip update -U pip. And this was needed, since this way I got rid of the update message
If I need another python version, well… I compiled it X-D
It’s “simple” on Ubuntu: you’ve “just” to do some apt install before, clone CPython, checkout the right tag, create a separate build dir and do the usual:
CC=gccX.Y ../cpython/configure --enable-optimizations --with-lto
make -j
make altinstall
Now? Now I do:
uv init something
cd something
uv add somedep
And it’s done.
If I need a tool, I use uv tool. If I want to update uv, when I remember to do that, I do uv self update. If I need to lock the deps, uv lock.
If I need pythonX.Y… well, since I know how to compile it, I continue to compile it X-D but sometimes I’m lazy and I install it using uv.
You says I can do it with Poetry, hatch and PDM. I know only poetry a little and honestly? I don’t feel the need to use them. I already have all that I ever needed with just uv.
I suppose that the problem isn’t just to make pip, pipx, venv and PUT_THE_NAME_OF_YOUR_PROJECT_MANAGER_HERE. The problem is having them all in a simple and coherent way. Python was a winner also because of its battery included. I suppose the Python ecosystem needs something similar for packages and project management, and something “beyond venvs”.
A number of differences come down to the fact that uv makes assumptions about package metadata that aren’t supported by standards. That means that they can give a better result on “most” packages, but break on valid, but unusual[1] cases. We can debate which approach is “better”, but that’s not the point.
Yes, you can do pip cache purge. Of course, if you purge the cache, you lose the performance benefits the cache gave you. So I don’t see the point you’re making.
Which are fine. But to bring things back on topic, it’s not about whether uv is better or worse than pip or PDM or whatever. The original concern was that if we lose uv[2], we’ll have lost something essential. And that simply isn’t true - it’s perfectly possible to incorporate everything uv achieved into existing tools, assuming that (a) we want to, and (b) development resource can be found.
The one genuine problem here is that because uv is implemented in Rust rather than Python, we can’t directly reuse code or libraries from uv in other tools in the ecosystem. Instead, we’ll have to reimplement the functionality in Python first. But that’s not a problem with the acquisition, it’s something that was inherent in the choice to implement uv in Rust in the first place.