bognition
10 hours ago
This shouldn’t be a surprise to anyone who has been using Python and has tried uv.
Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.
Uv is lightning fast and damn easy to use. It’s so functional and simple.
anitil
9 hours ago
For me the most convincing argument was that it took ~3 minutes to go from 'I wonder if I should give this thing a try' to 'oh it .... it worked!?'
tclancy
8 hours ago
Yeah, been doing this for over twenty years and finally got a chance to start playing with it a few months back and was confused at how I got that far that fast.
saghm
8 hours ago
As someone who also hasn't really used any of the past 8 years or so of Python dependency management, it's nice that it seems to support using arbitrary other tooling as well. At some point recently I wanted to run something that happened to use pdm, which I hadn't even heard of, but I was able to invoke it with `uv tool run pdm` and not have to learn anything about how to set it up manually.
om8
3 hours ago
FYI you can run just `uvx pdm`
pjmlp
19 minutes ago
I never got why.
I use Python since version 1.6, mainly for OS scripting, because I rather use something with JIT/AOT in the box for application software.
Still, having a little setup script to change environment variables for PYTHONPATH, PATH and a few other things, always did the trick.
Never got to spend hours tracking down problems caused by the multiple solutions that are supposed to solve Python's problems.
ziml77
9 hours ago
It really is!
I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.
uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.
Balinares
an hour ago
I just wish uv made it more straightforward to have arbitrary purpose-specific virtual environments, e.g. for building the package, for running the test suite, for dev tooling (PuDB ), etc. That's one thing pixi does better, I think.
hk1337
7 hours ago
It’s a little too fast, I’m having trouble believing it’s actually doing anything sometimes.
hyperbovine
7 hours ago
uv is so over-the-top fast compared to what we're used to that I would argue it's actually bad for the language. Suddenly it dawns on you that by far the most capable and performant package manager (and linter) (and code formatter) (and type checker) for Python is in fact not written in Python. Leaves an odd taste. Makes you wonder what else ought not be written in Python ... or why anything should be written in Python. Here be dragons ...
tyg13
7 hours ago
IMO, Python should only be used for what it was intended for: as a scripting language. I tend to use it as a kind of middle ground between shell scripting and compiled languages like Rust or C. It's a truly phenomenal language for gluing together random libraries and data formats, and whenever I have some one-off task where I need to request some data from some REST API, build a mapping from the response, categorize it, write the results as JSON, then push some result to another API -- I reach for Python.
But as soon as I have any suspicion that the task is going to perform any non-trivial computation, or when I notice the structure of the program starts to grow beyond a couple of files, that's when Python no longer feels suitable to the task.
rcleveng
6 hours ago
I'd rather write python than rust personally, and I also don't mind if anything performance critical is written in rust.
Letting each language do what it does best is really ideal, I'm glad that python has a great FFI, I wish golang did.
Speaking of golang, the typescript compiler is now written in it. Another case of using each language for it's strengths.
slyall
6 hours ago
Well Python isn't written in python either.
ipnon
5 hours ago
And it's killer use case today that is artificial intelligence development really uses it as a glue language for CUDA and similar GPU APIs.
arcanemachiner
7 hours ago
Try not to cut yourself while grinding that axe.
Python may not be the fastest language, but it's easy to learn, compilation times aren't an issue, you'll never have to fight the borrow checker, etc.
Every language has its warts.
ThibWeb
9 hours ago
for me the surprise is the pace? I’d expect people to be more set in their tools that it takes longer than a few months for a new tool, no matter how good, to become the majority use one. Though perhaps people adopt new tools more easily in CI where install times matter more
perrygeo
9 hours ago
The pace of uv adoption is insanely fast. It's directly related to how bad the previous Python tools were/are. Even to seasoned veterans set in their ways - they still know a better solution when they see it.
rtpg
9 hours ago
uv having a good pip compatibility layer probably helped a lot, because you could try things out that way and see what fit, so to speak.
It's probably worth mentioning that Astral (The team behind uv/etc) has a team filled with people with a history of making very good CLI tooling. They probably have a very good sense for what matters in this stuff, and are thus avoiding a lot of pain.
Motivation is not enough, there's also a skill factor. And being multiple people working on it "full time"-ish means you can get so much done, especially before the backwards compat issues really start falling into place
scuff3d
9 hours ago
uv was really smart in the way they integrated with existing solutions. My whole team just switched over from pip, and it was painless. We were already using pyproject.toml files which made it even easier, but uv also has documentation for transitioning from requirements.txt files.
simonw
9 hours ago
uv first came out 15th February 2024 so it's a year and a half old now. Still pretty impressive for it to get adoption this fast though.
lukeschlather
9 hours ago
I feel like I've tried at least 5 different package management tools for python. Between pip, poetry, pip-tools, pipx, I'm not really sure what easy_install, egg, pkg_info are, but I do know I have always been surprised I need to care.
It sounds like uv is a drop-in replacement for pip, pipx, and poetry with all of their benefits and none of the downsides, so I don't see why I wouldn't migrate to it overnight.
skylurk
10 minutes ago
It's a (better IMO) replacement for poetry, but not drop-in. Additionally it is a drop-in replacement for venv and pip-tools.
bognition
9 hours ago
Honestly, I was skeptical when I learned about uv. I thought, just Python needs, another dependency manager… this was after fighting with pip, venv, venvwrapper, and poetry for years.
Then I gave it a try and it just worked! It’s so much better that I immediately moved all my Python projects to it.
zahlman
9 hours ago
> I thought, just Python needs, another dependency manager… this was after fighting with pip, venv, venvwrapper, and poetry for years.
Pip, venv and virtualenvwrapper (people still use this?) are not meaningfully "dependency managers". A venv is just a place to put things, and pip does only basic tracking and tries to maintain a consistent environment. It isn't trying to help you figure out what dependencies you need, create new environments from scratch, update pyproject.toml....
Pip's core capability is the actual installation of packages, and uv does a far better job of that part, using smarter caching, hard links to share files, parallelized pre-compilation of .pyc files, etc. Basically it's designed from the ground up with the intention to make lots of environments and expect starting a new one to be cheap. Poetry, as far as I was able to determine, does it basically the same way as pip.
sgarland
7 hours ago
I actually did use virtualenvwrapper quite a bit until uv. I had built up various shell aliases and functions, so it was fairly painless to create and manage venvs. uv just means that I don’t have to think about that part now.
WD-42
9 hours ago
I think it’s been long enough now. Uv just has so much velocity. Pyproject.toml and pep support just keeps getting better.
Poetry which I think is the closest analogue, still requires a [tool.poetry.depenencies] section afaik.
greenavocado
9 hours ago
You don't even need to edit any files yourself for most simple use cases.
uv init
uv add package
uv run program.py
That's it.If you inherit a codebase made this way from someone else, merely running uv run program.py will automatically create, launch the venv, configure packages, run your script, seamlessly on first launch.
Uv lets you almost forget virtual environments exist. Almost.
kstrauser
8 hours ago
Yep. Poetry was such a delightful upgrade from pipenv, which we’d tested as an upgrade from bare pip, which didn’t have a dependency resolver at the time. If someone’s already fully bought in on poetry, that’d be the one case where I could plausibly imagine them wanting to leave well enough alone.
For everyone else, just try uv and don’t look back.
walkabout
8 hours ago
Is this like when everyone on here had already been saying Yarn was a no-brainer replacement for npm, having totally obsoleted it, for like two-plus years, but it was still lacking safety/sanity checks, missing features, and broke in bizarre ways on lots of packages in-the-wild?
Or is the superior replacement actually up to the job this time?
dmd
8 hours ago
It really is just that good. That is why it's had such massive uptake. No matter how many times you've been burned before, no matter how skeptical you are, it's so good that, seriously, just try it, and you'll be instantly converted.
walkabout
7 hours ago
Ok, sold, I’ll try it.
kstrauser
8 hours ago
I’m certain there’s going to be so bizarre edge case where pip is fine and uv isn’t. It’s inevitable. However, in every situation where I’ve used it, pip is better than pip or poetry or any other package manager I’ve ever used.
I just found out they’re still making pipenv. Yes, if you’re using pipenv, I’m confident that uv will be a better experience in every way, except maybe “I like using pipenv so I can take long coffee breaks every time I run it”.
walkabout
8 hours ago
Yeah, I’m just skeptical because I was at an agency in the heat of yarn-mania, waaaay after people online were proclaiming npm dead and pointless, and it went poorly enough that we developed a ha-ha-only-serious joke that you knew a project was properly in-development when someone had lost a half-day debugging some really weird error only to find that “npm install” instantly fixed it, and then switched the started-in-yarn codebase over to npm.
kstrauser
8 hours ago
I could see that being traumatizing, but this really isn’t like that. Pip and uv and poetry and the rest don’t fundamentally change how a package is installed into a Python virtualenv. If `uv add foo` works, you could use the equivalent in any of those other tools and get basically the same result. You don’t have to know or care which tool is installing your project because that’s all invisible from inside the code you write.
daemonologist
6 hours ago
There is! The company I work for uses a weird version of Azure Devops for <governance> reason, and pip can authenticate and install packages from its artifact feeds while uv cannot. We use uv for development speed (installing internal packages from source) and then switch to pip for production builds.
kstrauser
6 hours ago
This doesn't work? https://docs.astral.sh/uv/guides/integration/alternative-ind...
andy99
9 hours ago
I’ll bite - I could care less about speed, that feels like a talking point I see often repeated despite other package managers not being particularly slow. Maybe there’s some workload I’m missing that this is more important for?
I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.
I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.
simonw
8 hours ago
The place where speed really matters is in virtual environment management.
uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.
This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.
Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.
I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:
uv-test -p 3.11
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.Which means I can test any code I like against different Python versions without any extra steps.
Alir3z4
8 hours ago
Ooooh that's a neat one. I really like the hard links.
On my machine, there are like 100s of not thousands of venvs.
I simply have all of them under ~/.python_venvs/<project_name>/
Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?
If that's the case, then I can leave my habit of pip and move to uv.
This is something that always bugged my mind about virtual environments in almost all the package managers.
simonw
8 hours ago
"Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?"
I think so, based on my understanding of how this all works. You may end up with different copies for different Python versions, but it should still save you a ton of space.
eslaught
9 hours ago
Conda doesn't do lock files. If you look into it, the best you can do is freeze your entire environment. Aside from this being an entirely manual process, and thus having all the issues that manual processes bring, this comes with a few issues:
1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.
2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.
Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.
P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.
gre
9 hours ago
You've never waited 10 minutes for conda to solve your environment and then say it's unsolvable?
andy99
8 hours ago
I have, but it takes me back many years to some obscure situations I’ve been in. For my day to day, I can’t think of the last time I’ve encountered it, it’s been years, and I regularly am setting up new environments. That’s why I’m curious about the workflows where it matters.
gre
8 hours ago
I dunno, 2020? Since then I switched to mamba, then poetry, and now uv. I have spent way too much time fighting python's package managers and with uv I finally don't have to. ymmv
zbentley
7 hours ago
> I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.
> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.
I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?
cgearhart
9 hours ago
I’ve been trying uv lately to replace my normal workflow of selecting a python with pyenv for the shell, then making a venv, then installing a bunch of default packages (pandas, Jupyter, etc). So far the only benefit is that I can use just the one tool for what used to take 3 (pyenv, venv, pip). I don’t _hate_ it…but it really isn’t much of an improvement.
morshu9001
9 hours ago
uv is comparable to npm. All your deps get auto tracked in a file. There are other things that do this, but pip isn't one of them, and I vaguely remember the others being less convenient.
The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.
markkitti
8 hours ago
What tooling do you use?
testdelacc1
8 hours ago
Everyone downvoting you and disagreeing - don’t listen to them! I’m here to tell you that there is a massive conspiracy and everyone is in on it. Commenters on HN get paid every time someone downloads a Rust tool, that’s why they’re trying to convince you to use uv. It’s definitely not because they used it and found it worked well for them.
> could care less
I think “couldn’t care less” works better.
fragmede
7 hours ago
Being forced to use a tool you don't want to use sucks, no matter how awesome that tool may or may not actually be. *conda and uv have roughly the same goals which means they're quite similar. For me, the speed of uv really does set it apart. For python programs with lots of dependencies, it's faster enough that I found it worth it to climb its learning curve. (ChatGPT makes that curve rather flat.) pip install -r requirements.txt went from a coffee break to me watching uv create the venv. But okay, speed gains aren't going to convince you.
Both of them manage venvs, but where the venv goes (by default) makes a difference, imo. Conda defaults to a user level directory eg ~/.conda/envs/my-venv. uv prefers a .venv dir in the project's folder. It's small, but it means per-project venvs are slightly more ergonomic with uv. Wereas with conda, because they're shared under homedir, it's easy to get lazy once you have a working venv and reuse that good working venv across multiple programs, and then it breaks when one program needs its dependencies updated and now it's broken for all of them. Naturally that would never happen to a skilled conda operator, so I'll just say per-project uv venv creation and recreation flows just that tiny bit smoother, because I can just run "rm -rf .venv" and not worry about breaking other things. One annoyance I have with uv is that it really wants to use the latest version of python it knows about, and that version is too new for a program or one of its dependencies, and the program won't run. Running "uv venv --python 3.12" instead of"uv venv" isn't onerous, but it's annoying enough to mention. (pyproject.toml lets projects specify version requirements, but they're not always right.) Arguably that's a python issue and not uv's, but as users, we just want things to work, dammit. That's always the first thing I look for when things don't work.
As mentioned, with uv the project venv lives in .venv inside the project's directory which lets "uv run program.py" cheat. Who amongst us hasn't forgotten to "source .venv/bin/activate" and been confused when things "suddenly" stopped working. So if you're in the project directory, "uv run" will automatically use the project's .venv dir.
As far as it being pushed to promote rust. I'm sure there's a non-zero amount of people for whom that's true, but personally as that makes it harder to contribute to uv, it's actually a point against it. Sometimes I wonder how fast it would be if it was written in python using the same algorithms, but run under pypy.
Anyway, I wouldn't say any of that's revolutionary. Programs exist to translate between the different project file types (requirements.txt/environment.yml/pyproject.toml) so if you're already comfortable with conda and don't want to use uv, and you're not administering any shared system(s), I'd just stick the command to generate environment.yml from pyproject.toml on a cheat sheet somewhere.
---
One bug I ran into with one of the condas; I forgot which, is that it called out to pip under the hood in interactive mode and pip got stuck waiting for user input and that conda just sat there waiting for input that would never come. Forums were filled with reports by users talking about letting it run for hours or even days. I fixed that, but it soured me on *conda, unfortunately.
WhyNotHugo
5 hours ago
uv is weird. It's like 5 entirely different tools mashed and entangled into one program.
Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).
Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.
I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.
It's pretty disappointing to see the Python ecosystem move in this direction.
Balinares
an hour ago
Your distro's package manager cannot install arbitrary versions of Python such as might be required by a specific Python project and it cannot install anything at all for individual users without root access. These are two different tools that serve two different purposes.
atoav
8 hours ago
Started converting every repo over to uv. I had some weird and hard to deal with dependencies before. Every single one was easier to solve than before. It just works and is blazingly fast.
Absolute no-brainer.