Uv is the best thing to happen to the Python ecosystem in a decade

2058 pointsposted 21 hours ago
by todsacerdoti

1144 Comments

hexbin010

4 hours ago

I predict it'll be the best and then the 'worst' thing: they'll go hard on monetisation.

Just look at this post: 1839 points and 1048 comments! That is insane. It's captured the hearts and minds of Python devs and I'm sure they know it.

I'm not against projects making money, just remember you'll likely pay a price later on once you invest in more of Astral's ecosystem. It's just temporarily free.

neves

an hour ago

The Python Software Foundation includes highly competent contributors. One reason I adopted uv is confidence in the Python community’s engineering decision to prevent the take over of the platform.

Progress is already underway. PEP 751 proposes a standardized format for lock files: https://peps.python.org/pep-0751/ This helps to reduce tool-specific lock-in.

uv is open source, so forking remains viable. Build metadata is committed, and conversion to other tools is feasible if needed.

However, we must all remain vigilant against the risk of lock-in.

rnhGatt

17 minutes ago

Many competent contributors have left or were silenced by the politicians. PyPI had multiple severe vulnerabilities. pip has no adequate story for the scientific ecosystem. Building from source via pip usually fails, unlike around 2010 when it usually worked.

The only thing that prevents lock-in is the religious zeal of most Python users to use anything presented by the PSF high priests, not technical merit.

The reason uv exists is the utter incompetence of PyPA.

lijok

3 hours ago

They’ve been very transparent about their monetization strategy and it does not impact uv’s foss model

hexbin010

3 hours ago

That's just marketing. Only time will tell. I'll be very happy to be wrong

lijok

3 hours ago

The time has come. They’re in closed beta with pyx, their first product they’re charging for.

maleldil

an hour ago

That assumes that pyx (or whatever else they come up with) will be enough to sustain the company.

lijok

17 minutes ago

Yes, as with every single OSS project in existence.

If I write some OSS tool that becomes popular, and lose my job, I might just start monetizing it.

motbus3

2 hours ago

Until they change their minds. If they were serious about it, it would be part of PyCQA

lijok

an hour ago

The same PyCQA that they worked hard to do a significantly better job than?

bgwalter

3 hours ago

It has always been like this. The only way to get glory and money in the Python space [1] is to set up a new package manager or package repository or both.

Active State, Enthought, Anaconda, now Astral.

[1] Discounting pure SaaS companies that just use Python but offer no tools.

dundercoder

3 hours ago

Active state, that’s a name I’ve not heard in a long time. A long time.

oblio

3 hours ago

Heh, Sonatype - Maven - Nexus, Gradle Inc - Gradle (both Java).

limagnolia

3 hours ago

It is open source. If they enshitify UV with monetization, it will be forked.

pjmlp

2 hours ago

Most forks eventually die.

passivegains

44 minutes ago

I'm not sure immortality is a good standard to hold forks to. the original project won't last forever either.

pjmlp

41 minutes ago

There is yet one to exist that dies before its forks.

Boxxed

30 minutes ago

Hudson died well before its fork (Jenkins)

zo1

2 hours ago

Yes, but only after fracturing the ecosystem even further unfortunately.

Qwertious

26 minutes ago

Yes, but how far apart will the fracture be? For instance, Mac and Windows are further apart than Ubuntu and Fedora, despite both being fractures in the OS 'ecosystem' - it's far easier to be cross-platform between Ubuntu/Fedora than between Mac/Windows.

krapht

2 hours ago

Anaconda is a good example of this.

sanskarix

9 hours ago

What excites me most about UV isn't just the speed improvement, but how it demonstrates a key principle in modern developer tooling: removing friction should never mean removing choice.

I've been following this discussion about project-centric vs. environment-centric workflows, and I think UV actually enables both patterns quite well. For the "fiddle around until something emerges" workflow that @BrenBarn mentioned, you can absolutely create a general-purpose environment with `uv venv playground` and then use `uv pip install` to gradually build up your experimental dependencies. The project structure can come later.

What's interesting is how UV's speed makes the cost of switching between these approaches nearly zero. Want to quickly test something in isolation? Spin up a temporary environment. Want to formalize an experiment into a project? The migration is painless.

This mirrors what I've seen in other parts of the toolchain - tools like Vite for frontend dev or modern Docker practices all follow this pattern of "fast by default, but flexible when you need it." The velocity improvements compound when your entire toolchain operates on this principle.

darkamaul

7 hours ago

I think better developer tools tend to compound over time, they raise expectations and push the whole ecosystem forward.

IMO, uv is quickly becoming one of the best reasons to start a new project in Python. It’s fast, and brings a level of polish and performance that makes Python feel modern again.

ModernMech

an hour ago

I think uv is one of the best examples of why not to start a new project in Python. I mean, depending on the project really, Python has its place. But if your project is anything like uv, uv proves you can't actually write an app like uv in Python. So for as great as uv is, its proximity to Python is a constant reminder of its shortcomings.

arcanemachiner

an hour ago

A car doesn't move as fast as an airplane can.

Therefore, cars are useless and nobody should use one.

ModernMech

42 minutes ago

I had said: "I mean, depending on the project really, Python has its place."

My point is if you put an airplane next to a car factory, it's very clear you can't build the airplane in the car factory.

lawn

6 hours ago

Indeed. It's interesting to compare the expectations of a new programming language today to just a decade ago for instance.

oblio

2 hours ago

The funky thing is that people will fork uv and make package managers for other programming languages from it. Or at least from parts of it. Win win.

dekhn

20 hours ago

I hadn't paid any attention to rust before uv, but since starting to use uv, I've switched a lot of my performance-sensitive code dev to rust (with interfaces to python). These sorts of improvements really do improve my quality of life significantly.

My hope is that conda goes away completely. I run an ML cluster and we have multi-gigabyte conda directories and researchers who can't reproduce anything because just touching an env breaks the world.

embe42

20 hours ago

You might be interested in pixi, which is roughly to conda as uv is to pip (also written in Rust, it reuses the uv solver for PyPI packages)

Difwif

19 hours ago

Pixi has also been such a breathe of fresh air for me. I think it's as big of a deal as UV (It uses UV under the hood for the pure python parts).

It's still very immature but if you have a mixture of languages (C, C++, Python, Rust, etc.) I highly recommend checking it out.

exasperaited

15 hours ago

Pixi is what FreeCAD is now using. (Along with Rattler).

It makes building FreeCAD pretty trivial, which is a huge deal considering FreeCAD’s really complex Python and non-python, cross-platform dependencies.

alfalfasprout

18 hours ago

Yep, pixi is game changing. Especially for AI/ML, the ability to deal with non-python dependencies in nearly as fast a way as `uv` is huge. We have some exciting work leveraging the lower level primatives pixi uses we hope to share more about soon.

icar

17 hours ago

This seems to pretty much cover the same use cases as Mise. Is that true?

suslik

10 hours ago

The main difference between mise and pixi is an ability to subscribe to conda channels and build (in an extremely fast way) conda environments, bypassing or eliminating most of the conda frustration (regular conda users know what I mean). mise allows to install asdf tools primarily (last I checked).

On the python front, however, I am somehow still an old faithful - poetry works just fine as far as I was every concerned. I do trust the collective wisdom that uv is great, but I just never found a good reason to try it.

th0ma5

20 hours ago

This is something that uv advocates should pay attention to, there are always contexts that need different assumptions, especially with our every growing and complex pile of libraries and systems.

adastra22

18 hours ago

I wish the Python ecosystem would just switch to Rust. Things are nice over here… please port your packages to crates.

atty

13 hours ago

The unspoken assertion that Rust and Python are interchangeable is pretty wild and needs significant defense, I think. I know a lot of scientists who would see their first borrow checker error and immediately move back to Python/C++/Matlab/Fortran/Julia and never consider rust again.

t43562

9 hours ago

I've never used a more hostile language than rust. Some people hate python and I can't understand why but such is life. One mans meat....

arw0n

2 hours ago

For me uv seems to solve some of the worst pain points of Python, which is great since I have to work with it. I think for a lot of people the hate comes in when they have to maintain or deploy Python code in scenarios that Python and its libraries wasn't designed to do. Some stuff just makes Python seem like an "unserious" programming language to me:

1. Installation & dependencies: Don't install Python directly, instead install pyenv, use pyenv to install python and pip, use pip to install venv, then use venv to install python dependencies. For any non-trivial project you have to be incredibly careful with dependency management, because breaking changes are extremely common.

2. Useless error messages: I cannot remember outside of trivial examples without external packages when the error message I got was actually directly pointing towards the issue in the code. To give a quick example (pointing back to the point above), I got the error message "ImportError: cannot import name 'ChatResponse' from 'cohere.types'". A quick google search reveals that this happens if a) the cohere API-Key isn't set in ENV or b) you use langchain-cohere 0.4.4 with cohere 5.x, since the two aren't compatible.

3. Undisciplined I/O in libraries: Another ML library I recently deployed has a log-to-file mode. Fair enough, should be disabled before k8s deployment no biggie. Well, the library still crashes because it checks if it has rwx-permissions on a dir it doesn't need.

4. Type conversions in C-interop: Admittedly I was also on the edge of my own capabilities when I dealt with these issues, but we had issues with large integers breaking when using numpy/pandas in between to do some transforms. It was a pain to fix, because Python makes it difficult to understand what's in a variable, and what happens when it leaves Python.

1. and 4. are mainly issues with people doing stuff in Python it wasn't really designed to do. Using Python as a scripting language or a thin (!) abstraction layer over C is where it really shines. 2. and 3. have more to do with the community, but it is compounded by bad language design.

kortex

an hour ago

1. is true, yup people have been ragging on the python install superfund site problem for years, but the rest of those are entirely 3rd party library issues. It's like saying Windows is not a serious operating system because you installed a buggy application.

2. I've used a ton of languages and frankly Python has the best tracebacks hands-down, it's not even close. It's not Python's fault a 3rd party library is throwing the wrong error.

3. Again, why is bad language design a library can do janky things with I/O?

4. FFI is tricky in general, but this sounds like primarily a "read the docs" problem. All of the major numeric acceleration libraries have fixed sized numbers, python itself uses a kind of bigint that can be any size. You have to stay in the arrays/tensors to get predictable behavior. This is literally python being "a thin abstraction layer over C."

moi2388

7 minutes ago

Because it isn’t a statically typed language, is terribly, terribly slow, has horrible production environments and has a lot of quirks?

I don’t understand why anybody would ever develop anything in Python other than “I want to write software but can’t be arsed to follow software design principles”, with all the mess that follows from it.

brabel

3 hours ago

I don’t really hate python but would absolutely never use it as a large code base main language. I think what people hate is other people trying to use a scripting language like Python in places where you have large code bases and large teams. Scripting languages in general are terrible for that as they give you almost no compile time guarantees about anything! But I always thought that for people whose main job is not programming and whose scripts don’t get larger than a couple of thousand lines, python is a good choice… though Lisp would perhaps be even better if historically it had gotten the huge mindshare and resulting ecosystem.

omgmajk

5 hours ago

> please port your packages to crates.

This is such a deeply unserious take. Do you have hundreds of thousands of hours to give out for free? No?

whimsicalism

20 hours ago

I work professionally in ML and have not had to touch conda in the last 7 years. In an ML cluster, it is hopefully containerized and there is no need for that?

disgruntledphd2

an hour ago

> I work professionally in ML and have not had to touch conda in the last 7 years. In an ML cluster, it is hopefully containerized and there is no need for that?

I wish my life had been like this. Unfortunately I always appear to end up needing to make this stuff work for everyone else (the curse of spending ten years on Linux, I suppose).

But then ML is a very broad church, and particularly if you're a researcher in a bigger company then I could see this being true for lots of people (again, i wish this was me).

jscyc

18 hours ago

Very common in education/research systems. Even the things which are containerised often have conda in them.

dekhn

18 hours ago

At least on my cluster, few if any workloads are containerized. We also have an EKS where folks run containerized, but that's more inference and web serving, rather than training.

BoredPositron

20 hours ago

It's still used in edu and research. Haven't seen it in working environments in quite some time as well.

kardos

20 hours ago

It would be nice indeed if there was a good solution to multi-gigabyte conda directories. Conda has been reproducible in my experience with pinned dependencies in the environment YAML... slow to build, sure, but reproducible.

blactuary

an hour ago

uv does it by caching versions of packages so they can be shared across projects/environments. So you still have to store those multi-gig directories but you don't have so much duplication. If conda could do something similar that would be great

PaulHoule

20 hours ago

I'd argue bzip compression was a mistake for Conda. There was a time when I had Conda packages made for the CUDA libraries so conda could locally install the right version of CUDA for every project, but boy it took forever for Conda to unpack 100MB+ packages.

kardos

20 hours ago

It seems they are using zstd now for .conda packages, eg, bzip is obsoleted, so that should be faster.

warbaker

17 hours ago

Have you figured out a good way to manage CUDA dependencies with uv?

dekhn

16 hours ago

CUDA is part of our cluster install scripts, we don't manage that with uv or conda. To me, that should be system software that only gets installed once.

persedes

17 hours ago

okanat

16 hours ago

Not the OP but does this actually package CUDA and the CUDA toolchain itself or just the libraries around it? Can it work only with PyTorch or "any" other library?

Conda packaging system and the registry is capable of understanding things like ABI and binary compatibility. It can resolve not only Python dependencies but the binary dependencies too. Think more like dnf, yum, apt but OS-agnostic including Windows.

As far as I know, (apart from blindly bundling wheels), neither PyPI nor Python packaging tools have the knowledge of ABIs or purely C/C++/Rust binary dependencies.

With Conda you can even use it to just have OS-agnostic C compiler toolchains, no Python or anything. I actually use Pixi for shipping an OS-agnostic libprotobuf version for my Rust programs. It is better than containers since you can directly interact with the OS like the Windows GUI and device drivers or Linux compositors. Conda binaries are native binaries.

Until PyPI and setuptools understand the binary intricacies, I don't think it will be able to fully replace Conda. This may mean that they need to have an epoch and API break in their packaging format and the registry.

uv, poetry etc. can be very useful when the binary dependencies are shallow and do not deeply integrate or you are simply happy living behind the Linux kernel and a container and distro binaries are fulfilling your needs.

When you need complex hierarchies of package versions where half of them are not compiled with your current version of the base image and you need to bootstrap half a distro (on all OS kernels too!), Conda is a lifesaver. There is nothing like it.

saagarjha

7 hours ago

No, it’s PyTorch built against a particular version of CUDA. You need to install that on your system first.

dekhn

16 hours ago

If I find myself reaching a point where I would need to deal with ABIs and binary compatiblity, I pretty much stop there and say "is my workload so important that I need to recompile half the world to support it" and the answer (for me) is always no.

okanat

16 hours ago

Well handling OS-dependent binary dependency is still unsolved because of the intricate behavior of native libraries and especially how tightly C and C++ compilers integrate with their base operating systems. vcpkg, Conan, containers, Yocto, Nix all target a limited slice of it. So there is not a fully satisfactory solution. Pixi comes very close though.

Conda ecosystem is forced to solve this problem to a point since ML libraries and their binary backends are terrible at keeping their binaries ABI-stable. Moreover different GPUs have different capabilities and support different versions of the GPGPU execution engines like CUDA. There is no easy way out without solving dependency hell.

saagarjha

7 hours ago

If you’re writing code for an accelerator, surely you care enough to make sure you can properly target it?

gostsamo

20 hours ago

As far as I get it, conda is still around because uv is focused on python while conda handles things written in other languages. Unless uv gets much more universal than expected, conda is here to stay.

tempay

20 hours ago

There is also pixi (which uses uv for the python side of things) which feels like uv for conda.

okanat

16 hours ago

Pixi is great! It doesn't purely use uv though. I just love it. It solves "creating a repo that runs natively on any developer's PC natively" problem quite well. It handles different dependency trees per OS for the same library too!

prpl

16 hours ago

conda (and its derivatives that are also “conda” now), and conda-forge specifically, are the best ways to install things that will work across operating systems, architectures, and languages - without having to resort to compiling everything.

Want to make sure a software stack works well on a Cray with MPI+cuda+MKL, macOS, and ARM linux, with both C++ and Python libraries? It’s possible with conda-forge.

the__alchemist

an hour ago

You can do all of the above with Wheels.

prpl

an hour ago

Can you show me someone who has packaged log4cxx in a wheel? Is it in pip?

Arbitrary examples, I know, but I moved a large software that was truly mixed C++ and python project to conda-forge and all sorts of random C++ dependencies were in there, which drastically simplified distribution and drastically reduced compile time.

If I had done it today, it might be nix+bazel, or maybe conda+bazel, but maintaining a world of C++ libraries for distribution as wheels does not sound like fun - especially because nobody is doing that work as a community now

the__alchemist

38 minutes ago

I wrap Rust/CUDA programs in wheels. I've packaged arbitrary binaries in wheels, like software that's not directly related to Python. Can't say it works for everything, but I suspect so. You just run `maturn build`, then it can be installed with pip.

Conda is like Jquery or Bootstrap: It was necessarily before the official tools evolved. Now we don't need them any more, but they still are around for legacy reasons. You still need it for example, for some Moleular dynamics packages, but that's due to the package publishers choosing it.

kombine

7 hours ago

For me the best way to install things across operating systems has been nix. I wish it was more popular in the ML community.

levocardia

13 hours ago

Except the ONE annoying quirk that certain major projects and repos let their conda distribution get stale.

miki123211

10 hours ago

As a person who has successfully used uv for ml workloads, I'm curious what makes you still stay with Conda.

disgruntledphd2

an hour ago

Conda is much, much better for the C/Fortran/C++ parts of data science/machine learning workloads.

Like, I had real issues with GDAL and SQLite/spatialite on MacOS (easy on Linux) which uv was of no help with. I ended up re-architecting rather than go back to conda as uv is just that good (and it was early stage so wasn't that big of a deal).

blactuary

an hour ago

I stay with it because last time I tried uv it was still directory/project focused vs. environment. With conda it doesn't matter where I am, I can activate any of my environments and run code

insane_dreamer

an hour ago

cuda-nvcc was a blocker for us but it looks like a stable version is coming to pypi

oofbey

18 hours ago

Have you found it easy to write rust modules with python interfaces? What tools do you recommend?

the__alchemist

an hour ago

PyO3 + Maturn is fine, but it's a bit tedious if you have a big API. Some annoyances:

  - Enums with inner types require hacks
  - No clean way to make your Rust enums into Python enums
  - More boilerplate than you should have. I think I will have to write my own macros/helpers to solve. For example, you need getters and setters for each field. And the gotcha: You can't just macro them due to a PyO3 restriction.
  - No way to use other Python-exposed rust libs as dependencies directly. You have to re-expose everything. So, if your rust B lib depends on rust A, you will make a PyO3 Rust A package, but instead of re-using that in PyO3 Rust B, you will copy+paste your PyO3 Rust A boilerplate into PyO3 Rust B.
If you're maintaining a single lib and want to expose it to Python, it's fine once you find a pattern that works. If you have a network and/or your API is large or not stable, it's a mess. (but doable)

oconnor663

13 hours ago

PyO3 and Maturin are excellent. I've been maintaining a Python-module-written-in-Rust for several years now, and it's been quite smooth.

ederamen

13 hours ago

I'd be interested in this too. I know it's possible, but haven't found a good guide on how to do it well and manage the multi-lang complexity.

sbt567

13 hours ago

Many people use PyO3 for that

jvanderbot

20 hours ago

Obligatory: Not only rust would be faster than python, but Rust definitely makes it easy with Cargo. Go, C, C++ should all exhibit the performance you are seeing in uv, if it had been written in one of those languages.

The curmudgeon in me feels the need to point out that fast, lightweight software has always been possible, it's just becoming easier now with package managers.

dekhn

18 hours ago

I've programmed all those languages before (learned C in '87, C++ in 93, Go in 2015 or so) and to be honest, while I still love C, I absolutely hate what C++ has become, Go never appealed to me (they really ignored numeric work for a long time). Rust feels like somebody wanted to make a better C with more standard libraries, without going the crazy path C++ took.

krzyk

10 hours ago

Also this. I liked C (don't use it now, right now it is mostly Java) but C++ didn't appeal to me.

Rust is for me similar to C just like you wrote, it is better, bigger but not the overwhelming way like C++ (and Rust has cargo, don't know if C++ has anything).

t43562

8 hours ago

OO is supposed to make life easier but C++ exposes all the complexity of the implementation to you. Its approach to hiding complexity is to shove it partially under a carpet with sharp bits sticking out.

okanat

16 hours ago

I actually got interested in Rust because its integer types and the core data structures looked sane, instead of this insanity: https://en.cppreference.com/w/cpp/types/integer.html . Fluid integer types are evil.

I stayed for the native functional programming, first class enums, good parts of C++ and the ultimate memory safety.

jvanderbot

18 hours ago

That is exactly how I feel about it. I've always loved C for it's simplicity and Rust felt like an accidental love letter.

BrenBarn

9 hours ago

The sticking point for me is the way tools like uv and poetry build everything around the idea of a "project". I don't want a separate environment for every project, and I don't want to start by creating a project. I want to start with an environment that has stuff in it, and I start fiddling around, and gradually something comes together that eventually will be pulled out into a separate project. From what I can see uv doesn't make this easy.

Uehreka

9 hours ago

This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.

When I drop into a Node.js project, usually some things have changed, but I always know that if I need to, I can find all of my dependencies in my node_modules folder, and I can package up that folder and move it wherever I need to without breaking anything, needing to reset my PATH or needing to call `source` inside a Dockerfile (oh lord). Many people complain about Node and npm, but as someone who works on a million things, Node/npm is never something I need to think about.

Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore”, forcing me to use a newer version than the project wants and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.

And then there’s Conda, which has all of these problems and is also closed source (I think?) and has a EULA, which makes it an even bigger pain to automate cleanly (And yes I know about mamba, and miniconda, but the default tool everyone uses should be the one that’s easy to work with).

And yes, I know that if I was a full-time Python dev there’s a “better way” that I’d know about. But I think a desirable quality for languages/ecosystems is the ability for an outsider to drop in with general Linux/Docker knowledge and be able to package things up in a sometimes unusual way. And until uv, Python absolutely failed in this regard.

jampekka

9 hours ago

Having a directory like node_modules containing the dependencies is such an obviously good choice, it's sad how Python steering council actively resists this with what I find odd arguments.

I think a lot of the decades old farce of Python package management would have been solved by this.

https://peps.python.org/pep-0582/

https://discuss.python.org/t/pep-582-python-local-packages-d...

Sankozi

6 hours ago

This is not a good idea: this leads to longer build times and/or invalid builds (you build against different dependencies than declared in config).

Having dependency cache and build tool that knows where to look for it is much superior solution.

jampekka

6 hours ago

(p)npm manages both fine with the dependency directory structure.

Sankozi

4 hours ago

This is literally not possible.

If you have local dependency repo and dependency manifest, during the build, you can either:

1. Check if local repo is in sync - correct build, takes more time

2. Skip the check - risky build, but fast

If the dependencies are only in the cache directory, you can have both - correct and fast builds.

jampekka

an hour ago

I don't follow. In pnpm there's a global cache at ~/.pnpm with versioned packages and node_modules has symlinks to those. Dependencies are defined in package.json transitive dependencies are versioned and SHA512-hashed in pnpm-lock.yaml.

E.g.

  $ ls -l ./node_modules/better-sqlite3
  ... node_modules/better-sqlite3 -> .pnpm/better-sqlite3@12.4.1/node_modules/better-sqlite3

zelphirkalt

7 hours ago

It's literally what a venv does and it is very widespread to just make a venv per project, just like you are creating a node_modules per project.

quietbritishjim

2 hours ago

> This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH ...

What's the "this" that is supposedly always your issue? Your comment is phrased as if you're agreeing with the parent comment but I think you actually have totally different requirements.

The parent comment wants a way to have Python packages on their computer that persist across projects, or don't even have a notion of projects. venv is ideal for that. You can make some "main" venv in your user directory, or a few different venvs (e.g. one for deep learning, one for GUIs, etc.), or however you like to organise it. Before making or running a script, you can activate whichever one you prefer and do exactly like parent commenter requested - make use of already-installed packages, or install new ones (just pip install) and they'll persist for other work. You can even switch back and forth between your venvs for the same script. Totally slapdash, because there's no formal record of which scripts need which packages but also no ceremony to making new code.

Whereas your requirements seem to be very project-based - that sounds to me like exactly the opposite point of view. Maybe I misunderstood you?

zelphirkalt

6 hours ago

    > Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore” [...]
How are you containerizing Python projects? What confuses me about your statement are the following things:

(1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?

(2) Are you not using official Python docker images?

(3) What's pip gotta do with a Python version being supported?

(4) How does that "Your Linux distro doesn’t support that version of Python anymore" show itself? Is that a literal error message you are seeing?

    > [...] demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year
It seems you are talking about installing things in system Python, which you shouldn't do. More questions:

(1) Why are you not using virtual environments?

(2) You are claiming Node.js projects to be better in this regard, but actually they are just creating a `node_modules` folder. Why then is it a problem for you to create a virtual environment folder? Is it merely, that one is automatic, and the other isn't?

    > This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.
It is very easy to activate a venv just for one command. Use a subshell, where you `. venv/bin/activate && python ...(your program invocation here)...`. Aside from that, projects can be set up so that you don't even see that they are using a venv. For example I usually create a Makefile, that does the venv activating and running and all that for me. Rarely, if ever, I have to activate it manually. Since each line in a Makefile target is running in its own shell, nothing ever pollutes my actual top level shell.

bashkiddie

5 hours ago

> (1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?

Debian-13 defaults to Python-3.13. Between Python-3.12 and Python-3.13 the support for `pkg_config` got dropped, so pip projects like

https://pypi.org/project/remt/

break. What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.

agrounds

4 hours ago

> What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.

This is one of uv’s selling points. It will download the correct python version automatically, and create the venv using it, and ensure that venv has your dependencies installed, and ensure that venv is active whenever you run your code. I’ve also been bit by the issue you’re describing many times before, and previously had to use a mix of tools (eg pyenv + pipenv). Now uv does it all, and much better than any previous solution.

bashkiddie

5 hours ago

> (2) Are you not using official Python docker images?

Would you help me make it work?

  docker run -it --rm -v$(pwd):/venv --entrypoint python python:3.12-alpine -m venv /venv/remt-docker-venv
How do I source it?

  cd remt-docker-venv/
  source bin/activate
  python --version
  bash: python: command not found

milsebg

3 hours ago

Instead of "python --version", just use the "python" executable from within the venv. Sourcing is a concept for interactive shells.

tecleandor

3 hours ago

The python executable from the venv is not going to work inside the container as it's a symlink by default. That's a venv that was built on their host OS, and the symlink to the Python binary on the container is not going to work.

You could also pass the `--copies` parameter when creating the initial venv, so it's a copy and not symlinks, but that is not going to work if your on MacOS or Windows (because the binary platform is different to the Linux that's running the container), or if your development Python is built with different library versions than the container you're starting.

tecleandor

3 hours ago

I'd recommend re-creating the virtual environment inside the Docker container.

The problem is you are mounting a virtual environment you have built in your development environment into a Docker container. Inside your virtual environment there's a `python` binary that in reality is a symlink to the python binary in your OS:

  cd .venv
  ls -l bin/python
  lrwxr-xr-x@ 1 myuser  staff  85 Oct 29 13:13 bin/python -> /Users/myuser/.local/share/uv/python/cpython-3.13.5-macos-aarch64-none/bin/python3.13
So, when you mount that virtual environment in a container, it won't find the path to the python binary.

The most basic fix would be recreating the virtual environment inside the container, so from your project (approximately, I don't know the structure):

   docker run -it --rm -v$(pwd):/app --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install --system -r requirements.txt
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 97ms
  Prepared 23 packages in 975ms
  Installed 23 packages in 7ms
  [...]
  /app # python
  Python 3.12.12 (main, Oct  9 2025, 22:34:22) [GCC 14.2.0] on linux
  Type "help", "copyright", "credits" or "license" for more information.
But, if you're developing and don't wanna build the virtual environment each time you start the container, you could create a cache volume for uv, and after the first time installation, everything is going to be way faster:

  # First run
   docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install -r requirements.txt --system
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 103ms
  Prepared 23 packages in 968ms
  Installed 23 packages in 16ms
  [...]
  # Second run
   docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install -r requirements.txt --system
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 10ms
  Installed 23 packages in 21ms
You can also see some other examples, including a Docker Compose one that automatically updates your packages, here:

https://docs.astral.sh/uv/guides/integration/docker/#develop...

---

Edit notes:

  - UV_LINK_MODE="copy" is to avoid a warning when using the cache volume
  - Creating the venv with `--copies` and mounting it into the container would fail 
    if your host OS is not exactly the same as the containers, and also defeats in a 
    way the use of a versioned Python container

Timsky

7 hours ago

> demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.

Literally, my case. I recently had to compile an abandoned six-year-old scientific package written in C with Python bindings. I wasn’t aware that modern versions of pip handle builds differently than they did six years ago — specifically, that it now compiles wheels within an isolated environment. I was surprised to see a message indicating that %package_name% was not installed, yet I was still able to import it. By the second day, I eventually discovered the --no-build-isolation option of pip.

rdfi

9 hours ago

For not having to call 'source ...' in a Dockerfile, if you use the python executable from the virtualenv directly, then it will be as if you've activated that virtualenv.

This works because of the relative path to the pyenv.cfg file.

zelphirkalt

7 hours ago

The way to activate a virtual environment in a docker container is to export modified PATH and possibly change PYTHONHOME.

Uehreka

8 hours ago

I think my ultimate problem with venv is that virtual environments are solved by Docker. Sure sure, full time Python devs need a way to manage multiple Python and package versions on their machine and that’s fine. But whatever they need has to not get in my way when I come in to do DevOps stuff. If my project needs a specific version of Node, I don’t need nvm or n, I just install the version I want in my Dockerfile. Same with Go, same with most languages I use.

Python sticks out for having the arrogance to think that it’s special, that “if you’re using Python you don’t need Docker, we already solved that problem with venv and conda”. And like, that’s cute and all, but I frequently need to package Python code and code in another language into one environment, and the fact that their choice for “containerizing” things (venv/conda) plays rudely with every other language’s choice (Docker) is really annoying.

cylemons

6 hours ago

I dont understand why you can't just install python in your container? How does venv make it hard?

zelphirkalt

7 hours ago

Then use a Docker container that has the right Python version already? There are official containers for that.

If that's not good enough for you, you could do some devops stuff and build a docker container in which you compile Python.

I don't see where it is different from some npm project. You just need to use the available resources correctly.

BrenBarn

8 hours ago

Conda is open source. Not sure what you mean about an EULA. There are some license agreements if you use Anaconda, but if you just use conda-forge you don't have any entanglements with Anaconda the company. (I agree the nomenclature is confusing.)

Uehreka

8 hours ago

I… I’m sorry to hear that. Wow. That is shockingly bad.

Seriously, this is why we have trademarks. If Anaconda and Conda (a made-up word that only makes sense as a nickname for Anaconda and thus sounds like it’s the same thing) are two projects by different entities, then whoever came second needs to change their name, and whoever came first should sue them to force them. Footguns like this should not be allowed to exist.

BrenBarn

7 hours ago

It's not like they're entirely separate and unrelated things. Anaconda is a company that created a program called Conda which can connect to various "channels" to get packages, and initially the main one was the Anaconda channel. Conda was open source but initially its development was all done by Anaconda. Gradually the Conda program was separated out and development was taken over by a community team. Also there is now conda-forge which is a community-run channel that you can use instead of the Anaconda one. And then there is also Mamba which is basically a faster implementation of Conda. That's why there's the awkward naming. It's not like there are competing groups with similar names, it's just things that started off being named similarly because they were built at one company, but gradually the pieces got separated off and moved to community maintenance.

cma

8 hours ago

Conda was made by Anaconda, there's no one to sue, chromium vs Chrome

matsemann

8 hours ago

What you describe I think is what most other people hate the most about python. The fact that everything pollutes the global environment, which then becomes a mess of things depending on various versions, which also ends up breaking tools included in the OS and suddenly your whole system is effed.

zelphirkalt

7 hours ago

Exactly, and this is why we use virtual environments. Which then get dumped on by other people. People can't have it both ways, but they already have all the choices in Python's ecosystem. They just complain, because they are not aware how they can do it their way and didn't invest the time to nail down the workflow they like, or their workflow idea is self-contradictory.

brap

7 hours ago

My sentiment coming to Python after getting used to the DX of Node

https://uploads.dailydot.com/2024/04/damn-bitch-you-live-lik...

abenga

6 hours ago

My exact reaction when I visit front-end-land. It always surprises me that you can't just leave a project alone for some time, get back to it and continue where you left off. There are always a couple of days of getting the thing to work again, then figuring out how everyone is doing things now, because apparently that has to change every few months.

brap

6 hours ago

While I don’t do frontend and don’t want to defend that mess, I think that locking a specific version for each dependency would solve this

abenga

5 hours ago

Uv allows you to lock versions too. And creates a .venv inside the project which, I guess, is similar to the node_modules directory.

brap

4 hours ago

For sure, I was referring to the state of python before uv. Specifically projects not being self-contained, easily portable, etc

robertfw

24 minutes ago

You could already do these things before, you just spent much more time twiddling your thumbs waiting for lock files to be resolved

cylemons

6 hours ago

System package management is a mess in the first place, if you have a program that uses python then all the packages that it uses need to be installed globally, so you have python packages bundled as system packages which can conflict with that same package installed with pip.

dochne

7 hours ago

I've never quite forgiven python for the time I tried to update it using yum, it failed which is then broke yum - not a fun place to end up.

Treating python as a project level dependency rather than a system level dependency is just an excellent design choice.

BrenBarn

7 hours ago

I'm not talking about the global environment. I want to be able to have whatever environments I want. But my point is that the environments are not in a 1-1 relationship with "projects".

gempir

9 hours ago

This is easier to do with uv than it is with pip.

You can create venvs wherever you please and then just install stuff into them. Nobody forces the project onto you, at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.

zelphirkalt

7 hours ago

Way to go to create non-reproducible results. Basically, nothing production ready is rolling out of that one.

gempir

7 hours ago

This is what the OP wanted. I don't agree to do it this way.

At work for us we use uv pip freeze to generate a more strict requirements file.

zelphirkalt

6 hours ago

What I was referring to is:

> [...] at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.

Unless your `python_requirements.txt` also carries checksums, like uv's lock files or poetry's lock files, that is. Though of course things get spicy and non-reproducible again, if you have then multiple projects/services, each with their own `python_requirements.txt`, all installing into that same global venv ...

taftster

9 hours ago

You can't just create yourself an "everything" environment with UV and then experiment with it? Honest question.

I think you're basically suggesting that you'd have a VM or something that has system-high packages already preinstalled and then use UV on top of it?

BrenBarn

9 hours ago

If so, it's certainly not obvious. I mean look at the docs: https://docs.astral.sh/uv/getting-started/features/

I don't see anything resembling "environments" in the list of features or in the table of contents. In some sections there is stuff like "When working on a project with uv, uv will create a virtual environment as needed", but it's all about environments as tied to particular projects (and maybe tools).

You can use the `uv venv` and the `uv pip` stuff to create an environment and install stuff into it, but this isn't really different from normal venvs. And in particular it doesn't give me much benefit over conda/mamba.

I get that the project-based workflow is what a lot of people want, and I might even want it sometimes, but I don't want to be forced into foregrounding the project.

philipallstar

5 hours ago

> I don't want to be forced into foregrounding the project.

The advantage of being forced to do this is other people (including yourself on a new laptop) can clone your project, run uv install and get working. It's the death of "works on my machine" and "well it'll take them a couple of weeks to properly get up and running".

jcattle

9 hours ago

> And in particular it doesn't give me much benefit over conda/mamba.

How about the advantage of not taking an entire lunch break to resolve the environment every time you go to install a new library?

That was the biggest sticking point with conda/mamba for me. It's been a few years since I last used them but in particular with geospatial packages I would often run into issues.

blactuary

an hour ago

This is not the case anymore. With libmamba I've never waited more than a few minutes for resolving dependencies and installing. uv is faster for sure, but speed is no longer a big problem with conda

greazy

9 hours ago

libmamba solved this year's ago. The dep solver is now much faster.

quietbritishjim

2 hours ago

I think that user (like me until finding out just now) didn't know that you could make ad-hoc virtual environments with uv, and that instead you would have to make a directory with a pyproject.toml and install things into it by adding to that pyproject.toml

zelphirkalt

7 hours ago

You can, but an "everything environment" is rarely to be recommended. How to do it: Make uv use the --active environment to install things. Before using uv, activate that environment.

quietbritishjim

2 hours ago

I have the same feeling, so I just looked it up and actually uv does support exactly that mode. It works the same as venv and pip but you just prefix a bunch of commands with "uv". Create a new virtual environment fooenv:

   uv venv fooenv
Activate virtual environment on Windows (yes I'm sorry that's what I'm currently typing on!):

   .\fooenv\Scripts\activate
Run some environment commands:

   uv pip install numpy
   uv pip freeze
   uv pip uninstall numpy
If you run python now, it will be in this virtual environment. You can deactivate it with:

   deactivate
Admittedly, there doesn't seem to be much benefit compared to traditional venv/pip for this use case.

This is covered in the section of the docs titled "The pip interface": https://docs.astral.sh/uv/pip/

jdranczewski

8 hours ago

I agree that having a reliable main environment for quick experiments is great! On Windows I just use the main Python installation as a global environment, since no system stuff depends on it, on Linux I tend to create a "main" environment in the home directory. Then I can still have per-project environments as needed (say with uv), for example for stuff that I need to deploy to the VPS.

Note that I'm mostly in the research/hobby environments - I think this approach (and Python in general, re: some other discussions here about the language) works really well, especially for the latter, but the closer you get to "serious" work, the more sense the project environment approach makes of course

zelphirkalt

7 hours ago

Especially in research the project environment approach makes sense, where results don't mean much without reproducibility.

blactuary

an hour ago

This is me exactly. I have a default dev environment, and a couple for experimenting with new versions of various packages, I don't need or want "projects" most of the time

trymas

9 hours ago

You can setup uv inside your script, without a project.

Example: https://treyhunner.com/2024/12/lazy-self-installing-python-s...

BrenBarn

9 hours ago

I'm not talking about wanting single-file scripts, but about having a "sandbox" environment in which various things can be messed with before abstracting anything out into a project.

jrvarela56

9 hours ago

I have a directory called workspace where there’s a projects directory and the main area is for messing around. Just setup workspace once as a project.

BrenBarn

8 hours ago

But I don't want the sandbox linked in any way to a directory. I just want to be able to use it from anywhere. (This is what I can do with conda.)

csnweb

7 hours ago

You can activate the uv venv from anywhere just fine, just do source path_to_sandbox/.venv/bin/activate. Probably makes sense to define a shortcut for that like activate sandbox. Your conda env is also linked to a directory, it’s just a hidden one, you can also create the uv obe somewhere hidden. But I get it to some extent conda has this large prefilled envs with a lot of stuff in it already that work together. Still if you then end up needing anything else you wait ages for the install. I find conda so unbearable by now that I voluntarily switch every conda thing I have left over to uv the second I need to touch the conda env.

zelphirkalt

7 hours ago

A shell alias that activates workspace/sandbox?

Hackbraten

9 hours ago

Doesn't the single-file script let you do exactly that?

If not, where do you see a meaningful difference?

prasoon2211

6 hours ago

Most people actually using python do not start off in scripts. Usually, I would mess around in IPython / Jupyter for a couple days until I have something I'm happy with. Then I'll "productionize" the project.

tbh this has been a sticking point for me too with uv (though I use it for everything now). I just want to start of a repl with a bunch of stuff installed so I can try out a bunch of stuff. My solution now it to have a ~/tmp dir where I can mess around with all kinds of stuff (not just python) and there I have a uv virtualenv installed with all kinds of packages pre-installed.

Hackbraten

5 hours ago

    uv run --with=numpy,pandas python

lurking_swe

8 hours ago

Serious question - what’s stopping you from having 1 large project called “sandbox”?

codeptualize

5 hours ago

But then you don't need uv. The pain point uv solves is projects. Different projects with different dependencies (even the same but different versions), multiple people, teams, and environments trying to run the same code.

That gets problematic if environments go out of sync, or you need different versions of python or dependencies.

So you are right, you probably won't benefit a lot if you just have one big environment and that works for you, but once you pull things in a project, uv is the best tool out there atm.

You could also just create a starter project that has all the things you want, and then later on pull it out, that would be the same thing.

zelphirkalt

7 hours ago

Neither of the tools (uv, poetry) makes it the default, and the poetry designers/developers are kinda ignorant about this, but it can be done.

9dev

9 hours ago

Interesting. I never start working on something without a rough idea of what I am working on, be that just researching something or a program; and uv makes it extremely easy to create a folder, and make it a project.

Could it be that you’re just used to separate environments causing so much pain that you avoid it unless you’re serious about what you’re doing?

tlarkworthy

9 hours ago

It unblocks that workflow, that's why it's so great. You can have a single script with inline dependencies that are auto installed on execution. That can expand to importing other files, but there is very little setup tax to get started with a script and it does not block expansion.

BrenBarn

9 hours ago

It's not about single-file scripts, it's about having a "sandbox" environment in which various things can be messed with before abstracting anything out into a project.

blactuary

an hour ago

This is a divide among different Python devs it seems. A lot of people are running scripts, which I will do eventually, but I spend a ton of time in a REPL experimenting and testing.

dimatura

7 hours ago

Whenever I feel like doing that I just use "uv pip" and pretty much do the same things I'd do when using pip to messily install things in a typical virtual environment.

atoav

9 hours ago

Go ahead and do that then. uv is not preventing you from putting 10 projects within one folder.

It is still benefitial to not install stuff system wide, since this makes it easy to forget which stuff you already have installed and which is a missing dependency.

Keeping track of dependencies is kind part of a programers work, so as long as you're writing these things mostly for yourself do whatever you like. And I say that as someone who treats everything like a project that I will forget about in 3 days and need to deploy on some server a year later.

lvl155

6 hours ago

You set up a template project folder. In the script you can just add a uv command. In the age of Claude Code and Codex, creating a sandbox/container should be step one.

vietvu

7 hours ago

You can just `uv venv`? Or even uvx?

BiteCode_dev

3 hours ago

It doesn't.

uv has a script mode, a temp env mode, and a way to superimpose a temp env on top of an existing env.

See: https://www.bitecode.dev/p/uv-tricks

That's one of the selling point of the tool: you don't need a project, you don't need activate anything, you don't even need to keep code around.

Yesterday I wanted to mess around with logoru in ipython. I just ran `uvx --with loguru ipython` and I was ready to go.

Not even a code file to open. Nothing to explicitly install nor to clean up.

For a tool that is that fantastic and create such enthusiasm, I'm always surprise of how little of its feature people know about. It can do crazy stuff.

forrestthewoods

7 hours ago

> I don't want a separate environment for every project

That is exactly 100% what I demand. Projects should be - must be - completely isolated from one another.

Quite frankly anything else seems utterly insane to me.

mark_l_watson

2 hours ago

UV lets me love using Python. There are other languages, mostly Lisp languages, that I have always liked better but my workflow with UV is so pleasant that I find myself not minding Python the language, even looking forward to using it.

General comment: using Rust for utilities and libraries has revitalized Python.

aerhardt

21 hours ago

I'm surprised by how much I prefer prepending "uv" to everything instead of activating environments - which is still naturally an option if that's what floats your boat.

I also like how you can manage Python versions very easily with it. Everything feels very "batteries-included" and yet local to the project.

I still haven't used it long enough to tell whether it avoids the inevitable bi-yearly "debug a Python environment day" but it's shown enough promise to adopt it as a standard in all my new projects.

zahlman

20 hours ago

> how much I prefer prepending "uv" to everything instead of activating environments

You can also prepend the path to the virtual environment's bin/ (or Scripts/ on Windows). Literally all that "activating an environment" does is to manipulate a few environment variables. Generally, it puts the aforementioned directory on the path, sets $VIRTUAL_ENV to the venv root, configures the prompt (on my system that means modifying $PS1) as a reminder, and sets up whatever's necessary to undo the changes (on my system that means defining a "deactivate" function; others may have a separate explicit script for that).

I personally don't like the automatic detection of venvs, or the pressure to put them in a specific place relative to the project root.

> I also like how you can manage Python versions very easily with it.

I still don't understand why people value this so highly, but so it goes.

> the inevitable bi-yearly "debug a Python environment day"

If you're getting this because you have venvs based off the system Python and you upgrade the system Python, then no, uv can't do anything about that. Venvs aren't really designed to be relocated or to have their underlying Python modified. But uv will make it much faster to re-create the environment, and most likely that will be the practical solution for you.

biimugan

20 hours ago

Yup. I never even use activate, even though that's what you find in docs all over the place. Something about modifying my environment rubs me the wrong way. I just call ``./venv/bin/python driver.py`` (or ``./venv/bin/driver`` if you install it as a script) which is fairly self-evident, doesn't mess with your environment, and you can call into as many virtualenvs as you need to independently from one another.

``uv`` accomplishes the same thing, but it is another dependency you need to install. In some envs it's nice that you can do everything with the built-in Python tooling.

1718627440

18 hours ago

And when you control the installation, you can install multiple python versions with `make altinstall` into the same prefix, so you don't even need to pass 'project/bin/python, you can just call 'python-project' or 'project.py' or however you like.

zahlman

17 hours ago

Yep. (Although I installed into a hierarchy within /opt, and put symlinks to the binaries in /usr/local/bin. Annoyingly, I have to specify the paths to the actual executables when making venvs, so I have a little wrapper for that as well....)

agrounds

4 hours ago

>> I also like how you can manage Python versions very easily with it. > > I still don't understand why people value this so highly, but so it goes.

Well I do need some way to install multiple python versions in parallel, and ideally the correct python version would be used in each project automatically. I used to use pyenv for this, which puts shims in your path so that it can determine which python executable to run on the fly, but I found that it was sometimes very slow, and it didn’t work seamlessly with other tools. Specifically pipenv seemed to ignore it, so I’d have to use a workaround to point pipenv to the path to the pyenv-installed python executable.

When one tool does both python installs and dependency/venv management, then it can make these work seamlessly together, and I don’t need to screw up my path to make the version selection work either.

pnt12

18 hours ago

If you have multiple python applications with different versions, it's nice to use the same version as deployed.

At least major and minor, patch is rarely needed for python.

scotty79

17 hours ago

I would very much like to know the reason why they named it bin/ here and Scripts/ there. To get some closure.

omgmajk

5 hours ago

Annoys me to no end as well. I don't have an answer for you, but I would like one too.

lelandbatey

20 hours ago

I agree, once I learned (early in my programming journey) what the PATH is as a concept, I have never had an environment problem.

However, I also think many people, even many programmers, basically consider such external state "too confusing" and also don't know how they'd debug such a thing. Which I think is a shame since once you see that it's pretty simple it becomes a tool you can use everywhere. But given that people DON'T want to debug such, I can understand them liking a tool like uv.

I do think automatic compiler/interpreter version management is a pretty killer feature though, that's really annoying otherwise typically afaict, mostly because to get non-system wide installs typically seems to require compiling yourself.

bobsomers

20 hours ago

Personally, I prefer prepending `uv` to my commands because they're more stateless that way. I don't need to remember which terminal my environment is sourced in, and when copying and pasting commands to people I don't need to worry about what state their terminal is it. It just works.

sirfz

20 hours ago

I use mise with uv to automatically activate a project's venv but prefixing is still useful sometimes since it would trigger a sync in case you forgot to do it.

globular-toast

19 hours ago

One of the key tenets of uv is virtualenvs should be disposable. So barring any bugs with uv there should never be any debugging environments. Worst case just delete .venv and continue as normal.

j45

20 hours ago

This isn't a comment just about Python.. but it should just work. There shouldn't be constant ceremony for getting and keeping environments running.

1718627440

18 hours ago

It does, Python has essentially solved it for years.

bschwindHN

13 hours ago

lol no it hasn't

Why else is this discussion getting hundreds of comments?

For any random python tool out there, I had about a 60% chance it would work out of the box. uv is the first tool in the python ecosystem that has brought that number basically to 100%. Ironically, it's written in Rust because python does not lend itself well to distributing reliable, fast tools to end users.

zelphirkalt

3 hours ago

It comes up again and again, because people don't realize that solutions already exist(ed) and they don't spend the time to figure things out.

I have managed reproducible Python services and software for multiple years now. This was solved already before uv, although uv does it faster and maybe offers a bit more comfort, although I abstract away such tooling using a simple Makefile anyway.

The reason you are having such a bad time getting random Python projects to work out of the box is, because people creating them did not spend the effort to make them reproducible, meaning, that they do not ensure the setup has same versions and checksums of every direct and transitive dependency. This can be facilitated using various tools these days. poetry, uv, and I am sure there are more. People are just clueless and think that a requirements.txt file with a few loose versions slung in is sufficient. It is not, and you end up with not working project setups like in those cases you refer to.

1718627440

9 hours ago

The language (actually the standard implementations build system) has. The problem is programs and installations don't use it.

oblio

20 hours ago

There are basically 0 other programming languages that use the "directory/shell integration activated virtual environment", outside of Python.

How does the rest of the world manage to survive without venvs? Config files in the directory. Shocking, really :-)))

zahlman

19 hours ago

> Config files in the directory.

The problem is, that would require support from the Python runtime itself (so that `sys.path` can be properly configured at startup) and it would have to be done in a way that doesn't degrade the experience for people who aren't using a proper "project" setup.

One of the big selling points of Python is that you can just create a .py file anywhere, willy-nilly, and execute the code with a Python interpreter, just as you would with e.g. a Bash script. And that you can incrementally build up from there, as you start out learning programming, to get a sense of importing files, and then creating meaningful "projects", and then thinking about packaging and distribution.

9dev

18 hours ago

And how is that different from any other interpreted language? Node and PHP handle this just fine, and they don’t need a Rube Goldberg contraption to load dependencies from a relative directory or the systems library path. I really don’t get why Python people act like that’s some kind of wicked witchcraft?

t43562

8 hours ago

Python's just working like a normal unix program. Some people like that because they can reason about it the way they reason about any other utility so it has advantages when using python as a scripting language - which is what it was invented as. AI/ML/ASGI/blablahblah are just specific applications with problems that seem overwhelmingly important to their users.

oblio

8 hours ago

Is this an AI generated comment?

Node or PHP also work like normal Unix programs...

t43562

7 hours ago

PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

I wonder what good you think insults do? I could insult your use of English for example but would that make my argument better?

oblio

7 hours ago

Please do insult my use of English, it's a learning experience and it might be the same for others reading these comments.

> PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

Perl, Ruby, Lua, I can keep going. You're just nitpicking. Practically only Python uses the venv [1] approach from all the programming languages I've used.

[1] Manual activation needed, non portable pile of dependencies per project (by design, fairly sure the documentation mentions this is not a supported use case - even across quasi-identical machines !!!), etc. I stand by my decision to call venvs a "fractal of bad design".

t43562

3 hours ago

There are 5000 programs in my /usr/bin. They take environment settings from the current environment which means you can set a variable on the commandline for that process without finding that is ineffectual because it has been overridden by some configuration file somewhere or by some weird "automatic" behaviour. "clever" tools end up being a pain in the arse when you have to second guess them and work around their behaviour.

As things are, I can share a venv with several projects, or have one for my account if I don't want to break the system tools. I can even write a trivial bash function to activate a venv automatically based on whether a venv exists in my project. It's so trivial to do and yet generates all this discussion.

As for non-portability that's the most specious and pathetic argument of the lot. who can bother to make every library portable to every distro....? what is the point of distros if one demands that one can run the same binaries everywhere? This is what containers were invented for. If you need the latest versions you might just, heaven forbid, have to compile something and fight with the problems of running it on distro that it wasn't developed on.

9dev

an hour ago

I don’t understand what you’re talking about. Literally all common interpreted languages adhere to Unix standards—they take arguments, use environment files, handle input and output streams, and load libraries.

Their single advantage over Python is that they are able to work fine without virtual environments, as they just load libraries from a relative path: That way, you can copy-paste a project directory, move it to another system with a copy of the interpreter binary, and… run the software. There is nothing clever about that; I’d even say Python's way of magically configuring a shell just to be able to locate files is the "clever" solution that nobody asked for!

oblio

3 hours ago

Portable as in "you can't copy a virtual env to an equivalent machine to deploy it". Look up the documentation, it's not recommended because of the hardcoded absolute paths or something like that :-|

Python venvs literally f*ed up the simplest form of deployment on the planet, scp. Yes, we have more complex solutions like Docker, another abomination (the software itself). Docker was invented in big part due to Python (not only, but it was a big factor).

Again, I use venvs. They're ok. But they're a stupid semi abstraction.

https://xkcd.com/1987/

1718627440

18 hours ago

There are path configuration files (*.pth) and you can configure sys.path in the script itself?

zahlman

17 hours ago

Yes, and in principle you can install each package into a separate folder (see the `--target` option for pip) and configure sys.path manually like that.

For .pth files to work, they have to be in a place where the standard library `site` module will look. You can add your own logic to `sitecustomize.py` and/or `usercustomize.py` but then you're really no better off vs. writing the sys.path manipulation logic.

Many years ago, the virtual environment model was considered saner, for whatever reasons. (I've actually heard people cite performance considerations from having an overly long `sys.path`, but I really doubt that matters.) And it's stuck.

whywhywhywhy

19 hours ago

The only word in the `source .venv/bin/activate` command that isn't a complete red flag that this was the wrong approach is probably bin. Everything else is so obviously wrong.

source - why are we using an OS level command to activate a programming language's environment

.venv - why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

activate - why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Feels dirty every time I've had to type it out and find it particularly annoying when Python is pushed so much as a good first language and I see people paid at a senior level not understand this command.

zahlman

19 hours ago

> why are we using an OS level command to activate a programming language's environment

Because "activating an environment" means setting environment variables in the parent process (the shell that you use to run the command), which is otherwise impossible on Linux (see for example https://stackoverflow.com/questions/6943208).

> why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

It doesn't have to be. You can call it anything you want, hidden or not, and you can put it anywhere in the filesystem. It so happens that many people adopted this convention because they liked having the venv in that location and hidden; and uv gives such venvs special handling (discovering and using them by default).

> why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Because the entire point is that, when you need to activate the environment, the folder in question is not on the path (the purpose of the script is to put it on the path!).

If activating virtual environments shadows e.g. /usr/bin/activate on your system (because the added path will be earlier in $PATH), you can still access that with a full absolute path; or you can forgo activation and do things like `.venv/bin/python -m foo`, `.venv/bin/my-program-wrapper`, etc.

> Feels dirty every time I've had to type it out

I use this:

  $ type activate-local 
  activate-local is aliased to `source .local/.venv/bin/activate'
Notice that, again, you don't have to put it at .venv . I use a .local folder to store notes that I don't want to publish in my repo nor mention in my project's .gitignore; it in turn has

  $ cat .local/.gitignore 
  # Anything found in this subdirectory will be ignored by Git.
  # This is a convenient place to put unversioned files relevant to your
  # working copy, without leaving any trace in the commit history.
  *
> and I see people paid at a senior level not understand this command.

If you know anyone who's hiring....

whywhywhywhy

17 hours ago

Fair response it's just nothing else feels like this weird duct tape'd together bunch of hacks to work around the design mistakes of the base language assuming it's a top level part of the OS.

> which is otherwise impossible on Linux

Node, Rust, etc all manage it.

> Because the entire point is that...

I just mean there is a history of Python using overly generic naming: activate, easy-install. Just feels weird and dirty to me that you'd call such a specific things names like these and I think it's indicative of this ideology that Python is deep in the OS.

Maybe if I'd aliased the activate command a decade ago I wouldn't feel this way or think about it.

zahlman

16 hours ago

> Node, Rust, etc all manage it.

  $ (bash -c 'export foo=bar && echo $foo')
  bar
  $ echo $foo

  $
How do they work around this?

baq

11 hours ago

They don’t use environment variables. See also git.

oblio

8 hours ago

You're arguing an awful lot in favor of Python venvs for someone who doesn't really seem to know any other programming language ecosystems in depth.

Similar mindset to the original creators of venv, I imagine :-)

zelphirkalt

2 hours ago

Or you don't realize the difference between something like "cargo run" and "python file.py".

j45

18 hours ago

Maybe it's just me, but it shouldn't be necessary to manage this and a few other things to get a python script working.

uv has increased my usage of python for production purposes because it's maintainable by a larger group of people, and beginners can become competent that much quicker.

t43562

8 hours ago

One could say ... why do people not bother to learn the shell, or how programs get environment settings ...or how to write shell function to run activate for themselves or how to create a tiny makefile which would do all of this for them?

Surely the effort of programming the actual code is so significant that starting a tool is a minor issue?

SAI_Peregrinus

2 hours ago

Because the effort of programming the actual code is often less than figuring out how to glue together a dozen different inane tools with inexplicably different usage conventions. KISS always matters, and UV keeps it very simple. Small is often the opposite of simple.

zelphirkalt

2 hours ago

By requiring special tooling to build and run your program. Namely cargo, npm, etc, while in Python these are a bit more separate concerns.

cluckindan

9 hours ago

Node.js does, if you use fnm or nvm.

j45

18 hours ago

The venv thing def stands out to me as being a bit of an outlier.

If uv makes it invisible it is a step forward.

roflyear

19 hours ago

what happens when you have two projects using different versions of node, etc? isn't that a massive headache?

not that it's great to start with, but it does happen, no?

cluckindan

9 hours ago

You create a .node-version file and use fnm or nvm, and presto, when you cd into a project dir, the corresponding node version is activated.

Installing a particular node version also becomes as easy as

    fnm install 24

oblio

19 hours ago

The rest of the world handles that through PATH/PATH equivalent.

Either the package manager is invoked with a different PATH (one that contains the desired Node/Java/whatever version as a higher priority item than any other version on the system).

Or the package manager itself has some way to figure that out through its config file.

Or there is a package manager launch tool, just like pyenv or whatever, which does that for you.

In practice it's not that a big of a deal, even for Maven, a tool created 21 years ago. As the average software dev you figure that stuff out a few weeks into using the tool, maybe you get burnt a few times early on for misconfiguring it and then you're on autopilot for the rest of your career.

Wait till you hear about Java's CLASSPATH and the idea of having a SINGLE, UNIFIED package dependency repo on your system, with no need for per-project dependency repos (node_modules), symlinks, or all of that stupidity.

CLASSPATH was introduced by Java in 1996, I think, and popularized for Java dependency management in 2004.

dragonwriter

18 hours ago

> The rest of the world handles that through PATH/PATH equivalent.

Activating a venv is just setting a few environment variables, including PATH, and storing the old values so that you can put them back to deactivate the environment.

oblio

5 hours ago

It's also putting dependencies in a certain folder in a setup that isn't even portable between machines. Probably some other stuff I'm forgetting.

1718627440

18 hours ago

Well, that is how Python does it as well, an venv is a script setting the PYTHONPATH.

ederamen

15 hours ago

Uv is so good. I'm a curmudgeon about adopting new tooling, and tried uv with a lot of skepticism, but it was just better in every way. And even if it wasn't so polished and reliable, the raw speed makes it hard to go back to any other tool.

Uv combined with type hints reaching critical mass in the Python ecosystem, and how solid PyLance is in VSCode, feels so good it has made me consider investing in Python as my primary language for everything. But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

I know the performance meta in Python is to...not use python (bind to C, Rust, JVM) - and you can get pretty far with that (see: uv), but I'd rather spend my limited time building expertise in a language that isn't constantly hemorrhaging resources unless your code secretly calls something written in another language :/

There are so many good language options available today that compete. Python has become dominant in certain domains though, so you might not have a choice - which makes me grateful for these big steps forward in improving the tooling and ecosystem.

liampulles

6 hours ago

I think Python has a place in many developers toolkits. I've never met anyone who hates Python (though I'm sure they exist), whereas for pretty much any other language one could mention there are much more polarizing viewpoints. (as the saying goes "Python is everyone's second favorite programming language").

The Python team needs not feel any pressure to change to compete, Python has already done quite well and found its niche.

bashkiddie

5 hours ago

raises hand I hate python.

I am a user of pip binaries. Every few years one of them breaks.

As far as I understand, developers never cared about pinning their dependencies and python is fast to deprecate stuff.

  $ uvx remt
      Built pygobject==3.54.5
      Built remt==0.11.0
      Built pycairo==1.28.0
  Installed 12 packages in 9ms
  Traceback (most recent call last):
    File "/home/user/.cache/uv/archive-v0/BLXjdwASU_oMB-R4bIMnQ/bin/remt", line 27, in <module>
    import remt
  File "/home/user/.cache/uv/archive-v0/BLXjdwASU_oMB-R4bIMnQ/lib/python3.13/site-packages/remt/__init__.py", line 20, in <module>
    import pkg_resources
  ModuleNotFoundError: No module named 'pkg_resources'

  $ uvx maybe
  × Failed to build `blessings==1.6`
  ├─▶ The build backend returned an error
  ╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status:
      1)

      [stderr]
      /home/user/.cache/uv/builds-v0/.tmpsdhgNf/lib/python3.13/site-packages/setuptools/_distutils/dist.py:289:
      UserWarning: Unknown distribution option: 'tests_require'
        warnings.warn(msg)
      /home/user/.cache/uv/builds-v0/.tmpsdhgNf/lib/python3.13/site-packages/setuptools/_distutils/dist.py:289:
      UserWarning: Unknown distribution option: 'test_suite'
        warnings.warn(msg)
      error in blessings setup command: use_2to3 is invalid.

      hint: This usually indicates a problem with the package or the build
      environment.
  help: `blessings` (v1.6) was included because `maybe` (v0.4.0) depends on
        `blessings==1.6`
I heard rumors from computer vision developers that even libraries deprecate that fast.

liampulles

5 hours ago

Be careful of attributing to python what is really the fault of python lib developers.

Having said that, our team is having to do a bunch of work to move to a new python version for our AWS serverless stuff, which is not something I'd have to worry about with Go (for example). So I agree, there is a problem here.

Sesse__

5 hours ago

> Be careful of attributing to python what is really the fault of python lib developers.

If so, you also cannot attribute to Python the virtues of Python lib developers either (in particular, a large library ecosystem).

liampulles

4 hours ago

Yip. What you are talking about is the language ecosystem.

intrasight

4 hours ago

I've been mostly not using Python since it's arrival in the early 90s. Meaning I tinker with it but don't use it professionally.

beeforpork

4 hours ago

Well, I hate Python. Main points for me: the scoping (or lack of it), the lack of declaration of new identifiers (linked to lack of scoping), the lack of update operators (like `a ||= (i > 0)`), the general lack of conciseness, the `a if b else c` confusion, the weirdness of comprehensions (particularly nested `for`s), the exceptions that are raised for non-exceptional cases. The heaviness of handling exception, like trying to delete a non-existing dict entry (need an additional `if` or a `try` block) or access a non-existing map entry (need to use `.get()` instead of `[]` or a `try` block).

Also, the syntax of layout is bad. I am not talking about layout itself, I do like Haskell syntax (despite being weird about parens). But if I write `a = b +` in Python on one line, then I get a syntax error, although the parser could instead assume that the expression is not terminated and must (obviously) continue on the next (indented) line. I hate that I need to use `\` or `(...)` to make this clear to the parser. I wrote parsers myself, and I know that it knows what needs to follow, and Python itself shows me that it knows: by raising a completely unnecessary syntax error.

It feels to me that the Python language design confuses `simple and easy` with `primitive`. If feels like a design without the knowledge of programming language research and ergonomy. It feels to me like a dangerous toy language, and I am never sure which of my stupid mistakes will be found by the compiler/interpreter, and which will just silently misinterpreted. And which of my perfectly valid requests will be rejected with an exception. In some aspects it feels less safe than C, particularly due to the lack of scoping and the danger of reuse of variables or introduction of new function local variables when actually, outer 'scope' variables were intended to be written.

This is not really meant as a rant, but it is a personal opinion, and I try to lay out my reasons. I am not trying to shame anyone who loves Python, but I just want to clarify that there are people who hate Python.

RamtinJ95

5 hours ago

I hate python and I use it everyday because I work in the data space. Its a toy scripting / glue language that has gotten used for far too much that it was not designed for. The usual suspects are also really annoying, such as white space instead of {}, no types, its so damn slow and all projects use so many packages and the packages use packages etc. That last one could just be a personal preference thing to I will admit, but the rest are just almost objectively bad. Especially when building infrastructure like a data platform.

mrugge

4 hours ago

usually packages use packages in any worthwhile language with useful packages and desire for code reuse...

wiseowise

4 hours ago

It's also funny to hear this about Python which, arguably, has biggest std and amount batteries included out of the box.

dlisboa

4 hours ago

Python's stdlib is not really something to brag about. Very inconsistent, multiple ways of doing the same thing, a lot stuff in there people recommend against actually using...so much cruft acquired over decades.

It's why projects use so many packages in the first place.

cyberpunk

6 hours ago

raises hand I hate python.

Every python codebase i’ve had to look after has rotten to the point it doesn’t even build and is a maintenance nightmare.

I also hate whitespace instead of {}

muyuu

5 hours ago

i resist saying that i hate python because that implies that i don't hate aspects of basically all alternatives (or all that are popular anyway)

like with everything else these days, it's about living with it and try to make the best of the good parts of it

i remember getting told in the 00s that i would get used to and love the whitespace-based block definition, and boy i hate it now more than ever with 1000s of hours spent looking at and coding in python

but it is what it is, for whatever reason it has become a must in many particular industries a lot like Java took over some earlier on although it seems to be fading, and javascript is a must in others

it really isn't just about programming languages that these days you either learn to live with some massive annoyances and practices you may hate, or withdraw entirely from society

cyberpunk

4 hours ago

Oh, don’t get me wrong i don’t refuse to work with it or anything so extreme. These days I end up writing code in any of a handful of languages.

Given the choice though I typically don’t teach for python unless there’s an obvious reason to (some good library for my task or a team i am helping is only python people / devops etc)

liampulles

5 hours ago

These issues are true of most legacy codebases I've worked on in other languages, but I think language design can be a factor here. Do you have any thoughts on if and how Python has led to this rot?

jiggawatts

5 hours ago

Another comment has explained this already, but the lack of dependency pinning and the general "it's just a script" attitude isn't conducive to long-term stability. During the early days of the llama local LLM runner, I was shocked to discover that releases weren't buildable mere days after being tagged in Git! Days!

Other platforms like Java and .NET enjoy one to two decades of life for source before it becomes mildly challenging to build.

wiseowise

4 hours ago

> Other platforms like Java and .NET enjoy one to two decades of life for source before it becomes mildly challenging to build.

Java enjoys months of life for source before it becomes impossible to build, because some prehistoric version of Gradle with shitty Groovy script stopped working.

liampulles

5 hours ago

I think that is a totally fair criticism. There is something to be said about a language being too easy to hack something together with, it's something that makes backend JavaScript coding a long term pain as well I think.

TheFlyingFish

6 hours ago

uv helps a lot with the first problem. Not so much with the second, though.

miki123211

10 hours ago

I wish we had a language that had the syntax of Python (notably including operator overloading, which is absolutely critical for neural networks, ML, data science and numerical computations), the performance, compile times and concurrency support of Go, the type system flexibility of Typescript, and the native platform integration of C/C++.

AyanamiKaine

9 hours ago

Than you would probably like the Nim[1] programming language. It has the syntax of python, but transpiles to C/C++. A good type system. The main problem would probably the compiles times. Because you basically compile just C/C++ code. And of course the eco-system is much much smaller than Python.

[1] https://nim-lang.org/

cb321

5 hours ago

Nim compile+link times can be sub-100 milliseconds with tcc [1] (just saw 93 ms on my laptop) which can yield pretty reasonable edit-compile-test cycles.

The grandparent's scientific/Go interests suggests a need for a large, working ecosystem and there are probably several places in ArrayMancer which need some love for "fallback cases" to work with tcc (or elsewhere in the as you note much smaller Nim eco-system).

EDIT: E.g., for ArrayMancer you need to make stb_image.h work with tcc by adding a nim.cfg/config.nims `passC="-DSTBIW_NO_SIMD -DSTBI_NO_SIMD"` directive, though. And, of course, to optimize compile times, just generally speaking, you always want to import only exactly what you really need which absolutely takes more time/thought.

[1] https://github.com/TinyCC/tinycc

freethinky

4 hours ago

I agree with this sentiment.

I try to learn the basics of new programming language regularly and write a small lisp alike interpreter in it and give myself a maximum of 2 days working on it. It covers things like string handling, regexp, recursion, lambdas, garbage collection, ... and run them through a tiny test suite.

In Python and JS, it was easy to do it and the code was still very readable. In C++, the language I earn my money from, I had a bug I was not able to fix within the given time frame, happening just with gcc not clang, assuming some undefined behavior. In C, I was able to add my own garbage collector with muss less work than I expected ... but

Nim really impressed me, it really felt almost like I wrote it in Python, but an executable which run on its own and being quite a bit faster.

Working mostly in the embedded world, where ecosystem matters somewhat less. If any employee ever would give me a chance to choose a language myself I would definitely try to write a first prototype in Nim.

k__

7 hours ago

I think, Nim is probably the best fit.

It also compiles to JS.

thayne

9 hours ago

Rust doesn't quite hit all of those, but it hits a lot of them.

It's syntax is significantly different from python, but it does have operator overloading.

It's performance is comparable to go, and has good concurrency support, although it is different than go, and there are still some rough edges with "async" code. Compile times aren't as good as go though.

The type system is excellent, although I'm not really sure what you mean by "flexible".

And FFI support is great.

throwup238

9 hours ago

Rust’s compile times are crippling and its type system is easily one of the most rigid of all type systems (lifetimes are part of the type!). The latter is one of Rust’s main selling points because it allows encoding business rules into affine types, but that’s very very far from flexible especially when compared to Typescript (or Python or Haskell and their many ways of polymorphism). Traits add an orthogonal axis of flexibility but they’re still limited by lifetimes (see async_trait and generic associated types and specialization).

“Flexible” means the range from gradual typing (‘any’) to Turing complete conditional types that can do stuff like string parsing (for better or for worse). Structural typing vs instanceof and so on.

There’s really no comparison between Typescript’s type system and Rust’s. It’s worth noting though that Typescript is a bolted on typesystem that has explicitly traded soundness for flexibility. That’s the real tradeoff between Rust and TS IMHO. Rust is sound and expressive but not flexible, while Typescript is expressive and flexible but not sound.

friendzis

8 hours ago

> “Flexible” means the range from gradual typing (‘any’) to Turing complete conditional types that can do stuff like string parsing (for better or for worse).

So the flexibility means one gets to pretend they are doing typing, but in reality they get to sprinkle the code with void casts, because expressing ideas is apparently hard? For better or worse, that is probably the main pillar Rust is designed on.

foresterre

7 hours ago

On compile time for Typescript: when projects have large unioned or conditional types, Typescript's compilation time isn't all that fast, and sometimes even slower than Rust, more often when Rust will be compiling incrementally (I write both Rust and Typescript extensively).

Worse, typescript may even run out of it's allocated memory sometimes.

jmaker

8 hours ago

Also, some prominent projects migrated away from TypeScript to JSDoc type comments due to the transpile times in TypeScript. The type checking task takes the more time the more complex the type-level expressions are. Haskell can also take a long time to compile if you turn on a few extensions and move toward dependent types.

Rust compiles fast if your translation units don’t need too much macro expansion. You add something like Diesel, and you can call for the lunch break.

It’s also worth mentioning Scala with Scala Native and maybe Kotlin with Kotlin/Native. OpenJDK Project Panama FFM now gives a better FFI experiences than JNI.

NeutralCrane

4 hours ago

I feel like if Python’s lack of typing is going to be considered a drawback, then the solution needs to be true typing. Rust’s strict typing is an advantage, not a drawback.

littlestymaar

7 hours ago

> Rust’s compile times are crippling

It's kind of a meme here on HN but while Rust compilation times are indeed higher than I wished they were, calling them “crippling” is a massive exaggeration.

My daily driver is a mid-range desktop computer from 2018 and I regularly code on a laptop from 2012, and even then it's completely manageable: cargo check is nigh instant on both, incremental compilation in debug mode is more than fast enough and even incremental rebuild in release mode and full debug builds are OK-ish (especially since they don't happen as often as the other above). There's only full builds in release mode that are arguably slow in the 2012 laptop (though on the project where it's the biggest problem, the majority of the time is spent compiling a C++ dependency), but then again it's a very obsolete piece of hardware and this isn't supposed to happen more than every six week when you update your compiler toolchain.

throwup238

2 hours ago

I welcome you to try to work on a cxx-qt project to understand what crippling compile times look like.

I’m not memeing here, I’ve struggled with this issue on a variety of different projects since I first started using Rust seven years ago.

ozgrakkurt

7 hours ago

This really depends on what you are working on. As an example, compiling the protobuf stuff can get insanely slow for some reason.

ViewTrick1002

6 hours ago

For any larger project I would recommend working with a cargo workspace to compile as little as possible on each check/test/run/build.

Then you can build a DAG of crates and stick e.g. the Protobuf stuff in its own separate corner where it only needs to be recompiled on full rebuilds or when you work on it.

Feels a bit shitty to have to resort to managing crates instead of modules simply due to the compile times, but it is what it is.

https://doc.rust-lang.org/cargo/reference/workspaces.html

jmaker

6 hours ago

Yeah, second that. This is just release management best practice.

And it makes total sense to me, it’s a way of organizing your dependency graph by the lifetimes of your components.

This will also simplify testing, development, and audit. You won’t need to recompile autogen schemas as often as the business logic implementation anyway. Depending on artifacts pushed through a release pipeline is even more reliable. You can QA everything in guaranteed isolation while keeping it conveniently a workspace monorepo.

lambdaone

5 hours ago

If only someone had a way of automating this! Say perhaps using some kind of digital computer...

littlestymaar

2 hours ago

I wish rustc could automatically build a DAG of “smallest group of modules with cyclic references” and use that as a compilation unit, but that's unfortunately not the case so far and it won't until someone steps up to build such a system.

raverbashing

7 hours ago

Yeah

Go feels like C with training wheels.

Rust feels like riding a bike where one leg pedals the front wheel and another one pedals the back wheel, and you have one handlebar for each wheel as well and a very smart alarm system but it is very clunky to ride (and they tell you it's "flexible")

u8080

5 hours ago

Pretty normal bike for those who used to C++ bikes before.

raverbashing

5 hours ago

Yes, you pedal the C++ bike by squeezing the brake handles and brake by pedaling backwards

hexo

7 hours ago

rust hits none.

antocuni

8 hours ago

This looks exactly what I'm trying to do with SPy -- although SPy is still "not there" and it's WIP. I literally wrote an intro post about it yesterday: https://antocuni.eu/2025/10/29/inside-spy-part-1-motivations...

TheFlyingFish

6 hours ago

SPy looks really interesting! I've run across projects like MyPyc before, but as you say they kill a lot of the "joy" of Python by removing things like decorators and operator overloading.

One question, more out of curiosity than any genuine need: do you (or do you plan to) support any kind of trait/interface based polymorphism? E.g. it's a pretty common idiom to have a function that works on "any iterable type" and that sort of thing, which seems like it would be best modeled as an interface. I guess you could argue that's at odds with Python's tradition of "duck typing" but then again, so is static typing in general so I'm not sure.

antocuni

4 hours ago

I haven't fully decided which route to take in that direction, but I think I'd like to support something similar do "go": you declare an interface (or, in python typing terms a Protocol) and then all the types which have the right methods automatically implement that interface/protocol

lejalv

7 hours ago

This is an enlightening post and I am very excited about SPy.

I only wish discussions happened elsewhere than Discord, e.g. Zulip, where you can have web-public channels, which is great for searchable documentation, and you can interact with channels by email if you so desire.

antocuni

4 hours ago

thank you for the feedback! I wasn't aware of these features of Zulip, there are way too many competing technologies in this space and I miss the days in which everybody was IRC and that's it.

cogman10

4 hours ago

Kotlin on a modern JVM comes pretty close.

* Performance - the JVM is very competitive with C/C++ performance.

* Compile times - Not go fast, but not C/C++/Rust slow.

* Concurrency - Virtual threads (finalized in 21) bring in the concurrency capabilities of go to the JVM

* Type System Flexibility - Kotlin isn't quite as flexible as Typescript, but it's pretty close. It's more flexible than java but not as flexible as scala. I think it strikes a good middle ground.

* Native platform integration - This is probably the weakest part of the JVM but it's gotten a lot better with the work done on Project Panama (mostly delivered in 22). Jextract makes it a lot easier to make native integrations with Java. Definitely not as painful as the JNI days.

There's also kotlin native that you could play around with (I haven't).

CraigJPerry

7 hours ago

Mojo is aiming at that. I've decided it's this years advent of code language for me and kinda looking forward to learning more about it.

wjholden

8 hours ago

I think Julia largely accomplishes these goals except for the platform integration.

jakobnissen

7 hours ago

It also has poor tooling when compared to Python. Julia's package manager is good, and so is it's tools for performance optimisation, but for type checking, app/cli creation, semver checking and IDE integration, the tooling is quite bad. Also, the compile times are shit, and the type system makes it very hard to make a type checker in the first place.

leephillips

3 hours ago

All except maybe the compile times. I read the beginning of the article about the features of Uv; Julia has all that built in.

quaintdev

7 hours ago

I feel so much comfortable with Go's type system than Typescript's

callamdelaney

7 hours ago

Why would operator overloading be absolutely critical? Just use methods. The pipe operator already has around 15 uses and counting, it doesn’t need more

saagarjha

7 hours ago

Because writing a.add(b).divide(c) is miserable

CraigJPerry

7 hours ago

That's a fine approach for "plumbing" type work, you know "join this thing to that thing then call that thing" - and that is most of the code in the world today but it falls apart in math heavy code.

You really just want operators when you're performing tons of operations, it's an absolute wall of text when it's all method calls.

ogogmad

4 hours ago

Instead of writing 2+2, you're suggesting writing 2.plus(2) or plus(2,2).

yard2010

8 hours ago

Also, case matching of Scala <3

movpasd

7 hours ago

You may be interested in Mojo, it's a project by Chris Lattner. It aims to have Python-like syntax and smooth integration with Python but allow Rust-like low-level control (I believe it has a borrow checker). Unfortunately, I believe it's proprietary.

rewgs

8 hours ago

There's Mojo, but it's been a while since I've heard anything about it.

https://www.modular.com/mojo

pansa2

8 hours ago

There’s been a lot of hype around Mojo, but is anyone actually using it?

Does it deliver on the bold claims of its designers?

DeathArrow

5 hours ago

Is it ready to use? Does it have libraries, tools and documentation? Is it usable for anything else beside AI? Does it work on all major platforms?

teiferer

9 hours ago

> But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

Case in point: uv itself is not written in Python. It's a Rust tool.

It always amazes me when people work on an ecosystem for a language but then don't buy enough into that to actually use it to do the work.

Avoidance of dogfooding is a big red flag to me.

wiseowise

9 hours ago

There’s this thing where you work to requirements instead of picking things on vibes, it’s called engineering.

alex_duf

7 hours ago

I understand the argument but the language used for uv (rust) and python don't have the same goal.

Python aims to be simple, not particularly fast (though it is getting faster)

I don't see a problem with that. Pick the language adapted to your problem. Python isn't aiming at solving every problem and that's okay.

pansa2

7 hours ago

> Python aims to be simple

Well, it wildly missed the mark there. Nothing about modern Python is simple. It's a very complex language hiding behind friendly syntax.

whatevaa

7 hours ago

Python performance is poor. Python achieves performance by having as little python code as possible. The runtime and all performant libraries are not in python.

It's ok for IO bound but not for CPU bound work.

rk06

2 hours ago

Interpreted languages are not the best choicce for a variety of software. for eg: Vast amount of performance issues in typescript cli boil down to being written in typescript, and team is porting to golang now.

Tools specifically CLI tools, are best written in statically typed compiled languages.

Epa095

8 hours ago

And that's how languages start optimizing towards being a better language to write compilers in ;-)

It's completely fair for a language to have a niche different that 'quick start-up and runtime'.

PeterStuer

7 hours ago

Different tools for different use cases.

Would you write an assembler IDE in assembler?

I use Python for >90% of my code these days. I love uv for its ux and speed. I 0% care it wasn't written in Python. In fact, making it fully independent of any existing Python environment is a plus for the Python user. No weird bootstrap problems.

It does not make me switch to Rust for my own code.

heinrichhartman

7 hours ago

Well Python Language should be seen as an UI layer for C++/C. Not out of character to use Rust for "heavy lifting"

AdamN

7 hours ago

That's like every language. The language you work in (typically) has all sorts of libraries that are lower level. And even if not it's not like you're code is 'running', it's compiling to lower level code that actually runs and the compilers that do that work are usually black box unless you're a compiler engineer.

The power of Python is that it's high level and very powerful and has a great community and ecosystem of tools/libraries. There's absolutely zero problem and totally a good thing if there are core libraries written in faster languages.

jmaker

5 hours ago

In a similar vein, Microsoft TypeScript team picked Go for the new TypeScript transpiler engine.

wiz21c

7 hours ago

the python interpreter is not written in python :-)

KeplerBoy

6 hours ago

it can be though and pypy is

amelius

6 hours ago

My main question is: how good is it when it breaks? Because with most build/package tools that's when the misery starts.

Jaxkr

15 hours ago

On performance: 3.13 removed the GIL and added experimental first-party JIT (like PyPy).

In two years I bet we’ll be seeing v8 level performance out of CPython.

pansa2

15 hours ago

The “Faster CPython” team were let go from Microsoft because they could only produce a 1.5x speedup in four years instead of the planned 5x.

It’s wildly optimistic to now expect a 10x speedup in two years, with fewer resources.

EmilStenstrom

7 hours ago

Them being let go "because they didn't meet 5x" is hearsay. The only source for that is in social-media commentary and opinion pieces. Microsoft described the layoffs as "organizational adjustments".

danielscrubs

10 hours ago

Wow, know you make me curious about the business processes at Microsoft. Did they see that they would earn more money if the interpreter had a 5x speedup, that they wouldn’t see with 1.5x? Or was it trust broken?

eptcyka

10 hours ago

Instead of generating more revenue, it would drive down costs. You will need less computers to do the same amount of work if the work can be done faster.

Someone

7 hours ago

Lower costs could open new markets, as it would allow you to charge less and still make a profit.

y1n0

14 hours ago

Depends if they are the right resources.

fmbb

10 hours ago

Depends if it’s possible.

spooky_deep

9 hours ago

Python is slow due to design decisions in the language. For example operator dispatch is slow without some kind of static analysis. But this is hindered by how dynamic the language is.

ogogmad

4 hours ago

It's hard to make Python run fast when it pervasively uses duck typing. It makes types only resolvable at runtime. JIT is the only thing that can work here at the moment, but I think that needs to make very similar assumptions to a branch predictor, plus it needs to identify lexical regions (is that what they're called?). People here have criticised PyPy, but I've forgotten why.

heavyset_go

14 hours ago

I'd be surprised if we saw anything more than the 4x speedup from compiling Python with something like Nuitka/mypyc/etc can bring.

I also believe the JIT in v8 and Python are different, the latter relying on copy-and-patch while v8 uses a bunch of different techniques together.

rslashuser

14 hours ago

I would surprised to see performance as good as V8, although that would be great. As I recall the v8 team performed exceptionally well in a corporate environment that badly wanted js performance to improve, and maybe inherited some Hotspot people at the right time.

I'd be quite delighted to see, say, 2x Python performance vs. 3.12. The JIT work has potential, but thus far little has come of it, but in fairness it's still the early days for the JIT. The funding is tiny compared to V8. I'm surprised someone at Google, OpenAI et al isn't sending a little more money that way. Talk about shared infrastructure!

t43562

9 hours ago

pypy is probably faster. Lets put effort into that. BUT the dynamic features that make python lovely are always going to limit its performance.

If you're using python because you have to then you might not like all that and might see it as something to toss out. This makes me sad.

CamouflagedKiwi

4 hours ago

It didn't "remove the GIL". It added an experimental free-threading mode which removes it, but is still considered experimental and not widely used in production yet.

ShroudedNight

13 hours ago

Has something changed that allows a more relaxed refcounting / less eager "gc"? Py_DECREF was what murdered any hope of performance back when we hooked up 3.3 to OMR... Well that and the complete opacity of everything implemented in C

motoboi

15 hours ago

I bet we’ll be seeing python compiled to JVM of getting JVM levels of performance. Much better than v8

necovek

10 hours ago

There have for a long time been IronPython (CLR) and and Jython (JVM).

But, they don't have the full compatibility with CPython, so nobody really picks them up.

CamouflagedKiwi

4 hours ago

Jython seems to be effectively dead though - it only has 2.7 compatibility.

animuchan

9 hours ago

JVM Python exists for the longest time now, where "exists" is purely technical. It's very cursed and bad, keeping in line with the rest of Java-adjacent stack.

homebrewer

6 hours ago

Yet this "Java-adjacent stack" wipes the floor with Python and its ilk w.r.t performance and is what's actually running the world outside of some silicon valley ephemeral unicorns.

rjzzleep

11 hours ago

Am I the only one that's sad that poetry happened before pdm otherwise we might have had pdm as a standard instead of uv, addressing many of the things uv addresses without all the extra bells and whistles that make it cumbersome. I don't like the wedding between package manager and install manager.

... but then again neither pdm nor uv would have happened without poetry.

pjc50

5 hours ago

What is the distinction between "package manager and install manager"?

x187463

4 hours ago

One installs Python packages into a Python installation and the other manages Python installations.

testdelacc1

10 hours ago

How do extra bells and whistles bother you? You had the option to not use them. Like you said yourself, they’re “extra”.

miki123211

10 hours ago

I think in Python specifically, an install manager is absolutely the right call. There's far too much breakage between Python versions.

I recently had to downgrade one of our projects to 3.12 because of a dependency we needed. With uv, I can be sure that everybody will be running the project on 3.12, it just all happens automatically. Without uv, I'd get the inevitable "but your changes crashed the code, have you even tested them?"

ErikBjare

9 hours ago

Honestly I think poetry was a bigger development than uv. I used pipenv before it, and requirements before that, and I can't imagine going back. I've yet to fully embrace uv and migrate away from poetry for that reason (even thought it seems inevitable at this point, there's just no need)

txdv

6 hours ago

Just profile the slow parts and rewrite them in rustm, easy.

ActorNightly

11 hours ago

> But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

Post like these aptly describe why companies are downsizing in lieu of AI assistants, and they are not wrong for doing so.

Yes, Python is "slow". The thing is, compute is cheap these days and development time is expensive. $1000 per month is considered expensive as hell for an EC2 instance, but no developer would work for $12000 a year.

Furthermore, in modern software dev, most of the bottlenecks is network latency. If your total end to end operation takes 200ms mostly because of network calls, it doesn't matter if you code runs in 10 ms or 5ms as far as compute goes.

When it comes to development, the biggest uses of time are

1. Interfacing with some API or tool, for which you have to write code 2. Making a change, testing a change, fixing bugs.

Python has both covered better than any other language. Just today, it took me literally 10 mins to write code for a menu bar for my Mac using rumps python library so I have most commonly used commands available without typing into a terminal, and that is without using an LLM. Go ahead and try to do the same in Java or Rust or C++ and I promise you that unless you have experience with Mac development, its going to take you way more time. Python has additional things like just putting breakpoint() where you want the debugger, jupyter notebooks for prototyping, and things like lazy imports where you use import inside a function so large modules only get loaded when they run. No compilation step, no complex syntax. Multiprocessing is very easy to use as a replacement for threading, really dunno why people want to get rid of GIL so much. Functionally the only difference is overhead in launching a thread vs launching a process, and shared memory. But with multiprocessing API, you simply spin up a worker pool and send data over Pipes, and its pretty much just as fast as multithreading.

In the end, the things that matter are results. If LLMs can produce code that works, no matter how stringy it is, that code can run in production and start making company money, while they don't have to pay you money for multiple months to write the code yourself. Likewise, if you are able to develop things fast, and a company has to spend a bit more on compute, its a no brainer on using Python.

Meanwhile like strong typing, speed, GIL, and other popular things that get mentioned is all just echos of bullshit education that you learned in CS, and people repeat them without actually having any real world experience. So what if you have weak typing and make mistakes - code fails to run or generate correct results, you go and fix the code, and problem solved. People act like failing code makes your computer explode or something. There is no functional difference between a compilation failure and a code running failure. And as far as production goes, there has never been a case of a strong type language that gets used that gets deployed and doesn't have any bugs, because those bugs are all logic bugs within the actual code. And consequently, with Python, its way easier to fix those bugs.

Youtube, Uber, and a bunch of other well used services all run Python backends for a good reason. And now with skilled LLM usage, a single developer can write services in days that would take a team of engineers to write in weeks.

So TL:DR, if you actually want to stay competitive, use Python. The next set of LLMs are all going to be highly specialized smaller models, and being able to integrate them into services with Pytorch is going to be a very valuable skill, and nobody who is hiring will give a shit how memory safe Rust is.

saagarjha

7 hours ago

I write Python all day for work and I run into its issues all the time, be it performance or weak typing or just lack of modern language features. If you’re just vibecoding all day and that’s ok for what you want it to do, all the more power to you, but do at least take a moment to understand that when people want things you see no value in maybe they just have different requirements than you do.

spooky_deep

9 hours ago

The irony here is UV is written in Rust.

no_carrier

5 hours ago

That's not exactly for slowness reasons. The creators of uv have stated that if pip followed the same algorithms they'd see similar performance benefits. People greatly overstate the Python performance penalty.

DeathArrow

9 hours ago

Python is bad for large projects, and it's not just because of speed.

I see it shine for scripts and AI but that's it.

DeathArrow

9 hours ago

>$1000 per month is considered expensive as hell for an EC2 instance, but no developer would work for $12000 a year.

If using Python instead of what we use, our cloud costs would be more than double.

And I can't go to CEO and CFO and explain to them that I want to double the cloud costs (which are already seen as high).

Then, our development speed won't really improve because we have large projects.

That being said, I think using Python for scripting is great in our case.

never_inline

9 hours ago

Nice essat bro. what's your real world experience scaling a python service, and how many DB-backed or computationally otherwise non trivial (select one entry by ID doesn't count) requests does it handle? We want to listen to your hard earned practical wisdom.

typpilol

9 hours ago

Don't you know most slow things are only from apis! /s

never_inline

8 hours ago

For someone writing a whole essay why python's speed is not a problem, one would expect to have worked in workloads where performance actually matters. For most people in this industry, that would be web APIs. Though it's kinda fine for low traffic APIs.

GP comment reeks of textbook "performance doesn't matter" rhetoric.

LeoPanthera

21 hours ago

For single-file Python scripts, which 99% of mine seem to be, you can simplify your life immensely by just putting this at the top of the script:

  #!/usr/bin/env -S uv run --script
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [ "modules", "here" ]
  # ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.

thunky

17 hours ago

> The script now works like a standalone executable

But whoever runs this has to install uv first, so not really standalone.

hshdhdhehd

15 hours ago

And a shell

dmd

14 hours ago

They gotta have a computer too. And a source of power.

rafael-lua

14 hours ago

And my ax... Oh, this is hackernews.

hshdhdhehd

13 hours ago

"I write code and am curious I am a hacker"

"Lol, no I break into computer systems I am a hacker"

"Geeze hell no I have an axe, I am an OG hacker"

Zamiel_Snawley

14 hours ago

I don’t think they need a shell unless uv itself requires it, the shebang is handled by the exec syscall.

hshdhdhehd

14 hours ago

Of course. Hense the bash shebang - the shebang is the step before the shell is used. Thanks.

lgas

14 hours ago

And an operating system

NewJazz

13 hours ago

No, not a shell. Just a /usr/bin/env

TeeMassive

14 hours ago

This is a PEP and not specific to uv: https://peps.python.org/pep-0723/

dragonwriter

11 hours ago

You need a runner for scripts that follow the PEP (actually the packaging standard established initially by the PEP, hence the note about it's historical status.)

The two main runners I am aware of are uv and pipx. (Any compliant runner can be referenced in the shebang to make a script standalone where shebangs are supported.)

NewJazz

13 hours ago

The shebang line references uv.

lxgr

6 hours ago

Is it time for a Debian `alternatives`-style system for PIP 723 compliant Python wrappers yet?

I could totally see `#!/usr/bin/python723` become a thing :)

gre

13 hours ago

Is that a dare? /s

Small price to pay for escaping python dependency hell.

zahlman

20 hours ago

As long as your `/usr/bin/env` supports `-S`, yes.

It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.

This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.

hugmynutus

19 hours ago

> As long as your

linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]

freebsd core utils have supported this since 2008

MacOS has basically always supported this.

---

1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...

pnt12

18 hours ago

I also recommend the flag for a max release date for $current_date - that basically locks all package versions to that date without a verbose lock file!

(sadly, uv cannot detect the release date of some packages. I'm looking at you, yaml!)

d4mi3n

21 hours ago

If I were to put on my security hat, things like this give me shivers. It's one thing if you control the script and specified the dependencies. For any other use-case, you're trusting the script author to not install python dependencies that could be hiding all manner of defects or malicious intent.

This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.

chatmasta

20 hours ago

If you’re executing a script from an untrusted source, you should be examining it anyway. If it fails to execute because you haven’t installed the correct dependencies, that’s an inconvenience, not a lucky security benefit. You can write a reverse shell in Python with no dependencies and just a few lines of code.

1oooqooq

17 hours ago

it's a stretch to "executing a script with a build user" or "from a validated distro immutable package" to "allowing something to download evergreen code and install files everywhere on the system".

teruakohatu

16 hours ago

A vanilla python can write files, edit ~/.zsh to create an sudo alias that executes code next time you invoke sudo and type in your password.

uv installing deps is hardly more risky.

jrnng

16 hours ago

That's sneaky. Do any code scanners check for that class of vulnerability?

Scanning for external dependencies is common but not so much internal private libraries.

saagarjha

7 hours ago

Code scanners cannot protect you from code execution on your machine.

1oooqooq

16 hours ago

point is that a script executes the script in front of you.

uv executes http://somemirror.com/some-version

most people like their distro to vet these things. uv et all had a reason when Python2 and 3 were a mess. i think that time is way behind us. pip is mostly to install libraries, and even that is mostly already done by the distros.

skinner927

18 hours ago

You’re about to run an untrusted python script. The script can do whatever it wants to your system. Dependencies are the least of your worries.

schrodinger

11 hours ago

The script is just a cat or vim away from audit. Its dependencies on the other hand…

schrodinger

3 hours ago

This was very confusing!

I meant it’s easy to inspect your script’s logic — look it. Bunch harder to audit the code in dependencies though…

Hikikomori

7 hours ago

A download and a cat away?

schrodinger

4 hours ago

Sorry I was half asleep! Meant that you can easily look at the code in the script and audit what it does – you can just run `cat` in it and you’re done!

But it’s much harder to inspect what the imports are going to do and be sure they’re free of any unsavory behavior.

maccard

20 hours ago

If that’s your concern you should be auditing the script and the dependencies anyway, whether they’re in a lock file or in the script. It’s just as easy to put malicious stuff in a requirements.txt

gcr

18 hours ago

Would you feel better with a script containing eval(requests.get(“http://pypi.org/foo.py”)) ?

It’s the script contents that count, not just dependencies.

Deno-style dependency version pinning doesn’t solve this problem unless you check every hash.

theamk

20 hours ago

Is there anything new that uv gives you here though?

If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.

golem14

16 hours ago

""" uv is straightforward to install. There are a few ways, but the easiest (in my opinion) is this one-liner command — for Linux and Mac, it’s:

curl -LsSf https://astral.sh/uv/install.sh | sh """

Also isn't great. But that's how homebrew is installed, so ... shrug ... ?

Not to bash uv/homebrew, they are better than most _easy_ alternatives.

caymanjim

11 hours ago

There's a completely irrational knee-jerk reaction to curl|sh. Do you trust the source or not? People who gripe about this will think nothing of downloading a tarball and running "make install", or downloading an executable and installing it in /usr/local/bin.

I will happily copy-paste this from any source I trust, for the same reason I'll happily install their software any other way.

ShroudedNight

13 hours ago

I hate that curl $SOMETHING | sh has become normalized. One does not _have_ to blindly pipe something to a shell. It's quite possible to pull the script in a manner that allows examination. That Homebrew also endorses this behaviour doesn't make it any less of a risky abdication of administrative agency.

But then I'm a weirdo that takes personal offense at tools hijacking my rc / PATH, and keep things like homebrew at arm's length, explicitly calling shellenv when I need to use it.

p_l

20 hours ago

uv can still be redirected to private PyPi mirror, which should be mandatory from security and reliability perspective anyway.

ndr

5 hours ago

You don't have to remember this, instead remember `uv init --script myscript.py`

blueflow

4 hours ago

Thats the same thing?

x187463

4 hours ago

I think the point is you don't have to memorize the boilerplate.

agumonkey

11 hours ago

Yeah, tried it with some rest client + pyfzf (CLI swagger UI sort of), it was really fun. Near instant dependency handling.. pretty cool

moleperson

20 hours ago

Why is the ‘-S’ argument to ‘env’ needed? Based on the man page it doesn’t appear to be doing anything useful here, and in practice it doesn’t either.

zahlman

20 hours ago

> Based on the man page it doesn’t appear to be doing anything useful here

The man page tells me:

  -S, --split-string=S
         process and split S into separate arguments; used to pass multi‐
         ple arguments on shebang lines
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.

See also: https://unix.stackexchange.com/questions/361794

moleperson

20 hours ago

Right, I didn’t think about the shebang case being different. Thanks!

Rogach

20 hours ago

Without -S, `uv run --script` would be treated as a binary name (including spaces) and you will get an error like "env: ‘uv run --script’: No such file or directory".

-S causes the string to be split on spaces and so the arguments are passed correctly.

gcr

18 hours ago

On these systems, wouldn’t binfmt attempt to exec(“/usr/bin/env -S uv run --script”, “foo.py”) and fail anyway for the same reason?

Rogach

8 hours ago

No. The string is split to extract at most one argument. See: https://linux.die.net/man/2/execve

So in fact "-S" is not passed as a separate argument, but as a prefix in the first (and only) argument, and env then extracts it and acts accordingly:

``` $ /usr/bin/env "-S echo deadbeef" deadbeef ```

pseudalopex

13 hours ago

Most systems split at least the 1st space since decades.

globular-toast

19 hours ago

You can get uv to generate this and add dependencies to it, rather than writing it yourself.

XorNot

19 hours ago

I use this but I hate it.

I want to be able to ship a bundle which needs zero network access to run, but will run.

It is still frustratingly difficult to make portable Python programs.

miggol

18 hours ago

I wouldn't be surprised if astral's next product would be something like this. It's so obvious and there would be much interest from the ML crowd.

My current hobby language is janet. Creating a statically linked binary from a script in janet is trivial. You can even bring your own C libraries.

beemoe

12 hours ago

Have you tried Nuitka? It takes a little effort but it can compile your Python program to a single executable that runs without network access.

kardos

21 hours ago

> uv will magically install and use the specified modules.

As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...

tclancy

20 hours ago

And electricity and running water and oh the inconvenience. How is this worse than getting a script file that expects you to install modules?

maccard

20 hours ago

If I download python project from someone on the same network as me and they have it written in a different python version to me and a requirements.txt I need all those things anyway.

dragonwriter

20 hours ago

I mean, if you use == constraints instead of >= you can avoid getting different versions, and if you’ve used it (or other things which combined have a superset of the requirements) you might have everything locally in your uv cache, too.

But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.

gkfasdfasdf

14 hours ago

You can specify python version requirements in the comment, as the standard describes

hardwaregeek

20 hours ago

I gotta say, I feel pretty vindicated after hearing for years how Python’s tooling was just fine and you should just use virtualenv with pip and how JS must be worse, that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing

caconym_

19 hours ago

There is nothing I dread more within the general context of software development, broadly, than trying to run other people's Python projects. Nothing. It's shocking that it has been so bad for so long.

hardwaregeek

19 hours ago

Never underestimate cultural momentum I guess. NBA players shot long 2 pointers for decades before people realized 3 > 2. Doctors refused to wash their hands before doing procedures. There’s so many things that seem obvious in retrospect but took a long time to become accepted

hshdhdhehd

15 hours ago

Hey and you can use both lanes in a zip merge!

eru

12 hours ago

Isn't that the law anyway?

Morale: follow the rules.

tehnub

16 hours ago

>NBA players shot long 2 pointers for decades before people realized 3 > 2

And the game is worse for it :')

terminalshort

15 hours ago

This is a fundamental problem in sports. Baseball is going the same way. Players are incentivized to win, and the league is incentivized to entertain. Turns out these incentives are not aligned.

jancsika

13 hours ago

> Players are incentivized to win, and the league is incentivized to entertain.

Players are incentivized to win due to specific decisions made by the league.

In Bananaball the league says, "practice your choreographed dance number before batting practice." And those same athletes are like, "Wait, which choreographed dance number? The seventh inning stretch, the grand finale, or the one we do in the infield when the guy on stilts is pitching?"

Edit: the grand finale dance number I saw is both teams dancing together. That should be noted.

terminalshort

13 hours ago

Sure. There's a market for that. But the NBA sells a lot more tickets than the Harlem Globetrotters.

kbenson

13 hours ago

But that's a matter of scale. When I was a child, the Harlem Globetrotters were far more more famous than any 3-4 NBA teams combined. They were in multiple Scooby Doo movies/episodes. They failed tp scale the model, but wrestling didn't.

danielmarkbruce

12 hours ago

This isn't right - the league can change the rules. NFL has done a wonderful job over the years on this.

Baseball has done a terrible job, but at least seems to have turned the corner with the pitch clock. Maybe they'll move the mound back a couple feet, make the ball 5.5oz, reduce the field by a player and then we'll get more entertainment and the players can still try their hardest to win.

mh-

12 hours ago

I wonder if anyone has made an engine for simulating MLB play with various rule changes.

Personally, I think it'd be interesting to see how the game plays if you could only have two outfielders (but you could shift however you choose.)

danielmarkbruce

11 hours ago

It's a good thought.

I'd guess MLB The Show video game wouldn't be a bad place to start. They should have a decent simulator built in.

tehnub

12 hours ago

And the ongoing gambling scandal gives credence to a third incentive I've long suspected. Only half joking

yla92

14 hours ago

Is it ? I, for one, enjoy watching the 3s raining down!

peterfirefly

15 hours ago

They did wash their hands. Turns out that soap and water wasn't quite enough. Lister used carbolic acid (for dressing and wound cleaning) and Semmelweis used chlorinated lime (for hand washing).

msla

14 hours ago

And Semmelweis is a perfect case against being an asshole who's right: He was more right than wrong (he didn't fully understand why what he was doing helped, but it did) but he was such a horrible personality and such an amazing gift for pissing people off it probably cost lives by delaying the uptake of his ideas.

But this is getting a bit off topic, I suppose.

llanowarelves

11 hours ago

Or you could say it the other way around: Even leading scientists are susceptible to letting emotions get the best of them and double-down defending their personal investments into things.

"A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it." - Max Planck.

hrimfaxi

13 hours ago

Was soap often used prior to the mid 1800s?

Ferret7446

13 hours ago

That was later; earlier in history doctors (or "doctors" if you so insist) did not wash their hands.

ipaddr

16 hours ago

People paid 100x more for their hosting when using aws cloud until they realized they never neded 99.97% uptime for their t-shirt business. Oh wait too soon. Save for post for the future.

terminalshort

14 hours ago

People paid only 100x more than self hosting to use AWS until they realized that they could get a better deal by paying 200x for a service that is a wrapper over AWS but they never have to think about since it turns out that for most businesses that 100x is like 30 bucks a month.

frde_me

16 hours ago

People spent half their job figuring out self hosted infrastructure until they realized they rather just have some other company deploy their website when they make a commit.

brailsafe

18 hours ago

Most people in NA even spent thousands on cars and drove to get nearly anywhere until they discovered that trains exist... oh wait ;)

IMTDb

18 hours ago

Usually when someone comes with that argument, I ask them to pick any week date in the past year and then I take a random item on my calendar on that day; I give them the time and address of where I need to be as well as the address of my home and I ask them how long it's going to take me and how much it's going to cost. That's usually enough to bring them down a notch from "train work" to "sometimes train work". (But they tend to forget very often, they need to be reminded regularly for some reason). Do you want to play that game with me to get your reality check in order ?

Western Europe in a VERY dense city BTW.

Ukv

17 hours ago

> I give them the time and address of where I need to be [...] That's usually enough to bring them down a notch from "train work" to "sometimes train work" [...] Do you want to play that game with me to get your reality check in order ?

I don't think the implied claim is that there should be specifically a train to every particular address, if that's what you're counting as failure in the game, but rather that with good public transport (including trains) and pedestrian/cyclist-friendly streets it shouldn't be the case that most people need to drive.

_carbyau_

17 hours ago

Cars are so flexible. It's the answer to so many questions outside "how to move one or two people from A to common destination B".

Need to move 3 or 4 people? Driving the car may be cheaper.

Don't want to get rained on? Or heatstroke? Or walk through snow? Or carry a bunch of stuff, like a groceries/familyWeek or whatever else? Or go into the countryside/camping? Or move a differently-abled person? Or go somewhere outside public transport hours? Or, or .. or.

Are there many cases where people should take public transport or ride a bike instead of their car? Obviously yes. But once you have a car to cover the exigent circumstances it is easy to use them for personal comfort reasons.

mr_toad

16 hours ago

> Cars are so flexible.

They’re also a joke when it comes to moving large numbers of people. I can’t imagine the chaos if everyone leaving a concert at Wembley Stadium decided to leave by car.

flerchin

15 hours ago

You wouldn't have to imagine it if you visited Dallas. AT&T stadium has roughly the same capacity as Wembley, and no public transit at all.

tomrod

12 hours ago

Dallas would look very different if they emphasized public transport. Outside of downtown it is so sparse, many of the suburbs suffer from crumbling infrastructure because it turns out pipes made to last 30 years do poorly after 40 to 50 years when all the low density suburbs have aged out and there is no remaining land to subsidize the infrastructure ponzi scheme.

Fort Worth is worse for this!

Strongtowns is definitely worth a listen.

_carbyau_

11 hours ago

Are they crap during peak hour traffic or mass public events? Sure are! They're not some miracle device.

But people claiming that you can live a life without cars don't seem to realise the very many scenarios where cars are often easier and sometimes the only answer.

petre

11 hours ago

Until everyone wants to go from A to B, when a traffic jam happens. If that happens quite often, it might be more convenient to use a bicycle, an umbrella or snow boots.

Greed

16 hours ago

The argument there is a little dishonest, given that if you only had the option of riding public transit that your schedule would indeed be well conformed to using public transit. I think everyone understands VERY well that they could get from point A to point B faster by using a dedicated vehicle which is solely concerned with getting them from point A to point B, that's not really debatable.

In the states at least if you're using public transit it's generally as an intentional time / cost tradeoff. That's not a mystery and taking a point-to-point schedule and comparing that against public transit constraints doesn't really prove much.

seanw444

17 hours ago

The average European mind can't comprehend freedom of movement across vast amounts of open nature.

II2II

16 hours ago

I live in Canada, which is similar to the US in this regard, and I can't believe how enslaved we are to the private automobile.

If you want the freedom to move across vast amounts of open nature, then yeah the private automobile is a good approximation for freedom of mobility. But designing urban areas that necessitate the use of a private vehicle (or even mass transit) for such essentials as groceries or education is enslavement. I don't buy the density argument either. Places that historically had the density to support alternative modes of transportation, densities that are lower than they are today, are only marginally accessible to alternative forms of transportation today. Then there is modern development, where the density is decreased due to infrastructure requirements.

_carbyau_

10 hours ago

To me, "urban planning" has a lot to answer for. They seem to have the foresight of a moth. However, they are probably constrained by politics which is similar.

prayerie

17 hours ago

I’m pretty sure we can comprehend it, we just usually enjoy said freedom of movement in nature on our feet rather than sat in an SUV.

rmunn

14 hours ago

Heard an anecdote about a German engineer who was in California (I think San Francisco, but if it was Los Angeles then the distances involved would be even larger) for meetings with American colleagues, and thought he would drive up to Oregon for a day trip. His American colleagues asked him to take another look at the scale on the bottom right of the map, and calculate the driving time. Once he ran the numbers, he realized that his map-reading instincts, trained in Germany, were leading him astray: the scale of maps he was used to had him thinking it was a 2- or 3-hour drive from San Francisco to Oregon. But in fact it's a 6-hour drive just to get to the Oregon border from SF, and if you want to head deeper into the interior then it's probably 9 to 10 hours depending on where you're going.

So no, I don't think Europeans who haven't been in America have quite absorbed just how vast America is. It stretches across an entire continent in the E-W direction, and N-S (its shortest border) still takes nearly a full day. (San Diego to Seattle is about 20 hours, and that's not even the full N-S breadth of the country since you can drive another 2.5 hours north of Seattle before reaching the Canadian border). In fact, I can find a route that goes nearly straight N-S the whole way, and takes 25 hours to drive, from McAllen, TX to Pembina, ND: https://maps.app.goo.gl/BpvjrzJvvdjD9vdi9

Train travel is sometimes feasible in America (I am planning Christmas travel with my family, and we are planning to take a train from Illinois to Ohio rather than fly, because the small Illinois town we'll be in has a train station but no airport; counting travel time to get to the airport, the train will be nearly as fast as flying but a lot cheaper). But there are vast stretches of the country where trains just do not make economic sense, and those whose only experience is in Europe usually don't quite realize that until they travel over here. For most people, they might have an intellectual grasp of the vastness of the United States, but it takes experiencing it before you really get it deep down. Hence why the very smart German engineer still misread the map: his instincts weren't quite lined up with the reality of America yet, and so he forgot to check the scale of the map.

jodrellblank

12 hours ago

> there are vast stretches of the country where trains just do not make economic sense

There are plenty of city pairs where high speed trains do make economic sense and America still doesn't have them. [1] is a video "56 high speed rail links we should've built already" by CityNerd. And that's aside from providing services for the greater good instead of for profit - subsidizing public transport to make a city center more walkable and more profitable and safer and cleaner can be a worthwhile thing. The US government spends a lot subsidizing air travel.

> So no, I don't think Europeans who haven't been in America have quite absorbed just how vast America is

China had some 26,000 miles of high speed rail two years ago, almost 30,000 miles now connecting 550 cities, and adding another couple of thousand miles by 2030. A hundred plus years ago America had train networks coast to coast. Now all Americans have is excuses why the thing you used to have and tore up is impossible, infeasible, unafordable, unthinkable. You have reusable space rockets that can land on a pillar of fire. If y'all had put as much effort into it as you have into special pleading about why it's impossible, you could have had it years ago.

[1] https://www.youtube.com/watch?v=wE5G1kTndI4

rmunn

10 hours ago

Personally, I'd blame California for American voters' distaste for subsidizing high-speed rail. They look at the massive budget (and time) overruns of California's celebrated high-speed rail, and say "I don't want that waste of money happening in MY state, funded with MY state taxes" and then vote against any proposed projects.

This is, of course, a massively broad generalization, and there will be plenty of voters who don't fit that generalization. But the average American voter, as best I can tell, recoils from the words "high-speed rail" like Dracula would recoil from garlic. And I do believe that California's infamous failure (multiple failures, even) to build the high-speed rail they have been working on for years has a lot to do with that "high-speed rail is a boondoggle and a waste of taxpayer dollars" knee-jerk reaction that so many voters have.

seanw444

17 hours ago

Good luck reaching the good remote spots from a train.

Dylan16807

16 hours ago

Focusing on remote spots is largely a different topic. If the majority of driving was to remote spots then we'd have 90% less driving and cars wouldn't be a problem.

cruffle_duffle

16 hours ago

Honestly people really just dont understand how far apart things are. And yeah the good remote spots are a 4 hour drive from the city (and you aren’t even half way across the state at that point).

The forests and wilderness of the PNW are much, much, much, much more remote and wild than virtually anywhere you’d go in Europe. Like not even close.

virgildotcodes

15 hours ago

It seems like people are just talking past each other here. The fact is that 99% of driving is not done by people in the process of visiting remote nature destinations.

yazantapuz

4 hours ago

Also, the USA is not the only big country in the world... I live in a small city in Patagonia. The nearest towns are 60 km, 90 km, and 480 km away. But you can still live without a car in the city.

luqtas

12 hours ago

they can't also realize a country that ditches personal vehicles can invest in buses or more trains to "remote places". nor they realize the vehicle industry is one of the biggest pollutants on micro-plastic; which screws the "remote nature" as well our health

asielen

15 hours ago

Great so train to major destinations and then rent a car from there.

_carbyau_

10 hours ago

In the future, I hope this becomes a thing. As cars become more commodotised and self driving taxis can be ordered easily maybe there'll be competing mass fleets?

Or have a "car-cabin-without-engine-and-wheels" and treat it like a packet on a network of trains and "skateboard car platforms".

coldtea

15 hours ago

The average american mind can't comprehend this works out to a huge number of them having to commute by car 1-2 hours per day to get to work in some ungodly urban sprawl while living an alienated existence in crappy suburbs, and destroying the environment while doing so. At the same time working far more, slaving year round with laughable paid vacation time or sick day provisions, while being subjected to far worse homicide rates, and being treated as subjects by cops.

Such "freedom"...

_345

12 hours ago

No I love being stuck in traffic every day of the week for hours, its totally worth it because I can drive to an empty patch of grassland that no one wants to go to and there's nothing there. That's why cars are so amazing and freedom granting. Trains can't take you to the middle of nowhere to do nothing for the 1% of the time you don't want to be near other civilization so cars are better

coldtea

8 hours ago

lol, yeah. Meanwhile they can't even comprehend that it's a false dillema: Europeans have cars just fine, even several per family.

They just don't have to use them all the time since they can take the more efficient public transport, and they can buy one after college even, they don't need to drive one from 16 yo just to be able to get around...

jank199x

12 hours ago

I believe Russians have something to say on that, though.

Hikikomori

4 hours ago

Is this satire? In the nordics we have allemansrätten, the right to use even private land to camp as long as you're not too close to where someone lives, not to mention huge national parks. In the US you have the right to get shot if you enter private land.

moss_dog

17 hours ago

Are you arguing that trains are infeasible (due to cost or duration) for certain trips?

I'm curious how this changes (in your mind) if "trains" can be expanded to "trains, buses, bicycle", or if you consider that to be a separate discussion.

echelon

17 hours ago

I live in Atlanta.

The Atlanta Metro has 6.5 million people across TWENTY THOUSAND square kilometers.

Trains just don't make sense for this. Everything is too spread out. And that's okay. Cites are allowed to have different models of transportation and living.

I like how much road infra we have. That I can visit forests, rivers, mountains, and dense city all within a relatively short amount of time with complete flexibility.

Autonomous driving is going to make this paradise. Cars will be superior to trains when they drive themselves.

Trains lack privacy and personal space.

kataklasm

15 hours ago

The German metro area "Rheinland" has a population of 8.7 million people across 12 thousand square kilometers. ~700/sqkm vs the 240/sqkm population density of Atlanta metro. Train and metro travel in this metrk area is extremely convenient and fast. It's not that Atlanta (or anywhere else in the United States for that matter) couldn't do it because of vastness, there's just no political and societal will behind this idea. In a society that glamorizes everyone driving the biggest trucks and carrying the largest rifles, of course convenient train systems are "not feasible".

schrodinger

12 hours ago

I'm not following your logic. Having nearly triple the population density in Rheinland makes trains way _more_ feasible, not _less_. That means on average you have a train 1/3 the distance away from you. That's a big difference.

I live in NYC which has 29,000/sqkm in Manhattan and 11,300/sqkm overall. Public transportation is great here and you don't need a car.

but at 240/sqkm, that's really not much public trans per person!

schrodinger

11 hours ago

Rule 35 of the internet? Every discussion will eventually devolve into the United State's horrible usage (or lack thereof) of public transportation.

emmelaich

15 hours ago

And it loses money. And doesn't it have time reliability issues?

irowe

14 hours ago

The exact same comment could be made of Atlanta's roads.

How did we get here from the post about uv?

echelon

12 hours ago

This did veer very far from uv!

I'm so stoked for what uv is doing for the Python ecosystem. requirements.txt and the madness around it has been a hell for over a decade. It's been so pointlessly hard to replicate what the authors of Python projects want the state of your software to be in.

uv has been much needed. It's solving the single biggest pain point for Python.

qwertytyyuu

12 hours ago

roads also lose a lot of money, and that's fine. Public infrasturcture doesn't need to make money

jodrellblank

13 hours ago

Is your car a profitable investment?

Public transport is to move people around, not to make money.

schrodinger

12 hours ago

Having replied in good faith already, I also want to call out that your jab about trucks and rifles adds nothing to the conversation and is merely culture-war fuel.

> Please don't use Hacker News for political or ideological battle. It tramples curiosity.

> Eschew flamebait. Avoid generic tangents. Omit internet tropes.

h33t-l4x0r

7 hours ago

It seems like a fair point to me. You can't bring your rifle on the train but you can bring it in your truck. Whether or not that shapes Atlanteans choice of transport I can't say though.

schrodinger

3 hours ago

Fair point perhaps, but was clearly intended as sarcasm:

> a society that glamorizes everyone driving the biggest trucks and carrying the largest rifles

thaumasiotes

12 hours ago

> The German metro area "Rheinland" has a population of 8.7 million people across 12 thousand square kilometers. ~700/sqkm vs the 240/sqkm population density of Atlanta metro. Train and metro travel in this metrk area is extremely convenient and fast. It's not that Atlanta (or anywhere else in the United States for that matter) couldn't do it because of vastness

Did you forget to support yourself? You're saying Rheinland has three times the population density of Atlanta, with convenient passenger rail, and that demonstrates that low population density isn't an obstacle to passenger rail in Atlanta?

dpc050505

17 hours ago

I'll happily play your game with a bicycle.

ipaddr

16 hours ago

Great lets pick Canada in January. Bring a shovel.

halostatue

13 hours ago

Don't need one in Toronto within a ½ day or so of the snow stopping for the major bicycle routes (including the MGT).

Calgary apparently also does a good job of clearing its bike lanes.

And I do my Costco shopping by bike year-round. I think I've used the car for large purchases at Costco twice in the last year.

I _rarely_ drive my car anywhere in Toronto, and find the streets on bike safer than most of the sidewalks in January -- they get plowed sooner than most homeowners and businesses clear the ice from their sidewalks.

And in Toronto we're rank amateurs at winter biking. Look at Montreal, Oslo, or Helsinki for even better examples. Too bad we've got a addle-brained carhead who doesn't understand public safety or doing his own provincial as our premier.

refactor_master

14 hours ago

Just to add a less opinionated take: https://www.citymonitor.ai/analysis/why-winter-is-a-poor-arg...

Personally I've also biked to work (and everywhere, really) in sub-zero degrees many times, because the bicycle lanes are cleared and salted. It's really not too bad. It actually gets a bit too hot even, because you start out by wearing so much.

halostatue

13 hours ago

In cold weather, one should always dress for 5℃ warmer than the temperature outside when you have a bike longer than 5 km. Runners pretty much have to do the same. Your body heat and good layering will take care of everything else.

thaumasiotes

11 hours ago

Personally I've also biked to work (and everywhere, really) in sub-zero degrees many times, because the bicycle lanes are cleared and salted.

I used to bike to work in just-above-freezing temperatures. That wasn't so bad.

The one time it started to rain mid-journey, that was bad.

nickserv

16 hours ago

They don't clear snow from cycle paths in Canada? If not then it's an infrastructure problem, not a weather problem.

littlestymaar

17 hours ago

> and how much it's going to cost

Depending how expensive is gasoline in your country, when using a car people underestimate the cost of a travel by a factor two to five, because they don't count the depreciation of their vehicle's value and the maintenance cost (and sometimes even insurance price) driven by the kilometers ridden during the trip.

croes

16 hours ago

By that logic cars work also turns into sometimes cars work. Ever heard of traffic jams and have you compared the number of fatal car accidents vs fatal train accidents. Not to mention the negative effect on air quality with many cars in dense cities. Cars main advantage is flexibility and that’s it. For times were the place and time usually stays the same like work, trains are a valid option.

trueismywork

18 hours ago

People in Europe spents years with people dying due to heat stress before they discovered ACs....

johnisgood

5 hours ago

I've wanted an AC, but I cannot get one. This apartment is just not equipped for it, it literally cannot handle multiple ACs (the cables are old, yadda yadda).

And you have to get lots of permits to have an AC installed legally. If you do not have a permit, you will have to pay a really hefty fee when the inspectors come.

So yeah, buying an AC is what most people would do, but they do not because of the damn permits they most likely will not get. It is a shitty situation.

PaulDavisThe1st

17 hours ago

This isn't really true. Heat stress deaths in Europe are comparitively rare, or were until urbanization and climate change became bigger factors.

seanw444

17 hours ago

They still do. More Europeans die every year from heat-related injuries than Americans do from guns.

anthk

16 hours ago

Spaniard here. Don't lecture Southern Europeans on surviving heat when the church of the village of my parents predates America itself (and it's pretty fresh inside in Summer).

terminalshort

10 hours ago

Always sucks when you're arguing with someone and it turns out the buildings in their town are older than yours. Sometimes you just gotta take the L.

MobiusHorizons

15 hours ago

This is not people’s fault individually, but rather in aggregate (ie government). The places that have good train infrastructure that is legitimately an alternative to driving are very few and far between in the US. It’s just not an option for most people. And people can’t just all move to the places where it is an option, because housing and jobs are already strained in those places negating many of the benefits.

taneq

13 hours ago

Have you considered that the repeated attempts to reinvent what's basically trains are not, in fact, evidence that people don't know about trains, but evidence that people like the advantages of trains but that the downsides suck so bad that people will pay literally tens of thousands of dollars a year to avoid them?

bongodongobob

16 hours ago

Yeah all you need to do is raze and rebuild every city in America and it will work great!

rustystump

18 hours ago

Wrong kind of cheek my friend

DarmokJalad1701

18 hours ago

People in Europe spent years walking to the store everyday for food until they discovered that mechanical refrigeration exists...

CharlieDigital

18 hours ago

Something I think that goes underappreciated: in many parts of the world, the food supply chain is shorter and the food is fresher to begin with. You're not meant to shop for 14 days at a time; you're meant to go more frequently and get what you need, fresh.

supportengineer

18 hours ago

Bad example. Walking to the store everyday for fresh food would be a drastic improvement for most Americans.

tialaramex

17 hours ago

The refrigerator is a relatively modern invention. There's always been a refrigerator for me, but as a child my mother sometimes stayed with people who didn't own one and for her mother they were a new invention many people didn't have.

Actually this idea of just buying things at "the store" is relatively new too. Historically people would make more things themselves, and more food would be purchased directly from farmers who had grown it.

tomaskafka

18 hours ago

So many times I have come onto a library or tool that would fix my problem, and then realized “oh crap, it’s in Python, I don’t want to spend few hours building a brittle environment for it only for that env to break next time I need to use it” - and went to look for a worse solution in better language.

ghusto

18 hours ago

I really don't get this. I can count on no hands the number of times I've had problems simply going "pip install cool-thing-i-found".

Sure, this is just my experience, but I use Python a lot and use a lot of tools written in Python.

wongarsu

18 hours ago

If you can install it with `pip install program-name` it's usually packaged well enough to just work. But if it's a random github repository with a requirements.txt with no or very few version numbers chances are that just running `pip install -r requirements.txt` will lead you down an hour+ rabbit hole of downgrading both your venv's python version and various packages until you get a combination that is close enough to the author`s venv to actually work

Usually happens to me when I find code for some research paper. Even something that's just three months old can be a real pain to get running

optionalsquid

17 hours ago

I don't disagree with you, but in my experience even having a requirements.txt file is a luxury when it comes to scientific Python code: A lot of the time I end up having to figure out dependencies based purely on whatever the script is importing

nickserv

16 hours ago

If they can't be bothered to make a requirements.txt file, I'm not seeing how uv will be of much help...

catlifeonmars

13 hours ago

uv basically makes that a default. You don’t need to be bothered. Just uv add your dependencies and they are in your pyproject.toml.

optionalsquid

7 hours ago

Or use `uv add --script`. Then dependencies gets recorded in the script itself, which is great for single-file scripts. But obviously, that won't help if the author can't be bothered to take those steps

ghusto

8 hours ago

Ah, I get it now! The problem occurs when someone publishes something without version pinning, because package versions can become incompatible over time. I don't think I've ever installed something outside of what's available on PyPy, which is probably why I've never run into this issue.

Still, I would think it's rare that package versions of different packages become incompatible?

IgorPartola

18 hours ago

Seconded. Python, even with virtualenv stuff, has never been bad. There have been a few things that have been annoying especially when you need system libraries (e.g. libav for PyAV to work, etc.), but you have the same issue with every other ecosystem unless the packages come with all batteries included.

To be fair to the GP comment, this is how I feel about Ruby software. I am not nearly as practiced at installing and upgrading in that ecosystem so if there was a way to install tools in a way that lets me easily and completely blow them away, I would be happier to use them.

virtue3

18 hours ago

I still have nightmares about nokogiri gem installs from back in the day :/

antod

15 hours ago

Shudder. I'm guessing it was the always breaking libxml2 compilation step right?

zelphirkalt

16 hours ago

This mentality is exactly what many people do wrong in Python. I mean, for a one-off, yes you can have setup instructions like that. But if you want things to work for other people, on other machines, you better include a lock file with checksums. And `pip install whatever` simply does not cut it there.

ghusto

8 hours ago

Except I'm saying my experience is the opposite of the problem you purport. I (as the consumer) have always done "pip install whatever", and have never run into issues.

One of the commentors above explained what the problem really is (basically devs doing "pip install whatever" for their dependencies, instead of managing them properly). That's more a problem of bad development practices though, no?

dragonwriter

18 hours ago

Recently (like for several years), with most packages providing wheels for most platforms, it tends to be less of a problem of things actually working, except for dependencies where the platform specifiers used by Python are insufficient to select the right build of the dependency, like PyTorch.

fluoridation

17 hours ago

Recently I've been playing with Chatterbox and the setup is a nightmare. It specifically wants Python 3.11. You have 3.12? TS. Try to do pip install and you'll get an error about pkg-config calling a function that no longer exists, or something like that.

God, I hate Python. Why is it so hard to not break code?

hamandcheese

16 hours ago

> pip install cool-thing-i-found

This is the entire problem. You gonna put that in a lock file or just tell your colleagues to run the same command?

ghusto

8 hours ago

I meant I'm running that command as the consumer, and have never had problems. When I make my own packages, I ensure that anyone doing the same thing for my package won't have issues by using version pinning.

thaumasiotes

11 hours ago

Having packages in a package manager is the problem?

zzzeek

15 hours ago

I know, this is just how it is I guess . Those of us mystified what the big problem is with virtualenv and pip and why we all have to use a tool distributed by a for profit company and it's not even written in python will just have to start a little club or something

I guess this is mostly about data science code and maybe people who publish software in those communities are just doing very poor packaging, so this idea of a "lock file" that freezes absolutely everything with zero chance for any kind of variation is useful. Certainly the worst packaged code I've ever seen with very brittle links to certain python versions and all that is typically some ML sort of thing, so yeah.

This is all anathema to those of us who know how to package and publish software.

caycep

17 hours ago

like democracy, it's the worst programming language except vs everything else...

throwaway2037

16 hours ago

This comment is pithy, but I reject the sentiment.

In 2025, the overall developer experience is much better in (1) Rust compared to C++, and (2) Java/DotNet(C#) compared to Python.

I'm talking about type systems/memory safety, IDEs (incl. debuggers & compilers), package management, etc.

Recently, I came back to Python from Java (for a job). Once you take the drug of a virtual machine (Java/DotNet), it is hard to go back to native binaries.

Last, for anyone unfamiliar with this quote, the original is from Winston Churchill:

    Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.

tehnub

16 hours ago

How come it's easier if the tool is in another language? What are the technical (or cultural) reasons? Do most C programs use static linking, or just not have deps?

caconym_

15 hours ago

When I need to build an established project written [mostly] in C or C++, even if I don't have the dependencies installed, it's typically just a matter of installing my distro's packages for the deps and then running configure and make, or whatever. It usually works for me. Python almost never does until I've torn half my hair out wrapping my brain around whatever new band-aid bullshit they've come up with since last time, still not having understood it fully, and muddled through to a working build via ugly shortcuts I'm sure are suboptimal at best.

I don't really know why this is, at a high level, and I don't care. All I know is that Python is, for me, with the kinds of things I tend to need to build, the absolute fucking worst. I hope uv gets adopted and drives real change.

My last dance with Python was trying to build Ardupilot, which is not written in Python but does have a build that requires a tool written in Python, for whatever reason. I think I was on my Mac, and I couldn't get this tool from Homebrew. Okay, I'll install it with Pip—but now Pip is showing me this error I've never seen before about "externally managed environments", a concept I have no knowledge of. Okay, I'll try a venv—but even with the venv activated, the Ardupilot makefile can't find the tool in its path. Okay, more googling, I'll try Pipx, as recommended broadly by the internet—I don't remember what was wrong with this approach (probably because whatever pipx does is totally incomprehensible to me) but it didn't work either. Okay, what else? I can do the thing everybody is telling me not to do, passing `--break-system-packages` to plain old Pip. Okay, now the fucking version of the tool is wrong. Back it out and install the right version. Now it's working, but at what cost?

This kind of thing always happens, even if I'm on Linux, which is where I more usually build stuff. I see errors nobody has ever posted about before in the entire history of the internet, according to Google. I run into incomprehensible changes to the already incomprehensible constellation of Python tooling, made for incomprehensible reasons, and by incomprehensible I mean I just don't care about any of it, I don't have time to care, and I shouldn't have to care. Because no other language or build system forces me to care as much, and as consistently, as Python does. And then I don't care again for 6 months, a year, 2 years, until I need to do another Python thing, and whatever I remember by then isn't exactly obsolete but it's still somehow totally fucking useless.

The universe has taught me through experience that this is what Python is, uniquely. I would welcome it teaching me otherwise.

Multicomp

19 hours ago

I agree with you wholeheartedly, besides not preferring dynamic programming languages, I would in the past have given python more of a look because of its low barrier to entry...but I have been repulsed by how horrific the development ux story has been and how incredibly painful it is to then distribute the code in a portable ish way.

UV is making me give python a chance for the first time since 2015s renpy project I did for fun.

zelphirkalt

16 hours ago

That's because many people don't pay attention to reproducibility of their developed software. If there is no lock file in a repo that nails the exact versions and checksums, then I already know it's likely gonna be a pain. That's shoddy work of course, but that doesn't stop people from not paying attention to reproducibility.

One could argue, that this is one difference between npm and such, and what many people use in the Python ecosystem. npm and cargo and so on are automatically creating lock files. Even people, who don't understand why that is important, might commit them to their repositories, while in the Python ecosystem people who don't understand it, think that committing a requirements.txt only (without checksums) is OK.

However, it is wrong, to claim, that in the Python ecosystem we didn't have the tools to do it right. We did have them, and that well before uv. It took a more care though, which is apparently too much for many people already.

xenophonf

13 hours ago

The lock file shouldn't be in the repository. That forces the developers into maintenance that's more properly the responsibility of the CI/CD pipeline. Instead, the lock file should be published with the other build artifacts—the sdist and wheel(s) in Python's case. And it should be optional so that people who know what they're doing can risk breaking things by installing newer versions of locked dependencies should the need arise.

kortilla

8 hours ago

It absolutely should be. Otherwise you don’t have reproducible builds.

xenophonf

5 hours ago

You can reproduce the release just fine using the lock file published alongside the release. Checking it in creates unnecessary work for devs, who should only be specifying version constraints when absolutely necessary.

acomjean

19 hours ago

You aren’t kidding. Especially if it’s some bioinformatics software that is just hanging out there on GitHub older than a year…

throwaway2037

17 hours ago

Do you think bioinformatics libs written in C++ do not have the same issues?

acomjean

12 hours ago

They’re weren’t that many that weren’t pre compiled for Linux in the c++ world. Python is bad, but others have issues too.

C/C++ often had to compile used “make” which I’ll admit to being better at the conda/pip.

I suspect this is because the c/c++ code was developed by people with a more comp Sci background. Configure/make/make install..I remember compiling this one.

https://mafft.cbrc.jp/alignment/software/source.html

If the software made it biogrids life was easier

https://biogrids.org/

But a lot of the languages had their own quirks and challenges (Perl cpan, Java…). Containerization kinda helps.

caycep

17 hours ago

I mean, I think this is par for the course by anything written by a grad student. Be thankful it's not written in matlab

lacker

18 hours ago

The only thing I dreaded more was trying to run other people's C++ projects.

peterfirefly

15 hours ago

vcpkg seems to help a lot there, at least for Windows code and msbuild/Visual Studio.

oivey

13 hours ago

Which means you’re already generally in worse shape than Python. At least Python’s half baked packaging systems try to be multi-platform.

intalentive

17 hours ago

I used to think this sentiment was exaggerated. Then I tried installing Dots OCR. What a nightmare, especially when NVIDIA drivers are involved.

lynndotpy

19 hours ago

I was into Python enough that I put it into my username but this is also my experience. I have had quasi-nightmares about just the bog of installing a Python project.

the__alchemist

18 hours ago

Same! And Python was my first, and is currently my second-highest-skill language. If someone's software's installation involves Python, I move on without trying. It used to be that it would require a Python 2 interpreter.

Honorable mention: Compiling someone else's C code. Come on; C compiles to a binary; don't make the user compile.

optionalsquid

17 hours ago

There's a lot more involved in distributing C (and C++) programs than just compiling them:

I'm assuming a Linux based system here, but consider the case where you have external dependencies. If you don't want to require that the user installs those, then you gotta bundle then or link them statically, which is its own can of worms.

Not to mention that a user with an older glibc may not be able to run your executable, even if they have your dependencies installed. Which you can, for example, solve by building against musl or a similar glibc alternative. But in the case of musl, the cost is a significant overhead if your program does a lot of allocations, due to it lacking many of the optimizations found in glibc's malloc. Mitigating that is yet another can of worms.

There's a reason why tools like Snap, AppImage, Docker, and many more exist, each of which are their own can of worms

the__alchemist

16 hours ago

Yea def. I think Linux's ABI diaspora and the way it handles dependencies is pain, and the root behind both those distro methods you mention, and why software is distributed as source instead of binaries. I contrast this with Rust. (And I know you can do this with C and C++, but it's not the norm:

  - Distribute a single binary (Or zip with with a Readme, license etc) for Windows
  - Distribute a single binary (or zip etc) for each broad Linux distro; you can cover the majority with 2 or 3. Make sure to compile on an older system (Or WSL edition), as you generally get forward compatibility, but not backwards.
  - If someone's running a Linux distro other than what you built, they can `cargo build --release`, and it will *just work*.

optionalsquid

16 hours ago

Another nice thing is that, if you can live with the slower musl malloc, then building a "universal" Linux binary with Cargo takes just two commands:

$ rustup target add x86_64-unknown-linux-musl

$ cargo build --target x86_64-unknown-linux-musl --release

Similarly for cross-compiling for Windows

tyingq

11 hours ago

It may be fixed now, but devil's in the details. As one example, musl has (or had) chronic issues with it's dns resolver and large responses.

mr_toad

16 hours ago

> Honorable mention: Compiling someone else's C code. Come on; C compiles to a binary; don't make the user compile.

Unless you’re on a different architecture, then having the source code is much more useful.

peterfirefly

15 hours ago

Or often just the same architecture with a slightly different OS version.

zippergz

17 hours ago

I dread running my own Python projects if I haven't worked with them in a while.

TheCondor

18 hours ago

How about shipping one? Like even just shipping some tools to internal users is a pain

jkercher

6 hours ago

Couldn't agree more. I have a project at work from 2016 that builds multiple different HMIs (C++) along with 2 embedded systems (C). They all have to play nicely with each other as they share some structures and can all be updated in the field with a single file on a USB stick. So there is a bash script that builds everything from a fresh clone, makes update files, and some other niceties. Then, there is a single python script that generates a handful of tables from a json file.

Guess which part of the build I spent fixing the other day... It wasn't the ~200000 lines of c/c++ or the 1000+ line bash script. No. It was 100 lines of python that was last touched 2 years years ago. Python really doesn't work as a scripting language.

luckydata

18 hours ago

The python community was in profound denial for a very long time.

dataflow

15 hours ago

Not even trying to compile build other people's C/C++ projects on *nix?

kristopolous

16 hours ago

I really don't understand this. I find it really easy.

RobertoG

19 hours ago

pfff... "other people projects".. I was not even able to run my own projects until I started using Conda.

LtWorf

17 hours ago

Just stick to what's in your linux distribution and you've got no problems.

esseph

17 hours ago

No need, run python as a container. No need to mix what's installed on the hostOS.

https://hub.docker.com/_/python

1oooqooq

17 hours ago

this manages to be even worse. since it's setup full of holes to usable (eg reaching out on the filesystem), you get the worst of random binaries without isolation, plus the dead end for updates you get in practice when dealing with hundreds of containers outside of a professionally managed cluster.

mk89

19 hours ago

I have used

pip freeze > requirements.txt

pip install -r requirements.txt

Way before "official" lockfile existed.

Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.

Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.

icedchai

19 hours ago

That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.

mk89

19 hours ago

Not sure how uv helps here, because I am not very familiar with it.

With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?

kstrauser

17 hours ago

> it won't work if it's not compatible

This is very new behavior in pip. Not so long ago, imagine this:

You `pip install foo` which depends on `bar==1.0`. It installs both of those packages. Now you install `pip install baz` which depends on `bar==2.0`. It installs baz, and updates bar to 2.0. Better hope foo's compatible with the newer version!

I think pip only changed in the last year or two to resolve conflicts, or die noisily explaining why it couldn't be done.

pridkett

19 hours ago

Simple for simple cases - but you update a dependency and that updates a dependency that has a window range of dependencies because one version had a security issue which causes you to downgrade three other packages.

It can get complicated. The resolver in uv is part of its magic.

https://docs.astral.sh/uv/reference/internals/resolver/

noosphr

18 hours ago

JavaScript has truly rotted the brains of software developers.

You include the security patch of whatever your dependencies are into your local vetted pypi repository. You control what you consider liabilities and you don't get shocked by breakages in what should be minor versions.

Of course you have to be able to develop software and not just snap Lego's together to manage a setup like that. Which is why uv is so popular.

throw-the-towel

17 hours ago

You're implying that I have to run a local Pypi just to update some dependencies for a project? When other languages somehow manage without that? No way I'm doing that.

icedchai

17 hours ago

Some organizations force you to use their internal dependency repos because the "IT department" or similar has blessed only certain versions in the name of "security" (or at least security theater.)

Inevitably, these versions are out-of-date. Sometimes, they are very, very out of date. "Sorry, I can only install [version from 5 years ago.]" is always great for productivity.

I ran into this recently with a third-party. You'd think a 5 year old version would trigger alarm bells...

noosphr

9 hours ago

I use 30 year old software regularly. Newer doesn't mean working.

icedchai

3 hours ago

Sure. I do a lot of retrocomputing and that's fine. I have OSes from the 80's running in emulators.

But when you're developing software, you want the newer stuff. Would you use MySQL 5.0 from 2005? No, you'd be out of your mind.

Capricorn2481

18 hours ago

You can make it a language flame war, but the Python ecosystem has had no problem making this bed for themselves. That's why people are complaining about running other people's projects, not setting up their own.

Sensible defaults would completely sidestep this, that's the popularity of uv. Or you can be an ass to people online to feel superior, which I'm sure really helps.

9dev

18 hours ago

Im wondering if people like you are getting paid to vet other people’s libraries? Because with every modern project I have ever seen, you can’t do too much the rest of the day with the amount of library updates you have to be vetting.

Capricorn2481

17 hours ago

He's a consultant. Making everyone else sound incompetent is part of the gig.

oivey

13 hours ago

Cool so how does that work when you’re writing a library that you want to distribute to other people?

zamalek

15 hours ago

> Not sure how uv helps here, because I am not very familiar with it.

Which makes you part of the people the GP is referring to? Try using it anger for a week, you'll come to understand.

It's like Sisyphus rolling a cube up a hill and being offered a sphere instead: "no thanks, I just push harder when I have to overcome the edges."

auraham

19 hours ago

Can you elaborate on this? How is npm/cargo/etc better than pip on this regard?

As far as I know, files like requirements.txt, package.json, cargo.toml are intended to be used as a snapshot of the dependencies in your project.

In case you need to update dependency A that also affects dependency B and C, I am not sure how one tool is better than other.

zelphirkalt

15 hours ago

Open a requirements.txt and a package.lock.json next to each other and compare. Then you will know the answer to the question what npm, cargo, and others are doing better than pip. Oh, did I sneek a ".lock" in there? Damn right I did.

jeremyjh

19 hours ago

They will resolve a version that works for all dependencies if it exists.

morshu9001

15 hours ago

Even more importantly, uv forces you to do it right like npm always did

halostatue

13 hours ago

npm did not always do it right, and IMO still does not do it completely right (nor does pnpm, my preferred replacement for npm -- but it has `--frozen-lockfile` at least that forces it to do the right thing) because transitive dependencies can still be updated.

cargo can also update transitive dependencies (you need `--locked` to prevent that).

Ruby's Bundler does not, which is preferred and is the only correct default behaviour. Elixir's mix does not.

I don't know whether uv handles transitive dependencies correctly, but lockfiles should be absolute and strict for reproducible builds. Regardless, uv is an absolute breath of fresh air for this frequent Python tourist.

debazel

9 hours ago

npm will not upgrade transient dependencies if you have a lockfile. All the `forzen-lockfile` or `npm ci` commands does is prevent upgrades if you have incompatible versions specified inside of `package.json`, which should never happen unless you have manually edited the `package.json` dependencies by hand.

(It also removed all untracked dependencies in node_modules, which you should also never have unless you've done something weird.)

bdangubic

19 hours ago

it won’t work of course, no one is that lucky :)

morkalork

19 hours ago

I remember advocating for running nightly tests on every project/service I worked on because inevitably one night one of the transitive dependencies would update and shit would break. And at least with the nightly test it forced it to break early vs when you needed to do something else like an emergency bug fix and ran into then..

kstrauser

17 hours ago

That works, more or less. But now you have a requirements.txt file with 300 dependencies. Which ones do you actually care about, and which are just transitive things that your top-level deps brought along for the ride? And a year later, when GitHub's Dependabot is telling you have a security vulnerability in some package you've never heard of, do you remember if you even care about that package in the first place, or if it's left over cruft from that time you experimented with aiohttp instead of httpx?

roywiggins

16 hours ago

I always just used pip-tools. Your requirements.in is the file that is human-readable and -writable, and sets your top-level deps and the version ranges you want. requirements.txt is your lockfile that you generate from .in with pip-compile. pip-compile writes out comments specifying from where each package in requirements.txt is being required.

uv does it a lot faster and generates requirements.txts that are cross-platform, which is a nice improvement.

rtpg

19 hours ago

As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update

Pips solver could still cause problems in general on changes.

UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.

handystudio

16 hours ago

Totally agree, UV's solver speed is exciting

pnt12

18 hours ago

This is way less than what uv and other package managers do:

- dev dependencies (or other groups) - distinguishing between direct and indirect dependencies (useful if you want to cut some fat from a project) - dependencies with optional extra dependencies (if you remove the main, it will delete the orphans when relevant)

It's not unachievable with pip and virtualenvs, but verbose and prone to human error.

Like C: if you're careful enough, it can be memory safe. But teams would rather rely on memory safe languages.

2wrist

19 hours ago

It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.

mk89

19 hours ago

I think it's a great modern tool, don't get me wrong.

But the main reason shouldn't be the "lockfile". I was replying to the parent comment mainly for that particular thing.

tecoholic

18 hours ago

I am on the same boat. I like uv for its speed and other niceties it brings and being a single tool to manage different things. But lockfile is not that big a deal. I never got Poetry as well. Tried it in a project once and the lockfile was a pain with the merges. I didn’t spend much time, so maybe I didn’t understand the tool and workflow or whatever, but pip and pip-tools were just fine working with requirements.txt.

alejoar

5 hours ago

Hey, just so you know, newer lockfiles are meant to fully replace old ones, you shouldn't bother with solving merge conflicts on these files, just accept all the new changes always.

What you SHOULD solve are conflicts in the packages/project file. Once solved, just create a new lockfile and replace the old one.

This applies to lockfiles on any project python or non-python.

selcuka

17 hours ago

The canonical way to do this with pip was using Constraints Files [1]. When you pollute your main requirements.txt it gets harder to see which package is an actual dependency of your project, and which ones are just sub-dependencies. Constraint files also let you not install a package if it's no longer a sub-dependency.

That being said, the uv experience is much nicer (also insanely fast).

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files

FuckButtons

17 hours ago

Honestly, this feels like the difference between Cmake and cargo, sure Cmake does work and you can get to do everything you need, you just need discipline, knowledge and patience. On the other hand, you could just have a tool that does it all for you so you can get back to doing the actual work.

avidphantasm

17 hours ago

I don’t get the hype either. Every time I’ve tried to use tools like pyenv or pipenv they fall down when I try to install anything that doesn’t provide wheels (GDAL), so I give up and stick to pip and virtualenv. Does uv let me install GDAL without hassle?

kstrauser

17 hours ago

Pyenv's a different animal. It's meant for installing multiple Python versions at once so that you're not stuck with whatever dog your base OS happens to ship.

Pipenv tried to be what uv is, but it never did seem to work right, and it had too many weird corner cases ("why is it suddenly taking 3 hours to install packages? why it is literally impossible to get it to upgrade one single dependency and not all the others?") to ever be a contender.

ghusto

18 hours ago

I've never even understood the virtual env dogma. I can see how version conflicts _could_ happen, but they never have. Admittedly, I'm surprised I never have issues installing globally, especially since others keep telling me what a terrible idea it is and how they had nightmare-scenario-X happen to them.

selcuka

17 hours ago

I write Python code for a living and no two projects I work on have the exact same dependencies. This is especially true when working with microservices, or working for multiple customers.

tecoholic

18 hours ago

How do you work with multiple projects with different versions of the same dependencies? If you are using the “system python” for everything?

ghusto

8 hours ago

Not system Python (not least because that's a hassle to do these days anyway, with all the safeguards OS vendors have put in), but _my_ version of globally. My (user) global PyEnv version, for example.

Now having said that, I suspect PyEnv is doing some voodoo behind the scenes, because I occasionally see messages like "Package X what's version N, but you have version N1". I've never investigated them though, since both old and new packages seem to work just fine regardless.

LtWorf

9 hours ago

> How do you work with multiple projects with different versions of the same dependencies?

You don't… you use the same versions for everything :)

electroglyph

18 hours ago

it's very common for different projects to have different requirements, especially for fast moving libraries like transformers. if you rarely run python stuff it might not be a big deal, but i'd rather not have to reinstall stuff (especially big stuff like pytorch builds) every time i switch projects.

kstrauser

17 hours ago

That's exactly it. Imagine your company has multiple Python repos, and one depends on foo>=1.0,<2.0, and another depends on foo>=2.0. Venvs let you configure completely isolated environments for each so that they can peacefully coexist. I would not for a moment consider using Python without virtualenvs, though I'm not opinionated about which tool manages them. Uv? Great. Poetry? Fine. `python -m venv`? Whatever. They all get the job done.

Honestly, I can't think of a single good reason not to want to use a venv for Python.

LtWorf

9 hours ago

Using the same version of everything lets you have a much easier time when a vulnerability is discovered?

kstrauser

an hour ago

How so? That hasn’t been my experience.

digisign

18 hours ago

I only ever had it a problem with large, poorly maintained projects from work. You know the kind that have two web frameworks required in the same project, and two orms, etc. ;-) That one I definitely put into a venv. But my stuff, no.

not_kurt_godel

18 hours ago

And then you're sunk the moment anyone else needs to run your code, or even if you just need to run your own code on another machine.

digisign

14 hours ago

Never happened.

not_kurt_godel

an hour ago

I salute you for never needing a new computer, ever.

esseph

17 hours ago

They happen /all the time/.

For a long time there were even compatibilities between the RHEL host python version, and the python version the Red Hat Ansible team were shipping.

ghusto

8 hours ago

> They happen /all the time/.

So I keep hearing ;)

Meanwhile, on my machines ...

ifwinterco

19 hours ago

It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level

12345hn6789

16 hours ago

Oops, you forgot to sh into you venv and now your env is messed up.

epage

19 hours ago

Good luck if you need cross-platform `requirements.txt` files.

mk89

19 hours ago

This is a good use case. Not sure how this is typically solved, I guess "requirements-os-version.txt"? A bit redundant and repetitive.

I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...

trenchpilgrim

18 hours ago

But then you have to m x n x o it for different combinations of Python version, OS, CPU architecture, GPU make/model... uv will solve it for you in milliseconds.

freehorse

19 hours ago

How does uv solve that? Like, if you use dependencies that do not cross platforms very well?

mirashii

19 hours ago

uv finds a dependency resolution that works for all platforms by default, and can do things like fork the resolution and choose different versions based on platform or python version requirements.

chrisweekly

20 hours ago

Webdev since 1998 here. Tabling the python vs JS/etc to comment on npm per se. PNPM is better than npm in every way. Strongest possible recommendation to use it instead of npm; it's faster, more efficient, safer, and more deterministic. See https://pnpm.io/motivation

Ant59

19 hours ago

I've gone all-in on Bun for many of the same reasons. Blazingly fast installs too.

https://bun.sh/

ifwinterco

19 hours ago

I think at this point everyone on hacker news with even a passing interest in JS has heard of bun, it's promoted relentlessly

trenchpilgrim

18 hours ago

I'm still meeting devs who haven't heard of it and get their minds blown when they replace npm in their projects. Every day is a chance to meet one of the lucky 10000: https://xkcd.com/1053/

fud101

11 hours ago

I avoided JS for the longest time because i wanted nothing to with node or npm. With bun, i'm finally enjoying javascript.

catlifeonmars

13 hours ago

Bun still segfaults way too often for my comfort but I’m crossing my fingers waiting for it to mature. It is definitely nice to have an alternative runtime to Node.

throw-the-towel

17 hours ago

Did you experience any compatibility problems with Bun?

tracker1

18 hours ago

Deno is pretty sweet too... shell scripts that don't need a package.json or a node_modules directory for dependencies.

chrisweekly

14 hours ago

Yeah, Deno 2 is pretty compelling.

nullbyte

19 hours ago

I find pnpm annoying to type, that's why I don't use it

DemocracyFTW2

19 hours ago

IME after years of using pnpm exclusively having to type `pnpm install` instead of `npm install` is easily the single biggest drawback of replacing `npm` with `pnpm`, so yes.

FWIW I use zsh with auto-auto-completion / auto-completion-as-you-type, so just hitting `p` on an empty command line will remember the most recent command starting with `p` (which was likely `pnpm`), and you can refine with further keystrokes and accept longer prefixes (like I always do that with `git add` to choose between typical ways to complete that statement). IMO people who don't use auto-completion are either people who have a magical ability to hammer text into their keyboards with the speed of light, or people who don't know about anything hence don't know about auto-completion, or terminally obsessive types who believe that only hand-crafting each line is worth while.

I don't know which type of person you are but since typing `pnpm` instead of `npm` bothers you to the degree you refuse to use `pnpm`, I assume you must be of the second type. Did you know you can alias commands? Did you know that no matter your shell it's straightforward to write shell scripts that do nothing but replace obnoxious command invocations with shorter ones? If you're a type 3 person then of course god forbid, no true hacker worth their salt will want to spoil the purity of their artisanal command line incantations with unnatural ersatz-commands, got it.

ASalazarMX

18 hours ago

Command alias? Even Windows can do them these days.

anp

20 hours ago

Might be worth noting that npm didn’t have lock files for quite a long time, which is the era during which I formed my mental model of npm hell. The popularity of yarn (again importing bundled/cargo-isms) seems like maybe the main reason npm isn’t as bad as it used to be.

no_wizard

18 hours ago

npm has evolved, slowly, but evolved, thanks to yarn and pnpm.

It even has some (I feel somewhat rudimentary) support for workspaces and isolated installs (what pnpm does)

WatchDog

17 hours ago

Lock files are only needed because of version ranging.

Maven worked fine without semantic versioning and lock files.

Edit: Changed "semantic versioning" to "version ranging"

bastawhiz

16 hours ago

> Maven worked fine without semantic versioning and lock files.

No, it actually has the exact same problem. You add a dependency, and that dependency specifies a sub-dependency against, say, version `[1.0,)`. Now you install your dependencies on a new machine and nothing works. Why? Because the sub-dependency released version 2.0 that's incompatible with the dependency you're directly referencing. Nobody likes helping to onboard the new guy when he goes to install dependencies on his laptop and stuff just doesn't work because the versions of sub-dependencies are silently different. Lock files completely avoid this.

Sankozi

6 hours ago

It is possible to set version ranges but it is hard to see this in real world. Everyone is using pinned dependencies.

Version ranges are really bad idea which we can see in NPM.

WatchDog

16 hours ago

My apologies I should have said "version ranging" instead of "semantic versioning".

Before version ranging, maven dependency resolution was deterministic.

bastawhiz

13 hours ago

Always using exact versions avoids this (your pom.xml essentially is the lock file), but it effectively meant you could never upgrade anything unless every dependency and transitive dependency also supported the new version. That could mean upgrading dozens of things for a critical patch. And it's surely one of the reasons log4j was so painful to get past.

WatchDog

5 hours ago

I’ve been out of the Java ecosystem for a while, so I wasn’t involved in patching anything for log4j, but I don’t see why it would be difficult for the majority of projects.

Should just be a version bump in one place.

In the general case Java and maven doesn’t support multiple versions of the same library being loaded at once(not without tricks at least, custom class loaders or shaded deps), so it shouldn’t matter what transitive dependencies depend on.

omcnoe

14 hours ago

Maven also has some terrible design where it will allow incompatible transitive dependencies to be used, one overwriting the other based on “nearest wins” rather than returning an error.

zelphirkalt

15 hours ago

If in some supply chain attack someone switches out a version's code under your seating apparatus, then good look without lock files. I for one prefer being notified about checksums of things suddenly changing.

WatchDog

10 hours ago

Maven releases are immutable

zelphirkalt

6 hours ago

Sounds like the Common Lisp approach, where there are editions or what they call them and those are sets of dependencies at specific versions.

But the problem with that is, when you need another version of a library, that is not in that edition. For example when a backdoor or CVE gets discovered, that you have to fix asap, you might not want to wait for the next Maven release. Furthermore, Maven is Java ecosystem stuff, where things tend to move quite slowly (enterprisey) and comes with its own set of issues.

jrochkind1

17 hours ago

Yeah, python's tooling for dependency management was definitely not just fine, it was a disaster.

Coming from ruby. However, I think uv has actually now surpassed bundler and the ruby standard toolset for these things. Definitely surpassed npm, which is also not fine. Couldn't speak for cargo.

icedchai

20 hours ago

poetry gave us lock files and consistent installs for years. uv is much, much faster however.

beeb

20 hours ago

I used poetry professionally for a couple of years and hit so many bugs, it was definitely not a smooth experience. Granted that was probably 3-4 years ago.

teekert

19 hours ago

I always loved poetry but then I’d always run into that bug where you can’t use repos with authentication. So I’d always go somewhere else eventually.

Some time ago I found out it does work with authentication, but their “counter ascii animation” just covers it… bug has been open for years now…

palm-tree

19 hours ago

I started using poetry abiut 4 years ago and definitely hit a lot of bugs around that time, but it seems to have improved considerably. That said, my company has largely moved to uv as it does seem easier to use (particularly for devs coming from other languages).

icedchai

20 hours ago

I've occasionally run into performance issues and bugs with dependency resolution / updates. Not so much recently, but at a previous company we had a huge monorepo and I've seen it take forever.

IshKebab

18 hours ago

The very first time I tried to use Poetry I ran into a bug where it couldn't resolve some simple dependencies.

uv actually works.

no_wizard

18 hours ago

There was pipenv before that too, which also had a lockfile.

Funny how these things get forgotten to history. There's lots of prior art when it comes to replacing pip.

edit: here's an HN thread about pipenv, where many say the same things about it as they are about UV and Poetry before https://news.ycombinator.com/item?id=16302570

kstrauser

17 hours ago

Except pipenv was never anywhere near as good. It meant well but never delivered.

epistasis

13 hours ago

Exactly I jumped onto pipenv, poetry, and pyenv as soon as I heard about them, and though they provided advantages, they all had significant flaws which prevented me being able to give full-throated endorsement as the solutions to Python environments

However, I have zero reservations about uv. I have not encountered bugs, and when features are present they are ready for complete adoption. Plus there's massive speed improvements. There is zero downside to using uv in any application where it can be used and also there are advantages.

rcleveng

20 hours ago

and pip-compile before that.

Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity

ShakataGaNai

20 hours ago

I have to agree that there were a lot of good options, but uv's speed is what sets it apart.

Also the ability to have a single script with deps using TOML in the headers super eaisly.

Also Also the ability to use a random python tool in effectively seconds with no faffing about.

RatchetWerks

13 hours ago

I’ve been saying this for years! JS gets alot of hate for dependency hell.

Why?

It’s almost too easy to add one compared to writing your own functions.

Now compare that to adding a dependency to a c++ project

zelphirkalt

16 hours ago

Tooling like npm, cargo, and others existed well before uv came up. I have used poetry years ago, and have had reproducible virtual environments for a long time. It's not like uv, at least in that regard, adds much. The biggest benefit I see so far, and that is also why I use it over poetry, is that it is fast. But the benefit of that is small, since usually one does not change the dependencies of a project that often, and when one does, one can also wait a few seconds longer.

brightball

13 hours ago

I tried Python for the first time after I’d been coding with multiple other languages for about 15 years.

The environment, dependency experience created so much friction compared to everything else. Changed my perspective on Docker for local dev.

Glad to hear it seems to finally be fixed.

mbac32768

17 hours ago

> that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing

I think it's more like Rust devs using Python and thinking what the fuck why isn't this more like rustup+cargo?

gigatexal

20 hours ago

the thing is I never had issues with virtual environments -- uv just allows me to easily determine what version of python that venv uses.

j2kun

19 hours ago

you mean you can't just do `venv/bin/python --version`?

shlomo_z

19 hours ago

he means "choose", not "check"

gigatexal

17 hours ago

Yes sorry you’re correct. It allows me to specify a version of Python.

doright

16 hours ago

Why did it take this long? Why did so many prior solutions ultimately fall flat after years and years of attempts? Was Python package/environment management such a hard problem that only VC money could have fixed it?

morshu9001

15 hours ago

It's not fixed quite yet because the default recommended way is still pip. And that's the same reason past attempts didn't work.

stavros

15 hours ago

It didn't, though? Poetry was largely fine, it's just that uv is so much faster. I don't think uv is that much different from Poetry in the day-to-day dependency management, I'm sure there are some slight differences, but Poetry also brought all the modern stuff we expected out of a package manager.

odyssey7

18 hours ago

Python might have been better at this but the community was struggling with the 2 vs 3 rift for years. Maybe new tooling will change it, but my personal opinion is that python does not scale very well beyond a homework assignment. That is its sweet spot: student-sized projects.

morshu9001

14 hours ago

Imo the community should've rejected Python 3 and said, find a way to improve things without breaking everyone. JS managed to do it.

pansa2

13 hours ago

The community basically did reject Python 3, at first. Almost nobody used 3.0 / 3.1 / 3.2, to the point where I’ve seen them retconned as beta releases.

Even then though, the core developers made it clear that breaking everyone’s code was the only thing they were willing to do (remember Guido’s big “No 2.8” banner at PyCon?), which left the community with no choice.

tiltowait

15 hours ago

I don't know, Poetry's existed for years, and people still use requirements.txt. Uv is great but isn't exactly unique in Python-land.

wraptile

15 hours ago

Yeah I use poetry, uv and requirements.txt - all great tools for their respective niches.

temporallobe

16 hours ago

Yep, working with bundler and npm for a decade plus has made me appreciate these tools more than you can know. I had just recently moved to Python for a project and was delighted to learn that Python had something similar, and indeed uv is more than just a package manager like bundler. It’s like bundler + rvenv/rvm.

And inspired by uv, we now have rv for RoR!

nateglims

16 hours ago

Personally I never thought it was fine, but the solutions were all bad in some way that made direct venv and requirements files preferable. Poetry started to break this but I had issues with it. uv is the first one that actually feels good.

tyingq

12 hours ago

> but lock files and consistent installs are amazing

Yes, though poetry has lock files, and it didn't create the same positive feelings uv does :)

globular-toast

19 hours ago

I've been using pip-tools for the best part of a decade. uv isn't the first time we got lock files. The main difference with uv is how it abstracts away the virtualenv and you run everything using `uv run` instead, like cargo. But you can still activate the virtualenv if you want. At that point the only difference is it's faster.

ForHackernews

17 hours ago

To be fair, Poetry has done everything uv does for about a decade. uv is much faster, which is great, but lock files, integrated venv management, etc.

silverwind

17 hours ago

Yep, coming from poetry, uv is a pure speed increase with the same feature set.

zamalek

15 hours ago

I would dread cloning a python project more than I would C++, and was the sole purpose I made real effort to avoid the language entirely.

zellyn

17 hours ago

What weird shadow-universe do you inhabit where you found Python developers telling you the tooling was just fine? I thought everyone has agreed packaging was a trash fire since the turn of the century.

morshu9001

14 hours ago

Hackernews and also the official Python maintainers

ThinkBeat

16 hours ago

there are severe problems with npm as well. It is not a model I hope is replicated.

pydry

20 hours ago

>finally get a taste of npm

good god no thank you.

>cargo

more like it.

internetter

19 hours ago

cargo is better than npm, yes, but npm is better than pip (in my experience)

DemocracyFTW2

18 hours ago

As someone who moved from Python to NodeJS/npm ~10yrs ago I can fully support that statement. Dissatisfaction with Python's refusal to get its dependency/package-management act together and seeing how reasonably the task is being dealt with by `npm`—notably with all its flaws—made me firmly stay with NodeJS. Actually virtualenv was for me another reason to keep my fingers out of whatever they're doing now over there in Python-land, but maybe `uv` can change that.

morshu9001

14 hours ago

Yeah but I want uv to be default first

j45

16 hours ago

I feel a little like this too.

My default feeling towards using python in more ways than I did was default no because the tooling wasn't there for others to handle it, no matter how easy it was for me.

I feel uv will help python go even more mainstream.

Spivak

18 hours ago

But you are just using virtualenv with pip. It doesn't change any of the moving pieces except that uv is virtualenv aware and will set up / use them transparently.

You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed. Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.

dragonwriter

18 hours ago

> But you are just using virtualenv with pip.

No, you aren't.

> It doesn't change any of the moving pieces

It literally does, though iyt maintains a mostly-parallel low-level interface, the implementation is replaced with improved (in speed, in dependency solving, and in other areas.) You are using virtual environments (but not venv/virtualenv) and the same sources that pip uses (but not pip).

> You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed.

Yes, you can do a subset of what uv does with those without prefixes, and if you add pipx and hatch (though with hatch you’ll be prefixing for much the same reason as in uv) you’ll get closer to uv’s functionality.

> Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.

If you ignore the parts of the flow built around modern Python packaging standards like pyproject.toml, sure, pieces of the flow have been around and supported by the right constellation of other standard and nonstandard tools for a while.

NaomiLehman

18 hours ago

conda was great to me

bastawhiz

16 hours ago

conda ruined my shell and never successfully worked for me. I guess YMMV

morshu9001

14 hours ago

All my experience with Conda is from helping my friend nuke it off his laptop

NSPG911

16 hours ago

have you tried pixi for this?

insane_dreamer

11 hours ago

same here; I now prefer uv but conda served us very well, and allowed us to maintain stable reproducible environments; being able to have multiple environments for a given project is also sometimes handy vs a single pyproject.toml

insane_dreamer

11 hours ago

other than being much slower than uv, conda has worked great for years

I do prefer uv but it's not like sane python env management hasn't existed

WesolyKubeczek

20 hours ago

I somehow had quite enough problems going from bundler 1.13 to 1.16 to 2.x some years ago. I’m glad we have killed that codebase with fire.

kevin_thibedeau

20 hours ago

> you should just use virtualenv with pip

This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

ElectricalUnion

19 hours ago

Unless all python dependencies you ever used were available in your distro (and then at that point, you're no longer using pip, you're using dpkg...), this never worked well. What solves this well is PEP 723 and tooling around it.

With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.

zahlman

19 hours ago

> You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell.

This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.

But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.

Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):

  $ file `which apt`
  /usr/local/bin/apt: Python script, ASCII text executable
> Now you get endlessly chastised for trying to use Python as a general purpose utility.

No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).

If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.

> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.

noirscape

19 hours ago

Nowadays pip also defaults to installing to the users home folder if you don't run it as root.

Basically the only thing missing from pip install being a smooth experience is something like npx to cleanly run modules/binary files that were installed to that directory. It's still futzing with the PATH variable to run those scripts correctly.

zahlman

17 hours ago

> Nowadays pip also defaults to installing to the users home folder if you don't run it as root.

This could still cause problems if you run system tools as that user.

I haven't checked (because I didn't install my distro's system package for pip, and because I use virtual environments properly) but I'm pretty sure that the same marker-file protection would apply to that folder (there's no folder there, on my system).

whywhywhywhy

19 hours ago

> Python as a general purpose utility

This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.

1718627440

18 hours ago

This is very true! I was highly surprised when I installed Python from source and found out, that the entire problem is fixed since decades. You can have different Python versions in the same prefix just fine, you just need to pick a default one you install with `make install` and install all the others with `make altinstall`.

whalesalad

19 hours ago

it's because so many essential system tools now rely on python, and if you install arbitrary code outside of a venv it can clobber the global namespace and break the core OS' guarantees.

I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)

dragonwriter

19 hours ago

What they needed to do is allow side-by-side installs of different versions of the same distribution package and allow specifying or constraining versions at import time, then you wouldn't have the problem at all.

But that's probably not practical to retrofit given the ecosystem as it is now.

kevin_thibedeau

14 hours ago

That couldn't happen with Debian's dist_packages which was explicitly for the the system tools managed by apt.

aunderscored

19 hours ago

pipx solves this perfectly.

zahlman

19 hours ago

For "applications" (which are distributed on PyPI but include specified entry points for command-line use), yes. For development — installing libraries that your own code will use — you'll still generally need something else (although the restriction is really quite arbitrary).

aunderscored

12 hours ago

Agreed! Sorry my read was for apps. You can use --user with pip to install into the user site rather than the system site, however it still causes overlap which can be problematic

jonnycomputer

2 hours ago

I have to say that I am extremely reluctant to switch over to yet another python management system (packaging, environment, python version). Every few years someone says: this is it. Switch to poetry! Okay, I did. And, at least for some academic packages, psychopy I'm looking at you, it was a friggin disaster.

so. will uv install psychopy (say version 3.2.4)?

JodieBenitez

an hour ago

Well... I did pip -> poetry -> rye -> uv in less than a year, so I can understand the fatigue. But the fact is that uv is well above the rest and was well worth the effort (and the wait).

oofbey

an hour ago

I followed much of this path as well, but with conda (and mamba) in the mix. They all had fairly obvious flaws that uv doesn’t AFAICT.

Poetry was the worst for me. It doesn’t even try to manage the Python distribution, so it’s only a partial solution. It was so slow our CICD would timeout and fail. And I watched the maintainers actively refuse to fix super annoying bugs for YEARS, blaming others for the problem.

atonse

21 hours ago

These rust based tools really change the idea of what's possible (when you can get feedback in milliseconds). But I'm trying to figure out what Astral as a company does for revenue. I don't see any paid products on their website. They even have investors.

So far it seems like they have a bunch of these high performance tools. Is this part of an upcoming product suite for python or something? Just curious. I'm not a full-time python developer.

bruckie

20 hours ago

From "So how does Astral plan to make money? " (https://news.ycombinator.com/item?id=44358216):

"What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today. An example of what this might look like [...] would be something like an enterprise-focused private package registry."

There's also this interview with Charlie Marsh (Astral founder): https://timclicks.dev/podcast/supercharging-python-tooling-a... (specifically the "Building a commerical company with venture capital " section)

throwway120385

20 hours ago

That doesn't really seem like a way to avoid getting "Broadcommed." Vertically integrated tooling is kind of a commodity.

ploxiln

17 hours ago

hmm how well did that work for Docker ...

LtWorf

17 hours ago

It doesn't seem to answer to anything.

IshKebab

18 hours ago

Conda apparently makes a ton of money just by selling access to "more secure" packages, so maybe they'll do something like that.

There are apparently 10 million Python developers in the world and pretty soon all of them will be using uv. I doubt it is that hard to monetise.

morshu9001

15 hours ago

It doesn't really matter that it's Rust. npm is written in JS.

wpm

14 hours ago

npm runs dog slow IME

ghthor

13 hours ago

Yep, it’s next up for language package tooling that runs dog slow in CI and is consistently a pain in my side.

lxgr

6 hours ago

Couldn't agree more.

Arguably this article is missing one of the biggest benefits: Being able to make Python scripts truly self-contained by including dependencies via a PEP 723 inline header and then running them via `uv run <script.py>` [1].

It's made Python my language of choice for one-off scripts easily shareable as gists, scp-able across systems etc.

[1] https://pybit.es/articles/create-project-less-python-utiliti...

ndr

6 hours ago

If like me you never remember what the header looks like, uv will help there too:

`uv init --script myscript.py`

verdverm

21 hours ago

I'd put type annotations and GIL removal above UV without a second thought. UV is still young and I hit some of those growing pains. While it is very nice, I'm not going to put it up there with sliced bread, it's just another package manager among many

zahlman

20 hours ago

For that matter, IMX much of what people praise uv for is simply stuff that pip (and venv) can now do that it couldn't back when they gave up on pip. Which in turn has become possible because of several ecosystem standards (defined across many PEPs) and increasing awareness and adoption of those standards.

The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).

Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).

pnt12

17 hours ago

Things uv does better by pip by default: - really hard to install a package globally by accident (pip: forgetting to activate venv) - really easy to distinguish de and main dependencies (pip: create different files for different groups and set up their relationship) - distinguish direct dependencies from indirect dependencies, making it easy to find when a package is not needed anymore (pip: I bet most devs are either not tracking sub dependencies or mixing all together with pip freeze) - easily use different python versions for different projects (pip: not really)

With uv it just works. With pip, technically you can make it work, and I bet you'll screw something up along the way.

zahlman

17 hours ago

> - really hard to install a package globally by accident (pip: forgetting to activate venv)

This is different as of Python 3.11. Please see https://peps.python.org/pep-0668/ for details. Nowadays, to install a package globally, you first have to have a global copy of pip (Debian makes you install that separately), then you have to intentionally bypass a security marker using --break-system-packages.

Also, you don't have to activate the venv to use it. You can specify the path to the venv's pip explicitly; or you can use a different copy of pip (e.g. a globally-installed one) passing it the `--python` argument (you have been able to do this for about 3 years now).

(Pedantically, yes, you could use a venv-installed copy of pip to install into the system environment, passing both --python and --break-system-packages. I can't prove that anyone has ever done this, and I can't fathom a reason beyond bragging rights.)

> - really easy to distinguish [dev] and main dependencies

As of 25.1, pip can install from dependency groups described in pyproject.toml, which is the standard way to group your dependencies in metadata.

> distinguish direct dependencies from indirect dependencies, making it easy to find when a package is not needed anymore

As of 25.1, pip can create PEP 751 standard lockfiles.

> easily use different python versions for different projects

If you want something to install Python for you, yes, that was never in pip's purview, by design.

If you want to use an environment based off an existing Python, that's what venv is for.

verdverm

20 hours ago

For sure, we see the same thing in the JS ecosystem. New tooling adds some feature, other options implement feature, convergence to a larger common set.

I'm still mostly on poetry

9dev

18 hours ago

The things you list may be a reason for some, but in all discussions I’ve had and read about on uv, the reason is that it behaves as a package manger should. It can just install dependencies from an automatically generated lockfile. It can update outdated minor versions. It can tell me about outdated versions of my dependencies. It can reproduce a build on another machine. The lock file can be put into version control. A coworker can run a single command to install everything. It abstracts the stupidity that is virtual environments away so much you don’t even have to touch them anymore. And also, it’s fast.

Wake me up when pip can do any of that.

zahlman

17 hours ago

> the reason is that it behaves as a package manger should.

This is a matter of opinion. Pip exists to install the packages and their dependencies. It does not, by design, exist to manage a project for you.

9dev

17 hours ago

The overwhelming majority of developers seem to agree with me though.

If anything, pip is a dependency installer, while working with even trivial projects requires a dependency manager. Parent's point was that pip is actually good enough that you don’t even need uv anymore, but as long as pip doesn’t satisfy 80% of the requirements, that’s just plain false.

FreakLegion

13 hours ago

I'm not sure an overwhelming majority of Python developers care one way or the other. Like, I'm sure uv is nice, but I've somehow never had an issue with pip or conda, so there's just no reason to futz with uv. Same deal with Jujutsu. It's probably great, but git isn't a problem, so jj isn't a priority.

A majority of HN users might agree with you, but I'd guess that a majority of developers, to paraphrase Don Draper, don't think about it at all.

zahlman

16 hours ago

"anymore" makes no sense, since pip long predates uv.

Some people don't have, or don't care about, the additional requirements you have in mind.

WD-42

21 hours ago

As far as impact on the ecosystem I’d say uv is up there. For the language itself you are right. Curious if you’ve come across any real use cases for Gil-less python. I haven’t yet. Seems like everything that would benefit from it is already written in highly optimized native modules.

seabrookmx

21 hours ago

> Seems like everything that would benefit from it is already written in highly optimized native modules

Or by asyncio.

WD-42

20 hours ago

I'm pretty ignorant about this stuff but I think asyncio is for exactly that, asynchronus I/O. Whereas GIL-less Python would be beneficial for CPU bound programs. My day job is boring so I'm never CPU bound, always IO bound on the database or network. If there is CPU heavy code, it's in Numpy. So I'm not sure if Gil-less actually helps there.

nomel

19 hours ago

asyncio is unrelated to the parallelism prevented by the GIL.

rustystump

21 hours ago

I second and third this. I HATE python but uv was what made it usable to me. No other language had such a confusing obnoxious setup to do anything with outside of js land. uv made it sane for me.

giancarlostoro

20 hours ago

Node definitely needs its own "uv" basically.

jampekka

20 hours ago

Why? Uv very good compared to other Python package managers, but even plain npm is still better than uv, and pnpm is a lot better.

monkpit

20 hours ago

How is npm not exactly that?

jampekka

21 hours ago

Type annotations were introduced in 2008 and even type hints over decade ago in Sept 2015.

zacmps

20 hours ago

But there has been continual improvement over that time, both in the ecosystem, and in the language (like a syntax for generics).

9dev

17 hours ago

And yet you still cannot write even moderately complex type expressions without severe pain.

KaiserPro

20 hours ago

typed annotations that are useful.

Currently they are a bit pointless. Sure they aid in documentation, but they are effort and cause you pain when making modifications (mind you with halfarse agentic coding its probably less of a problem. )

What would be better is to have a strict mode where instead of duck typing its pre-declared. It would also make a bunch of things faster (along with breaking everything and the spirit of the language)

I still don't get the appeal of UV, but thats possibly because I'm old and have been using pyenv and venv for many many years. This means that anything new is an attack on my very being.

however if it means that conda fucks off and dies, then I'm willing to move to UV.

KK7NIL

19 hours ago

You can get pretty darn close to static typing by using ty (from the same team as uv).

I've been using it professionally and its been a big improvement for code quality.

ggm

16 hours ago

> it's just another package manager among many

It's the python version of fink vs macports vs homebrew. Or apt vs deb. or pkgsrc vs ports.

But I don't think "its just another" gets the value proposition here. It's significantly simpler to deploy in practice for people like me, writing ad hoc scripts and running git downloaded scripts and codelets.

Yes, virtualenv and pip existed. No, they turned out to be a lot more fiddly to run in practice than UV.

That UV is rust is funny, but not in a terrible way. The llvm compiler toolchain is written in C but compiles other languages. Using one language to do things for another language isn't such a terrible outcome.

I hope UV supplants the others. Not to disrespect their authors, but UV is better for end users. If its worse for package maintainers I think the UV authors should be told.

morshu9001

15 hours ago

I don't want type annotations. Was kinda the point of Python not to deal with types.

surajrmal

11 hours ago

If you've ever used python on a project above a certain size (both lines of code and people who contribute to it), type annotations quickly become something you find useful.

morshu9001

7 hours ago

I have, it didn't really help. It does help if you have no tests, but a large project needs tests.

never_inline

20 minutes ago

I guess you don't use an IDE. Use VSCode or IntelliJ with auto completion and error highlighting as you type. Type hints are a blessing.

kyt

21 hours ago

I must be the odd man out but I am not a fan of uv.

1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

3. It does not play well with Docker.

4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

xmprt

20 hours ago

Your implication is that pyenv, virtualenv, and pip should be 3 different tools. But for the average developer, these tools are all related to managing the python environment and versions which in my head sounds like one thing. Other languages don't have 3 different tools for this.

pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.

j2kun

19 hours ago

I think OP's complaint is rather that using `uv` is leaky: now you need to learn all the underlying stuff AND uv as well.

The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.

Grikbdl

17 hours ago

I don't think that's true, most projects using uv don't rely on those tools at all, and you don't need to understand them. You just `uv sync` and do your work.

throwaway894345

20 hours ago

Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.

It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.

morshu9001

15 hours ago

And in practice it usually ends up being 6 different machine-global pools that all weirdly intersect, and some are python2.

I started using NodeJS more after lots of Python experience. Packages make so much more sense there. Even imports. You know how hard it is to do the equivalent of "require '../foo.js'" in Python?

zahlman

20 hours ago

I, too, have ~20 years of Python experience.

`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.

The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.

Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.

Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.

> the Python version is itself a dependency of most libraries

This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.

grebc

18 hours ago

I don’t think people consider things from a first principles perspective these days.

“…I can't see any valid use case for a machine-global pool of dependencies…” - Rhetorical question for OP but how do you run an operating system without having said operating systems dependencies available to everything else?

throwaway894345

15 hours ago

That quote is mine, so I think you’re meaning to address me?

> how do you run an operating system without having said operating systems dependencies available to everything else?

I’m not sure if I understand your question, but I’ll answer based on what I think you mean. The OS gets compiled into an artifact, so the dependencies aren’t available to the system itself unless they are explicitly added.

grebc

12 hours ago

You asked what’s the point of a machine based global pool of dependencies - I answered: it’s an OS.

throwaway894345

3 hours ago

An OS isn’t a machine-global pool dependencies. Strange claim.

throwaway894345

15 hours ago

I agree with all of that context about virtualenv and venv, but it all seems orthogonal to my point. I still can’t see a case where you would want the default Python behavior (global dependencies).

> This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

“normally” is biased by what the tooling supports. If Python tooling supported pinning to an interpreter by default then perhaps it would seem more normal?

I write a lot of Go these days, and the libs pin to a version of Go. When you build a project, the toolchain will resolve and (if necessary) install the necessary Go dependency just like all of the other dependencies. It’s a very natural and pleasant workflow.

nicce

20 hours ago

Python versions and environments can be solved in more reliable abstraction level as well, e.g. if you are heavy Nix user.

throwaway894345

20 hours ago

On the other hand, Nix and Bazel and friends are a lot of pain. I'm sure the tradeoff makes sense in a lot of situations, but not needing to bring in Nix or Bazel just to manage dependencies is a pretty big boon. It would be great to see some of the all-in-one build tools become more usable though. Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

331c8c71

19 hours ago

Well Nix is the only sane way I know to manage fully reproducible envs that incorporate programs/scripts spanning multiple ecosystems. Very common situation in applied data analysis.

ghthor

13 hours ago

Nix is a 10x force multiplier for managing Linux systems. The fact that I can write python, go, bash, jq, any tool that is right for the job of managing and configuring the system is amazing. And on top of that I can patch any part of the entire system with just that, a patch from my fork on GitHub or anywhere else.

Top that off with first class programming capabilities and modularization and I can share common configuration and packages across systems. And add that those same customized packages can be directly included in a dev shell making all of the amazing software out there available for tooling and support. Really has changed my outlook and I have so much fun now not EVER dealing with tooling issues except when I have explicitly upgrade my shell and nixpkgs version.

I just rebuilt our CI infrastructure with nix and was a able to configure multiple dockerd isolated daemons per host, calculate the subnet spread for all the networks, write scripts configuring the env so you can run docker1 and hit daemon 1. Now we can saturate our CI machines with more parallel work without them fighting over docker system resources like ports. Never would have attempting doing this without nix, being able to generate the entire system config tree and inspect systemd service configs befor even applying to a host reduced my iteration loop to an all time low in the infrastructure land where 10-15mins lead times of building images to find out I misspelling Kafka and kakfa somewhere and now need to rebuild again for 15mins. Now I get almost instant feedback for most of these types of errors.

eisbaw

19 hours ago

> Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

Yep: Nix

throwaway894345

3 hours ago

Unless you’re packaging anything or consuming packages or teasing out the conflicting advice from the community on which nix-related tooling to use or literally anything else of interest.

jscheel

18 hours ago

oh man, don't even bother with bazel... hermetic python builds are such a mess.

throwaway894345

15 hours ago

Yeah, I burn my face on that particular stove once every 3 years or so.

knowitnone3

20 hours ago

"other languages don't have 3 different tools for this." But other languages DO have 3 different tools so we should do that too!

a_bored_husky

20 hours ago

> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

I think there are more cases where pip, pyenv, and virtualenv are used together than not. It makes sense to bundle the features of the three into one. uv does not replace ruff.

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

uv pip is there for compatibility and to facilitate migration but once you are full on the uv workflow you rarely need `uv pip` if ever

> 3. It does not play well with Docker.

In what sense?

> 4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

You don't need to touch them at all

dragonwriter

20 hours ago

> It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

uv doesn’t try to replace ruff.

> You end up needing to use `uv pip` so it's not even a full replacement for pip.

"uv pip" doesn't use pip, it provides a low-level pip-compatible interface for uv, so it is, in fact, still uv replacing pip, with the speed and other advantages of uv when using that interface.

Also, while I’ve used uv pip and uv venv as part of familiarizing myself with the tool, I’ve never run into a situation where I need either of those low-level interfaces rather than the normal high-level interface.

> It does not play well with Docker.

How so?

leblancfg

21 hours ago

uv's pip interface is like dipping one toe in the bathtub. Take a minute and try on the full managed interface instead: https://docs.astral.sh/uv/concepts/projects/dependencies. Your commands then become:

- uv add <package_name>

- uv sync

- uv run <command>

Feels very ergonomic, I don't need to think much, and it's so much faster.

collinmanderson

20 hours ago

> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

In my experience it generally does all of those well. Are you running into issues with the uv replacements?

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

What do end up needing to use `uv pip` for?

tclancy

20 hours ago

So I have been doing Python for far too long and have all sort of tooling I've accreted to make Python work well for me across projects and computers and I never quite made the leap to Poetry and was suspicious of uv.

Happened to buy a new machine and decided to jump in the deep end and it's been glorious. I think the difference from your comment (and others in this chain) and my experience is that you're trying to make uv fit how you have done things. Jumping all the way in, I just . . . never needed virtualenvs. Don't really think about them once I sorted out a mistake I was making. uv init and you're pretty much there.

>You end up needing to use `uv pip` so it's not even a full replacement for pip

The only time I've used uv pip is on a project at work that isn't a uv-powered project. uv add should be doing what you need and it really fights you if you're trying to add something to global because it assumes that's an accident, which it probably is (but you can drop back to uv pip for that).

>`UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

I've been using it for six months and didn't know those existed. I would suggest this is a symptom of trying to make it be what you're used to. I would also gently suggest those of us who have decades of Python experience may have a bit of Stockholm Syndrome around package management, packaging, etc.

brikym

20 hours ago

> It tries to do too many things. Please just do one thing and do it well.

I disagree with this principle. Sometimes what I need is a kitset. I don't want to go shopping for things, or browse multiple docs. I just want it taken care of for me. I don't use uv so I don't know if the pieces fit together well but the kitset can work well and so can a la carte.

Narushia

20 hours ago

uv has played well with Docker in my experience, from dev containers to CI/CD to production image builds. Would be interested to hear what is not working for you.

The uv docs even have a whole page dedicated to Docker; you should definitely check that out if you haven't already: https://docs.astral.sh/uv/guides/integration/docker/

eatonphil

20 hours ago

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

Needing pip and virtualenvs was enough to make me realize uv wasn't what I was looking for. If I still need to manage virtualenvs and call pip I'm just going to do so with both of these directly.

I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

notatallshaw

20 hours ago

> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

Isn't that exactly a pyproject.toml via the the uv add/sync/run interface? What is that missing that you need?

eatonphil

20 hours ago

> pyproject.toml

Ah ok I was missing this and this does sound like what I was expecting. Thank you!

ellg

20 hours ago

What are you needing to use `uv pip` for? I don't think I ever call into pip from uv for anything nowadays. I typically just need to do `uv sync` and `uv run`, maybe sometimes `uvx` if I want to run some random 3rd party python script

dragonwriter

20 hours ago

> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

If you are using uv, you don’t need to do shell shenanigans, you just use uv run. So I'm not sure how uv with pyproject.toml doesn't meet this description (yes, the venv is still there, it is used exactly as you describe.)

og_kalu

20 hours ago

In most cases, you don't really need to manage virtual envs though ? uv commands that need a venv will just create one for you or install to the existing one automatically.

ivell

20 hours ago

Pixi is an alternative that you may want to try.

yoavm

19 hours ago

Really sounds like you're using it wrong, no? I completely forgot about virtualenvs, pip and requirements.txt since I start using UV.

dsnr

20 hours ago

This. I was researching uv to replace my pipenv+pyenv setup, but after reading up a bit I decided to just give up. Pipenv is just straightforward and “just works”. Aside from being slow, not much is wrong with it. I’m not in the mood to start configuring uv, a tool that should take me 2 minutes and a “uv —-help” to learn.

9dev

17 hours ago

What doesn’t just work about uv in particular? You basically need three commands - uv add, uv sync, and uv run. Forget about virtual environments, and get back to working. No configuration necessary.

robertfw

20 hours ago

Slow doesn't really begin to do justice, I'd have to wait for >5 minutes for pipenv to finish figuring out our lock file. uv does it in less than a second.

aniforprez

13 hours ago

> Pipenv is just straightforward and “just works”

I have worked on numerous projects that started with pipenv and it has never "just works" ever. Either there's some trivial dependency conflict that it can't resolve or it's slow as molasses or something or the other. pipenv has been horrible to use. I started switching projects to pip-tools and now I recommend using uv

realityfactchex

15 hours ago

Yeah, I find that I like to use uv for one thing, quickly/efficiently getting a Python into a new venv for some project. A la:

  uv venv ~/.venvs/my_new_project --python 3.13
  source ~/.venvs/my_new_project/bin/activate
  python3 -m ensurepip --upgrade
  cp -r /path/from/source/* .
  python3 -m pip install -r requirements.txt
So here uv installs the Python version wanted. But it's just a venv. And we pip install using requirements.txt, like normal, within that venv.

Someone, please tell me what's wrong with this. To me, this seems much less complicated that some uv-centric .toml config file, plus some uv-centric commands for more kinds of actions.

TYPE_FASTER

20 hours ago

Yeah, I'm with you. I'm forcing myself to learn it because it looks like that's the way PyWorld is going. I don't dislike uv as much as poetry. But I guess I never really ran into issues using pyenv and pip. shrug Maybe I wasn't working on complex enough projects.

mdavid626

7 hours ago

We use it for many projects. None of these are true for us.

Why do you need to use uv pip?

What problems you have in Docker?

I don't understand any of those env variables you listed, yet I use uv without problems.

tpl

20 hours ago

What do you mean it doesn't play well with docker?

scuff3d

20 hours ago

If your pyproject.toml is setup properly you shouldn't need to use `uv pip` at all.

I'm using uv in two dozen containers with no issues at all. So not sure what you mean that it doesn't play well with Docker.

nhumrich

13 hours ago

When do you use `uv pip`? I never use it. It feels like an edge case only command.

chatmasta

20 hours ago

What problems do you encounter using it with Docker?

vindex10

20 hours ago

I would also add UV_NO_SYNC as smth I had to learn. It comes in combination with uv pip

wtallis

20 hours ago

What's your use case for UV_NO_SYNC? I assume the option exists for a reason, but aside from maybe a modest performance improvement when working with a massive complex package environment, I'm not sure what problem it solves.

vindex10

6 hours ago

some packages that I use for development need to be part of the virtual env. for example ipdb.

so i do uv pip install ipdb.

but then, after uv add somepackage

uv sync happens and cleans up all extras. to keep extras, you need to run uv sync --inexact. But there is no env var for `--inexact`, so I end up doing the sync manually.

nomel

18 hours ago

5. No concept of global/shell/local venv auto activation, so get used to typing "uv run", or manually recreating these concepts, with shell stuffs.

l2silver

18 hours ago

It's funny, I feel like half the reason I use docker is for python projects.

defraudbah

21 hours ago

yeah, I've moved away from it too, but that's a great tool. A rush of rust tools is the best thing that happened to python in the decade

nicoco

20 hours ago

uv pip is a full reimplementation of pip. Way faster, better caching, less disk usage. What'd not to like about it?

techbrovanguard

14 hours ago

oh look, the average golang fan. here’s a challenge for you: explain _why_ the complexity is bad without:

- resorting to logical fallacies, or

- relying on your unstated assumption that all complexity is bad

daedrdev

20 hours ago

I mean I’ve had quite awful bugs from using pip pyenv and venv at the same time

j45

20 hours ago

It's still one tool to orchestrate and run everything, which is preferable to many.

groby_b

20 hours ago

> You end up needing to use `uv pip` so it's not even a full replacement for pip.

No you don't. That's just a set of compatibility approaches for people who can't let go of pip/venv. Move to uv/PEP723, world's your oyster.

> It does not play well with Docker.

Huh? I use uv both during container build and container runtime, and it works just fine?

> You end up needing to understand all of these new environmental variables

Not encountered the need for any of these yet. Your comments on uv are so far out of line of all the uses I've seen, I'd love to hear what you're specifically doing that these become breaking points.

NewJazz

21 hours ago

Idk, for me ruff was more of a game changer. No more explaining why we need both flake8 and pylint (and isort), no more flake8 plugins... Just one command that does it all.

UV is great but I use it as a more convenient pip+venv. Maybe I'm not using it to it's full potential.

collinmanderson

19 hours ago

I agree flake8 -> ruff was more of a game changer for me than pip+venv -> uv. I use flake8/ruff for more often than pip/venv.

uv is probably much more of a game changer for beginner python users who just need to install stuff and don't need to lint. So it's a bigger deal for the broader python ecosystem.

zahlman

21 hours ago

> Maybe I'm not using it to it's full potential.

You aren't, but that's fine. Everyone has their own idea about how tooling should work and come together, and I happen to be in your camp (from what I can tell). I actively don't want an all-in-one tool to do "project management".

hirako2000

20 hours ago

The dependencies descriptor is further structured, a requirements.txt is pretty raw in comparison.

But where it isn't a matter of opinion is, speed. Never met anyone who given then same interface, would prefer a process taking 10x longer to execute.

zmmmmm

19 hours ago

    > Instead of 
    >
    > source .venv/bin/activate
    > python myscript.py
    >
    > you can just do
    >
    > > uv run myscript
    >
This is by far the biggest turn off for me. The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

Side rant: yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

collinmanderson

19 hours ago

> The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

The `uv run` command is an optional shortcut for avoiding needing to activate the virtual environment. I personally don't like the whole "needing to activate an environment" before I can run commands "natively", so I like `uv run`. (Actually for the last 10 years I've had my `./manage.py` auto-set up the virtual environment for me.)

The `uv add` / `uv lock` / `uv sync` commands are still useful without `uv run`.

dragonwriter

19 hours ago

> They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

There is a new standard mechanism for specifying the same things you would specify when setting up a venv with a python version and dependencies in the header of a single file script, so that tooling can setup up the environment and run the script using only the script file itself as a spec.

uv (and PyPA’s own pipx) support this standard.

> yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

"uv run myscript" is neither longer nor worse than separately manually building a venv, activating it, installing dependencies into it, and then running the script.

zbentley

18 hours ago

> I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

Apologies for triggering you in advance, but in case you or others find it useful, here’s how to do the equivalent env-activation commands with uv: https://news.ycombinator.com/item?id=44360892

mborsuk

19 hours ago

From what I can tell (just started using uv) it doesn't break the original workflow with the venv, just adds the uv run option as well.

wtallis

19 hours ago

Yes, you still have the option of manually activating a venv, and that makes sense if the amortized cost of that is lower than several instances of typing `uv run `. Though sometimes when working in one project with its venv activated, I end up needing to run a tool from another project with a separate vent, so uv still ends up being useful.

txdv

6 hours ago

you can just point a shotgun at your foot and pull the trigger, isn't that covenient?

IshKebab

17 hours ago

You can still do the `source .venv/bin/activate` if you want.

There's also `uv tool install` which will install things in your PATH without infecting your system with Python.

zmmmmm

16 hours ago

that makes me feel much better!

fireflash38

18 hours ago

Unless I'm an AI, I'm pretty sure "uv run" is the same number of characters as "python". So it's shorter. Also venvs are a translation layer already, changing path.

zmmmmm

16 hours ago

it's not really the number of characters so much as the cognitive load of having to do something different here vs there and anything I run successfully on the command line can't be directly lifted over into scripts etc. Along with training a team of people to do that.

mgh95

21 hours ago

As someone who generally prefers not to use python in a production context (I think it's excellent for one-off scripts or cron jobs that require more features then what bash provides), I agree with this sentiment. I recently wrote some python (using uv) and found it to be pleasant and well-integrated with a variety of LSPs.

seabrookmx

21 hours ago

Can't agree more. We were using pyenv+poetry before and regularly had to pin our poetry version to a specific one, because new poetry releases would stall trying to resolve dependencies.

pyenv was problematic because you needed the right concoction of system packages to ensure it compiled python with the right features, and we have a mix of MacOS and Linux devs so this was often non-trivial.

uv is much faster than both of these tools, has a more ergonomic CLI, and solves both of the issues I just mentioned.

I'm hoping astral's type checker is suitably good once released, because we're on mypy right now and it's a constant source of frustration (slow and buggy).

kardos

20 hours ago

> because new poetry releases would stall trying to resolve dependencies.

> uv is much faster than both of these tools

conda is also (in)famous for being slow at this, although the new mamba solver is much faster. What does uv do in order to resolve dependencies much faster?

collinmanderson

20 hours ago

> What does uv do in order to resolve dependencies much faster?

- Representing version numbers as single integer for fast comparison.

- Being implemented in rust rather than Python (compared to Poetry)

- Parallel downloads

- Caching individual files rather than zipped wheel, so installation is just hard-linking files, zero copy (on unix at least). Also makes it very storage efficient.

tehnub

16 hours ago

Before uv, I was fairly happy with pyenv + venv + pip for development and pipx for running "tools". IMO, the specific things uv improves upon are:

  - Faster dependency resolution. In fact, everything uv does is extremely fast.
  - Better ergonomics in a dozen ways (`uv run` instead of activating the virtual env, support for script metadata to run scripts with dependencies, uv add to modify the pyproject.toml (that it created for you), etc.)
  - Stack of one tool instead of four+
  - Easier Python installation (although I usually use both pyenv and uv on my machine)

tclancy

15 hours ago

The speed thing can’t be overstated. At first I thought it wasn’t actually running for some things.

j2kun

19 hours ago

This article appears to be NOT about someone who discovered uv after using venv/pip, but rather an article about someone who discovered uv after not using virtual environments at all, and is mostly excited about the cleanliness of virtual environments.

collinmanderson

19 hours ago

The article shows some advantages compared to plain virtual environments:

In principle, you can ‘activate’ this new virtual environment like any typical virtual environment that you may have seen in other tools, but the most ‘uv-onic’ way to use uv is simply to prepend any command with uv run. This command automatically picks up the correct virtual environment for you and runs your command with it. For instance, to run a script — instead of

   source .venv/bin/activate
   python myscript.py
you can just do

   uv run myscript.py

zahlman

18 hours ago

> The article shows some advantages compared to plain virtual environments

No; they are plain virtual environments. There is no special kind of virtual environment. Uv simply offers its own command structure for managing those environments. In particular, `uv run` just ensures a venv in a specific location, then uses it.

There is no requirement to activate virtual environments in order to use them (unless you have some other tooling that specifically depends on the environment variables being set). You can, similarly, "just do"

  .venv/bin/python myscript.py
without uv installed.

> This command automatically picks up the correct virtual environment for you

Some people dislike such magic, especially since it involves uv having an opinion about where the virtual environment is located.

collinmanderson

16 hours ago

Sorry, you're right I should have said "plain venv", as in the program.

`uv run` will also sync the environment to be sure it exists and meets the correct specifications.

But yes, it's optional. You can also just do `uv sync` to sync the environment and then activate it like normal.

Or use `uv venv`, `uv pip` commands and just take the speed advantage.

rldjbpin

3 hours ago

either conda is that old now or it seems a bit of a hyperbole.

haven used it personally, uv is quite fast and nice to work with at first. definitely nice if you work in a team that fully utilise its potential. however, there felt a lot of parallels with the node.js universe and switching to .venv / localized environments bloats up the system when you work with a boilerplate env that is the same across projects.

the additional files generated are also less human-readable compared to a requirements.txt and the workflow felt a bit more involved for individual users imho. it definitely has a place in the ecosystem, but personally don't find it ready to replace everything else yet.

pidgeon_lover

an hour ago

> Can install any version of Python

Does "any" version include custom homebrew builds of Python, e.g. backports of Python 3.12 to Windows Vista/7?

runningmike

21 hours ago

Seems like a commercial blog. And imho hatch is better from a Foss perspective.

UV means getting more strings attached with VC funded companies and leaning on their infrastructure. This is a high risk for any FOSS community and history tells us how this ends….

robot-wrangler

16 hours ago

This is going to sound harsh, but the problem with hatch is that it's pypa. And look at all the people that equate python-the-language with problems in pypa-managed solutions already. Pypa does not make good stuff or make good decisions.

Speaking of history, I was very sympathetic to the "we are open-source volunteers, give us a break" kind of stuff for the first N years.. but pypa has a pattern of creating problems, ignoring them, ignoring criticism, ignoring people who are trying to help, and pushing talent+interest elsewhere. This has fragmented the packaging ecosystem in a way that confuses newcomers, forces constant maintenance and training burden on experts, and damages the credibility of the language and its users. Hatch is frankly too little too late, and even if it becomes a wonderful standard, it would just force more maintenance, more confusion for a "temporary" period that lasts many, many years. Confidence is too far gone.

As mentioned elsewhere in the thread, there are tons of conflicting tools in the space already, and due to the fragmentation, poetry etc could never get critical mass. That's partly because pypa stuff felt most "official" and a safer long term bet than anything else, but partly because 33% better was never good enough to encourage widespread adoption until it was closer to 200% better. But uv actually IS that much better. Just let it win.

And let pypa be a case-study in how to NOT do FOSS. Fragmentation is fine up to a point, but you know what? If it wasn't for KDE / Gnome reinventing the wheel for every single kind of individual GUI then we'd have already seen the glorious "year of the linux desktop" by now.

blibble

13 hours ago

> Pypa does not make good stuff or make good decisions.

yep, I've been saying this for years, and astral have proved it in the best way: with brilliant, working software

python was a dying project 10 years ago, after the python 3000 debacle

the talent left/lost interest

then the machine learning thing kicked off (for some reason using python), and now python is everywhere and suddenly massively important

and the supporting bureaucracies, still in their death throes, are unable to handle a project of its importance

maccard

20 hours ago

You say this on a message board run by a VC about a programming language that is primarily developed by meta, google and co.

uv is MIT licensed so if they rug pull, you can fork.

LtWorf

17 hours ago

It's still annoying to fork and they will probably try to move it to their own pypi service so it won't be possible to do that.

maccard

7 hours ago

I’d rather use a great tool for a year and a half (which I have done) and suffer the pain of a fork once than have to use a worse tool. Look at terraform and the tofu fork - it took a while but throughout the peocsss you could just stick to the last open source version of terraform until you decided what you wanted to do. Ironically the fork there is now controlled by the VC backed firms people love to decry

jelder

an hour ago

> On GitHub Actions, we’re planning to use uv to quickly build a Python environment and run our unit tests. In production, uv already manages Python for all of our servers.

Does that mean they aren't running unit tests _at all_ in CI yet, or they just use a totally different, newer system in production than they do for CI? Either way, brave of them to admit that in public.

bunderbunder

an hour ago

My team just did this.

It wasn't anything like the radical change to how CI works that you seem to be envisioning. It was just deleting a lot of Python environment setup and management code that has a history of being obnoxious to maintain, and replacing it with a one-liner that, at least thus far, has given us zero fuss.

jelder

33 minutes ago

You updated _production_ before _testing_? Sorry but that just sounds like asking for a disaster.

bunderbunder

11 minutes ago

It seems like you're reading things that people aren't writing.

I don't know how the author's company manages their stack, so I can't speak to how they do their testing. But I do know that in many companies run-time environment management in production is not owned by engineering and it's common for ops and developers to use different methods to install run-time dependencies in the CI environment and in the production environment. In companies that work that way, testing changes to the production runtime environment isn't done in CI; it's done in staging.

If that's at all representative of how they work, then "we didn't test this with the automated tests that Engineering owns as part of their build" does not in any way imply, "we didn't test this at all."

Tangentially, the place I worked that maintained the highest quality and availability standards (by far) did something like this, and it was a deliberate reliability engineering choice. They wanted a separate testing phase and runtime environment management policy that developers couldn't unilaterally control as part of a defense in depth strategy. Jamming everything into a vertically integrated, heavily automated CI/CD pipeline is also a valid choice, but one that has its roots in Silicon Valley culture, and therefore reaches different solutions to the same problems compared to what you might see in older industries and companies.

frobisher

2 hours ago

Would be nice to systematically start with problems of the non-uv world to highlight the specific value of uv.

But nice read!

metmac

15 hours ago

UV and the crew at Astral really moved the Python packaging community forward.

I would love to see them compete with the likes of Conda and try to handle the Python C extension story.

But in the interim, I agree with everyone else who has already commented, Pixi which is partly built atop of UV’s solver is an even bigger deal and I think the longer term winner here.

Having a topologically complete package manager who can speak Conda and PyPi, is amazing.

https://pixi.sh/latest/

rayxi271828

15 hours ago

I love how uv allows me to not think of all the options anymore.

virtualenv, venv, pyenv, pipenv... I think at one point the recommended option changed because it was integrated into Python, but I can't even remember which is which anymore.

Such a pleasure to finally have just one, for maybe... ~99% of my needs.

pama

20 hours ago

I love uv. But the post starts with a simple install using a oneliner curl piping to sh, which is such a big attack surface area… I would much rather have a much longer one liner that increases safety.

tiagod

15 hours ago

What's the difference from going to the website and downloading it, or doing it through the package manager?

pama

6 hours ago

Package managers or old school download from a website (gnu) provide a separate public checksum and GPG signature in multiple mirrored locations. Even if their server was compromised I can still be safe.

hirako2000

20 hours ago

It seems to be a trend in the rust community. I guess because rustup is suggested to be installed that way.

But you don't have to. Brew and other package managers hold uv in their registries.

oblio

20 hours ago

Isn't uv like... a Rust binary? If that sh has any sense it just copies the binary and adds it to PATH.

dboon

18 hours ago

If you look at the script, this is indeed more or less what happens. Except the folks over there are very clever about ergonomics, so the script is quite long so it can detect your architecture, OS, and even libc to give you an appropriate binary. There’s a tool that they use (which they wrote) which generates such install scripts for you

It’s really excellent stuff

rieogoigr

19 hours ago

but since you are curling a web URL straight to sh you will never know. which is the problem.

bmicraft

17 hours ago

But it's not if you trust the url and curl has `--proto '=https' --tlsv1.2` as args

oblio

5 hours ago

What about running a downloaded binary installer? What if the URL is HTTPS?

ggm

16 hours ago

This blog very strongly echoes my own experiental sense of the field of play.

It's just simpler to use, and better overall. It's reduced friction significantly.

I think the Python community should put it as a first preference vehicle, and be respectful to the prior arts, and their developers, but not insist they have primacy.

thatsadude

14 minutes ago

Has been a life changer for me.

bartdecrem

2 hours ago

Should I try this? I, let’s say, code with Ai and am not an engineer. Python related environment stuff makes my head explode (across the 3 computers I use). It’s the main thing my brain just can’t seem to figure out.

fishmicrowaver

2 hours ago

Well if you want to code with AI you'll have to tell claude/codex constantly that you're using uv and sometimes it'll remember.

nilamo

2 hours ago

Or, you can use a task runner like mise or just, and tell it to use the task runner instead of any particular tool directly. `mise test:unit` is much harder for an agent to get wrong.

colpabar

2 hours ago

Yes. This is currently the best tool to handle python environments and dependencies and it's very popular. If you learn uv, and you really don't need to learn that much other than how to specify a python version and your dependencies, you (hopefully) should be set for a long time.

sdairs

21 hours ago

Everything from the astral team has been superb, I don't want to use Python without ruff & uv. Yet to try "ty", anyone used it?

tabletcorry

21 hours ago

Ty is still under very active development, so it either works or very much doesn't. I run it occasionally to see if it works on my codebases, and while it is getting closer, it isn't quite there yet.

Definitely lightyears faster than mypy though.

FattiMei

20 hours ago

But what was wrong with pip, venv and pyproject.toml in the first place? I just keep a system installation of python for my personal things and an environment for every project I'm working on. I'd get suspicious if a developer is picky about python versions or library versions like what crazy programs are you writing?

the8472

20 hours ago

What's wrong? Having modify the shell environment, no lockfile, slow download/installation, lack of a standard dependency dir, ...

> I'd get suspicious if a developer is picky about python versions or library versions

Certain library versions only support certain python versions. And they also break API. So moving up/down the python versions also means moving library versions which means stuff no longer works.

zahlman

19 hours ago

You don't have to modify the environment (this is provided as an option for convenience). The alternatives are to use higher-level management like uv does, or to specify the path to executables in the virtual environment directly. But uv works by creating virtual environments that are essentially the same as what you get with `python -m venv --without-pip` (although they reimplemented the venv creation logic).

Pip can install from dependency groups in a pyproject.toml file, and can write PEP 751 lockfiles, and work is under way to allow it to install from those lockfiles as well.

I don't know what you mean about a "standard dependency dir". When you make a venv yourself, you can call it what you want, and put it where you want. If you want to put it in a "standard" place, you can trivially make a shell alias to do so. (You can also trivially make a shell alias for "activate the venv at a hard-coded relative path", and use that from your project root.)

Yes, pip installation is needlessly slow for a variety of reasons (that mostly do not have to do with being implemented in Python rather than Rust). Resolving dependencies is also slow (and Rust may be more relevant here; I haven't done detailed testing). But your download speed is still going to be primarily limited by your internet connection to PyPI.

the8472

18 hours ago

I'm confused by this reply.

> The alternatives are to use higher-level management like uv does,

The question was specifically what's wrong with pip, venv and pyproject toml, i.e. what issues uv is trying to address. Well of course the thing trying to address the problem addresses the problem....

> I don't know what you mean about a "standard dependency dir".

like node's node_modules, or cargo's ~/.cargo/registry. You shouldn't have to manually create and manage that. installing/building should just create it. Which is what uv does and pip doesn't.

> the same as what you get with `python -m venv --without-pip`

The thing that should be automatic. And even if it is not it should at least be less arcane. An important command like that should have been streamlined long ago. One of the many improvements uv brings to the table.

> and work is under way to allow it to install from those lockfiles as well.

Yeah well, the lack up until now is one of those "what is wrong" things.

> But your download speed is still going to be primarily limited by your internet connection to PyPI.

Downloading lots of small packages dependencies serially leaves a lot of performance on the table due to latency and non-instantaneous response from congestion controllers. Downloading and installing concurrently reduces walltime further.

zahlman

18 hours ago

> Well of course the thing trying to address the problem addresses the problem....

The point is that it is a thing trying to address the "problem", and that not everyone considers it a problem.

> Which is what uv does and pip doesn't.

The point is that you might want to install something not for use in a "project", and that you might want to explicitly hand-craft the full contents of the environment. Pip is fundamentally a lower-level tool than uv.

> The thing that should be automatic.

Bootstrapping pip is the default so that people who have barely learned what Python is don't ask where pip is, or why pip isn't installing into the (right) virtual environment.

Yes, there are lots of flaws in pip. The problem is not virtual environments. Uv uses the same virtual environments. Neither is the problem "being a low-level tool that directly installs packages and their dependencies". I actively want to have that tool, and actively don't want a tool that tries to take over my entire project workflow.

johnfn

20 hours ago

As mostly a Python outsider, in the infrequent times that I do use python package management, uv just works. When I use pip I’d get all sorts of obscure error messages that I’d have to go track down, probably because I got some obscure environment detail wrong. With uv I never run into that nonsense.

zahlman

19 hours ago

Design-wise, nothing, IMO. But I don't fault people who prefer the uv workflow, either. Chacun a son gout.

Implementation-wise, there's nothing wrong in my view with venv. Or rather, everything is compelled to use virtual environments, including uv, and venv is just a simple tool for doing so manually. Pip, on the other hand, is slow and bulky due to poor architecture, a problem made worse by the expectation (you can work around it, but it requires additional understanding and setup, and isn't a perfect solution) of re-installing it into each virtual environment.

(The standard library venv defaults to such installation; you can disable this, but then you have to have a global pip set up, and you have to direct it to install into the necessary environment. One sneaky way to do this is to install Pipx, and then set up some script wrappers that use Pipx's vendored copy of pip. I describe my techniques for this in https://zahlman.github.io/posts/2025/01/07/python-packaging-....)

Edit: by "design" above I meant the broad strokes of how you use pip, installing single packages with their transitive dependencies etc. There's a lot I would change about the CLI syntax, and other design issues like that.

jvanderbot

20 hours ago

What was wrong was that you needed to do that.

How many commands are required to build up a locally consistent workspace?

Modern package managers do that for you.

wrs

20 hours ago

The pytorch ecosystem, for one, is notorious for very specific version dependencies between libraries.

oblio

20 hours ago

How do pip and venv integrate with pyproject.toml? At least pip doesn't even use it.

sheepscreek

16 hours ago

What about Pixi[1]? It has become an irreplaceable part of my dev stack. Fantastic for tool + library version management. It has replaced a number of tools for me and greatly simplified bootstrapping in a new environment (like lxc containers when I am experimenting with stuff) or creating a lightweight sandbox for AI agents.

1. https://pixi.sh/latest/

hollow-moe

21 hours ago

curl|sh and iwr|iex chills my spine, no one should recommend these methods of installation in 2025. I'm against closed computers but I'm also against reckless install. Even without the security concerns these way of installation tends to put files in a whole random places making it hard to manage and cleanup.

jampekka

20 hours ago

Installing an out-of-distro deb/rpm/msi/dmg/etc package is just as unsafe as curl|sh. Or even unsafer, as packages tend to require root/admin.

procaryote

20 hours ago

A package is at least a signable, checksummable artefact. The curl | sh thing could have been anything and after running it you have no record of what it was you did.

There have also been PoCs on serving malicious content only when piped to sh rather than saved to file.

If you want to execute shell code from the internet, at the very least store it in a file first and store that file somewhere persistent before executing it. It will make forensics easier

Grikbdl

17 hours ago

If you're going to run code without inspecting it though, the methods are similar. One case has https, the other a signature (which you're trusting due to obtaining it over https). You can't inspect it reliably only after getting hypothetically compromised.

nikisweeting

20 hours ago

Security and auditability is not the core problem, it's versioning and uninstalling. https://docs.sweeting.me/s/against-curl-sh

jampekka

19 hours ago

Uninstalling can be a problem.

Versioning OTOH is often more problematic with distro package managers that can't support multiple versions of the same package.

Also inability to do user install is a big problem with distro managers.

Intralexical

9 hours ago

Also file conflicts. Installing an RPM/ALPM/APK should warn you before it clobbers existing files. But for a one-off install script, all it takes is a missing environment variable or an extra space (`mv /etc/$INSTAALCONF /tmp`, `chown -R root /$MY_DATA_PATFH`), and suddenly you can't log on.

Of course unpredictability itself is also a security problem. I'm not even supposed to run partial updates that at least come from the same repository. I ain't gonna shovel random shell scripts into the mix and hope for the best.

1718627440

18 hours ago

That is still checked for its signature, the only thing you bypass is the automatic download over HTTP and dependency resolution by default.

mystifyingpoi

21 hours ago

While I do share the sentiment, I firmly believe that for opensource, no one should require the author to distribute their software, or even ask them to provide os-specific installation methods. They wrote it for free, use it or don't. They provide a handy install script - don't like it? sure, grab the source and build it yourself. Oops, you don't know what the software does? Gotta read every line of it, right?

Maybe if you trust the software, then trusting the install script isn't that big of a stretch?

WorldMaker

20 hours ago

For small project open source with a CLI audience, why bother with an install script at all and not just provide tarballs/ZIP files and assume that the CLI audience is smart enough to untarball/unzip it to somewhere on their PATH?

Also, many of the "distribution" tools like brew, scoop, winget, and more are just "PR a YAML file with your zip file URL, name of your EXE to add to a PATH, and a checksum hash of the zip to this git repository". We're about at a minimum effort needed to generate a "distribution" point in software history, so seems interesting shell scripts to install things seem to have picked up instead.

Intralexical

9 hours ago

> Maybe if you trust the software, then trusting the install script isn't that big of a stretch?

The software is not written in a scripting language where forgetting quote marks regularly causes silent `rm -rf /` incidents. And even then, I probably don't explicitly point the software at my system root/home and tell it to go wild.

rieogoigr

19 hours ago

Part of writing software involves writing a way to deploy that software to a computer. Piping a web URL to a bash interpreter is not good enough. if that's the best installer you can do the rest of your code is probably trash.

wtallis

18 hours ago

It's not the best installer they can come up with. It's just the most OS/distro-agnostic one-step installer they can come up with.

Intralexical

10 hours ago

It's so not, though. Half the time if you read one of those install scripts it's just an `if`-chain for a small number of platforms the developer has tested. And breaks if you use a different distro/version.

wtallis

9 hours ago

uv is pretty self-contained; there aren't a lot of ways a weird linux distro could break it it or its installer, aside from not providing any of the three user-owned paths it tries to install uv into (it doesn't try to do anything with elevated privileges or install for anyone other than the current user). Expecting $HOME and your own shell profile to be writable just isn't something that's going to break very often.

Looking at the install script or at a release page (eg. https://github.com/astral-sh/uv/releases/tag/0.9.6 ) shows they have pretty broad hardware support in their pre-compiled binaries. The most plausible route to being disappointed by the versatility of this install script is probably if you're running an OS that's not Linux, macOS, or Windows—but then, the README is pretty clear about enumerating those three as the supported operating systems.

WorldMaker

20 hours ago

That iwr|iex example is especially egregious because it hardcodes the PowerShell <7.0 EXE name to include `-ExecutionPolicy Bypass`. So it'll fail on Linux or macOS, but more importantly iwr|iex is already an execution bypass, so including a second one seems a red flag to me. (What else is it downloading?)

Also, most reasonable developers should already be running with the ExecutionPolicy RemoteSigned, it would be nice if code signing these install script was a little more common, too. (There was even a proposal for icm [Invoke-Command] to take signed script URLs directly for a much safer alternative code-golfed version of iwr|iex. Maybe that proposal should be picked back up.)

shorten6084

12 hours ago

How is it even different from running a pre compiled binary

chasd00

21 hours ago

can't you just do curl|more and then view what it's going to do? Then, once you're convinced, go back to curl|sh.

/just guessing, haven't tried it

threeducks

20 hours ago

A malicious server could detect whether the user is actually running "curl | sh" instead of just "curl" and only serve a malicious shell script when the code is executed blindly. See this thread for reference: https://news.ycombinator.com/item?id=17636032

chasd00

20 hours ago

well you still have to execute the shell script at some point. You could do curl > install.sh, open it up to inspect, and then run the install script which would still trigger the callback to the server mentioned in the link you posted. I guess it's really up to the user to decide what programs to run and not run.

01HNNWZ0MV43FF

21 hours ago

Maybe there will be a .deb one day

mystifyingpoi

20 hours ago

That doesn't fix the core issue. You can put anything inside a .deb file, even preinstall script can send your ~/.aws/credentials to China. The core concern is getting a package that's verified by a volunteer human to not contain anything malicious, and then getting that package into Debian repository or equivalent.

rieogoigr

19 hours ago

for real. You want to pipe a random URL to my bash interpreter to install?

no. thats how you get malware. Make a package. Add it to a distro. then we will talk.

danielhanchen

7 hours ago

Super agree! Love how uv installs packages in parallel! It made installs 30 seconds from 5 minutes during `uv pip install unsloth`!

randomsolutions

2 hours ago

The only reason I haven't switched, is when using uv as my environment manager, starting/connecting to the python kernel in notebooks within vscode takes forever.

Phelinofist

3 hours ago

   [project]
   name = "my_project"
   version = "1.0.0"
   requires-python = ">=3.9,<3.13"
   dependencies = [
     "astropy>=5.0.0",
     "pandas>=1.0.0,<2.0",
   ]
looks like a POM file

zem

20 hours ago

I think ruff is the best thing to happen to the python ecosystem in a decade, it really sold the entire community on the difference fast native tooling could make.

taeric

20 hours ago

I still feel bitten by diving into poetry when starting some projects. Has the ecosystem fully moved on to uv, now? Do they have good influence on what python's main ecosystem is moving to?

collinmanderson

19 hours ago

> Has the ecosystem fully moved on to uv, now?

It's moving pretty quick.

> Do they have good influence on what python's main ecosystem is moving to?

Yes, they're an early adaptor/implementer of the recent pyproject.toml standards.

taeric

18 hours ago

As someone that is fine on poetry for a few things, is there a good pitch deck on why I should change over?

collinmanderson

16 hours ago

The best pitch deck is spending 5 minutes playing with uv and trying out installing the dependencies for your project.

It’s hard to demonstrate the speed difference in a pitch deck.

taeric

12 hours ago

Fair. I don't even know how long that takes for my projects. I presume I've been fortunate that that is not one of my long poles, as it were.

Hopeful that a lot of this will be even more resolved next time I'm looking to make decisions.

0xpgm

13 hours ago

I for now prefer to stick to whatever the default is from the python packaging crew and standard library i.e. `python -m venv` and `pip install` inside of it.

Python for me is great when things can remain as simple to wrap your head around as possible.

taeric

10 hours ago

Managing environments with `python -m venv` and all of the easy ways that goes wrong is exactly what I don't want to deal with. Is almost enough to make me never want to use python.

waldrews

9 hours ago

(off topic) The code chunks in the article use a ligature font, so ">=" is rendered in a way that makes you stop to think how to type it - which is especially confusing since the context is not exactly math. Down with ligatures and extra cognitive load!

wodenokoto

8 hours ago

Some people love this, and hunt for the perfect ligature overloaded programming font. Fira Codes popularity is often credited to its tons of ligatures.

I hate it too.

throwaway7783

15 hours ago

Been using pip and venv for long. I get uv is faster, but I don't get why people drool over it this much. What is wrong with pip + venv? I build webapps so perhaps I don't see issues in ML world

isodev

20 hours ago

Or is it a corporate grab to gain more influence in the ecosystem? I like the idea, but for profit backing is out of the question. This lesson has been learned countless times.

collinmanderson

an hour ago

Does the conda package manager have the same issue?

collinmanderson

18 hours ago

They plan on making money by running private package registries, rather than making money on the client. https://astral.sh/pyx

LtWorf

17 hours ago

What they say today and what will the board decide to do in 6 months are 2 entirely separate things.

wiseowise

19 hours ago

Even if it was, that would be the best implementation of the strategy ever.

dcgudeman

20 hours ago

no, it's a python library, get a grip. Also "This lesson has been learned countless times"? No it hasn't, since when has a package manager developed by a for-profit company hurt the ecosystem?

antod

12 hours ago

Was that a dig at the recent RubyGems situation?

pixelbeat__

5 hours ago

It seems dotslash would complement uv well

https://dotslash-cli.com/docs/

DotSlash to get the interpreter for your platform, and uv to get the dependencies.

Perfect for corporate setups with custom mirrors etc.

vietvu

7 hours ago

Yes, agree. At first I still use pyenv, or mise to manage python version, now that uv does that, uv is the only tool I need for everything in Python env.

mosselman

20 hours ago

uv is great. I am a Ruby developer and I always loathed having to work with Python libraries because of how bad the tooling was. It was too complex to learn for the one-off times that I needed it and nothing worked properly.

Now with uv everything just works and I can play around easily with all the great Python projects that exist.

pjmlp

20 hours ago

Using Python on and off for OS scripting since version 1.6.

It has always been enough to place installations in separate directories, and use the same bash scripts for environment variables configuration for all these years.

amoe_

3 hours ago

If I'm already using Poetry and not facing any issues with it, is there any advantage to using uv?

tinodb

3 hours ago

Speed. It's an order of magnitude faster, at least.

barrrrald

14 hours ago

I’m surprised not to see a discussion of the biggest drawback: despite being fewer characters, “uv” is harder to type than “pip”. It requires two different hands to participate and a longer reach with my left index finger. pip is convenient – just a little rattle off with my right hand.

toenail

12 hours ago

Linters and formatters are things you shouldn't have to run by hand.. that's what you have CIs, git hooks, IDEs, etc for

palata

6 hours ago

> curl -LsSf https://astral.sh/uv/install.sh | sh

"Just pipe a random script from the internet into your shell! What could possibly go wrong?"

joefarish

6 hours ago

We've all done it. I'm curious though, I wonder what would be the best way to prevent a user from doing this.

redleader55

2 hours ago

Wrap curl to detect if /dev/stdout is a pipe and if the output is a script - this is fast because of the hash-bang. From there you can do a lot of fancy things: replace the output with echo "don't do this" ; exit 1", check the Url against a list of well known accepted scripts based on hash, run the unknown ones through an LLM to validate if they are potentially malicious, etc.

palata

2 hours ago

The user is the one copy-pasting that line in their terminal. There is no preventing them from doing it, they can `rm -rf /` if they feel like it.

They shouldn't, though...

random3

12 hours ago

It's surprising to me how long it can take for some languages to get decent package management solutions. There are no silver bullets because it's tricky to "encode" compatibility in a version number. I personally think semver helped a little and damaged a lot more by selling a pseudo solution that stands no chance to solve the real problem it needs to.

Maven has always been a very good solution. I think Bazel is too, but haven't had much experience with it.

alienbaby

19 hours ago

Can I just start using python if I've already got a bunch of projects manage with venv / pyenv / virtualenv ( and tbh I've kinda got into a confused mess with all these venv things, and at this point just hope they all keep working...)

jillesvangurp

19 hours ago

Python is not my first language but I've always liked it. But project and dependency management was always a bit meh and an afterthought.

Over the years, I've tried venv, conda, pipenv, petry, plain pip with requirements.txt. I've played with uv on some recent projects and it's a definite step up. I like it.

Uv actually fixes most of the issues with what came before and actually builds on existing things. Which is not a small compliment because the state of the art before uv was pretty bad. Venv, pip, etc. are fine. They are just not enough by themselves. Uv embraces both. Without that, all we had was just a lot of puzzle pieces that barely worked together and didn't really fit together that well. I tried making conda + pipenv work at some point. Pipenv shell just makes using your shell state-full just adds a lot of complexity. None of the IDEs I tried figured that out properly. I had high hopes for poetry but it ended up a bit underwhelming and still left a lot of stuff to solve. Uv succeeds in providing a bit more of an end to end solution. Everything from having project specific python installation, venv by default without hassle, dependency management, etc.

My basic needs are simple. I don't want to pollute my system python with random crap I need for some project. So, like uv, I need to have whatever solution deal with installing the right python version. Besides, the system python is usually out of date and behind the current stable version of python which is what I would use for new projects.

bicepjai

12 hours ago

I completely agree. Deploying Python packages like MCP servers has been a real game changer. I'm so glad the days of wrestling with conda environments and Jupyter kernels are behind us. I used to start personal projects, decide to clean up my Python setup first, and inevitably give up on the project after getting lost in the cleanup.

Areibman

18 hours ago

My biggest frustration is the lack of a good universal REPL to just play around with. It's frustrating how I have to run `uvx --with x,y,z ipython` every single time I just want to spin up some python code which may or may not use packages. (Hard to overstate how annoying it is to type out the modules list).

To me, Python's best feature is the ability to quickly experiment without a second thought. Conda is nice since it keeps everything installed globally so I can just run `python` or iPython/Jupyter anywhere and know I won't have to reinstall everything every single time.

embeng4096

18 hours ago

Would creating a `main.py` with the dependencies installed either as a uv project or inline work for you?

One thing I did recently was create a one-off script with functions to exercise a piece of equipment connected to the PC via USB, and pass that to my coworkers. I created a `main.py` and uv add'ed the library. Then when I wanted to use the script in the REPL, I just did `uv run python -i main.py`.

This let me just call functions I defined in there, like `set_led_on_equipment(led='green', on=True)` directly in the REPL, rather than having to modify the script body and re-run it every time.

Edit: another idea that I just had is to use just[0] and modify your justfile accordingly, e.g. `just pything` and in your justfile, `pything` target is actually `uv run --with x,y,z ipython`

Edit edit: I guess the above doesn't even require just, it could be a command alias or something, I probably am overengineering that lol.

[0]: https://github.com/casey/just

ValtteriL

9 hours ago

My go-to development environment for Python projects is nowadays Nix for uv, and uv for Python deps.

Nix does not play well with python dependencies. It's so nice to have that part taken care in a reproducible manner by uv.

mannicken

19 hours ago

God yes. I got dragged into the uv when I started using copyparty and I am a fanatical admirer ever since. I also use pipx to install tools often. I really don't understand why you can't just pip install something globally. I want this package to be available to me EVERYWHERE, why can't I do it? I only use python recreationally because everyone uses python everywhere and you can't escape it. So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.

collinmanderson

19 hours ago

> I really don't understand why you can't just pip install something globally. I want this package to be available to me EVERYWHERE, why can't I do it? I only use python recreationally because everyone uses python everywhere and you can't escape it. So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.

You may have a library that's been globally installed, and you have multiple projects that rely on it. One day you may need to upgrade the library for use in one project, but there are backward incompatibile changes in the upgrade, so now all of your other projects break when you upgrade the global library.

In general, when projects are used by multiple people across multiple computers, it's best to have the specific dependencies and versions specified in the project itself so that everyone using that project is using the exact same version of each dependency.

For recreational projects it's not as big of a deal. It's just harder to do a recreation of your environment.

zahlman

19 hours ago

> I want this package to be available to me EVERYWHERE, why can't I do it?

Because it being available in the system environment could cause problems for system tools, which are expecting to find something else with the same name.

And because those tools could include your system's package manager (like Apt).

> So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.

I assume you're referring to the new protections created by the EXTERNALLY-MANAGED marker file, which will throw up a large boilerplate warning if you try to use pip to install packages in the system environment (even with --user, where they can still cause problems when you run the system tools without sudo).

You should read one or more of:

* the PEP where this protection was introduced (https://peps.python.org/pep-0668/);

* the Python forum discussion explaining the need for the PEP (https://discuss.python.org/t/_/10302);

* my blog post (https://zahlman.github.io/posts/2024/12/24/python-packaging-...) where I describe in a bit more detail (along with explaining a few other common grumblings about how Python packaging works);

* my Q&A on Codidact (https://software.codidact.com/posts/291839/) where I explain more comprehensively;

* the original motivating Stack Overflow Q&A (https://stackoverflow.com/questions/75608323/);

* the Python forum discussion (https://discuss.python.org/t/_/56900) where it was originally noticed that the Stack Overflow Q&A was advising people to circumvent the protection without understanding it, and a coordinated attempt was made to remedy that problem.

Or you can watch Brodie Robertson's video about the implementation of the PEP in Arch: https://www.youtube.com/watch?v=35PQrzG0rG4.

namanyayg

10 hours ago

Is there a resource to understand the nuanced technical differences between pip and uv?

As an outsider to the python ecosystem I've wanted to learn the _how_ behind uv as well, but that hasn't been immediately clear

raduan

8 hours ago

I just hope that at this point the overtake over Python as a language and can shape it properly for the next 10 years(for AI and also for humans)

mov_dx_cx

5 hours ago

After I just enjoying on python project nd actually built something.

biglost

11 hours ago

I'm weird, this opinion Is even weirder, but how about start removing what makes Python slow? Sometimes you need yo remove features. It's hard, a pain but i dont see any other way of fixing really big mistakes...

culebron21

9 hours ago

Looks similar to buildout 15 years ago. It also made you a local executable `./bin/python`, with its own PYTHON_PATH. Then everyone rushed to virtualenv.

wrs

20 hours ago

Every time I see one of these comment threads it seems like uv desperately needs a better home page that doesn’t start with a long list of technical stuff. It’s really simple to use, in fact so simple that it confuses people!

The home page should be a simplified version of this page buried way down in the docs: https://docs.astral.sh/uv/guides/projects/

eikenberry

18 hours ago

I either want one universal tool that can manage this sort of thing across multiple languages (eg. devenv) or a native, built-in tool (eg. go's tooling). I don't see how this is any different from all the previous incarnations of Python's project/package management tools. The constant churning of 3rd party tooling for Python was one of the main reasons I mostly stopped using it for anything but smaller scripts.

9dev

17 hours ago

The difference is that this one is actually good. So good, in fact, that there is considerable momentum and thus adoption with this tool, and I wouldn’t be surprised if it reaches a similar state like npm is for node eventually.

zelphirkalt

15 hours ago

From the article:

> uv is an incredibly powerful simplification for us that we use across our entire tech stack. As developers, we can all work with identical Python installations, which is especially important given a number of semi-experimental dependencies that we use that have breaking changes with every version. On GitHub Actions, we’re planning to use uv to quickly build a Python environment and run our unit tests. In production, uv already manages Python for all of our servers.

> It’s just so nice to always know that Python and package installation will always be handled consistently and correctly across all of our machines. That’s why uv is the best thing to happen to the Python ecosystem in a decade.

I can only conclude, that the author of the article, and perhaps even the organization they work in, is unaware of other tools that did the job long before uv. If they really value reproducibility that much, how come they didn't look into the matter before? Things much have been really hastily stitched together, if no one ever looked at existing tooling before, and only now they make things reproducible.

I guess reproducibility is still very much a huge problem, especially in jobs, where it should be one of the most important things to take care of: Research. ("Astronomer & Science Communicator" it says on the website). My recommendation is: Get an actual software developer (at least mid-level) to support your research team. A capable and responsibly acting developer would have sorted this problem out right from the beginning.

I am glad they improved their project setups to the level they should be at, if they want to call it research.

collinmanderson

15 hours ago

> I can only conclude, that the author of the article, and perhaps even the organization they work in, is unaware of other tools that did the job long before uv. If they really value reproducibility that much, how come they didn't look into the matter before? Things much have been really hastily stitched together, if no one ever looked at existing tooling before, and only now they make things reproducible.

Yes, Poetry has had lock files for years, and pyenv has been able to manage installations, but uv is "an incredibly powerful simplification" that makes it easy to do everything really well with just one tool.

zelphirkalt

15 hours ago

Doesn't really explain, how their organization apparently ran around without proper lock files before, when they are a researcher. If anything, then this article is shining light on the previously bad state of project setup in the organization.

acdha

15 hours ago

Also, I kinda feel dirty criticizing an open source project but Poetry seems to be struggling with technical debt. I hit bugs which have been open for years or stuff which is WONTFIXed, and while they truly do not owe me anything, it’s a lot more rewarding to use uv where I hit fewer issues in general and the stuff I do hit is usually fixed quickly.

There’s a bigger conversation about open source maintenance there, but if I have to get my job done it’s increasingly tempting to take the simplifications and speed.

captain_coffee

20 hours ago

Yes, uv is probably the best thing to happen to the Py ecosystem in the last decade. That is mainly because the rest of the ecosystem is somewhere between garbage fire and mediocre at best. uv in itself is a great tool, I have no complaints about it whatsoever! But we have to remember just how bad the rest of things are and never forget that everything's still in a pretty bad state even after more than 3 ** DECADES ** of constant evolution.

These335

19 hours ago

Got a specific example in mind for garbage fire and mediocre?

rkagerer

15 hours ago

This looks awesome.

But why is it the Windows installation is to execute a script off the Internet with bypassed security isolations?

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

pshirshov

19 hours ago

And still there are some annoying issues:

  dependencies = [
      "torch==2.8.0+rocm6.4",
      "torchvision==0.23.0+rocm6.4",
      "pytorch-triton-rocm==3.4.0",
  ...
  ]
There is literally no easy way to also have a configuration for CUDA, you have to have a second config, and, the worse, manually copy/symlink them into the hardcoded pyproject.toml file

sirfz

18 hours ago

Checkout dependency groups and uv conflicts configuration

senderista

17 hours ago

It’s puzzling why Python became the de facto standard scripting language rather than Ruby when the tooling was so inferior.

nhumrich

13 hours ago

Perhaps the language design is more important than the tooling?

insane_dreamer

11 hours ago

the ecosystem, especially math / stats / data analysis packages. Also Google used python, making it more popular

cidd

16 hours ago

AI/ML

collinmanderson

15 hours ago

> AI/ML

The Machine-Learning world, especially "Google Brain" research team figured out that NumPy was an awesome piece of software for dealing with large arrays of numbers and matrix multiplication. They built "TensorFlow" on top of it around 2015 which became very popular. Facebook followed suit and released PyTorch in 2016.

IPython/Jupiter notebooks (for Julia, Python and R) from 2015 were another factor, also adopted by the AI/ML community.

The alternative data-science languages at the time were Mathematica, MATLAB, SAS, Fortran, Julia, R, etc, but Python probably won because it was general purpose and open source.

I suspect Python would not have survived the 2/3 split very well if it wasn't for AI/ML adopting Python as its main language.

> when the tooling was so inferior

Since 2012, Conda/Anaconda has been the go-to installer in the SciPy/NumPy world which also solves a lot of problems that uv solves.

tonymet

19 hours ago

Can someone steelman the python tooling ecosystem for me? Having a new packaging / dependency manager every few years seems excessive.

collinmanderson

19 hours ago

uv is finally an all-in-one tool that finally takes all of the good ideas from previous projects and combines them together to work well as one (and unbelievably fast).

The fact that it's a binary, not written in python, also simplifies bootstrapping. So you don't need python+dependencies installed in order to install your python+dependencies.

antod

13 hours ago

One helpful element that has changed over the years compared to the old wild west days is the large number of PEPs that have quietly in the background bit by bit standardized packaging formats and requirements.

Some foundations have moved into the stdlib. This means that newer tools are much more compatible with each other and mainly just differ in implementation rather than doing different things altogether. The new stuff is working on a much more standard base and can leave behind many dark crufty corners.

Unravelling the legacy stuff and putting the standards in place seems to have taken 15+ years?

tonymet

17 hours ago

I'm hoping for the best. now there's a lot of CI and Readme.md that will need rewriting

zahlman

18 hours ago

All of these tools are third-party and the Python core development team can't do anything to prevent people from inventing new ones. Even pip is technically at arms length; it has special support in the standard library (Python releases will vendor a wheel for it, which is designed to be able to bootstrap itself for installation[0]), but is developed separately.

Standards are developed to allow existing tools to inter-operate; this entails allowing new tools to appear (and inter-operate), too.

This system was in some regards deliberate, specifically to support competition in "build backends". The background here is that many popular Python projects must interface to non-Python code provided with the project; in many cases this is code in compiled languages (typically C, Fortran or Rust) and it's not always possible to pre-build for the user's system. This can get really, really complicated, and people need to connect to heavyweight build systems in some cases. The Python ecosystem standards are designed with the idea that installers can automatically obtain and use those systems when necessary.

And by doing all of this, Python core developers get to focus on Python itself.

Another important concern is that some bad choices were made initially with Setuptools, and we have been seeing a very long transition because of a very careful attitude towards backwards compatibility (even if it doesn't seem that way!) which in turn is motivated by the battle scars of the 2->3 transition. In particular, it used to be normal and expected that your project would use arbitrary Python code (in `setup.py` at the project root) simply to specify metadata. Further, `setup.py` generally expects to `import setuptools`, and might require a specific version of Setuptools; but it can't express its build-time Setuptools version requirement until the file is already running - a chicken-and-egg scenario.

Modern projects use a declarative TOML file for "abstract" metadata instead (which is the source for concrete metadata included in the actual build artifacts), but the whole ecosystem still has to support a lot of really outdated ways of doing things, because in part of how much abandonware is out there.

[0]: Wheels are zip-compressed, and Python can run code from a zip file, with some restrictions. The pip project is designed to make sure that this will work. The standard library provides a module "ensurepip" which locates this wheel and runs a bootstrap script from that wheel, which will then install into the current environment. Further, the standard library "venv", used to create virtual environments, defaults to using this bootstrap in the newly created environment.

tonymet

17 hours ago

It's helpful context but still seems like a lost opportunity for python to provide the UI. It feels like every couple years we are reworking the wheel and redefining how to publish software.

With python over the years i can think of pip, pipx, setuptools, easy_install, distutils, venv, conda, wheel, .egg, wheel (formats) , now uv.

PHP stabilized with composer, perl with cpan , go with `go mod` and `go get` (builtin).

Java and Swift had some competition with Gradle/maven and swiftPM / cocoapods, but nothing as egregious.

file tree, dep tree, task DAG. how many ways can they be written?

zahlman

16 hours ago

> It feels like every couple years we are reworking the wheel

Almost literally: https://wheelnext.dev/

> how many ways can they be written?

It's not just a matter of how they're written. For Python specifically, build orchestration is a big deal. But also, you know, there are all the architecture ideas that make uv faster than pip. Smarter (and more generous) caching; hard-linking files where possible rather than copying them; parallel downloads (I tend to write this off but it probably does help a bit, even though the downloading process is intermingled with resolution); using multiple cores for precompiling bytecode (the one real CPU-intensive task for a large pure-Python installation).

tonymet

12 hours ago

It sounds great and I’m not against Uv . It probably is the best . I’m wondering what’s wrong with the Python community that 25 years sees 10 package managers. I’m not being cynical it’s a clinical / empirical question

dark__paladin

21 hours ago

Genuinely trying to learn here - what's the major advantage of using uv over conda?

(Transparently, I'm posting this before I've completed the article.)

ethmarks

20 hours ago

They have different use cases. uv is meant to be the singular tool for managing Python packages and dependencies, replacing pip, virtualenv, and pip-tools. Conda is for more general-purpose environment management, not just Python. If you're doing something with Node or R, uv won't work at all because it's only for Python.

uv's biggest advantage is speed. It claims a 10-100x performance speedup over pip and Conda [1]. uv can also manage python versions and supports using Python scripts as executables via inline dependencies [2].

But Conda is better for non-Python usage and is more mature, especially for data science related uses.

[1]: https://github.com/astral-sh/uv/blob/main/BENCHMARKS.md [2]: https://docs.astral.sh/uv/#scripts

collinmanderson

20 hours ago

uv is unbelievably fast.

zahlman

18 hours ago

The speed is quite believable. Reinstalling packages from a cache should be extremely fast. Pip suffers from poor architecture.

psunavy03

20 hours ago

I have one problem with uv as of now, and it's more of an annoyance. It doesn't seem to understand the concept of >= when it's trying to resolve a local wheel I built and use. If I have 6.4.1 published on GitLab and the pyproject says $WHEEL_NAME>=6.2.0, it still goes to look for 6.2.0 (which I deleted) and errors out.

logicprog

17 hours ago

Uv existing is what made me willing to use Python as my primary prototype/experiment language!

aurintex

19 hours ago

I can only agree. I'm not an python expert, but I always struggled when installing a new package and got the warning, that it could break the system packages, or when cloning an existing repo on a new installed system. Always wondered, why it became so "complicated" over the years.

mogoh

5 hours ago

Is it worth running uv inside a docker container?

hoherd

5 hours ago

Sometimes, yes, but I would say it depends a lot on your base image and project. If nothing else, it gives you a runtime environment that is identical to your development environment.

Using uv at build time can dramatically reduce your build times if you properly handle the uv cache. https://docs.astral.sh/uv/guides/integration/docker/#caching

It's also easy:

COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/

https://docs.astral.sh/uv/guides/integration/docker

dirkc

5 hours ago

If you are using it for your project, wouldn't you necessarily have to use it for your Docker setup as well?

cyrialize

20 hours ago

I haven't tried uv yet, but I did use it's precursor - rye.

I had to update some messy python code and I was looking for a tool that could handle python versions, package updates, etc. with the least amount of documentation needing be read and troubleshooting.

Rye was that for me! Next time I write python I'm definitely going to use uv.

sirfz

20 hours ago

Indeed rye is great and switching to uv is pretty straight forward. I still think rye's use of shims was pretty cool but probably uv's approach is more sane

moonAA

7 hours ago

UV is so cool! These days, I never develop new projects without it.

languagehacker

21 hours ago

A very accessible and gentle introduction for the scientific set who may still be largely stuck on Conda. I liked it!

sanskarix

10 hours ago

This resonates so much. As someone who's more on the builder/product side than engineering, I've always felt that barrier with Python tooling. The learning curve for environment management has been one of those silent productivity killers.

What strikes me about uv is that it seems to understand that not everyone launching a Python-based project has a CS degree. That accessibility matters—especially in the era where more non-engineers are building products.

Curious: for those who've switched to uv, did you notice any friction when collaborating with team members who were still on traditional setups? I'm thinking about adoption challenges when you're not a solo builder.

talsperre

19 hours ago

uv is the best tool out there as long as you have python only dependencies. It's really fast, and you can avoid using poetry, pipenv, etc. The only reason for conda to still exist is non pythonic dependencies, but that's another beast to tackle in itself.

hglaser

20 hours ago

Am I the only one who feels like this is obviated by Docker?

uv is a clear improvement over pip and venv, for sure.

But I do everything in dev containers these days. Very few things get to install on my laptop itself outside a container. I've gotten so used to this that tools that uninstall/install packages on my box on the fly give me the heebie-jeebies.

czbond

20 hours ago

> I do everything in dev containers these days. Very few things get to install on my laptop itself outside a container.

Yes, it was the NPM supply chain issues that really forced this one me. Now I install, fetch, build in an interactive Docker container

NumberCruncher

19 hours ago

> Am I the only one who feels like this is obviated by Docker?

This whole discussion has the same vibes like digital photography 15 years ago. Back then some people spent more time on discussing the tech spec their cameras than takin photos. Now some people spend more time on discussing the pros and cons of different Python environment management solutions than building real things.

The last time I had to touch one of my dockerized environments was when Miniconda and Miniforge were merged. I said the agent "fix the dockerfile", and the third attempt worked. Another time, one dependency was updated and I had to switch to Poetry. Once again, I said the agent "refactor the repository to Poetry" and it worked. Maybe because all my Python package versions are frozen and I only update them when they break or when I need the functionality of the new version.

Whenever this topic pops up in real life, I always ask back what was the longest time they managed the same Python service in the cloud. In the most cases, the answer is never. The last time someone said one year. After a while this service was turned into two .py files.

I don't know. Maybe I'm just too far away from FAANG level sorcery. Everything is a hammer if all you have to deal with are nails.

zahlman

19 hours ago

Lots of people are doing things where they would prefer not to invoke the weight of an entire container.

panzi

13 hours ago

> uv is straightforward to install.

How do I install it globally on a system? Debian doesn't let me install packages via pip outside of a venv or similar.

varispeed

2 hours ago

I don't know. This looks more complex than pyenv.

samgranieri

20 hours ago

I'm not a pythonista, and the most recent time I've been playing with python has been using octodns. origninally I was using a pip setup, and honestly wow UV was so much faster.

I'm very happy the python community has better tooling.

dec0dedab0de

20 hours ago

I don't like that it defaults to putting the virtual environment right there, I much prefer how pipenv does it with a shared one in the users home directory, but it's a small price to pay for how fast it is.

aranw

19 hours ago

For years I've avoided using Python tools because I've always struggled to get them working properly. Will uv solve this pain for me? Can I install a Python app globally with it?

pouetpouetpoue

8 hours ago

best thing would be to use the package manager of the os.

armeetj

10 hours ago

uv brings some organization to the hell that is python package management

srameshc

20 hours ago

I am still learning and I have the same feeling as someone who don't consider myself good with python. At least I can keep my venv in control now is all I can feel with Uv approach.

himyathanir

10 hours ago

The popularity of uv marks the commercialization of the python ecosystem. And in this case by a company whose business model is totally elusive to me. Not a fan.

pulkitsh1234

10 hours ago

uv has been my sole reason to come back to Python for coding. It was just too time consuming to setup a working dev environment with Python locally.

never_inline

8 hours ago

This is just not true. Poetry existed for a while. It was slower than uv but not a deal breaker.

samuel2

19 hours ago

Reminds me of Julia's Pkg manager and the way Julia packages are managed (also with a .toml file). That's the way to go!

nothrowaways

20 hours ago

Does speed really matter during python installation?

andy99

16 hours ago

Possibly for some workflows, though personally I find the emphasis on speed baffling and a big part of the reason I don’t find most of these uv testimonials credible. I’m a regular python user across multiple environments and I’ve never considered waiting for pip to be a material part of my time, it’s trivial to the point of being irrelevant. The fact that so many people come out of the woodwork to talk about how fast it is, means either there’s some big group somewhere with a niche use case that gets them bogged down in pip dependency resolving or whatever gets sped up (obviously the actual downloading can’t be faster) or it’s just a talking point that (presumably) rust zealots who don’t actually use python arrive with en mass, but it’s honestly an extremely ineffective way of promoting the product to most python users who don’t have speed of package installation as anything close to a pain point.

maccard

20 hours ago

Speed matters everywhere. How much compute is spent on things that could easily be 100x faster than they are? Compare using VMware with pip to run a battery of unit tests with firecracker plus uv. It’s orders of magnitude quicker, and avoids a whole suite of issues related to persistent state on the machine

sunshowers

20 hours ago

Yes. Technical excellence is a virtue in and of itself.

mwcampbell

17 hours ago

This! I'm tired of the constant calls to be as mediocre as we can get away with, in the name of getting things done faster and cheaper.

collinmanderson

20 hours ago

It's fast enough that sometimes dependencies can be checked and resolved and installed at program runtime rather than it needing to be a separate step.

You can go from no virtual environment, and just "uv run myfile.py" and it does everything that's needed, nearly instantly.

zahlman

19 hours ago

On my system, Pip takes noticeable time just to start up without ultimately doing anything of importance:

  $ time pip install
  ERROR: You must give at least one requirement to install (see "pip help install")

  real 0m0.356s
  user 0m0.322s
  sys 0m0.036s
(Huh, that's a slight improvement from before; I guess pip 25.3 is a bit better streamlined.)

andy99

16 hours ago

lol who is using pip so much that .36s of startup time matters to them? This, if presumably uv can do nothing slightly faster, is an absolutely meaningless benefit

sunshowers

14 hours ago

In general, whenever you introduce a cache to make software faster (along any dimension), you have to think about cache invalidation and eviction. If your software is fast enough to not need caching, this problem goes away.

nova22033

19 hours ago

First time I tried to teach my son java, I realized how badly it's missing a built in dependency management system.

cluckindan

9 hours ago

Finally, npm comes to python-land. :-)

__mharrison__

18 hours ago

The best things since f-strings...

I'm teaching (strongly recommending/forcing using) uv in all my courses now.

quantum_state

19 hours ago

Running pytest with uv run —active pytest… is very slow to get it started … anyone has some tips on this?

docsaintly

20 hours ago

Python venv's is the #1 reason I've avoided working with it more. It used to be #2 behind strong typing, but now that Linux OSes' take up the default python install and block it from being used for quick scripts, it jumped to #1.

I've always wondered why Linux OSes that rely on python scripts don't make their own default venv and instead clobber the user's default python environment...

GamerYou54

8 hours ago

Hey have you heard about block0hunt by any chance? Looks like a cicada thing again

bonyt

14 hours ago

uv is fast enough that you can put things like this in your profile:

   alias ytd="uv tool upgrade yt-dlp && yt-dlp"
Which is pretty cool.

superfish

13 hours ago

Why not just ytd=“uvx yt-dlp”?

nicman23

8 hours ago

the farce that is python packaging is the reason that 99% of docker images exist

didip

18 hours ago

UV indeed is a blessing. Love it. Hopefully it gets recommended as the official one.

LtWorf

17 hours ago

It's VC backed. I have 100% confidence that it will end up badly.

devlovstad

20 hours ago

uv has made working with different python versions and environments much, much nicer for me. Most of my colleagues in computational genomics use conda, but I've yet to encounter a scenario where I've been unable to just use uv instead.

bfkwlfkjf

20 hours ago

The best thing about uv is it's not conda.

Pip is also not conda, but uv is way faster than pip.

pouetpouetpoue

8 hours ago

the best thing would be to use the package manager of the os.

zelphirkalt

16 hours ago

Hmpf. I am using uv now, but I have been doing fine before using poetry. For me it is not a huge revolution, as I always value reproducibility, which means lock file and checksums, and that, I was able to have before using poetry. Yes, yes, ... uv is faster. I grant them that. And yes, it's pleasant, when it runs so quickly. But I am not changing dependencies that often, that this really impacts my productivity. A venv is created, it stays. Until at some point I update pyproject.toml and the lock file.

Since I am mostly avoiding non-reproducible use-cases, like for example stating dependencies inside the python scripts themselves, without checksums, only with versions, and stuff like that, I am not really benefiting that much. I guess, I am just not writing enough throwaway code, to benefit from those use-cases.

Some people here act, like uv is the first tool ever to install dependencies like npm and cargo and so on. Well, I guess they didn't use poetry before, which did just that.

skavi

16 hours ago

poetry was incredibly slow and flaky in my experience.

zelphirkalt

15 hours ago

I've used it in various work projects/services, and in my free time in various projects. Never had anything "flaky" about it happening. Care to elaborate what you mean by that?

magdyks

20 hours ago

Huge fan of uv and ruff and starting to play around with ty. Hats of to astral!

AbuAssar

10 hours ago

why uv is not included by default with the standard Python binary installers?

asaddhamani

21 hours ago

I find the python tooling so confusing now. There’s pip, virtualenv, pipx, uv, probably half a dozen others I’m missing. I like node, npm isolates by default, npx is easy to understand, and the ecosystem is much less fragmented. I see a python app on GitHub and they’re all listing different package management tools. Reminds me of that competing standards xkcd.

tabletcorry

21 hours ago

Node has at least bun, and probably other tools, that attempt to speed things up in similar ways. New tooling is always coming for our languages of choice, even if we aren't paying attention.

collinmanderson

20 hours ago

> There’s pip, virtualenv, pipx, uv, probably half a dozen others I’m missing...

> Reminds me of that competing standards xkcd.

Yes, for years I've sat on the sidelines avoiding the fragmented Poetry, ppyenv, pipenv, pipx, pip-tools/pip-compile, rye, etc, but uv does now finally seem to be the all-in-one solution that seems to be succeeding where other tools have failed.

zahlman

18 hours ago

> I see a python app on GitHub and they’re all listing different package management tools.

In general, you can use your preferred package management tool with their code. The developers are just showing you their own workflow, typically.

theultdev

20 hours ago

well there's npm, pnpm, yarn, bun package managers

not a python developer, so not sure it's equivalent as the npm registry is shared between all.

curiousgal

21 hours ago

The best thing to happen to the Python ecosystem would be something that unites pip and conda. Conda is not going anywhere given how many packages depend on non-python binaries, especially in enterprise settings.

Carbonhell

20 hours ago

You might be interested in Pixi: https://prefix.dev/ It uses uv under the hood for Python dependencies, while allowing you to also manage Conda dependencies in the same manifest (pixi.toml). The ergonomics are really nice and intuitive imo, and we're on our way to replace our Poetry and Conda usage with only Pixi for Python/C++ astrodynamics projects. The workspace-centric approach along with native lockfiles made most of our package management issues go away. I highly recommend it! (Not affiliated anyhow, other than contributing with a simple PR for fun)

karlding

21 hours ago

I'm not sure if you're aware, but there's the Wheel Variants proposal [0] that the WheelNext initiative is working through that was presented at PyCon 2025 [1][2], which hopes to solve some of those problems.

uv has implemented experimental support, which they announced here [3].

[0] https://wheelnext.dev/proposals/pepxxx_wheel_variant_support...

[1] https://us.pycon.org/2025/schedule/presentation/100/

[2] https://www.youtube.com/watch?v=1Oki8vAWb1Q

[3] https://astral.sh/blog/wheel-variants

zahlman

21 hours ago

The standard approach nowadays is to vendor the binaries, as e.g. Numpy does. This works just fine with pip.

I'm interested if you have any technical documentation about how conda environments are structured. It would be nice to be able to interact with them. But I suspect the main problem is that if you use a non-conda tool to put something into a conda environment, there needs to be a way to make conda properly aware of the change. Fundamentally it's the same issue as with trying to use pip in the system environment on Linux, which will interfere with the system package manager (leading to the PEP 668 protections).

dugidugout

20 hours ago

I had this discussion briefly with a buddy who uses python exclusively for his career in austronomy. He was lamenting the pains of colaborting around Conda and seemed convinced it was irreplaceable. Being that I'm not familiar with the exact limitations Conda is providing for, Im curious if you could shed some insight here. Does nix not technically solve the issue? I understand this isn't solely a technical problem and Nix adoption in this space isn't likely, but I'm curious none-the-less!

semiinfinitely

20 hours ago

I had a recent period in my programming career where I started to actually believe that the "worse is better" philosophy is true in practice. It was a dark period and thankfully the existence of tools like uv save me from that abyss.

an_guy

20 hours ago

All these comments look like advertisement. "uv is better than python!!", "8/10 programmers recommend uv", "I was a terrible programmer before but uv changed my life!!", "uv is fast!!!"

collinmanderson

20 hours ago

> All these comments look like advertisement. "uv is better than python!!", "8/10 programmers recommend uv", "I was a terrible programmer before but uv changed my life!!", "uv is fast!!!"

Have you tried uv?

an_guy

20 hours ago

Why would I? Does it offer something that standard python tools doesn't? Why uv over, lets say, conda?

andy99

20 hours ago

FWIW I asked the same question last time a uv thread was posted (two weeks ago) - got some legit answers, none that swayed me personally but I can see why people use it. Also lots of inexplicable love for it https://news.ycombinator.com/item?id=45574550

collinmanderson

16 hours ago

I agree that the speed improvements are inexplicable, as in I can't convince you in writing. "uv is fast!!!" doesn't do it justice. You kinda just have to experience it for yourself.

If you haven't spent 5 minutes trying it out, you don't know what you're missing.

If you're worried about getting addicted like everyone else, I could see that as a valid reason to never try it in the first place.

dragonwriter

20 hours ago

> Does it offer something that standard python tools doesn't?

Other than speed and consolidation, pip, pipx, hatch, virtualenv, and pyenv together roughly do the job (though pyenv itself isn’t a standard python tool.)

> Why uv over, lets say, conda?

Support for Python standard packaging specifications and consequently also easier integration with other tools that leverage them, whether standard or third party.

wiseowise

19 hours ago

Maybe open hundreds of threads praising uv to find what was answered thousand of times?

andy99

20 hours ago

First time reading one of these threads? It’s a cult, and don’t dare criticize it. I think the same thing used to be true with rust though nobody really talks about it much anymore.

I don’t think people would think twice about the legitimacy (if you want to call it that) of uv except for all the weird fawning over it that happens, as you noticed. It makes it seem more like a religion or something.

jonnycomputer

18 hours ago

How is this different than (or better than) pyenv?

peter-m80

19 hours ago

So basically a node-like thing for python

rieogoigr

19 hours ago

Is there a way to install this that doesn't involve piping a random URL to my shell interpreter?

zahlman

18 hours ago

Uv is available as a wheel from PyPI, so you can in fact `pip install uv` into an appropriate environment. Since it provides a command-line binary, Pipx will also happily install it into an environment it manages for you. And so on and so forth. (You can even install uv with uv, if you want to, for whatever reason.)

The wheel basically contains a compiled ~53MB (huh, it's grown in recent versions) Rust executable and a few boilerplate files and folders to make that play nice with the Python packaging ecosystem. (It actually does create an importable `uv` module, but this basically just defines a function that tells you the path to the executable.)

If you want it in your system environment, you may be out of luck, but check your full set of options at https://docs.astral.sh/uv/getting-started/installation/ .

The install script does a ton of system introspection. It seems to be structured quite similarly to the Julia installer, actually.

petralithic

16 hours ago

Rust is the best thing to happen to the Python (and JS) ecosystem in a decade. Once people realized that the tooling doesn't need to be written in the same language as the target language, it opens up all sorts of performance possibilities.

orangeisthe

11 hours ago

why is a uv README the top post on HN??

orliesaurus

13 hours ago

the hilarious part is that uv is written in rust

mhogers

20 hours ago

Seeing a `pip install -r requirements.txt` in a very recently created python project is almost a red flag now...

nomel

19 hours ago

requirements.txt allows pip arguments to be included, so can be doing much more than just listing package names.

For example, installing on an air gapped system, where uv barely has support.

CalChris

20 hours ago

Mojo?

zahlman

20 hours ago

As far as I can tell, Mojo doesn't have very broad adoption. It also isn't actually Python, it just looks like it.

ModernMech

19 hours ago

Mojo stopped saying out loud they are trying to be a Python superset. Maybe they can do it one day but they're keeping that on the DL now because it's a really big ask.

helicone

13 hours ago

sunlight really IS the best disinfectant

karel-3d

7 hours ago

Another year, another new way of installing python packages?

dev_l1x_be

20 hours ago

And Rust is the best thing to happen to CS in a decade

fortran77

17 hours ago

Why is it written in Rust though? I'd prefer a pure Python solution.

pixelpoet

20 hours ago

Wait until they fully embrace the benefits of strong typing :)

tootie

20 hours ago

I've been using uv and am pleased that is about as useful as maven was the last time I used it 12 years ago. I'm not really sure why we still need venv.

j45

20 hours ago

uv has definitely helped make python a first class citizen in more ways.

TZubiri

6 hours ago

>curl -LsSf https://astral.sh/uv/install.sh | sh

Don't do this shit, especially if you were told to do this shit.

If you are going to do this shit, separate the commands, read the bash script (and roll it down, if another file is downloaded, download that manually and inspect it).

If you are going to ask people to do this shit, split the command into two. Someone that asks me to do something insecure is either a malicious actor that is trying to compromise me, or someone careless enough to be compromised themselves.

I don't care what uv is, I can pip install stuff thank you. I install 2 or 3 things tops. I don't install 500 packages, that's sounds like a security nightmare.

Change your ways or get pwned people. Don't go the way of node/npm

p.s: Stop getting cute with your TLDs, use a .com or the TLD from your country, using gimmick TLDs is opaque and adds an unnecessary vector, in this case on the politics of the British Overseas Territory of Saint Helena, Ascension, and Tristan da Cunha.

monerozcash

an hour ago

Nobody actually inspects binaries anyway, what's the difference?

eisbaw

19 hours ago

nix-shell is the OG

sph

21 hours ago

There is something hilarious about using a project/package manager written in another language.

wiseowise

19 hours ago

Care to share with the group what’s so hilarious?

philipallstar

20 hours ago

Wait til you find out what CPython is written in.

andrewstuart

21 hours ago

Venv seems pretty straightforward once you’ve learned the one activate command.

I don’t really get that uv solves all these problems ve never encountered. Just make a venv and use it seems to work fine.

nicce

21 hours ago

There have been actually many cases in my experience where venv simply worked but uv failed to install dependencies. uv is really fast but usually you need to install dependencies just once.

cdmckay

21 hours ago

Occasionally I have to build Python projects and coming from other languages and package managers, having to deal with a venv is super weird and annoying.

collinmanderson

18 hours ago

> I don’t really get that uv solves all these problems ve never encountered. Just make a venv and use it seems to work fine.

For me package installation is way, way faster with uv, and I appreciate not needing to activate the virtual environment.

bigstrat2003

21 hours ago

Yeah I've never remotely had problems with venv and pip.

athorax

21 hours ago

For me the biggest value of uv was replacing pyenv for managing multiple versions of python. So uv replaced pyenv+pyenv-virtualenv+pip

Hasz

21 hours ago

This is it. Later versions of python .11/.12/.13 have significant improvements and differences. Being able to seamlessly test/switch between them is a big QOL improvement.

I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.

zahlman

21 hours ago

> Later versions of python .11/.12/.13 have significant improvements and differences. Being able to seamlessly test/switch between them is a big QOL improvement.

I just... build from source and make virtual environments based off them as necessary. Although I don't really understand why you'd want to keep older patch versions around. (The Windows installers don't even accommodate that, IIRC.) And I can't say I've noticed any of those "significant improvements and differences" between patch versions ever mattering to my own projects.

> I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.

In my book, the less under the PSF's control, the better. The meager funding they do receive now is mostly directed towards making PyCon happen (the main one; others like PyCon Africa get a pittance) and to certain grants, and to a short list of paid staff who are generally speaking board members and other decision makers and not the people actually developing Python. Even without considering "politics" (cf. the latest news turning down a grant for ideological reasons) I consider this gross mismanagement.

philipallstar

20 hours ago

> I think such core tooling should be tied to the PSF, but that's a minor point.

The PSF is busy with social issues and doesn't concern itself with trivia like this.

rkomorn

21 hours ago

Didn't Astral get created out of uv (and other tools), though? Isn't it fair for the creators to try and turn it into a sustainable job?

Edit: or was it ruff? Either way. I thought they created the tools first, then the company.

gegtik

21 hours ago

Yes. poetry & pyenv was already a big improvement, but now uv wraps everything up, and additionally makes "temporary environments" possible (eg. `uv run --with notebook jupyter-notebook` to run a notebook with my project dependencies)

Wonderful project

nilamo

20 hours ago

If that works for you, then that's cool. Personally, I don't want to think about environments, and it's weird that python is the only language that has venvs. Having a tool that handles it completely transparently to me is ideal, to me.

projektfu

21 hours ago

One thing that annoys me about Claude is that it doesn't seem to create a venv by default when it creates a python project. (But who knows, maybe 1/3 of the time it does or something.) But you have to ask each time to be sure.

hkt

19 hours ago

Am I the only one who thought poetry was still the greatest available whizbang?

raduan

8 hours ago

been great maybe 5 years ago. Just migrated all of my projects from poetry to uv and it's been a big productivity boost for everyone: myself, agents, my CI and CD

canto

18 hours ago

ffs, stop installing stuff by piping random scripts from the internet to shell!!1one

ModernMech

19 hours ago

Honestly though it's a pretty rough indictment of Python that the best thing to happen in a decade is that people started writing Python tools in Rust. Not even a little Rust, uv is 98% Rust. I mean, they just released 3.14 and that was supposed to be a pretty big deal.

pansa2

17 hours ago

I sometimes wonder if many core Python people don’t actually like the language that much. That’s why (a) they’re constantly reinventing it, and (b) they celebrate rewrites from Python into other languages. Long before Rust, it was considered a good thing when a standard library module was rewritten in C.

Compare this to the Go community, who celebrate rewrites from other languages into Go. They rewrote their compiler in Go even though that made it worse (slower) than the original C version, because they enjoy using their own language and recognise the benefits of dogfooding.

hexo

7 hours ago

only people who dont like python celebrate this kind of changes.

zahlman

18 hours ago

No, the "best thing that happened" (in TFA's author's opinion) is that this specific tool exists, with its particular design. Rust is an implementation detail. Most of the benefit that Uv offers over pip, in my analysis, is not a result of being written in Rust.

3.14 is a big deal.

ModernMech

12 hours ago

I don't think Rust is incidental here. First, uv's particular design cargo culted from... well cargo. Which, they should be cause cargo is a great tool, no shade there.

But otherwise, people on this forum and elsewhere are praising uv for: speed, single-file executable, stability, and platform compatibility. That's just a summary of the top reasons to write in Rust!

I agree 3.14 is a big deal as far as Python goes, but it doesn't really move the needle for the language toward being able to author apps like uv.

wiseowise

18 hours ago

Who cares what it is written in?

sunshowers

17 hours ago

Rust's rigorous separation of immutable and mutable state consistently leads to higher-quality software that stands the test of time.

t43562

8 hours ago

C has stood a very great test of time and we don't ascribe virtue to it.

ModernMech

18 hours ago

It's called dogfooding -- writing tools for the language in the language. Not doing so here, where the result is "best thing to happen to the ecosystem in a decade", is a tacit admission that Python isn't up for the task of writing best-in-class Python tooling (the use of Rust wasn't incidental). Having seen uv, people will probably start writing more Python-ecosystem projects in Rust.

Which is fine, Python is not for everything.

forrestthewoods

18 hours ago

uv is spectacular

But I’m utterly shocked that UV doesn’t support “system dependencies”. It’s not a whole conda replacement. Which is a shame because I bloody hate Conda.

Dependencies like Cuda and random C++ libraries really really ought to be handled by UV. I want a true genuine one stop shop for running Python programs. UV is like 80% of the way there. But the last 20% is still painful.

Ideally UV would obsolete the need for docker. Docker shouldn’t be a requirement to reliable run a program.

dirkc

5 hours ago

Does uv use a sandbox or do process isolation?

I've switched to running any and all python projects in Docker as a way to ensure that low effort supply chain attacks doesn't easily get everything in my home dir. So even if I use uv, I'd only do that in a Docker image for now

zoobab

3 hours ago

Ideally uv pip install mypackage should work 100% of the time, but that's far from the case.

Animats

21 hours ago

Another Python package manager? How many are there now?

zahlman

21 hours ago

> Another

No, the same uv that people have been regularly (https://hn.algolia.com/?q=uv) posting about on HN since its first public releases in February of 2024 (see e.g. https://news.ycombinator.com/item?id=39387641).

> How many are there now?

Why is this a problem? The ecosystem has developed usable interoperable standards (for example, fundamentally uv manages isolated environments by using the same kind of virtual environment created by the standard library — because that's the only kind that Python cares about; the key component is the `pyvenv.cfg` file, and Python is hard-coded to look for and use that); and you don't have to learn or use more than one.

There are competing options because people have different ideas about what a "package manager" should or shouldn't be responsible for, and about the expectations for those tasks.

andy99

20 hours ago

It’s definitely an issue for learning the language. Obviously after working with python a bit that doesn’t matter, but fragmentation still makes it more of a hassle to get open source projects up and running if they don’t use something close to your usual package management approach.

haiji2025

14 hours ago

May I ask whether I can obtain the product I want by describing it directly?

GamerYou54

8 hours ago

Hey has anyone tried block0hunt recently? Looks like the cicada thing again. Im giving it a shot anyway

hirako2000

20 hours ago

A problem remain in that many and still more of the popular repositories don't use uv to manage their dependencies.

So you are back having to use conda and the rest. Now, you have yet another package manager to handle.

I wouldn't be harsh to engineers at astral who developed amazing tooling, but the issue with the python ecosystem isn't lack of tooling, it is the proliferation and fragmentation. To solve dependency management fully would be to incorporate other package descriptors, or convert them.

Rsbuild, another rust library, for the node ecosystem did just that. For building and bundling. They came up with rspack, which has large compatibility with the webpack config.

You find a webpack repo? Just add rsbuild, rspack, and you are pretty much ready to go, without the slow (node native) webpack.

raduan

8 hours ago

I've started forking some less popular ones and migrating them with AI to latest python tooling + uv.

It's been a joy for owning some of dependencies, that have been not maintained much.

Mostly just using codex web/claude code web and it's doing wonders.

zahlman

18 hours ago

When packages require conda, that has nothing to do with them "not using uv to manage their dependencies".

Conda solves a completely orthogonal set of problems, and is increasingly unnecessary. You can `pip install scipy` for example, and have been able to for a while.

oblio

20 hours ago

Don't they publish to PyPi? What do you care what they use behind the scenes?

hirako2000

19 hours ago

It isn't what they use under the scene.

I refered to the interfaces of other packaging tools. I use uv and it's excellent on its own.

You get a repo, it's using playwright, what do you do now ? You install all the dependencies found in the dependency descriptor then sync to create a uv descriptor. or you compose a descriptor that uv understands.

It's repetitive, rather systematic so it could be automated. I should volunteer for a PR but my point is introducing yet another tool to an ecosystem suffering a proliferation of build and deps management tooling expands the issue. It would have been helpful from the get go to support existing and prolific formats.

pnpm understands package.json It didn't reinvent the wheel be cause we have millions of wheels out there. It created its own pnpm lock file, but that's files a user isn't meant to touch so it goes seamlessly to transition from npm to pnpm. Almost the same when migrating from webpack to rsbuild.

oblio

8 hours ago

Ah, you mean if you take over maintenance for a project that uses a different tool? Yes, fragmentation hurts, but adopting good tools is better for everyone in the long run.

shevy-java

3 hours ago

I am a bit skeptical.

You only have to look at the ruby ecosystem and the recent mass-expulsion of long-term developers from rubygems/bundler via RubyCentral going full corporate-mode ("we needs us some more moneeeeeys now ... all for the community!!!" - or something). While one COULD find pros in everything, is what is happening in different programming languages really better for both users and developers? I am not quite convinced here.

I am not saying the prior status quo was perfect. What I am saying is ... I am not quite convinced that the proposed benefits are real. In fact, I find managing multiple versions actually annoying. I should say that I already handle that via the GoboLinux way mostly (Name/Version/ going into a central directory; this is also similar what homebrew does, and also to some extent nixos, except that they store it via a unique hash which is less elegant. For instance, on GoboLinux I would then have /Programs/Ruby/3.3.0/ - that's about as simple as can possibly be). I really don't want a tool I don't understand to inject itself here and add more complications to that. My system is already quite simple and I don't really need anything it describes to me as "you need this". I also track and handle dependencies on my own. (This is more work to do initially, but past that point I just do "ue" on the commandline to update to the latest version, where ue is simply an alias to a ruby class called UpdateEntry, which in turn updates an entry in a .yml file, which then is used to populate a SQL database and also downloads and repackages and optionally compiles/installs the given package, e. g. "ue mesa" would just update mesa .tar.xz locally. I usually don't automatically compile it though, so "ue" I just use to update a program version or simply change it; it also accepts an URL of course so users can override this behaviour as they see fit.)

skywhopper

2 hours ago

So you’ve learned one tool to handle this sort of complexity, but you are complaining that someone created a new one?