hamishwhc
a day ago
The author’s point about “not caring about pip vs poetry vs uv” is missing that uv directly supports this use case, including PyPI dependencies, and all you need is uv and your preferred Python version installed: https://docs.astral.sh/uv/guides/scripts/#using-a-shebang-to...
meander_water
a day ago
Actually you can go one better:
#!/usr/bin/env -S uv run --python 3.14 --script
Then you don't even need python installed. uv will install the version of python you specified and run the command.rikafurude21
a day ago
alternatively, uv lets you do this:
#!/usr/bin/env -S uv run --script
#
# /// script
# requires-python = ">=3.12"
# dependencies = ["foo"]
# ///semi-extrinsic
20 hours ago
The /// script block is actually specified in PEP 723 and supported by several other tools apart from uv.
nemosaltat
15 hours ago
The last time I commented extolling the virtues of uv on here, I got a similar reply, pointing out that PEP 723 specs this behavior, and uv isn’t the only way. So I’ll try again in this thread: I’m bullish on uv, and waiting for Cunningham.
semi-extrinsic
13 hours ago
I am all in on uv as well, and advocating for its use heavily at $dayjob. But I think having as much as possible of these things encoded in standards is good for the ecosystem. Maybe in a few years time, someone will make something even better than uv. And in the meantime, having things standardised speeds up adoption in e.g. syntax highlighting in editors and such.
yjftsjthsd-h
16 hours ago
That's good to hear; do you know what other tools support it?
semi-extrinsic
16 hours ago
From what I can tell, Hatch, PDM, pipx and pip-run also support it.
rjzzleep
6 hours ago
I'm actually a bit annoyed that uv won. I found pdm to be a really nice solution that fixed a lot of the issues poetry had without the hard ideological stance behind it, while fixing most of its problems. But maybe that ideology was partly what drove it's adoption.
nemosaltat
15 hours ago
I’ve started migrating all of my ~15 years of one-off python scripts to have this front matter. Right now, I just update when/if I use them. I keep thinking if were handier with grep/sed/regex etc, I’d try to programmatically update .pys system-wide. But, many aren’t git tracked/version controlled, just laying in whatever dir they service(d). I’ve several times started a “python script dashboard” or “hacky tools coordinator” but stop when I remember most of these are unrelated (to each-other) and un/rarely used. I keep watching the chatter and thinking this is probably an easy task for codex, or some other agent but these pys are “mine” (and I knew^ how they worked when I wrote^ them) and also, they’re scattered and there’s no way I’m turning an agent loose on my file system.
^mostly, some defs might have StackOverflow copy/pasta
giancarlostoro
14 hours ago
You could run ripgrep on your file system root to find most of them, its insanely fast, then feed it to claude or something to generate a script to do it for you.
ncouture
17 hours ago
This is an awesome features for quick development.
I'm sure the documentation of this featureset highlights what I'm about to say but if you're attracted to the simplicity of writing Python projects who are initialized using this method, do not use this code in staging/prod.
If you don't see why this is not production friendly it's for the simple a good.reaaon that creating deployable artifacts packaging a project or a dependency of a project this uses this method, creating reproducible builds becomes impossible.
This will also lead to builds that pass your CI but fail to run in their destination environment and vice versa due to the fact that they download heir dependencies on the fly.
There may be workarounds and I know nothing of this feature so investigate yourself if you must.
My two cents.
zahlman
20 hours ago
This isn't really "alternatively"; it's pointing out that in addition to the shebang you can add a PEP 723 dependency specification that `uv run` (like pipx, and some other tools) can take into account.
dietr1ch
20 hours ago
Yeah, but you need `uv`. If we are reaching out for tools that might not be around, then you can also depend on nix-shell,
#! /usr/bin/env nix-shell
#! nix-shell -i python3 --packages python3mystifyingpoi
20 hours ago
Yeah, but you need Nix. If we are reaching out for tools that might not be around, then you can also depend on `curl | sudo bash` to install Nix when not present.
(this is a joke btw)
jonhohle
19 hours ago
Yeah, but you need curl, sudo, and bash…
lioeters
17 hours ago
"Give me a 190-byte hex0 seed of x86 assembly, and I shall compile the rest of the world." - Archimedes
JodieBenitez
14 hours ago
... you must first invent the universe
kokada
15 hours ago
The issue I have with `nix-shell` is that the evaluation time is long, so if you need to run the script repeatedly it may take a long time. `nix shell` at least fix this issue by caching evaluations, but I think uv is still faster.
m3at
11 hours ago
As shared in a sibling comment, you can get away with just curl+shell: https://paulw.tokyo/standalone-python-script-with-uv/
Cyph0n
17 hours ago
This comes with the added benefit that your environment is reverted as soon as you exit the Nix shell.
jonhohle
19 hours ago
That shebang will work on GNU link based systems, but might not work elsewhere. I know that’s the most popular target, but not working on macOS, BSDs, or even busybox.
rented_mule
18 hours ago
I just tried the one you are replying to and it worked great on macOS. I frequently use a variant of this on my Mac.
jonhohle
10 hours ago
That’s interesting. I wonder when that changed. Maybe FreeBSD supports multi arg shebangs now, too
shakna
8 hours ago
The -S argument to env splits the argument on whitespace.
The shell doesn't support anything, it just passes the string to env.
So beware quoting and other delimiters that won't work the way you expect.
m3at
11 hours ago
And with some small shebang trick, you don't even need to have uv installed [1], just curl and a posix shell
Supermancho
18 hours ago
> Then you don't even need python installed. uv will install the version of python you specified and run the command
What you meant was, "you don't need python pre-installed". This does not solve the problem of not wanting to have (or limited from having) python installed.
benrutter
a day ago
I thought that too, but I think the tricky bit is if you're a non-python user, this isn't yet obvious.
If you've never used Clojure and start a Clojure project, you will almost definitely find advice telling you to use Leiningen.
For Python, if you search online you might find someone saying to use uv, but also potentially venv, poetry or hatch. I definitely think uv is taking over, but its not yet ubiquitous.
Ironically, I actually had a similar thing installing Go the other day. I'd never used Go before, and installed it using apt only to find that version was too old and I'd done it wrong.
Although in that case, it was a much quicker resolution than I think anyone fighting with virtual environments would have.
idoubtit
a day ago
That's my experience. I'm not a Python developer, and installing Python programs has been a mess for decades, so I'd rather stay away from the language than try another new tool.
Over the years, I've used setup.py, pip, pipenv (which kept crashing though it was an official recommendation), manual venv+pip (or virtualenv? I vaguely remember there were 2 similar tools and none was part of a minimal Python install). Does uv work in all of these cases? The uv doc pointed out by the GP is vague about legacy projects, though I've just skimmed through the long page.
IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times. I've also seen projects with incomplete dependencies (installed through Conda, IIRC) which were a major pain to get working. For many years, the only simple and sane way to run some Python code was in a Docker image, which has its own drawbacks.
lexicality
a day ago
> Does uv work in all of these cases?
Yes. The goal of uv is to defuck the python ecosystem and they're doing a very good job at it so far.
prox
20 hours ago
What are the big offenders right now? What does uv unfuck?
I only work a little bit with python.
lexicality
14 hours ago
In my experience every other python tool has a variety of slightly to extremely painful behaviours that you have to work around or at least be aware of.
Sometimes it's things like updating to Fedora 43 and every tool you installed with `pipx` breaking because it was doing things that got wiped out by the system upgrade, sometimes it's `poetry update --only dep1` silently updating dep2 in the background without telling you because there was an update available and even though you specified `--only` you were wrong to do that and Poetry knows best.
Did you know that when you call `python -m venv` you should always pass `--upgrade-deps` because otherwise it intentionally installs an out of date version of pip and setuptools as a joke? Maybe you're not using `python -m venv` because you ran the pyenv installer and it automatically installed `pyenv-virtualenv` without asking which overrides a bunch of virtualenv features because the pyenv team think you should develop things in the same way they do regardless of how you want to delevop things. I hate pyenv.
So far the only problem I've had with uv is that if you run `uv venv` it doesn't install pip in the created virtualenv because you're supposed to run `uv pip install` instead of `pip install`. That's annoying but it's not a dealbreaker.
Outside of that, I feel very confident that I could give a link to the uv docs to a junior developer and tell them to run `uv python install 3.13` and `uv tool install ruff` and then run `uv sync` in a project and everything will work out and I'm not going to have to help them recover their hard drive because they made the foolish mistake of assuming that `brew install python` wouldn't wreck their macbook when the next version of Python gets released.
lucideer
12 hours ago
uv not only completely replaces all of pip, pyenv & venv, but it also does a much better job than any of them at their intended function, as well as a bunch of other convenient, simple developer-friendly features.
1. pip isn't entirely to blame for all of Python's bad package management - distutils & setuptools gave us setup.py shenanigans - but either way, UV does away with that in favour of a modern, consistent, declarative, parseable PEP 508 manifest spec, along with their own well-designed lockfile (there was no accepted lockfile PEP at the time UV was created - since PEP 715 has become accepted UV has added support, though that PEP is still limited so there's more work to do here).
2. pyenv works fine but uv is faster & adds some nice extra features with uvx
3. venv has always been a pain - ensuring you're always in the right venv, shell support, etc. uv handles this invisibly & automatically - because it's one tool you don't need to worry about running pip in the right venv or whatever.
kevin_thibedeau
19 hours ago
pip and venv. The Python ecosystem has taken a huge step backwards with the preachy attitude that you have to do everything in a venv. Not when I want to have installable utility scripts usable from all my shells at any time or location.
I get that installing to the site-packages is a security vulnerability. Installing to my home directory is not, so why can't that be the happy path by default? Debian used to make this easy with the dist-packages split leaving site-packages as a safe sandbox but they caved.
kstrauser
16 hours ago
Regarding why not your home directory: which version of Foo do you install, the one that Project A needs or the incompatible one that Project B needs?
The brilliant part about venvs is that A and B can have their completely separate mutually incompatible environments.
kevin_thibedeau
14 hours ago
They have their place. But the default shouldn't force you into a "project" when you want general purpose applicability. Python should work from the shell as readily as it did 20 years ago. Not mysteriously break what used to work with no low-friction replacement.
gorgolo
2 hours ago
Python can work from the shell, if you don’t have external dependencies. But once you have external dependencies, with incompatible potential versions, I just don’t see how you could do this with “one environment”.
andoando
15 hours ago
Why can't we just have something like npm/gradle/maven dependencies? What makes python any different?
lexicality
14 hours ago
A python virtualenv is just a slightly more complicated node_modules. Tools like PDM, Poetry and uv handle them automatically for you to the point where it effectively is the same as npm.
The thing that makes Python different is that it was never designed with any kind of per-project isolation in mind and this is the best way anyone's come up with to hack that behaviour into the language.
halostatue
16 hours ago
For years, pipx did almost all the work that I needed it to do for safely running utility scripts.
uv has replaced that for me, and has replaced most other tools that I used with the (tiny amount of) Python that I write for production.
hexo
11 hours ago
It unfucks nothing because it wasn't fuckd in the first place. Whole uv is solution to non existing problem.
aeurielesn
21 hours ago
That's giving way too much credit to uv.
kraddypatties
18 hours ago
I'm interpreting this as "uv was built off of years of PEPs", which is true; that being said the UX of `uv` is their own, and to me has significantly reduced the amount of time I spend thinking about requirements, modules, etc.
karel-3d
20 hours ago
uv is really that good.
hexo
11 hours ago
If so, ok, let's port this prototype to back to python and get rid of uv.
llbeansandrice
6 hours ago
What does this comment mean? Port the dependency and virtual environment manager back to the language?
Should we port npm “back” to node js?
karel-3d
3 hours ago
Well, go does have the module management, including downloading new versions of itself, built-in into the `go` tool itself. It is really great.
But I don't see this hapenning in python.
hexo
2 hours ago
You don't see that happening because you don't want to.
hexo
2 hours ago
npm is written in javascript, not rust or c#.
yes, we should bring package manager back. if it is so awesome and solves some problem.
lexicality
14 hours ago
They've definitely not done it yet, but they're getting there.
NeutralCrane
19 hours ago
It really isnt
simonw
a day ago
> IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times.
One of the neatest features of uv is that it uses clever symlinking tricks so if you have a dozen different Python environments all with the same dependency there's only one copy of that dependency on disk.
zahlman
19 hours ago
Hard links, in fact. It's not hard to do, just (the Rust equivalent of) `os.link` in place of `os.copy` pretty much. The actually clever part is that the package cache actually contains files that can be used this way, instead of just having wheels and unpacking them from scratch each time.
For pip to do this, first it would have to organize its cache in a sensible manner, such that it could work as an actual download cache. Currently it is an HTTP cache (except for locally-built wheels), where it uses a vendored third-party library to simulate the connection to files.pythonhosted.org (in the common PyPI case). But it still needs to connect to pypi.org to figure out the URI that the third-party library will simulate accessing.
runjake
20 hours ago
I would not be putting up with Python if not for uv. It’s that good.
Before uv came along I was starting to write stuff in Go that I’d normally write in Python.
QuercusMax
18 hours ago
Coming from a mostly Java guy (since around 2001), I've been away from Python for a while and my two most recent work projects have been in Python and both switched to uv around the time I joined. Such a huge difference in time and pain - I'm with you here.
Python's always been a pretty nice language to work in, and uv makes it one of the most pleasant to deal with.
runjake
15 hours ago
I don't even like Python as a language (it's growing on me, but only a little).
It's just so useful: uv is great and there are decent quality packages for everything imaginable.
kristianbrigman
12 hours ago
That's partly because python has a very large installed base, and ease of entry (including distribution). This leads to people running into issues quicker, and many alternative solutions.
Unlike something like Rust, which has much fewer users (though growing) and requires PhDs in Compiler Imprecation and Lexical Exegetics.
Or C++ which has a much larger installed base but also no standard distribution method at all, and an honorary degree in Dorsal Artillery.
whimsicalism
18 hours ago
uv solved it, it’s safe to come back now.
regularfry
20 hours ago
There's definitely a philosophical shift that you can observe happening over the last 12-15 years or so, where at the start you have the interpreter as the centre of the world and at the end there's an ecosystem management tool that you use to give yourself an interpreter (and virtual environments, and so on) per project.
I think this properly kicked off with RVM, which needed to come into existence because you had this situation where the Ruby interpreter was going through incompatible changes, the versions on popular distributions were lagging, and Rails, the main reason people were turning to Ruby, was relatively militant about which interpreter versions it would support. Also, building the interpreter such that it would successfully run Rails wasn't trivial. Not that hard, but enough that a convenience wrapper mattered. So you had a whole generation of web devs coming up in an environment where the core language wasn't the first touchpoint, and there wasn't an assumption that you could (or should) rely on what you could apt-get install on the base OS.
This is broadly an extremely good thing.
But the critical thing that RVM did was that it broke the circular dependency at the core of the problem: it didn't itself depend on having a working ruby interpreter. Prior to that you could observe a sort of sniffiness about tools for a language which weren't implemented in that language, but RVM solved enough of the pain that it barged straight past that.
Then you had similar tools popping up in other languages - nvm and leiningen are the first that spring to mind, but I'd also throw (for instance) asdf into the mix here - where the executable that you call to set up your environment has a '#!/bin/bash' shebang line.
Go has sidestepped most of this because of three things: 1) rigorous backwards compatibility; 2) the simplest possible installation onramp; 3) being timed with the above timeline so that having a pre-existing `go` binary provided by your OS is unlikely unless you install it yourself. And none of those are true of Python. The backwards compatibility breaks in this period are legendary, you almost always do have a pre-existing Python to confuse things, and installing a new python without breaking that pre-existing Python, which your OS itself depends on, is a risk. Add to that the sniffiness I mentioned (which you can still see today on `uv` threads) and you've got a situation where Python is catching up to what other languages managed a decade ago.
Again.
bee_rider
18 hours ago
It is sort of funny, if we squint just the wrong way, “ecosystem management tool first, then think about interpreters” starts to look a lot like… a package manager, haha.
houzi
a day ago
Do you think a non-python user would piece it together if the shebang line reveals what tool to use?
MarsIronPI
18 hours ago
> If you've never used Clojure and start a Clojure project, you will almost definitely find advice telling you to use Leiningen.
I thought the current best practice for Clojure was to use the shiny new built-in tooling? deps.edn or something like that?
fulafel
15 hours ago
Clojure CLI (aka deps.edn) came out in 2018 and in the survey "how do you manage your dependencies?" question crossed 50% usage in early 2020. So for 6-8 years now.
codemonkey-zeta
16 hours ago
deps.edn is becoming the default choice, yes. I interpreted the parent comment as saying "you will see advice to use leiningen (even though newer solutions exist, simply because it _was_ the default choice when the articles were written)"
zahlman
20 hours ago
> you might find someone saying to use uv, but also potentially venv, poetry or hatch.
This is sort of like saying "You might find someone saying to drive a Ford, but also potentially internal combustion engine, Nissan or Hyundai".
evilduck
20 hours ago
Only to those already steeped in Python. To an outsider they're all equally arbitrary non-descriptive words and there's not even obvious proper noun capitalization to tell apart a component from a tool brand.
zahlman
19 hours ago
It's always rather irritating to me that people make these complaints without trying to understand any of the under-the-hood stuff, because the ultimate conclusion is that it's somehow a bad thing that, on a FOSS project, multiple people tried to solve a problem concurrently.
NetMageSCW
17 hours ago
That’s especially ironic given that inside Python part of the philosophy is “There should be one-- and preferably only one --obvious way to do it.” So why does Python’s external environment seem more like something that escape from a Perl zoo?
hexo
10 hours ago
Because a lot of people have no clue about packaging or how to write compatible software, one that is actually installable as normal application. I suspect a lot of them learned stuff in node.js or ruby ecosystem first and this is the result. Same as requiring using docker to install or build an application. It isn't cool, funny or right way to do stuff. I still don't get what was so wrong about venv that anyone needed uv. I have no need to even try and i'm writing python stuff so long that i cannot even estimate it. To me it feels like reinvention for sake of rewrite in rust. If it is so good, ok, i get it, it might be - and all that good stuff needs to go back to python as python.
kstrauser
16 hours ago
The one obvious way is the underlying virtualenv abstraction. Everything else just makes that part easier or more transparent.
zahlman
15 hours ago
What kstrauser said.
But with much more detail: it seems complicated because
* People refuse to learn basic concepts that are readily explained by many sources; e.g. https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... [0].
* People cling to memories of long-obsolete issues. When people point to XKCD 1987 they overlook that Python 2.x has been EOL for almost six years (and 3.6 for over four, but whatever)[1]; only Mac users have to worry about "homebrew" (which I understand was directly interfering with stuff back in the day) or "framework builds" of Python; easy_install is similarly a long-deprecated dinosaur that you also would never need once you have pip set up; and fewer and fewer people actually need Anaconda for anything[2][3].
* There is never just one way to do it, depending on your understanding of "do". Everyone will always imagine that the underlying functionality can be wrapped in a more user-friendly way, and they will have multiple incompatible ideas about what is the most user-friendly.
But there is one obvious "way to do it", which is to set up the virtual environment and then launch the virtual environment's Python executable. Literally everything else is window dressing on top of that. The only thing that "activating" the environment does is configure environment variables so that `python` means the virtual environment's Python executable. All your various alternative tools are just presenting different ways to ensure that you run the correct Python (under the assumption that you don't want to remember a path to it, I guess) and to bundle up the virtual environment creation with some other development task.
The Python community did explicitly provide for multiple people to provide such wrappers. This was not by providing the "15th competing standard". It was by providing the standard (really a set of standards designed to work together: the virtual environment support in the standard library, the PEPs describing `pyproject.toml`, and so on), which replaced a Wild West (where Setuptools was the sheriff and pip its deputy).
[0]: By the way, this is by someone who doesn't like virtual environments and was one of the biggest backers of PEP 582.
[1]: Of course, this is not Randall Munroe's fault. The comic dates to 2018, right in the middle of the period where the community was trying to sort things out and figure out how to not require the often problematic `setup.py` configuration for every project including pure-Python ones.
[2]: The SciPy stack has been installable from wheels for almost everyone for quite some time and they were even able to get 3.12 wheels out promptly despite being hamstrung by the standard library `distutils` removal.
[3]: Those who do need it, meanwhile, can generally live within that environment entirely.
fwip
17 hours ago
I imagine by this they meant `python -m venv` specifically, using that interface directly, rather than through another wrapper CLI tool.
zahlman
15 hours ago
Fair.
The way I teach, I would start there; then you always have it as a fallback, and understand the system better.
I generally sort users into aspirants who really should learn those things (and will benefit from it), vs. complete end users who just want the code to run (for whom the developer should be expected to provide, if they expect to gain such a following).
NeutralCrane
19 hours ago
uv has been around for less than two years. It’s on track to become the default choice, it’s just a matter of time.
the__alchemist
18 hours ago
I solved this in 2019 with PyFlow, but no one used it, so I lost interest. It's an OSS tool written in rust that automatically and transparently manages python versions and venvs. You just setup a `pyproject.toml`, run `pyflow main.py` etc, and it just works. Installs and locks dependencies like Cargo, installs and runs the correct Python version for the project etc.
At the time, Poetry and Pipenv were the popular tools, but I found they were not sufficient; they did a good job abstracting dependencies, but not venvs and Python version.
greensh
16 hours ago
sounds awesome. Just out of interest, why do you think pyflow didn't catch on, but UV did?
the__alchemist
16 hours ago
My best guess: I'm bad at marketing, and gave up too soon. The feedback I received was generally "Why would I use this when Pip, Pipenv and Poetry work fine?". To me they didn't; they were a hassle due to not handling venvs and Py versions, but I didn't find many people to also have had the same problem.
the_mitsuhiko
16 hours ago
Polish and that uv gets you entire python interpreters automatically without having to compile or manually install them.
That in retrospective was what made rye temporarily attractive and popular.
embedding-shape
21 hours ago
I've moved over mostly to uv too, using `uv pip` when needed but mostly sticking with `uv add`. But as soon as you start using `uv pip` you end up with all the drawbacks of `uv pip`, namely that whatever you pass after can affect earlier dependency resolutions too. Running `uv pip install dep-a` and then `... dep-b` isn't the same as `... dep-b` first and then `... dep-a`, or the same as `uv pip install dep-a dep-b` which coming from an environment that does proper dependency resolution and have workspaces, can be really confusing.
This is more of a pip issue than uv though, and `uv pip` is still preferable in my mind, but seems Python package management will forever be a mess, not even the bandaid uv can fix things like these.
sieep
20 hours ago
Ive been away from python for awhile now, I was under the impression uv was somehow solving this dependency hell. Whats the benefit of using uv/pip together? Speed?
embedding-shape
19 hours ago
As far as I can tell, `pip` by itself still doesn't even do something basic as resolving the dependency tree first, then download all the packages in parallel, as an basic example. The `uv pip` shim does.
And regardless if you use only uv, or pip-via-uv, or straight up pip, dependencies you install later steps over dependencies you installed earlier, and no tool so far seems to try to solve this, which leads me to conclude it's a Python problem, not a package manager problem.
chuckadams
20 hours ago
`uv pip` is still uv, it's just uv's compatibility layer for pip.
micik
15 hours ago
i found uv frustrating. i dont know what problem is it trying to solve. it's not a tool for managing virtualenvs, but it does them as well. i guess it's a tool for dependency management. the "uv tool" stuff. kinda weird. i gave it an honest try but i was working around it with shell functions all the time.
in the end i went back to good old virtualenvwrapper.sh and setting PYTHONPATH. full control over what goes into the venv and how. i guess people like writing new tools. i can understand that.
embedding-shape
15 hours ago
Maybe I "entered" the Python ecosystem at a different time, but I never used virtualenvwrapper.sh nor sat PYTHONPATH manually ever. When I first came into contact with Python, I think doing `virtuelenv venv && source venv/bin/activate` was what was recommended to me at the time. Eventually I used `python -m venv` but always also with `pip` and a `requirements.txt`. I pretty much stuck with that until maybe 1 year ago I started playing around with `uv`, and for me, I just use `uv venv|pip|init|add` from uv, and nothing else from any other tools, and generally do pretty basic stuff.
Maybe for more complex projects and use cases it's harder, but it's a lot faster than just pip and pyproject.toml is a lot nicer to manage than `requirements.txt`, so that's two easy enough wins for me to move over.
otterley
10 hours ago
uv solves many problems, but one textbook case is the problem of running some arbitrary Python-based command-line tool, where 1/you don’t have any Python interpreter installed, 2/your OS-provided Python interpreter isn’t compatible with the tool, or 3/you want to run single or multiple tools from any arbitrary folder where your data already is, as opposed to adapting your workflow to fit the virtualenv or running the risk that two tools have conflicting dependencies that would make the virtualenv not work well.
Narushia
an hour ago
Isn't what you're describing solved by `uv tool install`?
JodieBenitez
19 hours ago
you don't even need you prefered python version, uv will download it.
zahlman
20 hours ago
There are really so many things about this point that I don't get.
First off, in my mind the kinds of things that are "scripts" don't have dependencies outside the standard library, or if they do are highly specific to my own needs on my own system. (It's also notable that one of the advantages the author cites for Go in this niche is a standard library that avoids the need for dependencies in quick scripts! Is this not one of Python's major selling points since day 1?)
Second, even if you have dependencies you don't have to learn differences between these tools. You can pick one and use it.
Third, virtual environments are literally just a place on disk for those dependencies to be installed, that contains a config file and some stubs that are automatically set up by a one-liner provided by the standard library. You don't need to go into them and inspect anything if you don't want to. You don't need to use the activation script; you can just specify the venv's executable instead if you prefer. None of it is conceptually difficult.
Fourth, sharing an environment for these quick scripts actually just works fine an awful lot of the time. I got away with it for years before proper organization became second nature, and I would usually still be fine with it (except that having an isolated environment for the current project is the easiest way to be sure that I've correctly listed its dependencies). In my experience it's just not a thing for your quick throwaway scripts to be dependent on incompatible Numpy versions or whatever.
... And really, to avoid ever having to think about the dependencies you provide dynamically, you're going to switch to a compiled language? If it were such a good idea, nobody would have thought of making languages like Python in the first place.
And uh...
> As long as the receiving end has the latest version of go, the script will run on any OS for tens of years in the future. Anyone who's ever tried to get python working on different systems knows what a steep annoying curve it is.
The pseudo-shebang trick here isn't going to work on Windows any more than a conventional one is. And no, when I switched from Windows to Linux, getting my Python stuff to work was not a "steep annoying curve" at all. It came more or less automatically with acclimating to Linux in general.
(I guess referring to ".pyproject" instead of the actually-meaningful `pyproject.toml` is just part of the trolling.)
kstrauser
19 hours ago
> Third, virtual environments are literally just a place on disk for those dependencies
I had a recent conversation with a colleague. I said how nice it is using uv now. They said they were glad because they hated messing with virtualenvs so much that preferred TypeScript now. I asked them what node_modules is, they paused for a moment, and replied “point taken”.
Uv still uses venvs because it’s the official way Python stores all the project packages in one place. Node/npm, Go/go, and Rust/cargo all do similar things, but I only really here people grousing about Python’s version, which as you say, you can totally ignore and never ever look at.
zahlman
19 hours ago
From my experience, it seems like a lot of the grousing is from people who don't like the "activation script" workflow and mistakenly think it's mandatory. Though I've also seen aesthetic objections to the environment actually having internal structure rather than just being another `site-packages` folder (okay; and what are the rules for telling Python to use it?)
The very long discussion (https://discuss.python.org/t/pep-582-python-local-packages-d...) of PEP 582 (https://peps.python.org/pep-0582/ ; the "__pypackages__" folder proposal) seems relevant here.
kstrauser
19 hours ago
I've heard those objections, too. I do get that specific complaint: it's another step you have to do. That said, things like direnv and mise make that disappear. I personally like the activation workflow and how explicit it is, as you're activating that specific venv, or maybe one in a different location if you want to use that instead. I don't like sprinkling "uv run ..." all over the place. But the nice part is that both of those work, and you can pick whichever one you prefer.
It'll be interesting to see how this all plays out with __pypackages__ and friends.
zahlman
18 hours ago
> But the nice part is that both of those work, and you can pick whichever one you prefer.
Yep. And so does the pyenv approach (which I understand involves permanently adding a relative path to $PATH, wherein the system might place a stub executable that invokes the venv associated with the current working directory).
And so do hand-made subshell-based approaches, etc. etc.
In "development mode" I use my activation-script-based wrappers. When just hacking around I generally just give the path to the venv's python explicitly.
kstrauser
17 hours ago
I use your "hacking around" method for things like cron jobs, with command lines like:
* * * * * /path/to/project/.venv/python /path/to/project/foo.py
It's more typing one time, but avoids a whole lot of fragility later.tgv
a day ago
Won't those dependencies then be global? With potential conflicts as a result?
auxym
21 hours ago
uv uses a global cache but hardlinks the dependencies for your script into a temp venv that is only for your script, so its still pretty fast.
stephenlf
a day ago
Nope! uv takes care of that. uv is a work of art.
tgv
21 hours ago
Then I should seriously take a look at it. I figured it was just another package manager.
t43562
21 hours ago
....but you have to be able to get UV and on some platforms (e.g. a raspberry pi) it won't build because the version of rust is too old. So I wrote a script called "pv" in python which works a bit like uv - just enough to get my program to work. It made me laugh a bit, but it works anywhere, well enough for my usecase. All I had to do was embed a primitive AI generated TOML parser in it.
zahlman
19 hours ago
> All I had to do was embed a primitive AI generated TOML parser in it.
The standard recommendation for this is `tomli`, which became the basis of the standard library `tomllib` in 3.11.