captn3m0
a day ago
> The M4 Max MacBook I'm using to write this would've ranked among the 50 fastest supercomputers on Earth in 2009.
I attempted to validate this: You'd need >75 TFlop/s to get into the top50 in the TOP500[0] rankings in 2009. M4 Max review says 18.4 TFlop/s at FP32, but TOP500 uses LINPACK, which uses FP64 precision.
An M2 benchmark gives a 1:4 ratio for double precision, so you'd get maybe 9 TFlop/s at FP64? That wouldn't make it to TOP500 in 2009.
fleebee
18 hours ago
I'm guessing that's an LLM hallucination. The conclusion section especially has some hints it was pulled out of an LLM:
> The package managers we benchmarked weren't built wrong, they were solutions designed for the constraints of their time.
> Buns approach wasn't revolutionary, it was just willing to look at what actually slows things down today.
> Installing packages 25x faster isn't "magic": it's what happens when tools are built for the hardware we actually have.
notpushkin
14 hours ago
Sorry, what is the telltale sign here?
pests
13 hours ago
It’s not _____, it’s ______.
Some more conversation a week or so ago I had:
nine_k
a day ago
> Now multiply that by thousands of concurrent connections each doing multiple I/O operations. Servers spent ~95% of their time waiting for I/O operations.
Well, no. The particular thread of execution might have been spending 95% of time waiting for I/O, but a server (the machine serving the thousands of connections) would easily run at 70%-80% of CPU utilization (because above that, tail latency starts to suffer badly). If your server had 5% CPU utilization under full load, you were not running enough parallel processes, or did not install enough RAM to do so.
Well, it's a technicality, but the post is devoted to technicalities, and such small blunders erode the trust to the rest of the post. (I'm saying this as a fan of Bun.)
LollipopYakuza
2 hours ago
> even low-end smartphones have more RAM than high-end servers had in 2009
That's even less accurate. By two orders of magnitude. High-end servers in 2009 had way more than 4GB. The (not even high-end) HP Proliant I installed for a small business in 2008, that was already bought used at the time, had 128GB of RAM.
I understand why one would want to make an article entertaining but that seriously makes me doubt the rest of the articles when diving into a topic I don't know as much.