amiga386
10 days ago
What's old is new again.
Microsoft introduced the "Office Startup Assistant" or "Office Startup Application" (osa.exe) in Office 97, it sat in the system tray and loaded the main office DLLs at startup: https://web.archive.org/web/20041214010329/http://support.mi...
OpenOffice.org (predecessor of LibreOffice) copied this feature, which they called "QuickStarter", I don't know exactly when, but no later than 2003: https://www.openoffice.org/documentation/setup_guide2/1.1.x/...
Microsoft made OSA made non-default in Office 2007, and _removed_ it from Office 2010: https://learn.microsoft.com/en-us/previous-versions/office/o...
Are they now bringing it back?
barrkel
10 days ago
I came here looking for this. It's an old idea, from the days when spinning rust was the limiting factor - precache the binaries.
If you ever tried Office 97 on a PC of 10+ years later, it's amazing how fast and lightweight it was. Instant startup, super snappy. And those apps were not lacking in features. 95% of what you need out of a desktop word processor was in Word 97.
MisterTea
10 days ago
> from the days when spinning rust was the limiting factor
How did we get back to this though? We have gigabytes/sec with NVMe and stupid fast CPU's with at least 4 cores in even low end models. Yet a text editor takes so long to load we need to load it up on boot... Such a frustrating field to work in.
afavour
10 days ago
I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering. It's all about stuffing as many features in as quickly as we can and pushing it out to as many people as possible. Performance is always secondary.
Not that I'm that nostalgic for the old days, we would have been doing the exact same thing if we were able to get away with it. But performance restrictions meant you had no choice but to care. Modern tech has "freed" us from that concern.
trealira
10 days ago
Niklaus Wirth wrote about this in 1995, in his essay A Plea for Lean Software.
About 25 years ago, an interactive text editor could be designed with as little as 8,000 bytes of storage. (Modern program editors request 100 times that much). An operating system had to manage with 8,000 bytes, and a compiler had to fit into 32 Kbytes, whereas their modern descendants require megabytes. Has all this inflated software become any faster? On the contrary, were it not for a thousand times faster hardware, modern software would be utterly unusable.
https://www.computer.org/csdl/magazine/co/1995/02/r2064/13rR...
That said, as someone fairly young, I still don't think that makes it wrong or something only an old man would think. Software seems to perform exactly as well as it needs to and no more, which is why hardware advances don't make our computers run software much faster.
nextos
10 days ago
Aside from slowness, feature creep leads to poor quality, i.e. tons of bugs and user confusion with ever-changing graphical interfaces.
If software was simpler, we could afford to offer some formal guarantees of correctness. Model check protocols, verify pre and post conditions à la Dafny, etc.
There's too much change for the sake of change.
aylmao
9 days ago
> There's too much change for the sake of change.
+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.
The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's often almost no other option but to introduce change for the sake of change.
chipsrafferty
7 days ago
I once asked a man who worked in marketing why Oreos keep making crazy new flavors like "sour patch kids Oreos" when the normal kind is great and clearly has no issues being sold. I could see some upside - it gets people talking about them, it's fun, it reinforces the normal flavor as the best chocolate cookie, etc. but I was still dubious that those benefits outweighed the cost of developing new flavors in a lab, paperwork for food safety, a new manufacturing process, new ads, new packaging, etc. especially for something temporary.
He said it's often just some new marketing exec wants to put something on their resume, and they have certain metrics that they target that don't necessarily align with long term profits of the company.
I'm sure software has a similar problem.
aylmao
9 days ago
> There's too much change for the sake of change.
+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.
The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's almost no other option but to introduce change for the sake of change.
etruong42
9 days ago
This is exactly what I see as well.
At a general level, I believe there are other options - changes/features need to meet some level of usage or it is scrapped out of recognition that supporting all these features make bugs more likely, performance likely to degrade, increase difficulty of adding features, make the product more difficult to use, etc.
user
9 days ago
etruong42
9 days ago
> Software seems to perform exactly as well as it needs to and no more
The cynical spin I would put on this idea is that software performs as poorly as it can get away with. MSFT is feeling the need/pressure to have Office load faster, and they will try to get away with preloading it.
Otherwise, there is a strong pull towards bloat that different people will try to take credit for as features even if the cumulative experience of all these "features" is actually a worse user-experience.
naikrovek
10 days ago
software authors that don't care about performance annoy me (and I am an old man.)
The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.
I'm not talking about "write in assembly, duh" I'm talking about just doing things intelligently instead of naively. The developers I support often simply are not thinking about the problem they're solving and they solve the problem in the simplest way (for them) and not the simplest way for a computer.
Software is an inefficiency amplifier, because the number of developers for a piece of code is much smaller than the number of computers that run that code; how much coal has been burned solely because of shitty implementations? I'd wager that the answer is "a LOT!"
Even if you don't care about coal usage, think about how much happier your users would be if your application was suddenly 5x faster than it was previously? now think of how many customers want their software to be slow (outside of TheDailyWTF): zero.
languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it. JS and Electron are great for developers, and horrible for users because of that amplification I described above.
I am dead tired of seeing hustle culture overtake everything in this field, and important things, to me, like quality and performance and support all fall straight down the toilet simply because executives want to release features faster.
things like copilot could help with this, i hope. presumably copilot will help introduce better code into applications than a daydreaming developer would, though the existence of vibe coding sort of nulls that out probably.
one thing that AI will do quite soon is increase the amount of software that exists quite dramatically. and I am kinda concerned about the possibility that it's all going to suck horribly.
ppenenko
10 days ago
I commiserate with your frustration with developers writing things suboptimally all too often. However, I disagree with the assumption that it's a JS/Python vs C issue.
Example: when VS Code came out, it was much, much faster, more responsive and stable than Visual Studio at the time. Despite being based on Electron, it apparently was much better on architecture, algorithms and multithreading than VS with its C++ and .NET legacy codebase. That really impressed me, as a C++ programmer.
Overall, it feels like folks who idealize bygone eras of computing didn't witness or have forgotten how slow Windows, VS, Office etc. used to feel in the 90s.
Nevermark
10 days ago
> Overall, it feels like folks who idealize bygone eras of computing didn't witness or have forgotten how slow Windows, VS, Office etc. used to feel in the 90s.
Let’s normalize speed over time like we do dollars, so we are talking about the same thing.
Given the enormous multiplier in CPU and storage hardware speeds and parallelism today vs. say 1995, any “slow” application then should be indistinguishable from instant today.
“Slow” in the 90’s vs. “Slow” in 2025 are essentially different words. Given unclarified co-use pushes several orders magnitude of either speed or inefficiency difference under the rug.
naikrovek
9 days ago
“Slow” is when the human waits on the computer.
The promise of computing is that what was slow in the 1960s and 1970s would be instant in 1990. And those things were instant, but those things aren’t what people did with computers anymore.
New software that did more than before, but less efficiently, came around, so everything felt the same. Developers didn’t have to focus on performance so much, so they didn’t.
Developers are lazy sacks who are held skyward because of hardware designers alone. And software developers are just getting heavier and heavier all the time, but the hardware people can’t hold them forever.
This cannot continue forever. Run software from the 1990s or 2000s on modern hardware. It is unbelievably fast.
Maybe it was slow in the 1990s, sure. I ask why we can’t (or won’t) write software that performs like that today.
The compiler for Turbo Pascal could compile something like a million lines per second in 1990. We have regressed to waiting for 60+ minute C++ compile times today, on even moderate project sizes.
Debugging in visual studio used to be instant when you did things like Step Over. You could hold the Function key down and just eyeball your watch variables to see what was going on. The UI would update at 60FPS the entire time. Now if I hold down that key, the UI freezes and when I let go of the key it takes time to catch up. Useless. All so Microsoft could write the front end in dotnet. Ruin a product so it is easier to write… absolute nonsense decision.
All software is like that today. It’s all slow because developers are lazy sacks who will only do the minimum necessary so they can proceed to the next thing. I am ashamed of my industry because of things like this.
Gigablah
8 days ago
“Developers are lazy sacks who are held skyward because of hardware designers alone”
As a programmer who studied computer and electrical engineering in university, never before have I been so offended by something I one hundred percent agree with
wonnage
9 days ago
Counterpoint: single threaded performance hasn't improved much in the past 20 years. Maybe 5x at best. And virtually every UI programming environment still has problems with work done on the main thread.
DrillShopper
9 days ago
Single thread performance increased every processor generation, and is still doing so today.
zveyaeyv3sfye
9 days ago
Increased yes, but not by a whole lot. See https://cdn.arstechnica.net/wp-content/uploads/2020/11/CPU-p... and https://cdn.arstechnica.net/wp-content/uploads/2020/11/CPU-p...
Source: https://arstechnica.com/gadgets/2020/11/a-history-of-intel-v...
(I'm sure someone could dig up more recent graphs, but you get the idea).
In order to get more performance, your app needs to use multithreading.
Nevermark
9 days ago
Too true!
RAM parallel bandwidth, increased caching levels and size, and better caching rules, instruction re-ordering, predictive branching, register optimization, vector instructions, ... there have been many advances in single thread execution since the 90's. Beyond any clock speed advances.
antod
9 days ago
Office 4.3 loading on Win3.1 was glacial. I haven't forgotten.
p_ing
10 days ago
> The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.
Why? A good portion of programs are still single-threaded, and often that's the correct choice. Even in games a single-threaded main thread or logic thread may be the only choice. Where multi-threading makes sense it should be employed, but it's difficult to do well.
Otherwise, it's up to the OS to balance threads appropriately. All major OSes do this well today.
ExoticPearTree
9 days ago
I think what the author wanted to say is that because computers are very fast today developers have no incentive of writing optimized code.
Nowadays you just "scale horizontally" by the magic of whatever orchestration platform you happen to use, which is the modern approach of throwing hardware at the problem in the vertical scaling days.
naikrovek
9 days ago
It’s not about programs being multithreaded. It’s about computers running multiple programs at once on different threads and they all perform well.
One can write software that uses the CPU cache in non-dumb ways no matter how many threads your program has. You can craft your structs so that they take less space in RAM, meaning you can fit more in cache at once. You can have structs of arrays instead of arrays of structs if that helps your application. Few people think of things like this today, they just go for the most naive implementation possible so that the branch predictor can’t work well and everything needs to be fetched from RAM every time instead of building things so that the branch predictor and the cache are helping you instead of impeding you. People just do the bare minimum so that the PM says the card is complete and they never think of it again. It’s depressing.
The tools to write fast software are at our fingertips, already installed on our computers. And I have had zero success in getting people to believe that they should develop with performance in mind.
p_ing
9 days ago
So your assertion is that developers should get in a big huddle to decide how they’re going to consume L1 between applications? Which of course no dev has control over since the OS determines what runs and when.
naikrovek
9 days ago
You can make your time in the CPU more efficient by thinking of the cache and the branch predictor, or you can say “nothing I do matters because the OS schedules things how it wants.” Up to you I guess, but I know which of those approaches performs significantly better.
njarboe
10 days ago
My standard is that software should appear to work instantly to me, a human. Then it is fast enough. No pressing a button and waiting. That would be great.
naikrovek
9 days ago
That is probably the correct measure. If “The Promise of Computing” is ever to come true, people must never wait on computers when interacting with them.
Waiting is ok when it comes to sending batches of data to be transformed or rendered or processed or whatever. I’m talking about synchronous stuff; when I push a key on my keyboard the computer should be done with what I told it to do before I finish pushing the button all the way down. Anything less is me waiting on the computer and that slows the user down.
Businesses should be foaming at the mouth about performance; every second spent by a user waiting on a computer to do work locally, multiplied by the number of users who wait, multiplied by the number of times this happens per day, multiplied by the number of work days in a year… it’s not a small amount of money lost. Every more efficient piece of code means lighter devices are needed by users. Lambda is billed by CPU and RAM usage, and inefficient code there directly translates into higher bills. But everyone still writes code which stores a Boolean value as a 32-bit integer, and where all numbers are always 8-bytes wide.
What. The. Fuck.
People already go on smoke breaks and long lunches and come in late and leave early; do we want them waiting on their computers all of the time, too? Apparently so, because I’ve never once heard anyone complain to a vendor that their software is so slow that it’s costing money, but almost all of those vendor products are that slow.
I’m old enough that I’m almost completely sick of the industry I once loved.
Software developers used to be people who really wanted to write software, and wanted to write it well. Now, it’s just a stepping stone on the way to a few VP positions at a dozen failed startups and thousands of needlessly optimistic posts on LinkedIn. There’s almost no craft here anymore. Businessmen have taken everything good about this career and flushed it down the toilet and turned teams into very unhappy machines. And if you don’t pretend you’re happy, you’re “not a good fit” anymore and you’re fired. All because you want to do your job well and it’s been made too difficult to reliably do anything well.
autoexec
10 days ago
> languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it.
Even operating systems don't get direct access to the hardware these days. Instead a bunch of SoC middlemen handle everything however they like.
nativeit
9 days ago
Wait…those dastardly systems architecture engineers with their decadent trusted platform modules, each with an outrageous number of kilobytes of ROM. They are the true villains of software performance?
naikrovek
6 days ago
that doesn't matter; if you make your cache usage smart and your branches predictable, the CPU will take advantage of that and your program will run faster. It is in the interests of the system and CPU designers to make sure this is the case, and it is.
If you do the things which make your code friendly to the CPU cache and the branch predictor, when it comes time for your code to run on the CPU, it will run faster than it would if you did not do those things.
ralphc
10 days ago
What's your proposal for a "compromise" language between programmer productivity and performance, especially for multiple threads and CPUs? Go, Rust, a BEAM language?
Voultapher
9 days ago
I don't think the tools are the issue here, they are tools you can do good and bad jobs with all of them. What is lacking are the right incentives. The tech market has never been as anti-competitive as it is today. Let's repeal DMCA 1201 and go from there.
naikrovek
9 days ago
Jai seems to be an excellent start. Possibly Zig as well.
Both are written/designed by people who care a lot about application performance and developer experience.
teg4n_
9 days ago
This is an unserious take. Jai doesn’t even have official documentation and zig hasn’t reached a 1.0 release
naikrovek
6 days ago
I wasn't asked for examples of software that is congruent to whatever definition you want. I was asked for a proposal of a "compromise" language, and I answered that question.
t917910
9 days ago
[flagged]
colonial
10 days ago
> presumably copilot will help introduce better code into applications than a daydreaming developer would
Copilot is trained on Github (and probably other Git forges w/o permission, because OpenAI and Microsoft are run by greedy sociopaths.)
I'd wager that the majority of fleshed out repositories on these sites contain projects written at the "too-high level" you describe. This certainly seems to be true based on how these models perform ("good" results for web development and scripting, awful results for C/++/Rust/assembly...) - so I wouldn't get your hopes up, unfortunately.
tough
10 days ago
I dont know if its just the training data, or that CRUD and webapps are more inherently easy to parrot away.
Low level programming means actual -thinking- about the system, resources, and language decisions etc
Even humans struggle with it, Its much easier to build a website than say a compiler, for anyone, humans and llm's included
colonial
10 days ago
That probably plays into it as well. I have yet to see any convincing evidence that contradicts LLMs being mere pattern parrots.
My personal benchmark for these models is writing a simple socket BPF in a Rust program. Even the latest and greatest hosted frontier models (with web search and reasoning enabled!) can only ape the structure. The substance is inevitably wanting, with invalid BPF instructions and hallucinated/missing imports.
tough
10 days ago
imho these tools are great i fyou know what you're doing, becasue you know how to smell test the output, but a footgun otherwise.
It works great for me, but it is necessarily an aid learning tool more than a full on replacement, someone's still gotta do the thinking part, even if the llm's can cosplay -reasoning- now
vacuity
9 days ago
I'm also young and heavily favor simple, quality software. Age is a highly coarse proxy for one's experiences, but in this case I think it has more to do with my personality. I have enough experience in computing that I don't think I'm making demands that are unrealistic, although they are certainly unrealistic if we maintain current incentives and motives.
joelwilliamson
10 days ago
“What Andy giveth, Bill taketh away”
mapt
10 days ago
I saw the writing on the wall when I had to install a couple 150MB IDEs to run 101-level Java programs in the mid 2000's. 150 megabytes. MEGABYTES. I could consume about 1 kilobyte per minute of fantasy novel text in uncompressed ASCI, call it 1/8th that fully compressed. That means this compressed binary you're handing me is around 1.2 billion minutes of work (more if ingesting a novel is faster than writing/testing/debugging a program) for what is functionally a text editor, file manager, syntax library, compiler, and debugger. Pretty sure that got done in 150 kilobytes a generation earlier. A generation later, maybe it will be 150 gigabytes.
netsharc
10 days ago
I looked it up, you want an illegal taxi? 168 MB: https://apkcombo.com/uber/com.ubercab/
reaperducer
10 days ago
install a couple 150MB IDEs
Not Java, but an IDE in 4K: https://en.wikipedia.org/wiki/BASIC_Programming
Having used it quite extensively (Well, five solid days over two weeks, which is about 1000x longer than most people gargling on the internet), it's surprisingly capable.
Imagine if someone with the same talent and motivation was working on today's hardware.
<aside> Someone on Atari Age wrote a LISP for the same machine.
MisterTea
10 days ago
> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.
Indeed. I am sure many of us here are burnt out on bloat. I am also sure many of us want to move to smaller stuff but cant simply because of industry momentum. BUT that doesn't mean the dream is dead, only that we must work towards those goals on our own time. I found Plan 9 and haven't looked back. I can rebuild the entire OS in seconds on a fast machine. Even my little Celeron J1900 can rebuild the OS for several supported architectures in minutes. I can share a USB device seamlessly across my network, PXE booted from a single disk without installing anything. Cat(1) is just 36 lines of C.
There's still hope. Just ignore the industry hype noise and put in the effort ourselves.
ryandrake
10 days ago
And just when we think we can't make software any more inefficient, slow, and bloated, they release things like Electron, where you ship an entire browser with your app! And then when we think it can't even get worse, we have Docker and containers where we ship the entire OS with the application.
I'm looking forward to when app developers ship you an entire computer in the mail to run their text editor.
bloomca
9 days ago
The problem with Electron is that business-wise it is an excellent decision. You can get by with a few people to wrap the web app and integrate it with the OS, and then get updates pretty much for free.
Yet for the user it is bad -- bloated, slow, feels non-native, has specific bugs which are hard to address for the devs, etc.
I don't see any light for the desktop UI development unless there is some lightweight universal rendering engine. Tauri with WebView is somewhat promising, but it has problems on Linux and it is hard to target older systems.
ryandrake
9 days ago
It's a pretty OK example of a negative externality. A little like polluting: Just dumping your waste into the environment is business-wise an excellent decision. You avoid the cost and everyone else has to deal with the downsides.
bloomca
9 days ago
Polluting is indeed an excellent business decision. The thing about apps is that all of them are polluting, just some of them are worse than others. And we tend to fill all available resources, so over time it only gets worse.
bigstrat2003
9 days ago
It's an excellent business decision... right up until your customers abandon you because you make bad quality software. Like many businesses have found time and again, deliberately sacrificing quality for profit is a short term gain for a long term loss.
tekknik
9 days ago
there are quite a few examples of software built with electron that have very large user bases. this sounds like a personal vendetta against electron rather than meaningful insight.
LuciOfStars
9 days ago
Electron is horrid, but as a user, I prefer bloated "apps" to no support at all.
As for your second point: [1]
mbs159
4 days ago
> Cat(1) is just 36 lines of C.
Correct me if I'm wrong, but isn't it around 800 lines[1]?
1. https://github.com/coreutils/coreutils/blob/master/src/cat.c
gregates
10 days ago
I love that I work at a place (Row Zero) where caring about performance is baked into the culture, and I can spend days fixing weird perf edge cases our users discover without management asking why I'm wasting my time. And it's office software, no less!
zer00eyz
10 days ago
>> It's all about stuffing as many features in as quickly as we can...
The problem isn't "engineering" the problem is the culture of product management. (Note: NOT product managers per se).
I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"? If you can find one, it will end up being the exception that proves the rule. The culture of "add", "new" and "shiny" doesn't reward keeping things lean and effective. T
In the tangible world we look to accountants for this sort of thing (because they tend to have costs). Think cheap Costco hotdogs and free cookies at Double Tree. No one in product, dev and accounting is going to sit down and try to justify loosing some code, features and maybe a few customers to make it faster when you can just "engineer" you way out of it and not have to sell less is more.
zero_bias
10 days ago
> I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"?
Google goes a step further and kills entire apps
LuciOfStars
9 days ago
Moment of silence for Play Music. YT Music isn't any less awful now than it was a decade ago.
bigfatkitten
9 days ago
Ford does. They look at connected vehicle telemetry to strip out features nobody uses in order to save software maintenance, hardware and third party IP licensing costs.
https://www.slashgear.com/1513242/ford-gets-rid-of-self-park...
lispisok
9 days ago
My personal theory is there is a threshold of performance. Below the threshold the experience is bad enough it affects revenue so getting the program up to speed becomes a priority. Above the threshold only features are prioritized to drive more revenue. That's why despite computers getting orders of magnitude faster computer programs seem to run about the same speed.
nativeit
9 days ago
I think the threshold is more about how much more rent can we seek to collect from users, and making things more performant or ergonomic doesn’t do anything to allow sales to add another 10% to the per-user subscription pricing (I assume this is a product with per-user subscriptions, even though it’s almost certainly unnecessary).
But adding yet another gateway to ChatGPT’s API…that’s a $15/mo/user add-on right there, and not just a monkey, but one of the slower intern monkeys, could write the code for such an incredibly innovative and, there’s no other word for it, powerful new feature that everyone should be thrilled to pay for at a 200-300% (or more!) markup. So guess who gets that VP of New Bullshit Development position, and guess what kind of choices get reinforced?
(EDIT: grammar)
acdha
10 days ago
How does someone get promoted at Microsoft? How do they avoid being seen as a low performer?
Performance just isn’t on that list, and it’s often more and harder work than a given new feature took to create. Office users are getting what Microsoft is designed to deliver.
user
10 days ago
ambicapter
10 days ago
I really don't think we've "lost it", I think performance has just not been a consideration in the engineering of Office for a long time, if ever.
user
10 days ago
trinsic2
8 days ago
> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.
Absolutely this. I think this is evidence that points to modern civilization starting to collapse. When we can engineer correctly, we're fucked.
90s_dev
10 days ago
> we've lost the art of software engineering
Yes! This is what all my projects are geared towards restoring. The big one is not quite ready to announce yet, but I am very proud of it, and extremely excited to release it, to solve exactly that: it makes engineering fun again!
makapuf
9 days ago
Well that username matches
ysofunny
10 days ago
we don't have software engineering anymore than the romans had civil engineering
we now DO have civil engineering but that is it
gwbas1c
10 days ago
It was always like that
Salgat
10 days ago
It's a matter of resource allocation. Lowering your design requirements for performance can save significant developer cost. Also, Word in 2025 is doing a lot more under the hood than 97.
Workaccount2
10 days ago
I'll take the hate for this, but I have been using gemini to build narrow scope apps and they are extremely fucking fast compared to their bloated software package suite $200/user/month counterparts. It's amazing how fast and efficient programs can be when not trying to cover every use case for every possible user at every possible moment on top of a sea of tech debt programming.
While true LLMs fall flat on their face when fed massive codebases, the fact of the matter is that I don't need a 200k LOC program to accomplish a single task that an LLM can do in 2k LOC.
To give an example, we have proprietary piece of software that is used to make (physical) product test systems using flow charts and menus. It's expansive and complex. But we don't need it when we can just spend 30 minutes prompting your way to working test code and it produces way faster and more robust systems.
Maybe the devs of that software package cannot dump that whole codebase into an LLM and work on it. But they are completely missing the forest for the trees.
tharkun__
10 days ago
I will make this analogy:
Have a large ZIP file. Preferably like a few gigs and lots of (small) files.
Try to open it with the built-in Windows 11 tooling from Microsoft. It's going to be super slow to even show anything never mind unpack it.
Now install say 7-zip and do the exact same thing(s) and it's going to be instant opening and unpacking it takes a much much smaller amount of time (only limited by disk speed).
Turns out optimizations / not doing stupid things is still a thing even with all this raw power we now have.
jlarocco
10 days ago
Because an entire generation of developers and their managers believe the hardware is so fast there's no point trying to optimize the software.
Besides, the only thing that matters is getting tickets "done" before the arbitrary sprint deadline in 2 weeks, so best not to spend any extra time cleaning up or optimizing the first thing that works. You can't think about performance until the sprint dedicated to performance.
boringg
9 days ago
100% thought process is: why waste internal resources on speeding up software when the user has enough hardware to manage the workload.
codr7
9 days ago
Battery use is a pretty big concern these days; also, some users like running several things at the same time.
hunter2_
9 days ago
For the local OS and sustained workloads like video playback, yes, battery optimization is huge. For an individual app with bursty compute, less so, plus some of that inefficient code can run in the cloud instead, which is costly, but premium subscriptions can pay for it, and power plants are now colocated with data centers so power transmission cost is negligible. The incentive to be efficient is insufficient.
prussian
9 days ago
I think people forget that some of this software may be relatively fast. The problem is, most corporate environments are loaded up with EDRs and other strange anti-malware software that impede quick startup or speedy library calls. I've seen a misconfigured Forcepoint EDR rule block a window for 5 seconds on copy and paste from Chrome to Word.
Another example: it takes ~2 seconds to run git on my work machine
(Measure-Command { git status | Out-Null }).TotalSeconds
while running the same command on my personal Windows 11 virtual machine is near instant: ~0.1 seconds.
Still slower than Linux, but not nearly as bad as my work machine.user
9 days ago
bigmattystyles
10 days ago
Telemetry, syncing to the cloud by default…
dagmx
10 days ago
Neither of which contribute significantly to size though. The size aspect is what these new preloaders would help with.
2OEH8eoCRo0
10 days ago
We stopped developing for users/customers and instead added layers to make developer lives easier.
Why the hell are all my desktop apps written in JS now?!
cogman10
10 days ago
> Why the hell are all my desktop apps written in JS now?!
Have you seen the state of pretty much every non-js UX framework?
That's why.
JS/css/html won the UX domain in a way that no other language comes close to. If you look at the most recent most modern UX frameworks, they are often just half implemented poor mimics of the js/css/html stack with approximately 0 devs writing new 3rd party extensions.
Intellij uses swing, SWING, as it's UX. A framework written in the 90s filled with warts. Yet, it's still a better experience than the more recent JavaFX experience. Swing simply has more support.
ezst
9 days ago
Call me an idiot, but I still gladly take Swing and javafx over JS and monstrosities like react. The state of Qt is also very good. Web won because the distribution model is easier on the user, and because managers thought UX designers would be making whole apps now, saving on rates. Not because it's technically superior.
cogman10
9 days ago
You're not an idiot for liking the Swing/javafx/QT way of doing things. Or even for thinking they are technically superior.
The bigger issue isn't the tech, it's the ecosystem. While you might like swing, you simply are never going to find the swing version of Material UI or D3.js. That's more the problem that you'll run into.
For some of our apps because we need charting, we are using GraalJS just to run the JS charting library to export to an image that we ultimately put on some of our downloadable reports. It's a huge pain but really the only way to do that.
ezst
9 days ago
> you simply are never going to find the swing version of Material UI or D3.js. That's more the problem that you'll run into.
I remember a time when having your application look "out of place" was undesired, and the ultimate goal was to be "as native as possible". If you are running a website selling something, I agree that you want a brand-identity and a unique look. But productive software shouldn't require users to adapt to new UX paradigms (guessing whether the cancel button comes on the left or on the right, dealing with slightly do different input method and entry shortcuts…).
Anyhow, I think things could be worse, since, as you say, we can embed a webview into any JavaFX/Qt/… app and get the best of both worlds.
baobun
9 days ago
It's quite something that there hasn't been any real successor living up to Delphi or even goddamn Visual Basic for modern desktops.
stcroixx
10 days ago
Skilled programmers working on boring stuff like office. Most programmers today don't have the skills they think they do and would find working on something like Office boring.
layer8
10 days ago
Ironically, at the start of my career working on something like Office was my dream, and would actually still be. I reserve to change my mind once I’ve seen the code base, though. ;)
bradley13
10 days ago
Cruft built on frameworks using libraries with a zillion dependencies, some of which are cruft built on frameworks...
sh34r
7 days ago
Well, we networked all the computers together around that time, and it turned out that all the 1337 performance hacking that people did back then had severe unintended consequences. You’re talking about an era in which the NX bit would not be utilized by Windows for another ~7 years. “Smashing the Stack for Fun and Profit” was contemporary work.
It’s not rocket science to eke out oodles of performance out of a potato if you don’t care about correctness or memory safety.
Word 97 will only delight you if you use it on an airgapped computer, as a glorified typewriter, never open anyone else’s documents with it, and are diligent about manually saving the doc to multiple places for when it inevitably self-corrupts.
But at that point, why not be like GRRM and write on a DOS word processor? Those were historically a lot more reliable than these second-generation GUI apps.
PJDK
10 days ago
So, we often look back on the old days with rose tinted glasses. But let me recount my IT classes from the 90s.
We'd sometimes go to the library to write something up in MS Word. We always liked this because it would be a good 5-10 mins to boot up some kind of basic Unix menu. You'd then select windows 3.1 and wait another 10-15 minutes for that to load. Then you could fire up word and wait another 5 minutes. Then you could do 5 minutes work before the class was over!
layer8
10 days ago
> How did we get back to this though?
By piling up nonzero-cost abstractions left and right.
wk_end
10 days ago
And it's easy to understand how we get into that trap. Each one of those abstractions is very low-cost, so it seems harmless.
And getting out of the trap is hard too, because no single abstraction is to blame - you can't just hit things with your profiler and find the hot spot. It's all of them. So now you either live with it or rewrite an entire stack of abstractions.
lukan
9 days ago
"How did we get back to this though?"
Probably because windows needs to make a connection for every file somewhere else first and wait for the reply, before granting you the advanced software as a service feature called text editing.
It definitely feels like this at times and I fear there is too much truth in my statement.
But it is not just windows only. My old chromebook took seconds to open a folder in the file browser (even if it was already open). But a "ls" on the terminal was instant for any folder. So getting the information was not the problem. But from there to displaying it in a GUI, there seems to be myriads of important (tracking?) layers involved.
7657364574
10 days ago
My guess is that we're already seeing the consequences of "AI-assisted programming". Just yesterday, Microsoft's CEO revealed that 30% of their code is written by AI.
user
9 days ago
drjasonharrison
9 days ago
Given the game of telephone which would have to had occurred for that 30% figure to travel from developers up to the CEO, it's probably including things like autocomplete...
The Plan
In the beginning, there was a plan, And then came the assumptions, And the assumptions were without form, And the plan without substance,
And the darkness was upon the face of the workers, And they spoke among themselves saying, "It is a crock of shit and it stinks."
And the workers went unto their Supervisors and said, "It is a pile of dung, and we cannot live with the smell."
And the Supervisors went unto their Managers saying, "It is a container of excrement, and it is very strong, Such that none may abide by it."
And the Managers went unto their Directors saying, "It is a vessel of fertilizer, and none may abide by its strength."
And the Directors spoke among themselves saying to one another, "It contains that which aids plants growth, and it is very strong."
And the Directors went to the Vice Presidents saying unto them, "It promotes growth, and it is very powerful."
And the Vice Presidents went to the President, saying unto him, "This new plan will actively promote the growth and vigor Of the company With very powerful effects."
And the President looked upon the Plan And saw that it was good, And the Plan became Policy.
And this, my friend, is how shit happens.
from anonymous email
Dylan16807
9 days ago
Interesting that this story is not really a game of telephone, but instead a single layer replaced the meaning on purpose.
codr7
9 days ago
By software, most of it is probably generated scaffolding.
miunau
9 days ago
There are no financial or career incentives to optimise existing, working things in these institutions. The structures which these large companies use are for people working on new things, so that's what we get. Making office faster doesn't create headlines, but a new product that does the same thing somehow will.
intelVISA
8 days ago
Software 'engineering' is too abstract, yet imagine the outrage if every new highway had a 20km/h limit...
causality0
9 days ago
Ever notice how Windows 7 and 10 and 11 have basically the same features and benchmark the same on performance tests yet 10 and especially 11 completely shit the bed if you try to run them off a hard drive? Like they might all boot in twenty seconds from an SSD but booting from a hard disk might take W7 two minutes and W11 ten minutes to a stable desktop.
user
10 days ago
PeterStuer
8 days ago
In the old days there was the saying 'What Intel Giveth, Microsoft Taketh Away'
skywhopper
10 days ago
How much engineering time do you think is spent optimizing startup time on most modern editors? I’m guessing next to nothing.
marcod
10 days ago
"eight megabytes and constantly swapping"
Cthulhu_
10 days ago
I wish companies would go back to building fast apps, but alas. Everything is feature-packed and requires you to download half the internet.
My intellij license just expired so today I'm back using Sublime Text, and honestly it's a breath of fresh air / relief - and it's not even the fastest editor, iirc it uses Python under the hood. I've installed Zed but getting plugins and keyboard shortcuts all lined up is always challenging. That one took ~2-3 seconds to cold start.
qwerty456127
10 days ago
> I wish companies would go back to building fast apps
It seems fascinating how much more efficient Windows apps were back in the nineties, capable do to almost everything the same today apps do in a similar manner on orders of magnitude less powerful hardware, often performing even faster.
The last time I expressed this, probably also here, somebody suggested the performance drop is the cost of modern security - vulnerability mitigations, cryptography etc.
dbalatero
10 days ago
I think the performance drop probably has more to do with managers & product folks expecting growth and features at all costs at the expense of keeping performance at some baseline.
I also wonder if it's just harder to continually monitor performance in a way that alerts a team early enough to deal with regressions?
That said, security _can_ impact performance! I work on Stripe frontend surfaces, and one performance bottleneck we have comes from needing to use iframes to sandbox and isolate code for security. Having to load code in iframes adds an additional network load before we can get to page render.
zerkten
10 days ago
I think you need to add that many more developers, or even teams of developers, are building these apps. They have their own fiefdoms and it's less common for devs to have a complete understanding of the code base.
Over time decisions are made independently by devs/teams which cause the code bases to get out of alignment with a performant design. This is exacerbated by the growth pressure. It's then really hard for someone to come in and optimize top to bottom because there is everything from a bug to a design problem. Remediation has significant overhead, so only things around the edges are touched.
Fast forward a couple of years and you have a code base that devs struggle to evolve and add features to as well as keep performant. The causes are many and we come to the same one whether we complain about performance or maintainability. You probably don't feel this way, but Stripe and other premier engineering companies are way ahead of others in terms of their practices when you compare with the average situation developers are facing.
Independent mobile development is often where I see most attention to performance these days. The situation for these devs is a little bit closer to what existed in the nineties. They have a bigger span of control and performance is something they feel directly so are incentivized to ensure it's great.
apricot
10 days ago
> It seems fascinating how much more efficient Windows apps were back in the nineties
I remember everyone complaining about how bloated they were at the time. Pretty sure someone in 2055 is going to run today's Office on 2055 computers and marvel at how streamlined it is.
Illotus
10 days ago
My recollection is completely different, software was really slow on contemporary PCs in the 90s. Spinning disks, single core cpus, lot more swapping due to memory being so much more expensive.
barrkel
10 days ago
Contemporary software was slow. You could tell you should consider more RAM when the HDD light and chugging noises told you it was swapping. But if you ran the same software with the benefit of 10 years of hardware improvement, it was not slow at all.
hedora
10 days ago
This might match your recollection. x86 Win95 raytracing in javascript on my arm laptop is usable, but sort of slow:
https://copy.sh/v86/?profile=windows95
Once it boots, run povray, then click run.
It took over 2 minutes to render biscuit.pov, though it did manage to use SSE!
reaperducer
10 days ago
It took over 2 minutes to render biscuit.pov
We used to wait two hours for a mandelbrot to display on a Commodore 64, and were delighted when it did.
spookie
10 days ago
Nah, it's about favouring bad programming practices, not thinking about the architecture of the software and giving developer experience a bigger role than end user experience. All these stemming from a push to get to market earlier or making employees replaceable.
taeric
10 days ago
I'd guess less on "bad programming practices" and more "prioritizing development speed?" Mostly inline with your next point. We value getting things released and do not have a solid process for optimizing without adding new things.
Ironically, it was specifically the longer feedback cycle of long builds and scheduled releases that seems to specifically have given us better software?
spookie
10 days ago
Fair, I just think there is a huge overlap between bad practice and speed of development. The latter fuels the former in many ways.
taeric
9 days ago
Oh, I likely fully agree with you on it. I'm just pointing at the hazard that a lot of these practices aren't, intrinsically, bad. Rapidly getting something done is, generally, a good thing. I'm not entirely sure how to make it not the priority, but it does feel that that is the problem.
ralphc
10 days ago
How much modern security do I need to write & print a term paper or church bulletin on my own computer?
qwerty456127
9 days ago
This depends on whether you want to exchange data with other computers using even removable media, let alone the Internet. Also whether you want to use Unicode. In case you only want to hand-type, edit and print a plain English paper right away by dumping plaintext/PostScript/PCL to LPT you probably are fine with any computer you can find. It's just nobody is using standalone computers anymore, almost every modern computer is a network device.
toast0
10 days ago
I mean, as mentioned upthread, Office 97 (and Office 95 before it) was slow to load, so slow that they added the start up accelerator.
You can run Office 97 now and it'll start fast because disk i/o and cpus are so much faster now. Otoh Excel 97 has a maximum of 256 columns and 64k rows. You might want to try Excel 2007 for 16k columns and 1M rows, and/or Excel 2016 for 64-bit memory.
spitfire
10 days ago
The 90s was a time when computers were doubling in speed every 18 months. I remember office 97* being lightning fast on. A 366mhz celeron - a cheap chip in 1998.
You could build fast software today by simply adopting a reference platform, say. A 10 year old 4core system. then measuring performance there. If it lags then do whatever work needs to be done to speed it up.
Personally I think we should all adopt the raspberry pi zero as a reference platform.
Edit: * office 2000 was fast too with 32 megs of ram. Seriously what have we done?
hedora
10 days ago
Oddly, I remember timing it, and the startup accelerator didn't seem to speed up Office start time. It slowed everything else down though.
I rebooted a lot though (mostly used Linux by then), so maybe it was just fighting with itself when I immediately fired up office after boot.
kyrra
10 days ago
If you like fast apps, maybe check out FilePilot, a Windows explorer alternative.
It's amazingly fast, though it's missing some features and will be really expensive when it leaves beta.
QuicklyNow
10 days ago
I made an account to thank you for this. I've been looking for a _fast_ alternative to explorer since Windows XP. But one that doesn't require a change in workflow. This is the fastest I've tried by far. I've only been using it for 5 minutes, but I'm sold. Earlybird discount too!
Thank you for posting this, and if you have any other speedy apps you'd recommend I'd welcome suggestions. Mine top suggestions are Speedcrunch [0] (calculator app) and Everything [1] file search combined with Listary [2]
[0] https://github.com/ruphy/speedcrunch
[1] https://www.voidtools.com/
(For reference, I've tried Total Commander, DOpus, Files, Explorer XP, XY Explorer, Explorer ++, FreeCommander, Double Commander, Q-Dir)
jpsouth
10 days ago
Everything (the tool) is ridiculously fast, I’ve used it for quite a while now and it’s nice to see it mentioned here.
kyrra
10 days ago
I'll check these out, thanks!
I learned about File Pilot (whose author posts here: https://x.com/vkrajacic) from Casey Muratori (https://x.com/cmuratori) who pushed it a bunch because he loves fast things.
ajolly
8 days ago
Listary has slowed down tremendously ever since they included search. For a launcher I'm still using "find as run robot", which truly is a 90s era piece of softer but works blazingly fast. I do have a plug-in to tie it into search everything.
zerkten
10 days ago
Have you tried xplorer²? I only know about it because I was into Windows programming using the WTL eons ago.
qwerty456127
10 days ago
In my opinion Total Commander has always been the most ideal (also fast) file management tool since Windows 3.x. It was named Windows Commander back in the days but it still supports Windows 3.x as Total Commander.
vlachen
10 days ago
I never knew it was a Windows program. I've been using it on my Android phones for years.
Oleksa_dr
10 days ago
All of these ‘fast’ file managers have a big problem: they don't support system calls to dialog windows.
Mostly users interact with the explorer in this scenario to open/save a file in ‘BrowserOS’
zerkten
10 days ago
>> they don't support system calls to dialog windows.
It's a little unclear what you mean exactly. Do you want the browsing experience changed for the system's file open/save dialogs? i.e. a third-party file explorer opens instead with all of it's features.
gymbeaux
10 days ago
Do you know how it compares to Dolphin for interacting with very large (100k+ files) directories? Dolphin is the only file manager I’ve found that keeps up with the large directories- GNOME (Nautilus) and Windows Explorer are dogshit slow, even after the file list populates. macOS Finder is somewhere in the middle but still very slow.
Lammy
10 days ago
Check out ROX-Filer https://github.com/rox-desktop/rox-filer
worthless-trash
10 days ago
When you have 100k+ files sometimes the filesystem itself matters. Have you set your expectations appropriately, aka compared it to a raw ls/dir ?
aloisdg
10 days ago
I like dolphin but pcmanfm is fastet
ct0
10 days ago
how does it compare to directory opus?
jacurtis
10 days ago
I actually use IntelliJ and Sublime interchangeably throughout the day. I default to sublime for most work because of how snappy fast it is. I only load up Intellij when I need to do something that leverages its full IDE capabilities. To second your comment, I encourage people to try Sublime, you will be shocked at how fast it feels.
I still love IntelliJ, its a great product. But it is slow, bloats the computer, needs constant updating. But at least its incredibly powerful as a tradeoff to the bloat.
The Office debate is slightly different. It is slow, bloats the computer, needs constant updating. But unlike IntelliJ i dont feel that there is any advantage to all that added weight. We are using a word processor. We type words onto a blank screen. Why is Word and Outlook the most common applications to crash my computer that has an M1 Max Chip with 48Gb of Memory? I can literally run and build AI models on my computer with no problem but you boot up Microsoft Word to type words onto a blank page and you have a 33% chance of crashing the computer.
Google Sheets and Docs are actually better tools in my opinion for most people. 99% or more of the features you need are available in the google equivalents. These products are fast and run in a browser! The UI is actually VASTLY superior to Microsoft's ribbon. I still can't find stuff that I need on a routine basis and have to google it. I don't have this problem when using Google's office equivalents.
ben-schaaf
10 days ago
> and it's not even the fastest editor, iirc it uses Python under the hood
The majority of the app is written in C++. Python is used for plugins.
mdhb
10 days ago
Instead they decided to build it with React if I remember correctly which is truly one of the fucking dumbest ideas I’ve ever heard of.
tremon
10 days ago
I don't think these apps were ever built to be fast, they were built with the resource constraints of the time. Moore's law is what's making these apps fast.
rollcat
10 days ago
Intel giveth, Microsoft taketh.
This is even more sad with Apple. My M1 Mac felt incredibly snappy with Big Sur, but is getting ever so slightly slower with each update. My iPhone SE2 is a complete disaster.
mleo
10 days ago
They used no networking services either. Now, I open a shared PowerPoint and am stuck waiting for a couple of minutes while it is syncing or doing who knows what. People have no sense of what templates they copy from other documents causing the size of the file and load times to bloat.
formerly_proven
10 days ago
Word 2000 definitely was not very quick on a contemporary office PC (Pentium II or III), though I'm pretty certain it was much, much faster than desktop O365 is on a contemporary office PC today, despite those being >100x faster. So Fermi would estimate modern office to probably require at least 1000x more resources than Office 2000.
user
10 days ago
user
10 days ago
citizenpaul
10 days ago
>I wish companies would go back to building fast apps
My prediction is that we are about to enter a great winter of poor performance across the industry as AI slop is put in to production en mass. Along with bugginess that we have not seen since the early dotcom days.
bell-cot
10 days ago
> I wish companies would go back to building fast apps...
Similar to the slow-opening glove box in a car, many humans perceive slow computers & software as a signifiers of importance/luxury/prestige. At least at first. By the time they are familiar with the slow software, and impatient to just get their work done - too late, SnailSoft already has their money.
jonhohle
10 days ago
I always think the same thing. 486s could run real-time spell check and do wisiwig layouts and came on floppy disks. Now we have screen recording apps that require 256MB downloads every 5 minutes (yesterday’s story).
I have a small utility app that I sell and make great pains to keep it small and resource light. I really appreciate when other devs do the same.
hedora
10 days ago
486's were way past when that became practical. I remember using Prodigy on a 286. Screenshots are rare, since you can't fire it up in an emulator unless you have a modem that can make phone calls to the 1980's. Also, machines from back then didn't really have room to store screenshots:
https://www.vintagecomputing.com/index.php/archives/1063
Their service used vector art to render interactive pages like that over <= 2400 baud modems. Other than it being proprietary and stuff, I'm guessing the web would be a much cooler place if HTML hadn't eaten their lunch. SVG is a decent consolation prize, I guess.
ralphc
10 days ago
Let me point you to Prodigy Reloaded, https://github.com/ProdigyReloaded. We're reviving the Prodigy server and as many cache files as we can find, using Elixir as the backend language.
It's not merged yet but I've written an Elixir library that writes graphics files in Prodigy's graphics format, NAPLPS. I'm using it to get current weather and creating weather maps that are displayed in Prodigy. https://github.com/rrcook/naplps_writer
You can run Prodigy in DOSBox and get screenshots.
donny2018
10 days ago
Where is that guy who coded RollerCoaster Tycoon in Assembly?
alliao
10 days ago
some of the modern software is slower because someone made a poor decision to fetch something from the network on the critical path of main functionality... kills me
amiga386
10 days ago
Example: https://old.reddit.com/r/libreoffice/comments/upf8nw/fixed_o...
Opening a spreadsheet, even if you don't want to print it, will hang for 30 seconds doing nothing, because LibreOffice will load the printer settings for the document, which means asking the printer, which if the printer is network-based and turned off, means a 30 second wait for a timeout.
Reported in 2011, still not fixed: https://bugs.documentfoundation.org/show_bug.cgi?id=42673
ramshanker
10 days ago
This... 100 times. Everyday wastes my few minutes. 5 figure commercial application, state of art in it's domain, but somehow It has to wait for that unresponsive network printer and hang the startup UI.
HideousKojima
10 days ago
Best workaround (still ridiculous that it hasn't been fixed) is to set a PDF printer as your default printer.
hedora
10 days ago
I frequently have to do this for a different reason on MacOS/iOS. Printing directly sends bad PCL to my printer and makes it spew garbage pages.
"Print to PDF -> print the PDF" is much more reliable.
sigh
buovjaga
8 days ago
> Reported in 2011, still not fixed: https://bugs.documentfoundation.org/show_bug.cgi?id=42673
Perhaps spurred by this comment, there was new discussion in the report and it turned out Microsoft has fixed the issue in Windows 11: https://bugs.documentfoundation.org/show_bug.cgi?id=42673#c8...
Quote:
"I was one of the many who reported this problem in one from or another. The problem is Windows-specific. I have found out that the problem actually comes from the Windows print system. There is no way to check that a printer is actually active or working or even present without incurring a long time-out in case the printer is not present or powered. Trying to check the defualt printer will incur the same time-out.
Calc apparently wants to check the printer to know how to construct its layout, and has to wait for the time-out to continue.
Some of the comments that claim that Calc hangs and never returns have probably not waited long enough for the timeout.
On my new Windows 11 computer, this printer system behavior has been changed and I no longer experience a delay while opening Calc."
Twirrim
9 days ago
I don't recall spreadsheets ever having any visible printer formatting until the point at which you print them? So surely it's not even a case of "even if you don't want to print it"?
sgarland
10 days ago
It’s because modern devs by and large have zero concept of latency differences. I had to explain to people yesterday why an on-disk temporary table in MySQL was slower than an in-memory one. “But the disk is an SSD,” was an actual rebuttal I got. Never mind the fact that this was Aurora, so the “disk” is a slice of (multiple) SSDs exposed over a SAN…
osigurdson
10 days ago
Agree. I've heard people say things like "it is slow because it is not C++". When, in reality, the problem was I/O and N^2 algos.
zahlman
10 days ago
I've been trying to impress upon people from my own research in Python packaging: Pip is slow because it defaults to pre-compiling bytecode (and doesn't invoke multiprocessing for that, although this seems to be in the works from the discussion I've seen); imports literally hundreds of modules even when it ultimately does nothing; creates complex wrappers to set up for connecting to the Internet (and using SSH, of course) even if you tell it to install from a locally downloaded wheel directly; caches things in a way that simulates network sessions instead of just actually having the files... you get the idea.
"It's written in Python and not e.g. in Rust" is simply not relevant in that context.
(For that matter, when uv is asked to pre-compile, while it does some intelligent setup for multiprocessing, it still ultimately invokes the same bytecode compiler - which is part of the Python implementation itself, written in C unless you're using an alternative implementation like PyPy.)
osigurdson
10 days ago
There may be a bit of culture at play sometimes as well. If a language isn't meant to be fast, then perhaps devs using the language do not prioritize performance very much. For some, as long as it is possible to point to some externality ("hey this is Python, what do you expect") this is sufficient.
Of course, not always the case. C++ is a good counter example with a massive range of performance "orientedness". On the other hand, I suspect there are few Rust / Zig or C programmers that don't care about performance.
zahlman
10 days ago
To me, Python is more of a challenge than an excuse when it comes to performance.
On the flip side, I've seen quite a few C developers using their own hand-rolled linked lists where vector-like storage would be more appropriate, without giving it a second thought. Implementing good hash tables from scratch turns out not to be very much fun, either. I'm sure there are off the shelf solutions for that sort of thing, but `#include` and static compilation in C don't exactly encourage module reuse the same way that modern languages with package managers do (even considering all the unique issues with Python package management). For better or worse.
(For what it's worth, I worked in J2ME for phones in the North American market in the mid 00s, if you remember what those were like.)
__mharrison__
9 days ago
Do you have a real world example of a Python project that does selective import?
zahlman
9 days ago
Not sure what you mean, but I guess you're getting at something like "is it really pip's fault that it imports almost everything on almost every run?".
Well first off, pip itself does defer quite a few imports - just not in a way that really matters. Notably, if you use the `--python` flag, the initial run will only import some of the modules before it manages to hand off to the subprocess (which has to import those modules again). But that new pip process will end up importing a bunch more eventually anyway.
The thing is that this isn't just about where you put `import` statements (at the top, following style guidelines, vs. in a function to defer them and take full advantage of `sys.modules` caching). The real problem is with library dependencies, and their architecture.
If I use a library that provides the `foo` top-level package, and `import foo.bar.baz.quux`, at a minimum Python will need to load the `foo`, `foo.bar`, `foo.bar.baz` and `foo.bar.baz.quux` modules (generally, from the library's `foo/__init__.py`, foo/bar/__init__.py`, `foo/bar/baz/__init__.py` and `foo/bar/baz/quux.py` respectively). But some libraries offer tons of parallel, stand-alone functionality, such that that's the end of it; others are interconnected, such that those modules will have a bunch of their own top-level `import`s, etc. There are even cases where library authors preemptively import unnecessary things in their `__init__.py` files just to simplify the import statements for the client code. That also happens for backward compatibility reasons (if a library reorganizes some functionality from `foo` into `foo.bar`, then `foo/__init__.py` might have `from . import bar` to avoid breaking existing code that does `import foo`... and then over time `bar` might grow a lot bigger).
For pip, rich (https://pypi.org/project/rich/) is a major culprit, from what I've seen. Pip uses little of its functionality (AFAIK, just for coloured text and progress bars) and it tends to import quite a bit preemptively (such as an emoji database). It's very much not designed with modularity or import speed in mind. (It also uses the really old-fashioned system of ad-hoc testing by putting some demo code in each module behind an `if __name__ == '__main__':` block - code that is only ever used if you do `python -m rich.whatever.submodule` from the command line, but has to be processed by everyone).
And yes, these things are slow even with Python's system for precompiling and caching bytecode. It's uniformly slow, without an obvious bottleneck - the problem is the amount of code, not (as far as I can tell) any specific thing in that code.
athenot
10 days ago
It's also a Product issue: "fast and snappy" is almost never on their list of priorities. So product managers will push devs for more features, which satisfy the roadmap, and only get concerned about speed when it reaches painful levels.
jMyles
10 days ago
> only get concerned about speed when it reaches painful levels.
...and by then, the requests for performance are somewhere between onerous and ridiculous.
I'm as wary of premature optimization as anyone, but I also have a healthy fear of future-proofed sluggishness.
jeltz
10 days ago
But is it actually faster by any significant amount on your workload? Did you benchmark? Temporary tables in databases rarely actually do disk IO in any blocking code path but mostly just dirty buffers in the OS or in the database itself and then something writes it to disk in the background. This limits throughput but does not add much extra latency in the common cases.
Edit: It still might be a bad idea to waste the IO if you do not have to but the latency of a temporary table is usually RAM latency, not disk latency even for temporary tables on disk.
sgarland
9 days ago
Didn’t have to, prod benchmarked it for us, twice.
If you’re curious, the EBS disks Aurora uses for temporary storage, when faced with a QD of approximately 240, can manage approximately 5000 IOPS. This was an r6i.32xlarge.
My hypothesis is currently that the massive context switching the CPU had to do to handle the interrupts slowed down its acceptance of new connections / processing other queries enough such that everything piled up. I’ve no idea what kind of core pinning / isolation AWS does under the hood, but CPU utilization from disk I/O alone, according to Enhanced Monitoring, was about 20%.
101008
10 days ago
That is what happens when your dev learnt it from an online course instead of going to college or through proper education. "Anyone can code!"
dns_snek
10 days ago
And I thought we were past this sort of gatekeeping and elitism. I've worked with people who had a master's degree in CS who couldn't code their way out of a wet paper bag. In my experience there's very little correlation between how someone learned programming and how deep their understanding of the entire stack is.
dbalatero
10 days ago
Yeah, deep understanding I think is a matter of how much time you spend investigating the craft and improving. Maybe the real question is: what motivates that to happen? Maybe it's school type, but maybe it's not.
My personal anecdotes, which are music centric but all apply to my software career:
1. I've studied music my whole life, and baked into music is the idea of continual practice & improvement. Because of this experiential input, I believe that I can always improve at things if I actively put a bit of energy into it and show up. I believe it because I've put in so many hours to it and have seen the results. This is deeply ingrained.
2. When I picked up bass in high school, I spent the first year learning tabs in my bedroom. It was ok, but my ability accelerated when I started a band with my friends and had to keep up. I really think the people you surround yourself with can: push you more, teach you things you didn't know, and make the process way more of a fun hang than doing it by yourself.
3. Another outcome from music education was learning that I really love how mastery feels. There's a lot of positive feeling from achieving and growing. As a result, I try to seek it out in other places as well. I imagine sports/etc are the same?
aleph_minus_one
10 days ago
> I've worked with people who had a master's degree in CS who couldn't code their way out of a wet paper bag.
"Programming" consists of an insanely large number of isolated communities. Assuming the respective person is capable, I would assume that he simply comes from a very different "programming culture".
I actually observe a very related phenomenon for myself: the more I learn about some very complicated programming topics, the more "alien" I actually (start to) become to the programming topics that I have to do at work.
bsrkf
10 days ago
Hope this doesn't come off as disrespectful, as in that I don't believe you, but out of personal interest, would you consider expanding on that? I'd love to hear about the particular example you were thinking of, or in what ways self-taught coders surprised you over academically-taught ones, if you've had experience working with both over a meaningful span of time. Also, if the case, in what ways self-taught coders were/are maybe lacking on average.
If you've ever given answers to that in another comments on HN or elsewhere, feel free to link.
MattPalmer1086
10 days ago
It's certainly true in my experience. The main thing that makes a difference is simply how curious and interested you are.
Plenty of graduates simply got into to it to make money, and have no real deep interest. Some of them love the theory but hate the practice. And some of them are good at both of course.
By contrast, self taught people tend to have personal interest going for them. But I've also worked with self taught people who had no real understanding (despite their interest), and who were satisfied if something just worked. Even if they are motivated to know more, they are often lacking in deeper theoretical computer science (this is a gap I've had to gradually fill in myself).
Anyway, the determining factor is rarely exactly how they acquired their skills, it's the approach they take to the subject and personal motivation and curiosity.
bsrkf
10 days ago
Makes sense, out of all the potential differentiators the source of skill attainment simply isn't the necessarily dominant one. Thanks for the answer :)
dns_snek
10 days ago
Not disrespectful at all. I agree with the sibling comments, I think what allows someone to become a great software developer, to have great intuition, and to understand systems and abstractions on a really deep level is their curiosity, hunger for knowledge, and a lot of experience.
There are many college educated software developers who have that sort of drive (or passion, if you will) and there are just as many who don't, it's not something college can teach you, and the same is true for self-taught developers.
At the end of the day "self-taught" is also a spectrum that ranges from people who created their first "hello world" React app 2 months ago to people who have been involved in systems programming since they were 10 years old, and have a wealth of knowledge in multiple related fields like web development, systems administration, and networking. That's why I think it's silly to generalize like that.
Software development is extremely broad so depending on the discipline self-taught developers might not be missing anything essential, or they might have to learn algorithms, discrete mathematics, linear algebra, or calculus on their own. I learned all of that in college but I'd probably have to learn most of it again if I really needed it.
bsrkf
10 days ago
Thanks for the answer, very nice of you to take the time even hours after.
Guess it makes sense; I'm self taught myself, but thought academically taught developers should have a leg up in theory and mathematics, at the same time though, at one point I considered further formal education for myself (in at least paid courses and such), I realized that I don't think there's much I can't teach myself with the resources available (which includes high quality university lectures which are available for free).
Thanks for your perspective.
mbork_pl
8 days ago
Not the GP, but here is my N=1 (N=3, actually, as you'll see).
I have a Master's in Economics. After 3 years of economics, I started a Master's program in maths (back then the Master's degree was the standard thing you achieved after 4.5–5 years of studying in my country, there was basically nothing in-between high school and that). 9 years later I got a PhD in mathematical analysis (so not really close to CS). But I've been programming as a hobby since late 80s (Commodore 64 Basic, Logo, then QBasic, of course quite a bit of Turbo Pascal, and a tiny bit of C, too). I also read a bit (but not very much) about things like algos, data structures etc. Of course, a strong background in analysis gives one a pretty solid grasp of things like O(n) vs O(n log n) vs O(n^2) vs O(2^n). 9 years ago I started programming as a side job, and 2 years later I quit the uni.
I lack a lot of foundations – I know very little about networks, for example. But even in our small team of 5, I don't /need/ to know that much – if I have a problem, I can ask a teammate next desk (who actually studied CS).
Of course, _nemo iudex in causa sua_, but I think doing some stuff on the C64 and then Turbo Pascal gave me pretty solid feeling for what is going on under the hood. (I believe it's very probable that coding in BASIC on an 8-bit computer was objectively "closer to the bare metal" than contemporary C with all the compiler optimizations.) I would probably be able to implement the (in)famous linked list with eyes closed, and avoiding N+1 database queries is a natural thing for me to do (having grown up with a 1 MHz processor I tend to be more frugal with cycles than my younger friends). Recently I was tasked with rewriting a part of our system to optimize it to consume less memory (which is not necessarily an obvious thing in Node.js).
Another teammate (call them A) who joined us maybe 2 years ago is a civil engineer who decided to switch careers. They are mainly self-taught (well, with a bit of older-brother-taught), but they are a very intelligent person with a lot of curiosity and willingness to learn. I used to work with someone else (call them B) who had formal CS education (and wasn't even fresh out of a university, it was their second programming job, I think), but lacked general life experience (they were 5-10 years younger than A), curiosity and deep understanding, and I preferred to work with A than with B hands down. For example, B was perfectly willing to accept rules of thumb as a universal truths "because I was told it was good practice", without even trying to understand why, while A liked to know _why_ it was a good practice.
So – as you yourself noticed – how you acquire knowledge is not _that_ important. IMHO the most important advantage of having a formal CS education is that your knowledge is more likely (but not guaranteed!) to be much more comprehensive. That advantage can be offset by curiosity, willing to learn, some healthy skepticism and age. And yes, I think that young age – as in, lack of general life experience – can be a disadvantage. B was willing to accept even stupid tasks at face value and just code his way through them (and then tear the code apart because it had some fundamental design problem). A, as a more mature person, instinctively (or maybe even consciously) saw when a task did not fit the business needs and sometimes was able to find another solution which was for example simpler/easier to code and at the same time satisfied the actual needs of the customer better.
sgarland
9 days ago
I’ve worked with both (I personally have an unrelated BS and an MS in SWE, which I used purely to get my foot in the door – it worked), and IMO if someone has a BS, not MS, there’s a decent chance they at least understand DS&A, probably took an OS course, etc.
That said, I have also worked with brilliant people who had no formal education in the subject whatsoever, they just really, really liked it.
I’m biased towards ops because that’s what I do and like, but at least in that field, the single biggest green flag I’ve found is whether or not someone has a homelab. People can cry all they want about “penalizing people for having hobbies outside of their job,” but it’s pretty obvious that if you spend more time doing something – even moreso if you enjoy it – you will learn it at a much faster rate than someone who only does it during business hours.
jeltz
10 days ago
Maybe you should skip bashing modern devs and people who learnt from online courses when the parent poster is the one in the wrong. Unless InnoDB has a particularly bad implementation of disk backed temporary tables almodt all disk IO should be done in the background so the latency cost should be small. I recommend benchmarking on you particular workload of you really want to know hur big it is.
I am an old-school developer with a computer engineering degree but many of the old famous devs were self taught. Yes, if you learn how to code through online courses you will miss some important fundamentals but those are possible to learn later and I know several people who have.
sgarland
9 days ago
Please see this comment: https://news.ycombinator.com/item?id=43863885
We have excellent metrics between PMM and AWS Perf. Insights / Enhanced Monitoring. I assure you, on-disk temp tables were to blame. To your point, though, MySQL 5.7 does have a sub-optimal implementation in that it kicks a lot of queries out to disk from memory because of the existence of a TEXT column, which internally treated as a BLOB. Schema design is also partially to blame, since most of the tables were unnecessarily denormalized, and TEXT was often used where VARCHAR would have been more appropriate, but still.
StefanBatory
10 days ago
I wish. I did my undergrad, and now doing my Masters, at a top 10 uni in Poland.
Trust me so much, some of the stuff I learned there was actively harmful. Half of subjects were random fillers, and so on.
I envy Americans so much with that, their CS education seems to be top notch.
jonhohle
10 days ago
hedora
10 days ago
That really should include "sync write 4KB to an SSD". People don't realize it's purely in-RAM on the disk these days (you pay a PCIe round trip, and that's about it).
sgarland
9 days ago
I do want to add that many of the devs I work with and have worked with are genuinely interested in learning this stuff, and I am thrilled by it. What bothers me is when they neither have an interest in learning it, nor in fixing their bad patterns, and leadership shrugs and tells infra teams to make it work.
mjmas
10 days ago
Opening any ms office app with the network disconnected opens up almost instantly...
deeThrow94
10 days ago
This is also a classic emacs mistake, so I wouldn't put it all on age.
qwerty456127
10 days ago
What I am glad to leave behind and forget like a nightmare is Windows local networking with shared folders etc - these never worked nice and the last time anybody I know used these was pre-2010. Today we just use NextCloud, Matrix, email and Git for all our collaboration needs.
vladvasiliu
10 days ago
Who's "we"? I work for a company who drank the MS cool-aid, so running Windows on laptops, using Office365 for e-mail, word processing, spreadsheets, teams for chat, sharepoint / onedrive for shared ressources.
Have you tried launching a local app by typing in the start menu on a default win11 install with limited / slow internet access? Good times. How about doing some operation (say delete an e-mail) in one window of "new" outlook and having the others refresh?
I have never understood how some otherwise reasonable people are able to consider this absolute shitshow of a work environment good enough.
bluedino
10 days ago
A lifetime ago when I was doing MSP work, our law office clients were using the DOS versions of WordPerfect because the Windows version was too slow.
They refused to store files in directories and use good file names (although they were limited to 8.3), so they just scrolled through all their files until they found the right one. But they could open them so fast they didn't care.
In windows you had to use the mouse, click three times, wait for the document to load and render....it was instant under DOS character mode.
layer8
10 days ago
I agree with the point about the immediacy of TUIs, but I use MS Office (and Windows in general) almost exclusively by keyboard, so the point about having to use a mouse isn’t completely accurate.
bluedino
9 days ago
Right, but you could just hit up/down to scroll through the files in the directory listing. Very simple
layer8
9 days ago
You can still do that? I mean, there are certainly places where the interaction happens to be more cumbersome than it is in a TUI, because the relevant piece was designed mouse-first. However, one reason I prefer Windows as a desktop UI is that it was designed to be fully operable by keyboard as well.
bluedino
9 days ago
You could see the files as you scrolled through them. I don't think you understood how it works, I didn't do a great job explaining it. You can see the content of the files instantly on the screen as you go through them. Similar to how preview works on Mac
layer8
9 days ago
Ah, I see. Yeah, that’s more rare to come across on Windows for text documents.
psunavy03
9 days ago
The worst thing Office ever did was effectively kill WordPerfect, and I will die on this hill.
Reveal Codes was an order of magnitude more useful than whatever crap MS continues to pass off as formatting notation to this day 20+ years later, and that's before we even get into Word being over-helpful and getting ahead of you by doing things with autoformat you never wanted to have happen.
Yes, I know WordPerfect is still around, but fat chance being able to conveniently use it anymore.
SoftTalker
10 days ago
Yeah that era of Office, pre-ribbon, was pretty nice as Office goes.
Illotus
10 days ago
Ribbon was better for most people who didn't have all the shortcuts in muscle memory. It is much more discoverable.
SoftTalker
10 days ago
I find its discoverability is terrible. I am always hunting for what I want to do and it's never anywhere that seems to me to be sensible. I usually end up doing a google search for what I want. Perusing the ribbon takes me much more time than just looking at the various options under the old style menus.
Also traditional menus had some traditional standards. Once you learned what was under "File" or "View" or "Insert" or "Format" it was often pretty similar across applications.
CrimsonCape
9 days ago
Logically, users have to learn the name of the tool before performing any sort of geographical associations (which menu, symbol, etc to find the tool).
There is no faster discoverability than O(log(N)) search using the letters of the name as a lookup.
The biggest failure of modern operating systems is failing to standardize this innate reality.
Windows,Linux,etc should have 1. keyboard button to jump to search box 2. type 1-2 letters and hit enter. Operating systems and applications should all have this kind of interface.
The most ironic apps have a ribbon named something like "Edit" but then the most used "edit" command lives in an unrelated ribbon.
allears
10 days ago
Anecdotal evidence from myself: Although I've been using Word for many decades, I've never had much "muscle memory" in terms of accessing features. It was always a case of learning which pulldown menu held the required function.
When the accursed ribbon came along, "discoverability" went out the window. Functions were grouped according to some strange MS logic I could never understand, and it takes me twice as long to format a page as it used to. Now, I basically just use Word for text entry, and if I want an elegant format, I use a graphic design app like Illustrator.
Judging from what I've read online, you may be the only person who actually likes the ribbon.
mcswell
9 days ago
Hieroglyphics are the opposite of "discoverable". That's why they became uninterpretable for almost two thousand years, until the discovery of the Rosetta Stone. And even then it took considerable work to figure out how they functioned. In the Ribbon, in order to discover what some hieroglyph does, you have to mouse over it. Since there are lots of hieroglyphs there, that's a lot of mouse-over. And no, the Ribbon's images make no sense in 99% of the cases.
hedora
10 days ago
That might have been true for the first five minutes of using the software (assuming the person had not yet used a CUA application before the first time they used office). After that, it was strictly worse.
CUA ~= "standard menus + keyboard shortcuts for dos and windows": https://en.wikipedia.org/wiki/IBM_Common_User_Access
Illotus
10 days ago
Not really, it is much more discoverable for most people. If interested, MS UI lead has a blog about lot of the reasons for ribbon and on the research backing it https://learn.microsoft.com/en-us/archive/blogs/jensenh
cowboylowrez
9 days ago
Its probably a case of UI discoverability vs usability. Brand new users might discover better with the ribbon, but as brand new users keep using the product, they transition to experienced users, and the UI needs to adapt to that. My experience is that the ribbon doesn't. Its tolerable, and for me thats enough but I get the points about the ribbon. Its sort of like the new reddit UI, but props to reddit they at least have kept much of the old UI available for longer than I expected them to.
hedora
10 days ago
The problem with it was that it constantly moved the buttons around. So, you had to constantly rediscover it.
hedora
10 days ago
Sadly, none of the links I tried work anymore. (Though the conversation in the comments where they have to explain how to open a ppt in powerpoint is internet gold!)
I was hoping to figure out what led to design incompetence so spectacular that people would still be discussing it after 17 years.
I think there’s a clue in the abstract: The author claims they made 25,000 mock UI screenshots, but doesn’t mention user studies or even internally prototyping some of the concepts to see how they feel.
Illotus
9 days ago
Looks like the links in the posts no longer work, but all the posts are readable and he goes through the work they did and why. They did a lot of usability testing for the ribbon. But anyways I have no horse in this race other than liking ribbon over 16x16 icons and menus, so no point in hashing this over.
layer8
10 days ago
The Ribbon is more difficult to visually grep for me than the classic menus. Not to mention that a number of functions are still hidden in mini-menus in the Ribbon.
It wouldn’t be so bad if keyboard navigation was as good as with the classic menus, but having to press the Alt key separately, and general increased latency, kills it.
sh34r
7 days ago
Ignoring the serious memory safety and security problems that were exploited to get acceptable performance on contemporary hardware, I’d point to shared document editing, automatic backups via OneDrive, and version history as rather significant features in that missing 5%.
I think rosy recollection is tainting your memory here. How often docs would get corrupted due to those aforementioned memory safety issues, even if that Win95/98 box was never connected to the internet.
Of course it’s gonna be super snappy. It was designed to run on any Doom-compatible hardware. Which includes some toasters now.
Edit: it’s also worth noting that 1997 was right around the time where Moore’s law succumbed to the laws of physics for improving single-core CPU performance via clock speed alone.
josephernest
10 days ago
I still use Office 2007 on my computer. Super super snappy, I think Word or Excel starts and finishes loading in 0.5 second after clicking the icon. It has 99% of the features I need compared to the newest Office version.
benjiro
9 days ago
Same ... Office 2007 for the win. Frankly, i do not even use 90% of Office 2007 feature, so why do i even need more? For AI ... no.
And funny thing, barely uses any memory! Makes todays apps look like monsters. Even my music player uses 5x more memory while paused, then freaking Excel with multiple pages open.
ajolly
8 days ago
Do you have a good way to run multiple versions of office? I do sometimes need some of the newer features but I'd like to default to using the older versions.
mcswell
9 days ago
And 100% of what you don't need--the Ribbon--was not in Word 97.
lizknope
9 days ago
I remember a laptop in the early 1990's that had Microsoft Office in ROM to load faster. I can't find a reference to it right now.
Lammy
9 days ago
HP Omnibook 300? https://www.hpmuseum.net/display_item.php?hw=123
`Omnibook300Intro-5091-6270-6pages-May93.pdf` https://www.hpmuseum.net/exhibit.php?hwdoc=123 sez:
> Built-in ROM applications don't use up disk space
- Microsoft Windows
- Microsoft Word for Windows
- Microsoft Excel
- Appointment Book
- Phone Book
- HP Calculator
- Microsoft File Manager
- LapLink Remote Access™
lizknope
9 days ago
That looks like it. I didn't have one but I remember the magazine ads for it.
gymbeaux
10 days ago
Man you can still get by with Photoshop CS3 as far as features go.
user
10 days ago
erkt
10 days ago
Lightroom 5 is miles better than what they offer today. Lightroom Creative Cloud is steaming dog shit. Adobe seriously gets to extort over $120 a year out of me simply for the privilege of reading raw files from a new camera. They provide zero positive contribution these days. All of these incumbent tech companies extract rents on algorithms written decades ago. This needs to end. I am very excited for AI to get advanced enough to whip up replacement programs for everything these companies maintain monopolies over.
mrguyorama
10 days ago
> I am very excited for AI to get advanced enough to whip up replacement programs for everything these companies maintain monopolies over.
You are wildly off base. The algorithms aren't difficult or special. They were written by people reading text books for the most part.
They are able to sit on an on old algorithm for decades because the DMCA made interoperability and selling cheaper tools like that basically illegal.
Because of the DMCA, the work that was done to make IBM PC clones would have been close enough to illegal to kill all the businesses that tried it. They still tried with the liberal IP laws at the time.
erkt
9 days ago
Well that's a bit disappointing but i am aware that their are gaps in my knowledge. I had assumed it was the feasibility of competing with an entrenched firm, not lack of access to research texts. I will chat with Chat to learn more about how DMCA applies, thank you.
gymbeaux
10 days ago
Back in those days it took 15 minutes for Windows to “finish” booting. You’d hit the desktop but the HDD was still going ham loading a dozen programs, each with their own splash screen to remind you of their existence.
toast0
10 days ago
If you've had the priviledge of running Windows 10 on a spinning drive, it never gets to disk idle. Who knows what it's doing, but by that metric it never finishes. It probably never gets to disk idle on an SSD either, but SSDs have so much more io capacity it isn't noticed.
Grazester
10 days ago
Ah the ole slash screen. I remember in high school days writing programs in VB and of course it had to have some "Cool" splash screen.
amiga386
10 days ago
"I bet somebody got a really nice bonus for that feature" https://devblogs.microsoft.com/oldnewthing/20061101-03/?p=29...
> The thing is, all of these bad features were probably justified by some manager somewhere because it’s the only way their feature would get noticed. They have to justify their salary by pushing all these stupid ideas in the user’s faces. “Hey, look at me! I’m so cool!” After all, when the boss asks, “So, what did you accomplish in the past six months,” a manager can’t say, “Um, a bunch of stuff you can’t see. It just works better.” They have to say, “Oh, check out this feature, and that icon, and this dialog box.” Even if it’s a stupid feature.
On the other hand, I very much enjoyed going to Excel 97 cell X97:L97, pressing tab, holding Ctrl+Shift and clicking on the chart icon, because then you could play Excel's built in flight simulator
qwerty456127
10 days ago
> OpenOffice.org (predecessor of LibreOffice) copied this feature, which they called "QuickStarter"
It still does. Neither LibreOffice itself nor it's installation process with its components choice have changed seriously since the old days and I'm very grateful for this. The QuickStarter isn't as relevant anymore as we have fast SSDs now but some slow computers are still around and that's great we still have the option.
agilob
10 days ago
This has always been the problem with Microsoft. Here is a rant from 2020 about Visual Studio https://www.youtube.com/watch?v=GC-0tCy4P1U and live comparison of performance degradation while loading the same project.
silon42
10 days ago
I remember about 15 years ago running Windows + VS in VMWare because I could skip installing Office inside the VM and the system would run noticeably faster.
d_tr
10 days ago
I thought Windows had a generic subsystem for "warming up" frequently used apps for faster launches.
xquce
10 days ago
People don't use Office frequently, and then when they do it's slow and a bad look. So they will cheat in a way that prioritize their own software, and then every one else will then that feature loses all value, as all programs launch on startup as not to be "slow"
layer8
10 days ago
Only for OS components, I think?
mrguyorama
10 days ago
https://en.wikipedia.org/wiki/Windows_Vista_I/O_technologies...
SuperFetch was supposed to work with any app. It never seemed to have much effect IMO.
infinet
9 days ago
My workplace has two Windows 10 machines, one in daily use, and the other is mostly idle. Both are extremely slow. The one in daily use takes at least 15 minutes to become usable after boot. The other is not much better. They are domain-controlled so not much we can do to improve. In contrast, we still have a 20+ years computer, which has an IDE hard drive, running Windows 2000 with MS Office to operate an instrument, and it is much more responsive.
plorg
10 days ago
Huh. Maybe it's because I haven't installed Office since the 2010 version, I assumed OSA was still a thing. My work computer has O365 managed by IT and I could swear I've seen resident daemons (besides, say, Teams) running in the background.
araes
9 days ago
Equivalently useless back in 2004.
Notably, a solution to the current issues with modern office is to use a copy of Office 97.
20+ page XLS uses ~7MB and loads effectively instantly (on a frankly horribly performing laptop that can usually barely run demos on HN)
lolinder
10 days ago
Yeah, I'm honestly trying to figure out why this is getting so much attention. Applications setting themselves to start up at login in order to avoid doing a bunch of work when they're asked for later is one of the oldest patterns in Windows. It's annoying to regularly have to go in and clear out new startup applications, but it's been part of maintaining a Windows machine for decades, and Microsoft Office is hardly the worst offender.
Why is HN suddenly so interested in Microsoft doing the same thing that has always been done by large, bloated app suites?
fredoliveira
10 days ago
> Why is HN suddenly so interested in Microsoft doing the same thing that has always been done by large, bloated app suites?
Probably because it is horrible? It's indicative of how we spend less time optimizing code than we do coming up with computationally expensive, inefficient workarounds.
Let's say a hypothetical Windows user spends 2% of their day using Office (I made that number up). Why should Office be partially loaded the other 98% of the time? How is it acceptable to use those resources?
When are we actually going to start using the new compute capabilities in our machines, rather than letting them get consumed by unoptimized, barely decent code?
vladvasiliu
10 days ago
> Let's say a hypothetical Windows user spends 2% of their day using Office
I don't know about a "hypothetical" user, but I'd bet a "mean" (corporate) user probably uses office all day long. Hell, I've lost count of the number of e-mails I've seen having a screenshot which is inside a word document for some reason, or the number of excel files which are just a 5x4 table.
raincole
10 days ago
You're honestly trying to figure out why people want performant apps...?
lolinder
10 days ago
That's not at all what I asked.