amiga386
9 months ago
What's old is new again.
Microsoft introduced the "Office Startup Assistant" or "Office Startup Application" (osa.exe) in Office 97, it sat in the system tray and loaded the main office DLLs at startup: https://web.archive.org/web/20041214010329/http://support.mi...
OpenOffice.org (predecessor of LibreOffice) copied this feature, which they called "QuickStarter", I don't know exactly when, but no later than 2003: https://www.openoffice.org/documentation/setup_guide2/1.1.x/...
Microsoft made OSA made non-default in Office 2007, and _removed_ it from Office 2010: https://learn.microsoft.com/en-us/previous-versions/office/o...
Are they now bringing it back?
barrkel
9 months ago
I came here looking for this. It's an old idea, from the days when spinning rust was the limiting factor - precache the binaries.
If you ever tried Office 97 on a PC of 10+ years later, it's amazing how fast and lightweight it was. Instant startup, super snappy. And those apps were not lacking in features. 95% of what you need out of a desktop word processor was in Word 97.
MisterTea
9 months ago
> from the days when spinning rust was the limiting factor
How did we get back to this though? We have gigabytes/sec with NVMe and stupid fast CPU's with at least 4 cores in even low end models. Yet a text editor takes so long to load we need to load it up on boot... Such a frustrating field to work in.
afavour
9 months ago
I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering. It's all about stuffing as many features in as quickly as we can and pushing it out to as many people as possible. Performance is always secondary.
Not that I'm that nostalgic for the old days, we would have been doing the exact same thing if we were able to get away with it. But performance restrictions meant you had no choice but to care. Modern tech has "freed" us from that concern.
trealira
9 months ago
Niklaus Wirth wrote about this in 1995, in his essay A Plea for Lean Software.
About 25 years ago, an interactive text editor could be designed with as little as 8,000 bytes of storage. (Modern program editors request 100 times that much). An operating system had to manage with 8,000 bytes, and a compiler had to fit into 32 Kbytes, whereas their modern descendants require megabytes. Has all this inflated software become any faster? On the contrary, were it not for a thousand times faster hardware, modern software would be utterly unusable.
https://www.computer.org/csdl/magazine/co/1995/02/r2064/13rR...
That said, as someone fairly young, I still don't think that makes it wrong or something only an old man would think. Software seems to perform exactly as well as it needs to and no more, which is why hardware advances don't make our computers run software much faster.
nextos
9 months ago
Aside from slowness, feature creep leads to poor quality, i.e. tons of bugs and user confusion with ever-changing graphical interfaces.
If software was simpler, we could afford to offer some formal guarantees of correctness. Model check protocols, verify pre and post conditions à la Dafny, etc.
There's too much change for the sake of change.
aylmao
9 months ago
> There's too much change for the sake of change.
+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.
The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's often almost no other option but to introduce change for the sake of change.
aylmao
9 months ago
> There's too much change for the sake of change.
+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.
The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's almost no other option but to introduce change for the sake of change.
user
9 months ago
etruong42
9 months ago
> Software seems to perform exactly as well as it needs to and no more
The cynical spin I would put on this idea is that software performs as poorly as it can get away with. MSFT is feeling the need/pressure to have Office load faster, and they will try to get away with preloading it.
Otherwise, there is a strong pull towards bloat that different people will try to take credit for as features even if the cumulative experience of all these "features" is actually a worse user-experience.
naikrovek
9 months ago
software authors that don't care about performance annoy me (and I am an old man.)
The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.
I'm not talking about "write in assembly, duh" I'm talking about just doing things intelligently instead of naively. The developers I support often simply are not thinking about the problem they're solving and they solve the problem in the simplest way (for them) and not the simplest way for a computer.
Software is an inefficiency amplifier, because the number of developers for a piece of code is much smaller than the number of computers that run that code; how much coal has been burned solely because of shitty implementations? I'd wager that the answer is "a LOT!"
Even if you don't care about coal usage, think about how much happier your users would be if your application was suddenly 5x faster than it was previously? now think of how many customers want their software to be slow (outside of TheDailyWTF): zero.
languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it. JS and Electron are great for developers, and horrible for users because of that amplification I described above.
I am dead tired of seeing hustle culture overtake everything in this field, and important things, to me, like quality and performance and support all fall straight down the toilet simply because executives want to release features faster.
things like copilot could help with this, i hope. presumably copilot will help introduce better code into applications than a daydreaming developer would, though the existence of vibe coding sort of nulls that out probably.
one thing that AI will do quite soon is increase the amount of software that exists quite dramatically. and I am kinda concerned about the possibility that it's all going to suck horribly.
ppenenko
9 months ago
I commiserate with your frustration with developers writing things suboptimally all too often. However, I disagree with the assumption that it's a JS/Python vs C issue.
Example: when VS Code came out, it was much, much faster, more responsive and stable than Visual Studio at the time. Despite being based on Electron, it apparently was much better on architecture, algorithms and multithreading than VS with its C++ and .NET legacy codebase. That really impressed me, as a C++ programmer.
Overall, it feels like folks who idealize bygone eras of computing didn't witness or have forgotten how slow Windows, VS, Office etc. used to feel in the 90s.
p_ing
9 months ago
> The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.
Why? A good portion of programs are still single-threaded, and often that's the correct choice. Even in games a single-threaded main thread or logic thread may be the only choice. Where multi-threading makes sense it should be employed, but it's difficult to do well.
Otherwise, it's up to the OS to balance threads appropriately. All major OSes do this well today.
njarboe
9 months ago
My standard is that software should appear to work instantly to me, a human. Then it is fast enough. No pressing a button and waiting. That would be great.
autoexec
9 months ago
> languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it.
Even operating systems don't get direct access to the hardware these days. Instead a bunch of SoC middlemen handle everything however they like.
ralphc
9 months ago
What's your proposal for a "compromise" language between programmer productivity and performance, especially for multiple threads and CPUs? Go, Rust, a BEAM language?
colonial
9 months ago
> presumably copilot will help introduce better code into applications than a daydreaming developer would
Copilot is trained on Github (and probably other Git forges w/o permission, because OpenAI and Microsoft are run by greedy sociopaths.)
I'd wager that the majority of fleshed out repositories on these sites contain projects written at the "too-high level" you describe. This certainly seems to be true based on how these models perform ("good" results for web development and scripting, awful results for C/++/Rust/assembly...) - so I wouldn't get your hopes up, unfortunately.
vacuity
9 months ago
I'm also young and heavily favor simple, quality software. Age is a highly coarse proxy for one's experiences, but in this case I think it has more to do with my personality. I have enough experience in computing that I don't think I'm making demands that are unrealistic, although they are certainly unrealistic if we maintain current incentives and motives.
joelwilliamson
9 months ago
“What Andy giveth, Bill taketh away”
mapt
9 months ago
I saw the writing on the wall when I had to install a couple 150MB IDEs to run 101-level Java programs in the mid 2000's. 150 megabytes. MEGABYTES. I could consume about 1 kilobyte per minute of fantasy novel text in uncompressed ASCI, call it 1/8th that fully compressed. That means this compressed binary you're handing me is around 1.2 billion minutes of work (more if ingesting a novel is faster than writing/testing/debugging a program) for what is functionally a text editor, file manager, syntax library, compiler, and debugger. Pretty sure that got done in 150 kilobytes a generation earlier. A generation later, maybe it will be 150 gigabytes.
netsharc
9 months ago
I looked it up, you want an illegal taxi? 168 MB: https://apkcombo.com/uber/com.ubercab/
reaperducer
9 months ago
install a couple 150MB IDEs
Not Java, but an IDE in 4K: https://en.wikipedia.org/wiki/BASIC_Programming
Having used it quite extensively (Well, five solid days over two weeks, which is about 1000x longer than most people gargling on the internet), it's surprisingly capable.
Imagine if someone with the same talent and motivation was working on today's hardware.
<aside> Someone on Atari Age wrote a LISP for the same machine.
MisterTea
9 months ago
> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.
Indeed. I am sure many of us here are burnt out on bloat. I am also sure many of us want to move to smaller stuff but cant simply because of industry momentum. BUT that doesn't mean the dream is dead, only that we must work towards those goals on our own time. I found Plan 9 and haven't looked back. I can rebuild the entire OS in seconds on a fast machine. Even my little Celeron J1900 can rebuild the OS for several supported architectures in minutes. I can share a USB device seamlessly across my network, PXE booted from a single disk without installing anything. Cat(1) is just 36 lines of C.
There's still hope. Just ignore the industry hype noise and put in the effort ourselves.
ryandrake
9 months ago
And just when we think we can't make software any more inefficient, slow, and bloated, they release things like Electron, where you ship an entire browser with your app! And then when we think it can't even get worse, we have Docker and containers where we ship the entire OS with the application.
I'm looking forward to when app developers ship you an entire computer in the mail to run their text editor.
bloomca
9 months ago
The problem with Electron is that business-wise it is an excellent decision. You can get by with a few people to wrap the web app and integrate it with the OS, and then get updates pretty much for free.
Yet for the user it is bad -- bloated, slow, feels non-native, has specific bugs which are hard to address for the devs, etc.
I don't see any light for the desktop UI development unless there is some lightweight universal rendering engine. Tauri with WebView is somewhat promising, but it has problems on Linux and it is hard to target older systems.
LuciOfStars
9 months ago
Electron is horrid, but as a user, I prefer bloated "apps" to no support at all.
As for your second point: [1]
mbs159
9 months ago
> Cat(1) is just 36 lines of C.
Correct me if I'm wrong, but isn't it around 800 lines[1]?
1. https://github.com/coreutils/coreutils/blob/master/src/cat.c
gregates
9 months ago
I love that I work at a place (Row Zero) where caring about performance is baked into the culture, and I can spend days fixing weird perf edge cases our users discover without management asking why I'm wasting my time. And it's office software, no less!
zer00eyz
9 months ago
>> It's all about stuffing as many features in as quickly as we can...
The problem isn't "engineering" the problem is the culture of product management. (Note: NOT product managers per se).
I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"? If you can find one, it will end up being the exception that proves the rule. The culture of "add", "new" and "shiny" doesn't reward keeping things lean and effective. T
In the tangible world we look to accountants for this sort of thing (because they tend to have costs). Think cheap Costco hotdogs and free cookies at Double Tree. No one in product, dev and accounting is going to sit down and try to justify loosing some code, features and maybe a few customers to make it faster when you can just "engineer" you way out of it and not have to sell less is more.
zero_bias
9 months ago
> I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"?
Google goes a step further and kills entire apps
LuciOfStars
9 months ago
Moment of silence for Play Music. YT Music isn't any less awful now than it was a decade ago.
bigfatkitten
9 months ago
Ford does. They look at connected vehicle telemetry to strip out features nobody uses in order to save software maintenance, hardware and third party IP licensing costs.
https://www.slashgear.com/1513242/ford-gets-rid-of-self-park...
lispisok
9 months ago
My personal theory is there is a threshold of performance. Below the threshold the experience is bad enough it affects revenue so getting the program up to speed becomes a priority. Above the threshold only features are prioritized to drive more revenue. That's why despite computers getting orders of magnitude faster computer programs seem to run about the same speed.
nativeit
9 months ago
I think the threshold is more about how much more rent can we seek to collect from users, and making things more performant or ergonomic doesn’t do anything to allow sales to add another 10% to the per-user subscription pricing (I assume this is a product with per-user subscriptions, even though it’s almost certainly unnecessary).
But adding yet another gateway to ChatGPT’s API…that’s a $15/mo/user add-on right there, and not just a monkey, but one of the slower intern monkeys, could write the code for such an incredibly innovative and, there’s no other word for it, powerful new feature that everyone should be thrilled to pay for at a 200-300% (or more!) markup. So guess who gets that VP of New Bullshit Development position, and guess what kind of choices get reinforced?
(EDIT: grammar)
acdha
9 months ago
How does someone get promoted at Microsoft? How do they avoid being seen as a low performer?
Performance just isn’t on that list, and it’s often more and harder work than a given new feature took to create. Office users are getting what Microsoft is designed to deliver.
user
9 months ago
ambicapter
9 months ago
I really don't think we've "lost it", I think performance has just not been a consideration in the engineering of Office for a long time, if ever.
user
9 months ago
trinsic2
9 months ago
> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.
Absolutely this. I think this is evidence that points to modern civilization starting to collapse. When we can engineer correctly, we're fucked.
90s_dev
9 months ago
> we've lost the art of software engineering
Yes! This is what all my projects are geared towards restoring. The big one is not quite ready to announce yet, but I am very proud of it, and extremely excited to release it, to solve exactly that: it makes engineering fun again!
makapuf
9 months ago
Well that username matches
ysofunny
9 months ago
we don't have software engineering anymore than the romans had civil engineering
we now DO have civil engineering but that is it
gwbas1c
9 months ago
It was always like that
Salgat
9 months ago
It's a matter of resource allocation. Lowering your design requirements for performance can save significant developer cost. Also, Word in 2025 is doing a lot more under the hood than 97.
Workaccount2
9 months ago
I'll take the hate for this, but I have been using gemini to build narrow scope apps and they are extremely fucking fast compared to their bloated software package suite $200/user/month counterparts. It's amazing how fast and efficient programs can be when not trying to cover every use case for every possible user at every possible moment on top of a sea of tech debt programming.
While true LLMs fall flat on their face when fed massive codebases, the fact of the matter is that I don't need a 200k LOC program to accomplish a single task that an LLM can do in 2k LOC.
To give an example, we have proprietary piece of software that is used to make (physical) product test systems using flow charts and menus. It's expansive and complex. But we don't need it when we can just spend 30 minutes prompting your way to working test code and it produces way faster and more robust systems.
Maybe the devs of that software package cannot dump that whole codebase into an LLM and work on it. But they are completely missing the forest for the trees.
tharkun__
9 months ago
I will make this analogy:
Have a large ZIP file. Preferably like a few gigs and lots of (small) files.
Try to open it with the built-in Windows 11 tooling from Microsoft. It's going to be super slow to even show anything never mind unpack it.
Now install say 7-zip and do the exact same thing(s) and it's going to be instant opening and unpacking it takes a much much smaller amount of time (only limited by disk speed).
Turns out optimizations / not doing stupid things is still a thing even with all this raw power we now have.
jlarocco
9 months ago
Because an entire generation of developers and their managers believe the hardware is so fast there's no point trying to optimize the software.
Besides, the only thing that matters is getting tickets "done" before the arbitrary sprint deadline in 2 weeks, so best not to spend any extra time cleaning up or optimizing the first thing that works. You can't think about performance until the sprint dedicated to performance.
boringg
9 months ago
100% thought process is: why waste internal resources on speeding up software when the user has enough hardware to manage the workload.
codr7
9 months ago
Battery use is a pretty big concern these days; also, some users like running several things at the same time.
hunter2_
9 months ago
For the local OS and sustained workloads like video playback, yes, battery optimization is huge. For an individual app with bursty compute, less so, plus some of that inefficient code can run in the cloud instead, which is costly, but premium subscriptions can pay for it, and power plants are now colocated with data centers so power transmission cost is negligible. The incentive to be efficient is insufficient.
prussian
9 months ago
I think people forget that some of this software may be relatively fast. The problem is, most corporate environments are loaded up with EDRs and other strange anti-malware software that impede quick startup or speedy library calls. I've seen a misconfigured Forcepoint EDR rule block a window for 5 seconds on copy and paste from Chrome to Word.
Another example: it takes ~2 seconds to run git on my work machine
(Measure-Command { git status | Out-Null }).TotalSeconds
while running the same command on my personal Windows 11 virtual machine is near instant: ~0.1 seconds.
Still slower than Linux, but not nearly as bad as my work machine.user
9 months ago
bigmattystyles
9 months ago
Telemetry, syncing to the cloud by default…
dagmx
9 months ago
Neither of which contribute significantly to size though. The size aspect is what these new preloaders would help with.
2OEH8eoCRo0
9 months ago
We stopped developing for users/customers and instead added layers to make developer lives easier.
Why the hell are all my desktop apps written in JS now?!
cogman10
9 months ago
> Why the hell are all my desktop apps written in JS now?!
Have you seen the state of pretty much every non-js UX framework?
That's why.
JS/css/html won the UX domain in a way that no other language comes close to. If you look at the most recent most modern UX frameworks, they are often just half implemented poor mimics of the js/css/html stack with approximately 0 devs writing new 3rd party extensions.
Intellij uses swing, SWING, as it's UX. A framework written in the 90s filled with warts. Yet, it's still a better experience than the more recent JavaFX experience. Swing simply has more support.
ezst
9 months ago
Call me an idiot, but I still gladly take Swing and javafx over JS and monstrosities like react. The state of Qt is also very good. Web won because the distribution model is easier on the user, and because managers thought UX designers would be making whole apps now, saving on rates. Not because it's technically superior.
cogman10
9 months ago
You're not an idiot for liking the Swing/javafx/QT way of doing things. Or even for thinking they are technically superior.
The bigger issue isn't the tech, it's the ecosystem. While you might like swing, you simply are never going to find the swing version of Material UI or D3.js. That's more the problem that you'll run into.
For some of our apps because we need charting, we are using GraalJS just to run the JS charting library to export to an image that we ultimately put on some of our downloadable reports. It's a huge pain but really the only way to do that.
baobun
9 months ago
It's quite something that there hasn't been any real successor living up to Delphi or even goddamn Visual Basic for modern desktops.
stcroixx
9 months ago
Skilled programmers working on boring stuff like office. Most programmers today don't have the skills they think they do and would find working on something like Office boring.
layer8
9 months ago
Ironically, at the start of my career working on something like Office was my dream, and would actually still be. I reserve to change my mind once I’ve seen the code base, though. ;)
bradley13
9 months ago
Cruft built on frameworks using libraries with a zillion dependencies, some of which are cruft built on frameworks...
PJDK
9 months ago
So, we often look back on the old days with rose tinted glasses. But let me recount my IT classes from the 90s.
We'd sometimes go to the library to write something up in MS Word. We always liked this because it would be a good 5-10 mins to boot up some kind of basic Unix menu. You'd then select windows 3.1 and wait another 10-15 minutes for that to load. Then you could fire up word and wait another 5 minutes. Then you could do 5 minutes work before the class was over!
sh34r
9 months ago
Well, we networked all the computers together around that time, and it turned out that all the 1337 performance hacking that people did back then had severe unintended consequences. You’re talking about an era in which the NX bit would not be utilized by Windows for another ~7 years. “Smashing the Stack for Fun and Profit” was contemporary work.
It’s not rocket science to eke out oodles of performance out of a potato if you don’t care about correctness or memory safety.
Word 97 will only delight you if you use it on an airgapped computer, as a glorified typewriter, never open anyone else’s documents with it, and are diligent about manually saving the doc to multiple places for when it inevitably self-corrupts.
But at that point, why not be like GRRM and write on a DOS word processor? Those were historically a lot more reliable than these second-generation GUI apps.
layer8
9 months ago
> How did we get back to this though?
By piling up nonzero-cost abstractions left and right.
wk_end
9 months ago
And it's easy to understand how we get into that trap. Each one of those abstractions is very low-cost, so it seems harmless.
And getting out of the trap is hard too, because no single abstraction is to blame - you can't just hit things with your profiler and find the hot spot. It's all of them. So now you either live with it or rewrite an entire stack of abstractions.
lukan
9 months ago
"How did we get back to this though?"
Probably because windows needs to make a connection for every file somewhere else first and wait for the reply, before granting you the advanced software as a service feature called text editing.
It definitely feels like this at times and I fear there is too much truth in my statement.
But it is not just windows only. My old chromebook took seconds to open a folder in the file browser (even if it was already open). But a "ls" on the terminal was instant for any folder. So getting the information was not the problem. But from there to displaying it in a GUI, there seems to be myriads of important (tracking?) layers involved.
7657364574
9 months ago
My guess is that we're already seeing the consequences of "AI-assisted programming". Just yesterday, Microsoft's CEO revealed that 30% of their code is written by AI.
user
9 months ago
drjasonharrison
9 months ago
Given the game of telephone which would have to had occurred for that 30% figure to travel from developers up to the CEO, it's probably including things like autocomplete...
The Plan
In the beginning, there was a plan, And then came the assumptions, And the assumptions were without form, And the plan without substance,
And the darkness was upon the face of the workers, And they spoke among themselves saying, "It is a crock of shit and it stinks."
And the workers went unto their Supervisors and said, "It is a pile of dung, and we cannot live with the smell."
And the Supervisors went unto their Managers saying, "It is a container of excrement, and it is very strong, Such that none may abide by it."
And the Managers went unto their Directors saying, "It is a vessel of fertilizer, and none may abide by its strength."
And the Directors spoke among themselves saying to one another, "It contains that which aids plants growth, and it is very strong."
And the Directors went to the Vice Presidents saying unto them, "It promotes growth, and it is very powerful."
And the Vice Presidents went to the President, saying unto him, "This new plan will actively promote the growth and vigor Of the company With very powerful effects."
And the President looked upon the Plan And saw that it was good, And the Plan became Policy.
And this, my friend, is how shit happens.
from anonymous email
Dylan16807
9 months ago
Interesting that this story is not really a game of telephone, but instead a single layer replaced the meaning on purpose.
codr7
9 months ago
By software, most of it is probably generated scaffolding.
miunau
9 months ago
There are no financial or career incentives to optimise existing, working things in these institutions. The structures which these large companies use are for people working on new things, so that's what we get. Making office faster doesn't create headlines, but a new product that does the same thing somehow will.
causality0
9 months ago
Ever notice how Windows 7 and 10 and 11 have basically the same features and benchmark the same on performance tests yet 10 and especially 11 completely shit the bed if you try to run them off a hard drive? Like they might all boot in twenty seconds from an SSD but booting from a hard disk might take W7 two minutes and W11 ten minutes to a stable desktop.
user
9 months ago
intelVISA
9 months ago
Software 'engineering' is too abstract, yet imagine the outrage if every new highway had a 20km/h limit...
PeterStuer
9 months ago
In the old days there was the saying 'What Intel Giveth, Microsoft Taketh Away'
skywhopper
9 months ago
How much engineering time do you think is spent optimizing startup time on most modern editors? I’m guessing next to nothing.
marcod
9 months ago
"eight megabytes and constantly swapping"
Cthulhu_
9 months ago
I wish companies would go back to building fast apps, but alas. Everything is feature-packed and requires you to download half the internet.
My intellij license just expired so today I'm back using Sublime Text, and honestly it's a breath of fresh air / relief - and it's not even the fastest editor, iirc it uses Python under the hood. I've installed Zed but getting plugins and keyboard shortcuts all lined up is always challenging. That one took ~2-3 seconds to cold start.
qwerty456127
9 months ago
> I wish companies would go back to building fast apps
It seems fascinating how much more efficient Windows apps were back in the nineties, capable do to almost everything the same today apps do in a similar manner on orders of magnitude less powerful hardware, often performing even faster.
The last time I expressed this, probably also here, somebody suggested the performance drop is the cost of modern security - vulnerability mitigations, cryptography etc.
dbalatero
9 months ago
I think the performance drop probably has more to do with managers & product folks expecting growth and features at all costs at the expense of keeping performance at some baseline.
I also wonder if it's just harder to continually monitor performance in a way that alerts a team early enough to deal with regressions?
That said, security _can_ impact performance! I work on Stripe frontend surfaces, and one performance bottleneck we have comes from needing to use iframes to sandbox and isolate code for security. Having to load code in iframes adds an additional network load before we can get to page render.
zerkten
9 months ago
I think you need to add that many more developers, or even teams of developers, are building these apps. They have their own fiefdoms and it's less common for devs to have a complete understanding of the code base.
Over time decisions are made independently by devs/teams which cause the code bases to get out of alignment with a performant design. This is exacerbated by the growth pressure. It's then really hard for someone to come in and optimize top to bottom because there is everything from a bug to a design problem. Remediation has significant overhead, so only things around the edges are touched.
Fast forward a couple of years and you have a code base that devs struggle to evolve and add features to as well as keep performant. The causes are many and we come to the same one whether we complain about performance or maintainability. You probably don't feel this way, but Stripe and other premier engineering companies are way ahead of others in terms of their practices when you compare with the average situation developers are facing.
Independent mobile development is often where I see most attention to performance these days. The situation for these devs is a little bit closer to what existed in the nineties. They have a bigger span of control and performance is something they feel directly so are incentivized to ensure it's great.
apricot
9 months ago
> It seems fascinating how much more efficient Windows apps were back in the nineties
I remember everyone complaining about how bloated they were at the time. Pretty sure someone in 2055 is going to run today's Office on 2055 computers and marvel at how streamlined it is.
Illotus
9 months ago
My recollection is completely different, software was really slow on contemporary PCs in the 90s. Spinning disks, single core cpus, lot more swapping due to memory being so much more expensive.
barrkel
9 months ago
Contemporary software was slow. You could tell you should consider more RAM when the HDD light and chugging noises told you it was swapping. But if you ran the same software with the benefit of 10 years of hardware improvement, it was not slow at all.
hedora
9 months ago
This might match your recollection. x86 Win95 raytracing in javascript on my arm laptop is usable, but sort of slow:
https://copy.sh/v86/?profile=windows95
Once it boots, run povray, then click run.
It took over 2 minutes to render biscuit.pov, though it did manage to use SSE!
reaperducer
9 months ago
It took over 2 minutes to render biscuit.pov
We used to wait two hours for a mandelbrot to display on a Commodore 64, and were delighted when it did.
spookie
9 months ago
Nah, it's about favouring bad programming practices, not thinking about the architecture of the software and giving developer experience a bigger role than end user experience. All these stemming from a push to get to market earlier or making employees replaceable.
taeric
9 months ago
I'd guess less on "bad programming practices" and more "prioritizing development speed?" Mostly inline with your next point. We value getting things released and do not have a solid process for optimizing without adding new things.
Ironically, it was specifically the longer feedback cycle of long builds and scheduled releases that seems to specifically have given us better software?
spookie
9 months ago
Fair, I just think there is a huge overlap between bad practice and speed of development. The latter fuels the former in many ways.
ralphc
9 months ago
How much modern security do I need to write & print a term paper or church bulletin on my own computer?
qwerty456127
9 months ago
This depends on whether you want to exchange data with other computers using even removable media, let alone the Internet. Also whether you want to use Unicode. In case you only want to hand-type, edit and print a plain English paper right away by dumping plaintext/PostScript/PCL to LPT you probably are fine with any computer you can find. It's just nobody is using standalone computers anymore, almost every modern computer is a network device.
toast0
9 months ago
I mean, as mentioned upthread, Office 97 (and Office 95 before it) was slow to load, so slow that they added the start up accelerator.
You can run Office 97 now and it'll start fast because disk i/o and cpus are so much faster now. Otoh Excel 97 has a maximum of 256 columns and 64k rows. You might want to try Excel 2007 for 16k columns and 1M rows, and/or Excel 2016 for 64-bit memory.
spitfire
9 months ago
The 90s was a time when computers were doubling in speed every 18 months. I remember office 97* being lightning fast on. A 366mhz celeron - a cheap chip in 1998.
You could build fast software today by simply adopting a reference platform, say. A 10 year old 4core system. then measuring performance there. If it lags then do whatever work needs to be done to speed it up.
Personally I think we should all adopt the raspberry pi zero as a reference platform.
Edit: * office 2000 was fast too with 32 megs of ram. Seriously what have we done?
hedora
9 months ago
Oddly, I remember timing it, and the startup accelerator didn't seem to speed up Office start time. It slowed everything else down though.
I rebooted a lot though (mostly used Linux by then), so maybe it was just fighting with itself when I immediately fired up office after boot.
kyrra
9 months ago
If you like fast apps, maybe check out FilePilot, a Windows explorer alternative.
It's amazingly fast, though it's missing some features and will be really expensive when it leaves beta.
QuicklyNow
9 months ago
I made an account to thank you for this. I've been looking for a _fast_ alternative to explorer since Windows XP. But one that doesn't require a change in workflow. This is the fastest I've tried by far. I've only been using it for 5 minutes, but I'm sold. Earlybird discount too!
Thank you for posting this, and if you have any other speedy apps you'd recommend I'd welcome suggestions. Mine top suggestions are Speedcrunch [0] (calculator app) and Everything [1] file search combined with Listary [2]
[0] https://github.com/ruphy/speedcrunch
[1] https://www.voidtools.com/
(For reference, I've tried Total Commander, DOpus, Files, Explorer XP, XY Explorer, Explorer ++, FreeCommander, Double Commander, Q-Dir)
jpsouth
9 months ago
Everything (the tool) is ridiculously fast, I’ve used it for quite a while now and it’s nice to see it mentioned here.
kyrra
9 months ago
I'll check these out, thanks!
I learned about File Pilot (whose author posts here: https://x.com/vkrajacic) from Casey Muratori (https://x.com/cmuratori) who pushed it a bunch because he loves fast things.
ajolly
9 months ago
Listary has slowed down tremendously ever since they included search. For a launcher I'm still using "find as run robot", which truly is a 90s era piece of softer but works blazingly fast. I do have a plug-in to tie it into search everything.
zerkten
9 months ago
Have you tried xplorer²? I only know about it because I was into Windows programming using the WTL eons ago.
qwerty456127
9 months ago
In my opinion Total Commander has always been the most ideal (also fast) file management tool since Windows 3.x. It was named Windows Commander back in the days but it still supports Windows 3.x as Total Commander.
vlachen
9 months ago
I never knew it was a Windows program. I've been using it on my Android phones for years.
Oleksa_dr
9 months ago
All of these ‘fast’ file managers have a big problem: they don't support system calls to dialog windows.
Mostly users interact with the explorer in this scenario to open/save a file in ‘BrowserOS’
zerkten
9 months ago
>> they don't support system calls to dialog windows.
It's a little unclear what you mean exactly. Do you want the browsing experience changed for the system's file open/save dialogs? i.e. a third-party file explorer opens instead with all of it's features.
gymbeaux
9 months ago
Do you know how it compares to Dolphin for interacting with very large (100k+ files) directories? Dolphin is the only file manager I’ve found that keeps up with the large directories- GNOME (Nautilus) and Windows Explorer are dogshit slow, even after the file list populates. macOS Finder is somewhere in the middle but still very slow.
Lammy
9 months ago
Check out ROX-Filer https://github.com/rox-desktop/rox-filer
worthless-trash
9 months ago
When you have 100k+ files sometimes the filesystem itself matters. Have you set your expectations appropriately, aka compared it to a raw ls/dir ?
aloisdg
9 months ago
I like dolphin but pcmanfm is fastet
ct0
9 months ago
how does it compare to directory opus?
jacurtis
9 months ago
I actually use IntelliJ and Sublime interchangeably throughout the day. I default to sublime for most work because of how snappy fast it is. I only load up Intellij when I need to do something that leverages its full IDE capabilities. To second your comment, I encourage people to try Sublime, you will be shocked at how fast it feels.
I still love IntelliJ, its a great product. But it is slow, bloats the computer, needs constant updating. But at least its incredibly powerful as a tradeoff to the bloat.
The Office debate is slightly different. It is slow, bloats the computer, needs constant updating. But unlike IntelliJ i dont feel that there is any advantage to all that added weight. We are using a word processor. We type words onto a blank screen. Why is Word and Outlook the most common applications to crash my computer that has an M1 Max Chip with 48Gb of Memory? I can literally run and build AI models on my computer with no problem but you boot up Microsoft Word to type words onto a blank page and you have a 33% chance of crashing the computer.
Google Sheets and Docs are actually better tools in my opinion for most people. 99% or more of the features you need are available in the google equivalents. These products are fast and run in a browser! The UI is actually VASTLY superior to Microsoft's ribbon. I still can't find stuff that I need on a routine basis and have to google it. I don't have this problem when using Google's office equivalents.
ben-schaaf
9 months ago
> and it's not even the fastest editor, iirc it uses Python under the hood
The majority of the app is written in C++. Python is used for plugins.
mdhb
9 months ago
Instead they decided to build it with React if I remember correctly which is truly one of the fucking dumbest ideas I’ve ever heard of.
tremon
9 months ago
I don't think these apps were ever built to be fast, they were built with the resource constraints of the time. Moore's law is what's making these apps fast.
rollcat
9 months ago
Intel giveth, Microsoft taketh.
This is even more sad with Apple. My M1 Mac felt incredibly snappy with Big Sur, but is getting ever so slightly slower with each update. My iPhone SE2 is a complete disaster.
mleo
9 months ago
They used no networking services either. Now, I open a shared PowerPoint and am stuck waiting for a couple of minutes while it is syncing or doing who knows what. People have no sense of what templates they copy from other documents causing the size of the file and load times to bloat.
formerly_proven
9 months ago
Word 2000 definitely was not very quick on a contemporary office PC (Pentium II or III), though I'm pretty certain it was much, much faster than desktop O365 is on a contemporary office PC today, despite those being >100x faster. So Fermi would estimate modern office to probably require at least 1000x more resources than Office 2000.
user
9 months ago
user
9 months ago
citizenpaul
9 months ago
>I wish companies would go back to building fast apps
My prediction is that we are about to enter a great winter of poor performance across the industry as AI slop is put in to production en mass. Along with bugginess that we have not seen since the early dotcom days.
bell-cot
9 months ago
> I wish companies would go back to building fast apps...
Similar to the slow-opening glove box in a car, many humans perceive slow computers & software as a signifiers of importance/luxury/prestige. At least at first. By the time they are familiar with the slow software, and impatient to just get their work done - too late, SnailSoft already has their money.
jonhohle
9 months ago
I always think the same thing. 486s could run real-time spell check and do wisiwig layouts and came on floppy disks. Now we have screen recording apps that require 256MB downloads every 5 minutes (yesterday’s story).
I have a small utility app that I sell and make great pains to keep it small and resource light. I really appreciate when other devs do the same.
hedora
9 months ago
486's were way past when that became practical. I remember using Prodigy on a 286. Screenshots are rare, since you can't fire it up in an emulator unless you have a modem that can make phone calls to the 1980's. Also, machines from back then didn't really have room to store screenshots:
https://www.vintagecomputing.com/index.php/archives/1063
Their service used vector art to render interactive pages like that over <= 2400 baud modems. Other than it being proprietary and stuff, I'm guessing the web would be a much cooler place if HTML hadn't eaten their lunch. SVG is a decent consolation prize, I guess.
ralphc
9 months ago
Let me point you to Prodigy Reloaded, https://github.com/ProdigyReloaded. We're reviving the Prodigy server and as many cache files as we can find, using Elixir as the backend language.
It's not merged yet but I've written an Elixir library that writes graphics files in Prodigy's graphics format, NAPLPS. I'm using it to get current weather and creating weather maps that are displayed in Prodigy. https://github.com/rrcook/naplps_writer
You can run Prodigy in DOSBox and get screenshots.
donny2018
9 months ago
Where is that guy who coded RollerCoaster Tycoon in Assembly?
alliao
9 months ago
some of the modern software is slower because someone made a poor decision to fetch something from the network on the critical path of main functionality... kills me
amiga386
9 months ago
Example: https://old.reddit.com/r/libreoffice/comments/upf8nw/fixed_o...
Opening a spreadsheet, even if you don't want to print it, will hang for 30 seconds doing nothing, because LibreOffice will load the printer settings for the document, which means asking the printer, which if the printer is network-based and turned off, means a 30 second wait for a timeout.
Reported in 2011, still not fixed: https://bugs.documentfoundation.org/show_bug.cgi?id=42673
ramshanker
9 months ago
This... 100 times. Everyday wastes my few minutes. 5 figure commercial application, state of art in it's domain, but somehow It has to wait for that unresponsive network printer and hang the startup UI.
HideousKojima
9 months ago
Best workaround (still ridiculous that it hasn't been fixed) is to set a PDF printer as your default printer.
hedora
9 months ago
I frequently have to do this for a different reason on MacOS/iOS. Printing directly sends bad PCL to my printer and makes it spew garbage pages.
"Print to PDF -> print the PDF" is much more reliable.
sigh
buovjaga
9 months ago
> Reported in 2011, still not fixed: https://bugs.documentfoundation.org/show_bug.cgi?id=42673
Perhaps spurred by this comment, there was new discussion in the report and it turned out Microsoft has fixed the issue in Windows 11: https://bugs.documentfoundation.org/show_bug.cgi?id=42673#c8...
Quote:
"I was one of the many who reported this problem in one from or another. The problem is Windows-specific. I have found out that the problem actually comes from the Windows print system. There is no way to check that a printer is actually active or working or even present without incurring a long time-out in case the printer is not present or powered. Trying to check the defualt printer will incur the same time-out.
Calc apparently wants to check the printer to know how to construct its layout, and has to wait for the time-out to continue.
Some of the comments that claim that Calc hangs and never returns have probably not waited long enough for the timeout.
On my new Windows 11 computer, this printer system behavior has been changed and I no longer experience a delay while opening Calc."
Twirrim
9 months ago
I don't recall spreadsheets ever having any visible printer formatting until the point at which you print them? So surely it's not even a case of "even if you don't want to print it"?
sgarland
9 months ago
It’s because modern devs by and large have zero concept of latency differences. I had to explain to people yesterday why an on-disk temporary table in MySQL was slower than an in-memory one. “But the disk is an SSD,” was an actual rebuttal I got. Never mind the fact that this was Aurora, so the “disk” is a slice of (multiple) SSDs exposed over a SAN…
osigurdson
9 months ago
Agree. I've heard people say things like "it is slow because it is not C++". When, in reality, the problem was I/O and N^2 algos.
zahlman
9 months ago
I've been trying to impress upon people from my own research in Python packaging: Pip is slow because it defaults to pre-compiling bytecode (and doesn't invoke multiprocessing for that, although this seems to be in the works from the discussion I've seen); imports literally hundreds of modules even when it ultimately does nothing; creates complex wrappers to set up for connecting to the Internet (and using SSH, of course) even if you tell it to install from a locally downloaded wheel directly; caches things in a way that simulates network sessions instead of just actually having the files... you get the idea.
"It's written in Python and not e.g. in Rust" is simply not relevant in that context.
(For that matter, when uv is asked to pre-compile, while it does some intelligent setup for multiprocessing, it still ultimately invokes the same bytecode compiler - which is part of the Python implementation itself, written in C unless you're using an alternative implementation like PyPy.)
osigurdson
9 months ago
There may be a bit of culture at play sometimes as well. If a language isn't meant to be fast, then perhaps devs using the language do not prioritize performance very much. For some, as long as it is possible to point to some externality ("hey this is Python, what do you expect") this is sufficient.
Of course, not always the case. C++ is a good counter example with a massive range of performance "orientedness". On the other hand, I suspect there are few Rust / Zig or C programmers that don't care about performance.
__mharrison__
9 months ago
Do you have a real world example of a Python project that does selective import?
athenot
9 months ago
It's also a Product issue: "fast and snappy" is almost never on their list of priorities. So product managers will push devs for more features, which satisfy the roadmap, and only get concerned about speed when it reaches painful levels.
jMyles
9 months ago
> only get concerned about speed when it reaches painful levels.
...and by then, the requests for performance are somewhere between onerous and ridiculous.
I'm as wary of premature optimization as anyone, but I also have a healthy fear of future-proofed sluggishness.
jeltz
9 months ago
But is it actually faster by any significant amount on your workload? Did you benchmark? Temporary tables in databases rarely actually do disk IO in any blocking code path but mostly just dirty buffers in the OS or in the database itself and then something writes it to disk in the background. This limits throughput but does not add much extra latency in the common cases.
Edit: It still might be a bad idea to waste the IO if you do not have to but the latency of a temporary table is usually RAM latency, not disk latency even for temporary tables on disk.
sgarland
9 months ago
Didn’t have to, prod benchmarked it for us, twice.
If you’re curious, the EBS disks Aurora uses for temporary storage, when faced with a QD of approximately 240, can manage approximately 5000 IOPS. This was an r6i.32xlarge.
My hypothesis is currently that the massive context switching the CPU had to do to handle the interrupts slowed down its acceptance of new connections / processing other queries enough such that everything piled up. I’ve no idea what kind of core pinning / isolation AWS does under the hood, but CPU utilization from disk I/O alone, according to Enhanced Monitoring, was about 20%.
101008
9 months ago
That is what happens when your dev learnt it from an online course instead of going to college or through proper education. "Anyone can code!"
dns_snek
9 months ago
And I thought we were past this sort of gatekeeping and elitism. I've worked with people who had a master's degree in CS who couldn't code their way out of a wet paper bag. In my experience there's very little correlation between how someone learned programming and how deep their understanding of the entire stack is.
dbalatero
9 months ago
Yeah, deep understanding I think is a matter of how much time you spend investigating the craft and improving. Maybe the real question is: what motivates that to happen? Maybe it's school type, but maybe it's not.
My personal anecdotes, which are music centric but all apply to my software career:
1. I've studied music my whole life, and baked into music is the idea of continual practice & improvement. Because of this experiential input, I believe that I can always improve at things if I actively put a bit of energy into it and show up. I believe it because I've put in so many hours to it and have seen the results. This is deeply ingrained.
2. When I picked up bass in high school, I spent the first year learning tabs in my bedroom. It was ok, but my ability accelerated when I started a band with my friends and had to keep up. I really think the people you surround yourself with can: push you more, teach you things you didn't know, and make the process way more of a fun hang than doing it by yourself.
3. Another outcome from music education was learning that I really love how mastery feels. There's a lot of positive feeling from achieving and growing. As a result, I try to seek it out in other places as well. I imagine sports/etc are the same?
aleph_minus_one
9 months ago
> I've worked with people who had a master's degree in CS who couldn't code their way out of a wet paper bag.
"Programming" consists of an insanely large number of isolated communities. Assuming the respective person is capable, I would assume that he simply comes from a very different "programming culture".
I actually observe a very related phenomenon for myself: the more I learn about some very complicated programming topics, the more "alien" I actually (start to) become to the programming topics that I have to do at work.
bsrkf
9 months ago
Hope this doesn't come off as disrespectful, as in that I don't believe you, but out of personal interest, would you consider expanding on that? I'd love to hear about the particular example you were thinking of, or in what ways self-taught coders surprised you over academically-taught ones, if you've had experience working with both over a meaningful span of time. Also, if the case, in what ways self-taught coders were/are maybe lacking on average.
If you've ever given answers to that in another comments on HN or elsewhere, feel free to link.
sgarland
9 months ago
I’ve worked with both (I personally have an unrelated BS and an MS in SWE, which I used purely to get my foot in the door – it worked), and IMO if someone has a BS, not MS, there’s a decent chance they at least understand DS&A, probably took an OS course, etc.
That said, I have also worked with brilliant people who had no formal education in the subject whatsoever, they just really, really liked it.
I’m biased towards ops because that’s what I do and like, but at least in that field, the single biggest green flag I’ve found is whether or not someone has a homelab. People can cry all they want about “penalizing people for having hobbies outside of their job,” but it’s pretty obvious that if you spend more time doing something – even moreso if you enjoy it – you will learn it at a much faster rate than someone who only does it during business hours.
jeltz
9 months ago
Maybe you should skip bashing modern devs and people who learnt from online courses when the parent poster is the one in the wrong. Unless InnoDB has a particularly bad implementation of disk backed temporary tables almodt all disk IO should be done in the background so the latency cost should be small. I recommend benchmarking on you particular workload of you really want to know hur big it is.
I am an old-school developer with a computer engineering degree but many of the old famous devs were self taught. Yes, if you learn how to code through online courses you will miss some important fundamentals but those are possible to learn later and I know several people who have.
sgarland
9 months ago
Please see this comment: https://news.ycombinator.com/item?id=43863885
We have excellent metrics between PMM and AWS Perf. Insights / Enhanced Monitoring. I assure you, on-disk temp tables were to blame. To your point, though, MySQL 5.7 does have a sub-optimal implementation in that it kicks a lot of queries out to disk from memory because of the existence of a TEXT column, which internally treated as a BLOB. Schema design is also partially to blame, since most of the tables were unnecessarily denormalized, and TEXT was often used where VARCHAR would have been more appropriate, but still.
StefanBatory
9 months ago
I wish. I did my undergrad, and now doing my Masters, at a top 10 uni in Poland.
Trust me so much, some of the stuff I learned there was actively harmful. Half of subjects were random fillers, and so on.
I envy Americans so much with that, their CS education seems to be top notch.
jonhohle
9 months ago
hedora
9 months ago
That really should include "sync write 4KB to an SSD". People don't realize it's purely in-RAM on the disk these days (you pay a PCIe round trip, and that's about it).
sgarland
9 months ago
I do want to add that many of the devs I work with and have worked with are genuinely interested in learning this stuff, and I am thrilled by it. What bothers me is when they neither have an interest in learning it, nor in fixing their bad patterns, and leadership shrugs and tells infra teams to make it work.
mjmas
9 months ago
Opening any ms office app with the network disconnected opens up almost instantly...
deeThrow94
9 months ago
This is also a classic emacs mistake, so I wouldn't put it all on age.
qwerty456127
9 months ago
What I am glad to leave behind and forget like a nightmare is Windows local networking with shared folders etc - these never worked nice and the last time anybody I know used these was pre-2010. Today we just use NextCloud, Matrix, email and Git for all our collaboration needs.
vladvasiliu
9 months ago
Who's "we"? I work for a company who drank the MS cool-aid, so running Windows on laptops, using Office365 for e-mail, word processing, spreadsheets, teams for chat, sharepoint / onedrive for shared ressources.
Have you tried launching a local app by typing in the start menu on a default win11 install with limited / slow internet access? Good times. How about doing some operation (say delete an e-mail) in one window of "new" outlook and having the others refresh?
I have never understood how some otherwise reasonable people are able to consider this absolute shitshow of a work environment good enough.
bluedino
9 months ago
A lifetime ago when I was doing MSP work, our law office clients were using the DOS versions of WordPerfect because the Windows version was too slow.
They refused to store files in directories and use good file names (although they were limited to 8.3), so they just scrolled through all their files until they found the right one. But they could open them so fast they didn't care.
In windows you had to use the mouse, click three times, wait for the document to load and render....it was instant under DOS character mode.
layer8
9 months ago
I agree with the point about the immediacy of TUIs, but I use MS Office (and Windows in general) almost exclusively by keyboard, so the point about having to use a mouse isn’t completely accurate.
bluedino
9 months ago
Right, but you could just hit up/down to scroll through the files in the directory listing. Very simple
layer8
9 months ago
You can still do that? I mean, there are certainly places where the interaction happens to be more cumbersome than it is in a TUI, because the relevant piece was designed mouse-first. However, one reason I prefer Windows as a desktop UI is that it was designed to be fully operable by keyboard as well.
bluedino
9 months ago
You could see the files as you scrolled through them. I don't think you understood how it works, I didn't do a great job explaining it. You can see the content of the files instantly on the screen as you go through them. Similar to how preview works on Mac
psunavy03
9 months ago
The worst thing Office ever did was effectively kill WordPerfect, and I will die on this hill.
Reveal Codes was an order of magnitude more useful than whatever crap MS continues to pass off as formatting notation to this day 20+ years later, and that's before we even get into Word being over-helpful and getting ahead of you by doing things with autoformat you never wanted to have happen.
Yes, I know WordPerfect is still around, but fat chance being able to conveniently use it anymore.
SoftTalker
9 months ago
Yeah that era of Office, pre-ribbon, was pretty nice as Office goes.
Illotus
9 months ago
Ribbon was better for most people who didn't have all the shortcuts in muscle memory. It is much more discoverable.
SoftTalker
9 months ago
I find its discoverability is terrible. I am always hunting for what I want to do and it's never anywhere that seems to me to be sensible. I usually end up doing a google search for what I want. Perusing the ribbon takes me much more time than just looking at the various options under the old style menus.
Also traditional menus had some traditional standards. Once you learned what was under "File" or "View" or "Insert" or "Format" it was often pretty similar across applications.
CrimsonCape
9 months ago
Logically, users have to learn the name of the tool before performing any sort of geographical associations (which menu, symbol, etc to find the tool).
There is no faster discoverability than O(log(N)) search using the letters of the name as a lookup.
The biggest failure of modern operating systems is failing to standardize this innate reality.
Windows,Linux,etc should have 1. keyboard button to jump to search box 2. type 1-2 letters and hit enter. Operating systems and applications should all have this kind of interface.
The most ironic apps have a ribbon named something like "Edit" but then the most used "edit" command lives in an unrelated ribbon.
allears
9 months ago
Anecdotal evidence from myself: Although I've been using Word for many decades, I've never had much "muscle memory" in terms of accessing features. It was always a case of learning which pulldown menu held the required function.
When the accursed ribbon came along, "discoverability" went out the window. Functions were grouped according to some strange MS logic I could never understand, and it takes me twice as long to format a page as it used to. Now, I basically just use Word for text entry, and if I want an elegant format, I use a graphic design app like Illustrator.
Judging from what I've read online, you may be the only person who actually likes the ribbon.
mcswell
9 months ago
Hieroglyphics are the opposite of "discoverable". That's why they became uninterpretable for almost two thousand years, until the discovery of the Rosetta Stone. And even then it took considerable work to figure out how they functioned. In the Ribbon, in order to discover what some hieroglyph does, you have to mouse over it. Since there are lots of hieroglyphs there, that's a lot of mouse-over. And no, the Ribbon's images make no sense in 99% of the cases.
hedora
9 months ago
That might have been true for the first five minutes of using the software (assuming the person had not yet used a CUA application before the first time they used office). After that, it was strictly worse.
CUA ~= "standard menus + keyboard shortcuts for dos and windows": https://en.wikipedia.org/wiki/IBM_Common_User_Access
Illotus
9 months ago
Not really, it is much more discoverable for most people. If interested, MS UI lead has a blog about lot of the reasons for ribbon and on the research backing it https://learn.microsoft.com/en-us/archive/blogs/jensenh
cowboylowrez
9 months ago
Its probably a case of UI discoverability vs usability. Brand new users might discover better with the ribbon, but as brand new users keep using the product, they transition to experienced users, and the UI needs to adapt to that. My experience is that the ribbon doesn't. Its tolerable, and for me thats enough but I get the points about the ribbon. Its sort of like the new reddit UI, but props to reddit they at least have kept much of the old UI available for longer than I expected them to.
hedora
9 months ago
The problem with it was that it constantly moved the buttons around. So, you had to constantly rediscover it.
layer8
9 months ago
The Ribbon is more difficult to visually grep for me than the classic menus. Not to mention that a number of functions are still hidden in mini-menus in the Ribbon.
It wouldn’t be so bad if keyboard navigation was as good as with the classic menus, but having to press the Alt key separately, and general increased latency, kills it.
sh34r
9 months ago
Ignoring the serious memory safety and security problems that were exploited to get acceptable performance on contemporary hardware, I’d point to shared document editing, automatic backups via OneDrive, and version history as rather significant features in that missing 5%.
I think rosy recollection is tainting your memory here. How often docs would get corrupted due to those aforementioned memory safety issues, even if that Win95/98 box was never connected to the internet.
Of course it’s gonna be super snappy. It was designed to run on any Doom-compatible hardware. Which includes some toasters now.
Edit: it’s also worth noting that 1997 was right around the time where Moore’s law succumbed to the laws of physics for improving single-core CPU performance via clock speed alone.
josephernest
9 months ago
I still use Office 2007 on my computer. Super super snappy, I think Word or Excel starts and finishes loading in 0.5 second after clicking the icon. It has 99% of the features I need compared to the newest Office version.
benjiro
9 months ago
Same ... Office 2007 for the win. Frankly, i do not even use 90% of Office 2007 feature, so why do i even need more? For AI ... no.
And funny thing, barely uses any memory! Makes todays apps look like monsters. Even my music player uses 5x more memory while paused, then freaking Excel with multiple pages open.
ajolly
9 months ago
Do you have a good way to run multiple versions of office? I do sometimes need some of the newer features but I'd like to default to using the older versions.
mcswell
9 months ago
And 100% of what you don't need--the Ribbon--was not in Word 97.
lizknope
9 months ago
I remember a laptop in the early 1990's that had Microsoft Office in ROM to load faster. I can't find a reference to it right now.
Lammy
9 months ago
HP Omnibook 300? https://www.hpmuseum.net/display_item.php?hw=123
`Omnibook300Intro-5091-6270-6pages-May93.pdf` https://www.hpmuseum.net/exhibit.php?hwdoc=123 sez:
> Built-in ROM applications don't use up disk space
- Microsoft Windows
- Microsoft Word for Windows
- Microsoft Excel
- Appointment Book
- Phone Book
- HP Calculator
- Microsoft File Manager
- LapLink Remote Access™
lizknope
9 months ago
That looks like it. I didn't have one but I remember the magazine ads for it.
gymbeaux
9 months ago
Man you can still get by with Photoshop CS3 as far as features go.
user
9 months ago
erkt
9 months ago
Lightroom 5 is miles better than what they offer today. Lightroom Creative Cloud is steaming dog shit. Adobe seriously gets to extort over $120 a year out of me simply for the privilege of reading raw files from a new camera. They provide zero positive contribution these days. All of these incumbent tech companies extract rents on algorithms written decades ago. This needs to end. I am very excited for AI to get advanced enough to whip up replacement programs for everything these companies maintain monopolies over.
mrguyorama
9 months ago
> I am very excited for AI to get advanced enough to whip up replacement programs for everything these companies maintain monopolies over.
You are wildly off base. The algorithms aren't difficult or special. They were written by people reading text books for the most part.
They are able to sit on an on old algorithm for decades because the DMCA made interoperability and selling cheaper tools like that basically illegal.
Because of the DMCA, the work that was done to make IBM PC clones would have been close enough to illegal to kill all the businesses that tried it. They still tried with the liberal IP laws at the time.
erkt
9 months ago
Well that's a bit disappointing but i am aware that their are gaps in my knowledge. I had assumed it was the feasibility of competing with an entrenched firm, not lack of access to research texts. I will chat with Chat to learn more about how DMCA applies, thank you.
gymbeaux
9 months ago
Back in those days it took 15 minutes for Windows to “finish” booting. You’d hit the desktop but the HDD was still going ham loading a dozen programs, each with their own splash screen to remind you of their existence.
toast0
9 months ago
If you've had the priviledge of running Windows 10 on a spinning drive, it never gets to disk idle. Who knows what it's doing, but by that metric it never finishes. It probably never gets to disk idle on an SSD either, but SSDs have so much more io capacity it isn't noticed.
Grazester
9 months ago
Ah the ole slash screen. I remember in high school days writing programs in VB and of course it had to have some "Cool" splash screen.
amiga386
9 months ago
"I bet somebody got a really nice bonus for that feature" https://devblogs.microsoft.com/oldnewthing/20061101-03/?p=29...
> The thing is, all of these bad features were probably justified by some manager somewhere because it’s the only way their feature would get noticed. They have to justify their salary by pushing all these stupid ideas in the user’s faces. “Hey, look at me! I’m so cool!” After all, when the boss asks, “So, what did you accomplish in the past six months,” a manager can’t say, “Um, a bunch of stuff you can’t see. It just works better.” They have to say, “Oh, check out this feature, and that icon, and this dialog box.” Even if it’s a stupid feature.
On the other hand, I very much enjoyed going to Excel 97 cell X97:L97, pressing tab, holding Ctrl+Shift and clicking on the chart icon, because then you could play Excel's built in flight simulator
qwerty456127
9 months ago
> OpenOffice.org (predecessor of LibreOffice) copied this feature, which they called "QuickStarter"
It still does. Neither LibreOffice itself nor it's installation process with its components choice have changed seriously since the old days and I'm very grateful for this. The QuickStarter isn't as relevant anymore as we have fast SSDs now but some slow computers are still around and that's great we still have the option.
agilob
9 months ago
This has always been the problem with Microsoft. Here is a rant from 2020 about Visual Studio https://www.youtube.com/watch?v=GC-0tCy4P1U and live comparison of performance degradation while loading the same project.
silon42
9 months ago
I remember about 15 years ago running Windows + VS in VMWare because I could skip installing Office inside the VM and the system would run noticeably faster.
d_tr
9 months ago
I thought Windows had a generic subsystem for "warming up" frequently used apps for faster launches.
xquce
9 months ago
People don't use Office frequently, and then when they do it's slow and a bad look. So they will cheat in a way that prioritize their own software, and then every one else will then that feature loses all value, as all programs launch on startup as not to be "slow"
layer8
9 months ago
Only for OS components, I think?
mrguyorama
9 months ago
https://en.wikipedia.org/wiki/Windows_Vista_I/O_technologies...
SuperFetch was supposed to work with any app. It never seemed to have much effect IMO.
infinet
9 months ago
My workplace has two Windows 10 machines, one in daily use, and the other is mostly idle. Both are extremely slow. The one in daily use takes at least 15 minutes to become usable after boot. The other is not much better. They are domain-controlled so not much we can do to improve. In contrast, we still have a 20+ years computer, which has an IDE hard drive, running Windows 2000 with MS Office to operate an instrument, and it is much more responsive.
plorg
9 months ago
Huh. Maybe it's because I haven't installed Office since the 2010 version, I assumed OSA was still a thing. My work computer has O365 managed by IT and I could swear I've seen resident daemons (besides, say, Teams) running in the background.
araes
9 months ago
Equivalently useless back in 2004.
Notably, a solution to the current issues with modern office is to use a copy of Office 97.
20+ page XLS uses ~7MB and loads effectively instantly (on a frankly horribly performing laptop that can usually barely run demos on HN)
lolinder
9 months ago
Yeah, I'm honestly trying to figure out why this is getting so much attention. Applications setting themselves to start up at login in order to avoid doing a bunch of work when they're asked for later is one of the oldest patterns in Windows. It's annoying to regularly have to go in and clear out new startup applications, but it's been part of maintaining a Windows machine for decades, and Microsoft Office is hardly the worst offender.
Why is HN suddenly so interested in Microsoft doing the same thing that has always been done by large, bloated app suites?
fredoliveira
9 months ago
> Why is HN suddenly so interested in Microsoft doing the same thing that has always been done by large, bloated app suites?
Probably because it is horrible? It's indicative of how we spend less time optimizing code than we do coming up with computationally expensive, inefficient workarounds.
Let's say a hypothetical Windows user spends 2% of their day using Office (I made that number up). Why should Office be partially loaded the other 98% of the time? How is it acceptable to use those resources?
When are we actually going to start using the new compute capabilities in our machines, rather than letting them get consumed by unoptimized, barely decent code?
vladvasiliu
9 months ago
> Let's say a hypothetical Windows user spends 2% of their day using Office
I don't know about a "hypothetical" user, but I'd bet a "mean" (corporate) user probably uses office all day long. Hell, I've lost count of the number of e-mails I've seen having a screenshot which is inside a word document for some reason, or the number of excel files which are just a 5x4 table.
raincole
9 months ago
You're honestly trying to figure out why people want performant apps...?
lolinder
9 months ago
That's not at all what I asked.