If you use something like Win32 it probably will remain usable into the far future, especially with WINE and/or VMs. I have 30+ year old utilities that were originally written for Win95 but still work on Win11.
> probably
no. Pure win32, maybe, but as soon as vcredist or DirectX enters the scene, one is hopeless.
> is it realistic these days to just expect software to write once and run forever?
Yes, but you have to put a bit of effort to make it so. For example, if you write your software as a ROM for an early game console (SNES/GBA/etc.), you can probably expect it to run for a very long time, as there will likely be people who want to play Final Fantasy 6 and Pokémon Silver for as long as computers are around.
That's one extreme, but you don't have to go that far. I have some HTML/CSS/JS projects (meant to run locally) from the 2000s that still work totally fine today. Same for Python code from 2010, etc. I wouldn't be surprised if those still worked just fine 50 years from now.
All I do is write code that is meant to run locally first, with very minimal dependencies, only choosing to use conservative/proven technologies.
DOSBox! Super stable APIs that will never change. And it self-hosts compilers and editors (up to latest versions of Emacs and at least versions of vim from earlier this century) so anywhere your code can run you can also modify and recompile it.
(In practice I edit my code outside of DOSBox, and sometimes I use cross-compilers, but it is good to know that there is a fallback.)
> there will likely be people who want to play Final Fantasy 6 and Pokémon Silver for as long as computers are around
Is Final Fantasy 6 seeing a lot on new players today? Old games are played more by the people who enjoyed them when they were new. Even the landmark games are easily forgotten and younger players will never bother when they have so many modern choices.
The modern choices are accompanied by ads and gambling, to the point that it wouldn't be surprising if Final Fantasy 6 was seeing an uptick. Go enjoy your "modern" games all you like, but it doesn't always matter if a games community is growing, especially if it's not one requiring connectivity to blast ads and skins at people.
> Even the landmark games are easily forgotten and younger players will never bother when they have so many modern choices.
www.gog.com
Features regularly on HN
The Titan submersible implosion features a lot on HN, and almost nobody here built such a contraption, or took one to the bottom of the ocean, or imploded. Naming a gaming storefront which may sell an old game doesn't answer the question whether young people want to play it.
My reasonable baseline assumption is that almost no young player is going to jump to play a 30+ year old game. I'll be generous here and not name things like Frogger or worse, The Oregon Trail. But give a kid who has seen modern games a copy of Diablo, or SimCity 2000, or even newer things like or GoldenEye 007 or System Shock and watch the "excitement". And these are the heavy hitters.
A lot of these oldies didn't age well even for the people who loved them way back when. It's hard to get young players excited about them. Very few oldies could stand on their own today.
Some software does. If you've worked on SCADA gear it's expected to go into production and then run for 20-30 years without being patched once a month, or, actually, ever.
Someone once asked the author of an at the time widely-used security utility why it hadn't been updated since 1996, and whether it was abandonware. His response: "No, some people just get it right the first time". Just had a quick check and there's a single CVE for it from 25 years ago, possibly from a third-party mod rather than the original code.
I'd reasonably expect software to run forever insofar as the environment it's run in doesn't change. Essentially, no OS or dependency updates - network is inevitably going to get broken.
Anecdata: My Wii (2006) console has had a few hardware issues I've fixed, but the software is just as responsive as a decade ago (though many external networks/servers have shut down). Homebrew community is very much alive and has expanded its utility.
I was thinking to defer to the compiler to make things stable. i.e, Go or JavaScript virtual machine should just run forever and able to decide with OS updates.
On the frontend world, the browser so far has been super reliable in maintaining backwards compatibility of HTML, CSS and JS for years and years.
Unix shell script also more or less reached a stable state. It's even optional to target Bash rather than just shell.
> But is it realistic these days to just expect software to write once and run forever?
From an engineering perspective, yes.
From the current mindset in SW development (reinventing the wheel every couple of years) no.
Almost all current SW is throw away software.
For most of it you need a particular version of an OS, particular versions of libraries and a particular planet alignment, just to be able to run it.
I've used plenty of Common Lisp code from 30 even 40 years ago without issue.
You can get close. I have personal app and production systems in past jobs that are just running along year after year doing what they were designed to.
You can never escape security patches, but your theory of limiting to a free stable dependencies usually works really well for me.
I was a beta tester for Freehand/MX, and used Freehand and Altsys Virtuoso for decades before that --- it still running on Windows is why I still use Windows, and I despair of what I will use for vector drawing when it stops running.
None of the Inkscape devs are invested in Freehand's keyboard shortcuts and working techniques, Graphite doesn't even work with a stylus, Cenon is clunky beyond words and hasn't been upgraded in decades last I checked, Krita is pixel-oriented, and none of the note-taking programs really work for technical drawing/vector editing....
You seem to be describing how the web dev worked after JQuery but before React. It wasn’t prettier than now.
I agree that the wider NPM ecosystem is a morass of slop and that is technical debt for anyone who wanders into that minefield. But the solution isn’t to assume that there are no bad / unmaintained GoLang libraries. It’s to realize that maintenance, quality, and sustainability need to be first class attributes of every library you choose to allow your project to depend on.
Your proposal will yield lots of LLM near-slop (basically code that works given the original prompt requirements, but will fail to continue working well once some requirement changes, some original assumption is violated, some browser changes are implemented.
Ultimately, the sustainable solution is to have a subset of NPM libraries be extremely high quality, vetted via robust tests and security audits, and are visibly different than the average slop on NPM. Basically a very visible delineation between untrustworthy code and very trustworthy code. Then you should be able to tell the LLM to use only dependencies from that vetted subset.