IMHO, this overlooks probably the biggest advantage of all: software you can buy once and run locally is predictable.
While the modern world of mobile devices and near-permanent fast and reliable connectivity has brought some real advantages, it has also brought the ability for software developers to ruthlessly exploit their users in ways no-one would have dreamt of 20 or 30 years ago. Often these are pitched as if they are for the user’s benefit — a UI “enhancement” here, an “improved” feature there, a bit of casual spying “to help us improve our software and share only with carefully selected partners”, a subscription model that “avoids the big up-front cost everyone used to pay” (or some questionable logic about “CAPEX vs OPEX” for business software), our startup has been bought by a competitor but all the customers who chose our product specifically to avoid that competitor’s inferior alternative have nothing to worry about because they have no ulterior motive and will continue developing it just the way we have so far.
The truth we all know but don’t want to talk about is that many of the modern trends in software have been widely adopted because they make things easier and/or more profitable for software developers at the direct expense of the user’s experience and/or bank account.
And if you’re willing to pay $195 for the package and upgrades of $50 each time the OS breaks the app, we are good. And as long as this software doesn’t have to sync across mobile and desktop. And as long as this is single user software.
Then sure.
This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.
1. A lot of good models require an amount of VRAM that is only present in data center GPUs.
2. For models which can run locally (Flux, etc.), you get dramatically different performance between top of line cards and older GPUs. Then you have to serve different models with different sampling techniques to different hardware classes.
3. GPU hardware is expensive and most consumers don't have GPUs. You'll severely limit your TAM if you require a GPU.
4. Mac support is horrible, which alienates half of your potential customers.
It's best to follow the Cursor model where the data center is a necessary evil and the local software is an adapter and visualizer of the local file system.
I think you missed the most important thing, more than any of those, if there is no service, then the service cannot delete or deny access to your data.
And related, if there is no service, then the service cannot fail to secure your data.
No possibility of a billing or login error, where you lose access to your stuff because they just think your're not current or valid when you are.
No possibility of losing access because the internet is down or your current location is geo blocked etc.
No possibility of your account being killed because they scanned the data and decided it contained child porn or pirated movies or software or cad files to make guns or hate speech etc.
Those overlap with free and privacy but the seperate point is not the money or the privacy but the fact that someone else can kill your stuff at any time without warning or recourse.
And someone else can lose your stuff, either directly by having their own servers broken into, or indirectly, by your login credentials getting leaked on your end.
All of those advantages are also reason why businesses don’t adapt it…