saltysalt
2 hours ago
Not sure the dial-up analogy fits, instead I tend to think we are in the mainframe period of AI, with large centralised computing models that are so big and expensive to host, only a few corporations can afford to do so. We rent a computing timeshare from them (tokens = punch cards).
I look forward to the "personal computing" period, with small models distributed everywhere...
chemotaxis
an hour ago
> I look forward to the "personal computing" period, with small models distributed everywhere...
One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud".
jayd16
17 minutes ago
I don't know, I think you're conflating content streaming with central compute.
Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs?
Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent...
pksebben
40 minutes ago
That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.)
The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek.
There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship.
At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff.
idiotsecant
20 minutes ago
I can't imagine a universe where a small mind with limited computing resources has an advantage against a datacenter mind, no matter the architecture.
btown
an hour ago
Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions.
And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue.
But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0]
We do not live in a utopia.
[0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b...
paxys
2 hours ago
Why would companies sell you the golden goose when they can instead sell you an egg every day?
codegeek
2 hours ago
You could say the same thing about Computers when they were mostly mainframe. I am sure someone will figure out how to make it commoditized just like personal computers and internet.
fph
an hour ago
An interesting remark: in the 1950s-1970s, mainframes were typically rented rather than sold.
vjvjvjvjghv
an hour ago
It looks to me like the personal computer area is over. Everything is in the cloud and accessed through terminals like phones and tablets.
DevKoala
an hour ago
Because someone else will sell it to you if they dont.
kakapo5672
2 hours ago
Because companies are not some monolith, all doing identical things forever. If someone sees a new angle to make money, they'll start doing it.
Data General and Unisys did not create PCs - small disrupters did that. These startups were happy to sell eggs.
otterley
an hour ago
They didn't create them, but PC startups like Apple and Commodore only made inroads into the home -- a relatively narrow market compared to business. It took IBM to legitimize PCs as business tools.
worldsayshi
2 hours ago
Well if there's at least one competitor selling golden geese to consumers the rest have to adapt.
Assuming consumers even bother to set up a coop in their living room...
saltysalt
2 hours ago
Exactly! It's a rent-seeking model.
echelon
2 hours ago
> I look forward to the "personal computing" period, with small models distributed everywhere...
Like the web, which worked out great?
Our Internet is largely centralized platforms. Built on technology controlled by trillion dollar titans.
Google somehow got the lion share of browser usage and is now dictating the direction of web tech, including the removal of adblock. The URL bar defaults to Google search, where the top results are paid ads.
Your typical everyday person uses their default, locked down iPhone or Android to consume Google or Apple platform products. They then communicate with their friends over Meta platforms, Reddit, or Discord.
The decentralized web could never outrun money. It's difficult to out-engineer hundreds of thousands of the most talented, most highly paid engineers that are working to create these silos.
NemoNobody
14 minutes ago
Ok, so Brave Browser exists - if you download, you will see 0 ads on the internet, I've never really seen ads on the internet - even in the before brave times.
Fr tho, no ads - I'm not making money off them, I've got no invite code for you, I'm a human - I just don't get it. I've probably told 500 people about Brave, I don't know any that ever tried it.
I don't ever know what to say. You're not wrong, as long as you never try to do something else.
saltysalt
2 hours ago
I agree man, it's depressing.
mulmen
2 hours ago
Your margin is my opportunity. The more expensive centralized models get the easier it is for distributed models to compete.
8ytecoder
an hour ago
Funny you would pick this analogy. I feel like we’re back in the mainframe era. A lot of software can’t operate without an internet connection. Even if in practice they execute some of the code on your device, a lot of the data and the heavyweight processing is already happening on the server. Even basic services designed from the ground up to be distributed and local first - like email (“downloading”) - are used in this fashion - like gmail. Maps apps added offline support years after they launched and still cripple the search. Even git has GitHub sitting in the middle and most people don’t or can’t use git any other way. SaaS, Electron, …etc. have brought us back to the mainframe era.
thewebguyd
39 minutes ago
It's always struck me as living in some sort of bizaro world. We now have these super powerful personal computers, both handheld (phones) and laptops (My M4 Pro smokes even some desktop class processors) and yet I use all this powerful compute hardware to...be a dumb terminal to someone else's computer.
I had always hoped we'd do more locally on-device (and with native apps, not running 100 instances of chromium for various electron apps). But, it's hard to extract rent that way I suppose.
ryandrake
3 minutes ago
I don't even understand why computer and phone manufacturers even try to make their devices faster anymore, since for most computing tasks, the bottleneck is all the data that needs to be transferred to and from the modern version of the mainframe.
dzonga
an hour ago
this -- chips are getting fast enough both arm n x86. unified memory architecture means we can get more ram on devices at faster throughput. we're already seeing local models - just that their capability is limited by ram.
sixtyj
2 hours ago
Dial-up + mainframe. Mainframe from POV as silos, dial-up internet as the speed we have now when looking back to 2025 in 2035.
onlyrealcuzzo
2 hours ago
Don't we already have small models highly distributed?
saltysalt
2 hours ago
We do, but the vast majority of users interact with centralised models from Open AI, Google Gemini, Grok...
onlyrealcuzzo
2 hours ago
I'm not sure we can look forward to self-hosted models ever being mainstream.
Like 50% of internet users are already interacting with one of these daily.
You usually only change your habit when something is substantially better.
I don't know how free versions are going to be smaller, run on commodity hardware, take up trivial space and ram etc, AND be substantially better
oceanplexian
2 hours ago
> I'm not sure we can look forward to self-hosted models ever being mainstream.
If you are using an Apple product chances are you are already using self-hosted models for things like writing tools and don't even know it.
ryanianian
2 hours ago
The "enshittification" hasn't happened yet. They'll add ads and other gross stuff to the free or cheap tiers. Some will continue to use it, but there will be an opportunity for self-hosted models to emerge.
o11c
2 hours ago
> Like 50% of internet users are already interacting with one of these daily. You usually only change your habit when something is substantially better.
No, you usually only change your habit when the tools you are already using are changed without consulting you, and the statistics are then used to lie.
saltysalt
2 hours ago
You make a fair point, I'm just hoping this will happen, but not confident either to be frank.
raincole
2 hours ago
Because small models are just not that good.
gowld
2 hours ago
We are also in the mainframe period of computing, with large centralised cloud services.
runarberg
2 hours ago
I actually think we are much closer to the sneaker era of shoes, or the monorail era of public transit.
cyanydeez
2 hours ago
I think we are in the dotcom boom era where investment is circular and the cash investments all depend on the idea that growth is infinite.
Just a bunch of billionaires jockeying for not being poor.
EGreg
an hour ago
I actually don’t look forward to this period. I have always been for open source software and distributism — until AI.
Because if there’s one thing worse than governments having nuclear weapons, it’s everyone having them.
It would be chaos. And with physical drones and robots coming, it woukd be even worse. Think “shitcoins and memecoins” but unlike those, you don’t just lose the money you put in and you can’t opt out. They’d affect everyone, and you can never escape the chaos ever again. They’d be posting around the whole Internet (including here, YouTube deepfakes, extortion, annoyance, constantly trying to rewrite history, get published, reputational destruction at scale etc etc), and constant armies of bots fighting. A dark forest.
And if AI can pay for its own propagation via decentralized hosting and inference, then the chance of a runaway advanced persistent threat compounds. It just takes a few bad apples, or even practical jokers, to unleash crazy stuff. And it will never be shut down, just build and build like some kind of kessler syndrome. And I’m talking about with just CURRENT AI agent and drone technology.