AI's Dial-Up Era

107 pointsposted 4 hours ago
by nowflux

91 Comments

saltysalt

2 hours ago

Not sure the dial-up analogy fits, instead I tend to think we are in the mainframe period of AI, with large centralised computing models that are so big and expensive to host, only a few corporations can afford to do so. We rent a computing timeshare from them (tokens = punch cards).

I look forward to the "personal computing" period, with small models distributed everywhere...

chemotaxis

an hour ago

> I look forward to the "personal computing" period, with small models distributed everywhere...

One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud".

jayd16

17 minutes ago

I don't know, I think you're conflating content streaming with central compute.

Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs?

Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent...

pksebben

40 minutes ago

That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.)

The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek.

There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship.

At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff.

idiotsecant

20 minutes ago

I can't imagine a universe where a small mind with limited computing resources has an advantage against a datacenter mind, no matter the architecture.

btown

an hour ago

Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions.

And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue.

But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0]

We do not live in a utopia.

[0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b...

paxys

2 hours ago

Why would companies sell you the golden goose when they can instead sell you an egg every day?

codegeek

2 hours ago

You could say the same thing about Computers when they were mostly mainframe. I am sure someone will figure out how to make it commoditized just like personal computers and internet.

fph

an hour ago

An interesting remark: in the 1950s-1970s, mainframes were typically rented rather than sold.

vjvjvjvjghv

an hour ago

It looks to me like the personal computer area is over. Everything is in the cloud and accessed through terminals like phones and tablets.

DevKoala

an hour ago

Because someone else will sell it to you if they dont.

kakapo5672

2 hours ago

Because companies are not some monolith, all doing identical things forever. If someone sees a new angle to make money, they'll start doing it.

Data General and Unisys did not create PCs - small disrupters did that. These startups were happy to sell eggs.

otterley

an hour ago

They didn't create them, but PC startups like Apple and Commodore only made inroads into the home -- a relatively narrow market compared to business. It took IBM to legitimize PCs as business tools.

worldsayshi

2 hours ago

Well if there's at least one competitor selling golden geese to consumers the rest have to adapt.

Assuming consumers even bother to set up a coop in their living room...

saltysalt

2 hours ago

Exactly! It's a rent-seeking model.

echelon

2 hours ago

> I look forward to the "personal computing" period, with small models distributed everywhere...

Like the web, which worked out great?

Our Internet is largely centralized platforms. Built on technology controlled by trillion dollar titans.

Google somehow got the lion share of browser usage and is now dictating the direction of web tech, including the removal of adblock. The URL bar defaults to Google search, where the top results are paid ads.

Your typical everyday person uses their default, locked down iPhone or Android to consume Google or Apple platform products. They then communicate with their friends over Meta platforms, Reddit, or Discord.

The decentralized web could never outrun money. It's difficult to out-engineer hundreds of thousands of the most talented, most highly paid engineers that are working to create these silos.

NemoNobody

14 minutes ago

Ok, so Brave Browser exists - if you download, you will see 0 ads on the internet, I've never really seen ads on the internet - even in the before brave times.

Fr tho, no ads - I'm not making money off them, I've got no invite code for you, I'm a human - I just don't get it. I've probably told 500 people about Brave, I don't know any that ever tried it.

I don't ever know what to say. You're not wrong, as long as you never try to do something else.

saltysalt

2 hours ago

I agree man, it's depressing.

mulmen

2 hours ago

Your margin is my opportunity. The more expensive centralized models get the easier it is for distributed models to compete.

8ytecoder

an hour ago

Funny you would pick this analogy. I feel like we’re back in the mainframe era. A lot of software can’t operate without an internet connection. Even if in practice they execute some of the code on your device, a lot of the data and the heavyweight processing is already happening on the server. Even basic services designed from the ground up to be distributed and local first - like email (“downloading”) - are used in this fashion - like gmail. Maps apps added offline support years after they launched and still cripple the search. Even git has GitHub sitting in the middle and most people don’t or can’t use git any other way. SaaS, Electron, …etc. have brought us back to the mainframe era.

thewebguyd

39 minutes ago

It's always struck me as living in some sort of bizaro world. We now have these super powerful personal computers, both handheld (phones) and laptops (My M4 Pro smokes even some desktop class processors) and yet I use all this powerful compute hardware to...be a dumb terminal to someone else's computer.

I had always hoped we'd do more locally on-device (and with native apps, not running 100 instances of chromium for various electron apps). But, it's hard to extract rent that way I suppose.

ryandrake

3 minutes ago

I don't even understand why computer and phone manufacturers even try to make their devices faster anymore, since for most computing tasks, the bottleneck is all the data that needs to be transferred to and from the modern version of the mainframe.

dzonga

an hour ago

this -- chips are getting fast enough both arm n x86. unified memory architecture means we can get more ram on devices at faster throughput. we're already seeing local models - just that their capability is limited by ram.

sixtyj

2 hours ago

Dial-up + mainframe. Mainframe from POV as silos, dial-up internet as the speed we have now when looking back to 2025 in 2035.

onlyrealcuzzo

2 hours ago

Don't we already have small models highly distributed?

saltysalt

2 hours ago

We do, but the vast majority of users interact with centralised models from Open AI, Google Gemini, Grok...

onlyrealcuzzo

2 hours ago

I'm not sure we can look forward to self-hosted models ever being mainstream.

Like 50% of internet users are already interacting with one of these daily.

You usually only change your habit when something is substantially better.

I don't know how free versions are going to be smaller, run on commodity hardware, take up trivial space and ram etc, AND be substantially better

oceanplexian

2 hours ago

> I'm not sure we can look forward to self-hosted models ever being mainstream.

If you are using an Apple product chances are you are already using self-hosted models for things like writing tools and don't even know it.

ryanianian

2 hours ago

The "enshittification" hasn't happened yet. They'll add ads and other gross stuff to the free or cheap tiers. Some will continue to use it, but there will be an opportunity for self-hosted models to emerge.

o11c

2 hours ago

> Like 50% of internet users are already interacting with one of these daily. You usually only change your habit when something is substantially better.

No, you usually only change your habit when the tools you are already using are changed without consulting you, and the statistics are then used to lie.

saltysalt

2 hours ago

You make a fair point, I'm just hoping this will happen, but not confident either to be frank.

raincole

2 hours ago

Because small models are just not that good.

gowld

2 hours ago

We are also in the mainframe period of computing, with large centralised cloud services.

runarberg

2 hours ago

I actually think we are much closer to the sneaker era of shoes, or the monorail era of public transit.

cyanydeez

2 hours ago

I think we are in the dotcom boom era where investment is circular and the cash investments all depend on the idea that growth is infinite.

Just a bunch of billionaires jockeying for not being poor.

EGreg

an hour ago

I actually don’t look forward to this period. I have always been for open source software and distributism — until AI.

Because if there’s one thing worse than governments having nuclear weapons, it’s everyone having them.

It would be chaos. And with physical drones and robots coming, it woukd be even worse. Think “shitcoins and memecoins” but unlike those, you don’t just lose the money you put in and you can’t opt out. They’d affect everyone, and you can never escape the chaos ever again. They’d be posting around the whole Internet (including here, YouTube deepfakes, extortion, annoyance, constantly trying to rewrite history, get published, reputational destruction at scale etc etc), and constant armies of bots fighting. A dark forest.

And if AI can pay for its own propagation via decentralized hosting and inference, then the chance of a runaway advanced persistent threat compounds. It just takes a few bad apples, or even practical jokers, to unleash crazy stuff. And it will never be shut down, just build and build like some kind of kessler syndrome. And I’m talking about with just CURRENT AI agent and drone technology.

indigodaddy

2 hours ago

Funny how this guy thinks he knows exactly what's up with AI, and how "others" are "partly right and wrong." Takes a bit of hubris to be so confident. I certainly don't have the hubris to think I know exactly how it's all going to go down.

fragmede

2 hours ago

But do you have the audacity to be wrong?

indigodaddy

2 hours ago

Yeah that's interesting, good perspective

ivape

2 hours ago

The problem is that the bubble people are so unimaginative, similar to Krugman, that those who have any inkling of an imagination can literally feel like visionaries compared to them. I know I’m describing Dunning-Krueger, but so be it, the bubble people are very very wrong. It’s like, man, they really are unable to imagine a very real future.

teaearlgraycold

5 minutes ago

Almost everyone I hear calling our AI hype machine a bubble aren't claiming AI is a short term fluke. They're saying the marketing doesn't match the reality. The companies don't have the revenue they need. The model performance is hitting the top of the S curve. Essentially, this is the first big wave - but it'll be a while before the sea level rises permanently.

bdangubic

3 minutes ago

> marketing doesn't match the reality.

true for every marketing ever

confirmmesenpai

2 hours ago

takes a lot of hubris to be sure it's a bubble too.

hitarpetar

an hour ago

that's why I always identify the central position of any argument and take it. that way noone can accuse me of hubris

ares623

20 minutes ago

My head canon is that the thing that preemptively pops the bubble is Apple coming out and saying, very publicly, that AI is a dead end, and they are dropping it completely (no more half assed implicit promises).

And not just that, they come out with an iPhone that has _no_ camera as an attempt to really distance themselves from all the negative press tech (software and internet in particular) has at the moment.

NemoNobody

9 minutes ago

That would require people that know about AI to actually choose to cancel it - which nobody that actually knows what AI can do, would ever actually do.

The Apple engineers, with their top level unfettered access to the best Apple AI - they'll convince shareholders to fund it forever, even if normal people never catch on.

ladberg

14 minutes ago

Do you know a single person who'd buy an iPhone without a camera? I don't

sailfast

2 hours ago

I recall the unit economics making sense for all these other industries and bubbles (short of maybe tulips, which you could plant…) . Sure there were over-valuation bubbles because of speculatory demand, but right now the assumption seems to be “first to AGI wins” but that… may not happen.

The key variable for me in this house of cards is how long folks will wait before they need to see their money again, and whether these companies will go in the right direction long enough given these valuations to get to AGI. Not guaranteed and in the meantime society will need to play ball (also not a guarantee)

kaoD

2 hours ago

> If you told someone in 1995 that within 25 years [...] most people would find that hard to believe.

That's not how I remember it (but I was just a kid so I might be misremembering?)

As I remember (and what I gather from media from the era) late 80s/early 90s were hyper optimistic about tech. So much so that I distinctly remember a ¿german? TV show when I was a kid where they had what amounts to modern smartphones, and we all assumed that was right around the corner. If anything, it took too damn long.

Were adults outside my household not as optimistic about tech progress?

runarberg

an hour ago

That’s how I remember it too. The video is from 1999, during the height of the dot-com bubble. These experts are predicting that within 10 years the internet will be on your phone, and that people will be using their phones as credit cards and the phone company would manage the transaction, the prediction actually comes pretty close to the prediction made by bitcoin enthusiasts.

https://bsky.app/profile/ruv.is/post/3liyszqszds22

Note that this is the state TV broadcasting this in their main news program. The most popular daily show in Iceland.

slackr

an hour ago

There’s a big difference between the fibre infrastructure left by the dotcom crash, and the GPUs that AI firms will leave behind

mjr00

3 hours ago

While I mostly agree with the article's premise (that AI will cause more software development to happen, not less) I disagree with two parts:

1. the opening premise comparing AI to dial-up internet; basically everyone knew the internet would be revolutionary long before 1995. Being able to talk to people halfway across the world on a BBS? Sending a message to your family on the other side of the country and them receiving it instantly? Yeah, it was pretty obvious this was transformative. The Krugman quote is an extreme, notable outlier, and it gets thrown out around literally every new technology, from blockchain to VR headsets to 3DTVs, so just like, don't use it please.

2. the closing thesis of

> Consider the restaurant owner from earlier who uses AI to create custom inventory software that is useful only for them. They won’t call themselves a software engineer.

The idea that restaurant owners will be writing inventory software might make sense if the only challenge of creating custom inventory software, or any custom software, was writing the code... but it isn't. Software projects don't fail because people didn't write enough code.

solomonb

2 hours ago

Before I got my first full time software engineering gig (I had worked part time briefly years prior) I was working full time as a carpenter. We were paying for an expensive online work order system. Having some previous experience writing software for music in college and a couple /brief/ LAMP stack freelance jobs after college I decided to try to write my own work order system. It took me like a month and it would never have never scaled, was really ugly, and had the absolute minimum number of features. I could never had accepted money from someone to use it but it did what we needed and we ran with it for several years after that.

I was only able to do this because I had some prior programming experience but I would imagine that if AI coding tools get a bit better they would enable a larger cohort of people to build a personal tool like I did.

Kiro

2 hours ago

I don't think his quote is that extreme and it was definitely not obvious to most people. A common thing you heard even around 95 was "I've tried internet but it was nothing special".

alecbz

2 hours ago

> basically everyone knew the internet would be revolutionary long before 1995. Being able to talk to people halfway across the world on a BBS? Sending a message to your family on the other side of the country and them receiving it instantly? Yeah, it was pretty obvious this was transformative.

That sounds pretty similar to long-distance phone calls? (which I'm sure was transformative in its own way, but not on nearly the same scale as the internet)

Do we actually know how transformative the general population of 1995 thought the internet would or wouldn't be?

slackr

an hour ago

There’s a big difference between the fibre infrastructure left by the dotcom crash, and the GPUs that AI firms will leave behind.

gnarlouse

31 minutes ago

I feel like this article is too cute. The internet, and the state of the art of computing in general has been driven by one thing and one thing alone: Moore’s Law. In that very real sense, it means that the semiconductor and perhaps more generally even just TSMC is responsible for the rise of the internet and the success of it.

We’re at the end of Moore’s Law, it’s pretty reasonable to assume. 3nm M5 chips means there are—what—a few hundred silicon atoms per transistor? We’re an order of magnitude away from .2 nm which is the diameter of a single silicon atom.

My point is, 30 years have passed since dial up. That’s a lot of time to have exponentially increasing returns.

There’s a lot of implicit assumption that “it’s just possible” to have a Moore’s Law for the very concept of intelligence. I think that’s kinda silly.

idiotsecant

16 minutes ago

I would go so far as to say we are still in the computing dial-up era. We're at the tail end, maybe - we don't write machine code any longe, mostly, and we've abstracted up a few levels but we're still writing code. Eventually computing is something that will be everywhere, like air, and natural language interfaces will be nearly exclusively how people interact with computing machines. I don't think the idea of 'writing software' is something that will stick around, I think we're in a very weird and very brief little epoch where that is a thing.

arcticbull

3 hours ago

People tend to equate this to the railroad boom when saying that infrastructure spending will yield durable returns into the future no matter what.

When the railroad bubble popped we had railroads. Metal and sticks, and probably more importantly, rights-of-way.

If this is a bubble, and it pops, basically all the money will have been spent on Nvidia GPUs that depreciate to 0 over 4 years. All this GPU spending will need to be done again, every 4 years.

Hopefully we at least get some nuclear power plants out of this.

simonw

3 hours ago

Yeah, the short-lived GPU deprecation cycle does feel very relevant here.

I'm still a fan of the railroad comparisons though for a few additional reasons:

1. The environmental impact of the railroad buildout was almost incomprehensibly large (though back in the 1800s people weren't really thinking about that at all.)

2. A lot of people lost their shirts investing in railroads! There were several bubbly crashes. A huge amount of money was thrown away.

3. There was plenty of wasted effort too. It was common for competing railroads to build out rails that served the same route within miles of each other. One of them might go bust and that infrastructure would be wasted.

rhubarbtree

2 hours ago

What percentage of data centre build costs are the GPUs vs power stations, water cooling plants, buildings, roads, network, racks, batteries, power systems, etc

paxys

2 hours ago

There's a lot more to infrastructure spending than GPUs. Companies are building data centers, cooling systems, power plants (including nuclear), laying cables under oceans, launching satellites. Bubble or not, all of this will continue to be useful for decades in the future.

Heck if nothing else all the new capacity being created today may translate to ~zero cost storage, CPU/GPU compute and networking available to startups in the future if the bubble bursts, and that itself may lead to a new software revolution. Just think of how many good ideas are held back today because deploying them at scale is too expensive.

bryanlarsen

2 hours ago

> including nuclear

Note that these are just power purchase agreements. It's not nothing, but it's a long ways away from building nuclear.

amluto

2 hours ago

A bunch of the money is being spent on data centers and their associated cooling and power systems and on the power plants and infrastructure. Those should have much longer depreciation schedules.

schwarzrules

2 hours ago

>> basically all the money will have been spent on Nvidia GPUs that depreciate to 0 over 4 years

I agree the depreciation schedule always seems like a real risk to the whole financial assumptions these companies/investors make, but a question I've wondered: - Will there be an unexpected opportunity when all these "useless" GPUs are put out to pasture? It just seems like saying a factory will be useless because nobody wants to buy an IBM mainframe, but an innovative company can repurpose a non-zero part of that infrastructure for another use case.

ares623

3 hours ago

The recycling industry will boom. From what demand you ask? We'll find out soon enough.

robinhoode

3 hours ago

Railroads need repair too? Not sure if it's every 4 years. Also, the trains I take to/from work are super slow because there is no money to upgrade.

I think we may not upgrade every 4 years, but instead upgrade when the AI models are not meeting our needs AND we have the funding & political will to do the upgrade.

Perhaps the singularity is just a sigmoid with the top of the curve being the level of capex the economy can withstand.

FridgeSeal

2 hours ago

Imagine the progress we could have made on climate change if this money had been funneled into that, instead of making some GPU manufacturers obscenely wealthy.

vjvjvjvjghv

an hour ago

I think the hardware infrastructure may be obsolete but at the moment we are still just beginning to figure out how to use AI. So the knowledge will be the important thing that’s left after the bubble. The current infrastructure will probably be as obsolete as dial up infrastructure.

troupo

3 hours ago

The boom might not last long enough for western countries to pull heads out of their collective asses and ramp up production of nuclear plants.

It takes China 5 years now, but they've been ramping up for more than 20 years.

fjdjcjejdjfje

2 hours ago

This is precisely why the AI bubble is so much worse than previous bubbles: the main capital asset that the bubble is acquiring is going to depreciate before the bubble's participants can ever turn a profit. Regardless of what AI's future capabilities are going to be, it's physically impossible for any of these companies to become profitable before the GPUs that they have already purchased are either obsolete or burnt out from running under heavy load.

bigwheels

2 hours ago

> Benchmark today’s AI boom using five gauges:

> 1. Economic strain (investment as a share of GDP)

> 2. Industry strain (capex to revenue ratios)

> 3. Revenue growth trajectories (doubling time)

> 4. Valuation heat (price-to-earnings multiples)

> 5. Funding quality (the resilience of capital sources)

> His analysis shows that AI remains in a demand-led boom rather than a bubble, but if two of the five gauges head into red, we will be in bubble territory.

This seems like a more quantitative approach than most of "the sky is falling", "bubble time!", "circular money!" etc analyses commonly found on HN and in the news. Are there other worthwhile macro-economic indicators to look at?

It's fascinating how challenging it is meaningfully compare current recent events to prior economic cycles such as the y2k tech bubble. It seems like it should be easy but AFAICT it barely even rhymes.

rhubarbtree

2 hours ago

Yep.

Stockmarket capitalisation as a percentage of GDP AKA the Buffett indicator.

https://www.longtermtrends.net/market-cap-to-gdp-the-buffett...

Good luck, folks.

rybosworld

2 hours ago

How valuable is this metric considering that the biggest companies now draw a significant % of revenue from outside the U.S.?

I'm sure there are other factors that make this metric not great for comparisons with other time periods, e.g.:

- rates

- accounting differences

cb321

2 hours ago

Besides your chart, another point along these lines is that the article cites Azhar claiming multiples are not in bubble territory while also mentioning Murati getting essentially infinite price multiple. Hmmmm...

bena

2 hours ago

“But the fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses. They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers. But they also laughed at Bozo the Clown.”

Because some notable people dismissed things that wound up having profound effect on the world, it does not mean that everything dismissed will have a profound effect.

We could just as easily be "peak Laserdisc" as "dial-up internet".

rainsford

35 minutes ago

I was happy to come into this thread and see I was not the first person for whom that quote came to mind. The dial-up Internet comparison implicitly argues for a particular outcome of current AI as a technology, but doesn't actually support that argument.

There's another presumably unintended aspect of the comparison that seems worth considering. The Internet in 2025 is certainly vastly more successful and impactful than the Internet in the mid-90s. But dial-up itself as a technology for accessing the Internet was as much of a dead-end as Laserdisc was for watching movies at home.

Whether or not AI has a similar trajectory as the Internet is separate from the question of whether the current implementation has an actual future. It seems reasonable to me that in the future we're enjoying the benefits of AI while laughing thinking back to the 2025 approach of just throwing more GPUs at the problem in the same way we look back now and get a chuckle out of the idea of "shotgun modems" as the future.

runarberg

3 hours ago

The vast majority of the dot-com comparison that I personally see are economic, not technological. People (or at least the ones I see) are claiming that the bubble mechanics of e.g. circular trading and over-investments are similar to the dot-com bubble, not that the AI technology is somehow similar the internet (it obviously isn’t). And to that extent we are in the year 1999 not 1995.

When this article are claiming both sides of the debate, I believe only one of them are real (the ones hyping up the technology). While there are people like me who are pessimistic about the technology, we are not in any position of power, and our opinion on the matter is basically a side noise. I think a much more common (among people with any say in the future of this technology) is the believe that this technology is not yet at a point which warrants all this investment. There were people who said that about the internet in 1999, and they were proven 100% correct in the months that followed.

vjvjvjvjghv

an hour ago

Agreed. It would probably be better to keep improving AI before investing that much into infrastructure.

dude250711

2 hours ago

> Consider the restaurant owner from earlier who uses AI to create custom inventory software that is useful only for them.

That is the real dial-up thinking.

Couldn't AI like be their custom inventory software?

Codex and Claud Code should not even exist.

ToucanLoucan

2 hours ago

> Couldn't AI like be their custom inventory software?

Absolutely not. It's inherently a software with a non-zero amount of probability in every operation. You'd have a similar experience asking an intern to remember your inventory.

Like I enjoy Copilot as a research tool right but at the same time, ANYTHING that involves delving into our chat history is often wrong. I own three vehicles, for example, and it cannot for it's very life remember the year, make and model of them. Like they're there, but they're constantly getting switched around in the buffer. And once I started positing questions about friend's vehicles that only got worse.

dude250711

2 hours ago

But you should be able to say "remember this well" and AI would know it needs a reliable database instead of relying on its LLM cache or whatever. Could it not just spin up Postgres in some Codex Cloud like a human developer would? Not today but in a few years?

ToucanLoucan

an hour ago

Why do I need to tell an AI to remember things?! How does AI consistently feel less intelligent than regular old boring software?!

morkalork

2 hours ago

"That side of prime rib is totally in the walk-in, just keep looking. Trust me, bro"

dg0

2 hours ago

Nice article, but somewhat overstates how bad 1995 was meant to be.

A single image generally took nothing like a minute. Most people had moved to 28.8K modems that would deliver an acceptable large image in 10-20 seconds. Mind you, the full-screen resolution was typically 800x600 and color was an 8-bit palette… so much less data to move.

Moreover, thanks to “progressive jpeg”, you got to see the full picture in blocky form within a second or two.

And of course, with pages was less busy and tracking cookies still a thing of the future, you could get enough of a news site up to start reading in less time that it can take today.

One final irk is that it’s little overdone to claim that “For the first time in history, you can exchange letters with someone across the world in seconds”. Telex had been around for decades, and faxes, taking 10-20 seconds per page were already commonplace.

yapyap

2 hours ago

Big bias shining through in comparing AI to the internet.

Because we all know how essential the internet is nowadays.

righthand

2 hours ago

More like AI’s Diaper-Up Era aka AI’s Analogy Era to Mask It’s Shortcomings

bitwize

2 hours ago

Recently, in my city, the garbage trucks started to come equipped with a device I call "The Claw" (think Toy Story). The truck drives to your curb where your bin is waiting, and then The Claw extends, grasps the bin, lifts it into the air and empties the contents into the truck before setting it down again.

The Claw allows a garbage truck to be crewed by one man where it would have needed two or three before, and to collect garbage much faster than when the bins were emptied by hand. We don't know what the economics of such automation of (physical) garbage collection portend in the long term, but what we do know is that sanitation workers are being put out of work. "Just upskill," you might say, but until Claw-equipped trucks started appearing on the streets there was no need to upskill, and now that they're here the displaced sanitation workers may be in jeopardy of being unable to afford to feed their families, let alone find and train in some new marketable skill.

So no, we're in the The Claw era of AI, when business finds a new way to funge labor with capital, devaluing certain kinds of labor to zero with no way out for those who traded in such labor. The long-term implications of this development are unclear, but the short-term ones are: more money for the owner class, and some people are out on their ass without a safety net because this is Goddamn America and we don't brook that sort of commie nonsense here.

wewewedxfgdf

2 hours ago

Most of the big services seem to waste so much time clunking through updating and editing files.

I'm no expert but I can't help feeling there's lots of things they could be doing vastly better in this regard - presumably there is lots to do and they will get around to it.