Why I'm Not Worried About Running Out of Work in the Age of AI

35 pointsposted 14 hours ago
by 0bytematt

38 Comments

dvrp

13 hours ago

An (ex-) Senior VP at Salesforce is not worried about running out of work.

thx lol -- why is this on the front page?

myth_drannon

12 hours ago

"Why are you worried? You can always play golf with your buddies!"

refactor_master

12 hours ago

> The real move for the autoworker was sideways, not upward: industrial maintenance, tool and die work, welding, industrial electrical work, construction trades, trucking, or logistics.

But in an AI post-work future, all the sideways moves have also been taken over by AI and robots. After all, “knowledge work” as a discipline will not be there, right? Whether I can write code, manage teams or copywrite. All of them automated.

When the complexity vs cost of automation tips in the favor of humans, that’s where I’ll have to skill to. You said it, trucking, welding, … That I have a PhD in knowledge work is just worthless paper now.

bayarearefugee

12 hours ago

> When the complexity vs cost of automation tips in the favor of humans, that’s where I’ll have to skill to. You said it, trucking, welding

Even the 'safe' jobs are going to suffer a lot relative to today because it'll be a race to the bottom as more and more people try to shift into a reduced number of jobs with less demand.

Eg. Being a welder is safe from AI at least until the robots are perfected, but even they have two huge problems to contend with in the nearer term: reduced demand for their services as ~40% of the current workforce loses their income, plus an influx of competition for their own job as these same displaced workers look to shift jobs to a safer one.

People who assume everything will be alright post-AI because everything (mostly) worked itself out in the past are underestimating the extent to which the scale of so much changing so fast will negatively impact every aspect of our economy for anyone who isn't already a wealthy asset owner.

The economy can absorb buggy whip makers being obsoleted, or car factory workers being offshored, but even though those situations sucked for the people impacted by them, the scale of them was tiny (and the time to adjust was so much longer) compared to what is coming with AI displacement.

tartoran

11 hours ago

On top of that, this could trigger social unrest on a scale most of us have never seen in our lifetimes. But who benefits from AI is still politically negotiable. It is possible to build policies that spread the gains more broadly. Otherwise, the economy hollow out, society starts to fracture, and nobody wins, including the ultra-wealthy. Their wealth is not insulated from collapse; it depends on the stability of the system itself.

jonnycoder

12 hours ago

It’s shifting for knowledge workers too, we just need to pivot. I have had many app ideas for a while and now ai lets me build them quickly. Access to education and knowledge led to your advanced eduction, now access to cheap/fast building leads to products execution. Use your phd brain to come up with a well researched idea/plan and then go execute.

markus_zhang

12 hours ago

Just a note that everyone is doing that, at 10x speed, and very good people can now output 100x thanks to AI.

kjuulh

11 hours ago

Will swe's be squeezed; yes. But I don't think everything will just be magically done by these models. Right now the wheels are completely off the wagon as we see more and more vibe coded apps going live with fatal security vulnerabilities, privacy issues. The act of putting pen to paper, will change, the positions will change, but I don't think these models are a silver bullet.

Nothing has been so simple until now, and it seems strange that we just get to a certain point and then all of our problems are now just solved, completely. From my experience until now, at my current start-up, it has reduced our need to hire a tad, but not too much. However, I've also seen early stage start ups needing to hire because they started out building a product, and it became too much to handle, it is anectodal and current, I'd just find it strange that we just end up automating ourselves away, my own role has sort of turned into an AI enablement for the rest of start up, mostly C-level, business, pretty much everyone else than swe's. There is potential but mixed success for now. Agent's a good enough to build something that works, but not good enough to build the right solution.

I had a guy that ended up building a local dashboard in perl (the only thing claude could find on his mac) and wanted to distribute it to his colleagues. Engineers sometimes forget that normal people don't usually work in the unknown, they will solve problems in any way they known, in this case a airdropped folder of perl code sent to each other.

goopthink

11 hours ago

Somewhat buried in the post — “focus”.

The way people use AI is the opposite of focus. It’s throw as much as you can against the wall because it is fast and cheap and possible. It is peanut buttering. Companies today mandate AI use so indiscriminately that you might as well call it a comparative disadvantage. It’s not that the AI is bad, but that people haven’t figured out that the core competency of an organization is focus and coordination to achieve a goal, and that victory is external — users, customers. But AI is used to unfocus, to spread, and to meet internal goals - build more, etc. The challenge was never writing more code or creating more content (all of which is ultimately a long tail of debt that needs to be cleaned up and managed by someone else), which can be done cheaply enough with other paths. It was figuring out what is worth doing, and aligning everyone around that. So in a sense, I welcome company AI mandates today because they are so misdirected as to make “doing things they inefficient way they were done before” a relative advantage.

suzzer99

9 hours ago

A pervasive middle class in developed countries has only existed since WWII. We may just be an 80-year anomaly in a world history of serfs, owners, skilled tradesmen, and merchants.

pupppet

11 hours ago

I’m not worried about running out of work. I’m worried about running out of paid work.

dlivingston

12 hours ago

The writing is on the wall, but what it spells we don't yet know. One thing that is virtually certain is that tomorrow's world of software engineering will be very different than today. Thinking about and preparing for this world is something we need to do.

I take the negative comments here as expressions of anxiety about the future - we are, after all, this industry's auto workers in what may be the Rust Belt.

markus_zhang

12 hours ago

I think the auto worker analogy is wrong. It’s like ALL workers who do welding/maintenance/etc.

TuringNYC

12 hours ago

So many things wrong with this article.

>> The popular horse-switching fantasy answer is retraining. “Go back to school and become an engineer.” In theory, yes. In practice, rarely. The jump from an assembly-line worker to an engineer requires years of schooling and a different educational foundation.

Sure...and after "years of schooling" that work will also get taken by AI, since learning is accelerating. Remember 6yrs ago they told laid off people to learn to code? Then remember 3yrs ago they said to learn to prompt engineer. Unfortunately the tech moves faster than retraining for many.

>> So many things that we could do to help our customers.

Author assumes the customer is still in good shape. Not a great assumption, the value chain is being squeezed and disintermediated.

>> Which is why the idea that we’re somehow going to run out of work strikes me as absurd. It feels like a theory written by people who haven’t actually spent much time doing the work in the first place — serving customers, building products, and running businesses. There is always more that could be done.

Sure, there is always work. Not sure what the ROI is on the work though, is it worth paying someone to do? If so, why wasnt it done before?

thatmf

13 hours ago

Get wrecked. The actual reason bro is not worried is in the right column:

> I am an EIR at Balderton Capital and principal of my own eponymous consulting business.

> I bring an uncommon perspective to enterprise software, having more than ten years’ experience in each of the CEO, CMO, and independent director roles in companies from zero to over $1B in revenues.

First, what the hell is an EIR.

Second, the fact that you are one at some Bertie-Wooster-ass venture capital firm means that you could probably retire tomorrow, if not necessarily in the manner to which you are accustomed

So yes, must be nice

simonw

13 hours ago

EIR = Entrepreneur In Residence. It's a slightly odd position, and varies a little depending on the firm, but generally it means someone is employed by a VC firm for a period of time to work on developing their next idea and also help out around the VC firm sourcing deals and mentoring companies.

gerdesj

12 hours ago

I think I'm with you on this. The dribbling cynicism, pontification and entitledness is rather grating.

There are some uncommonly long ... m long ... dashes, sprinkled in para six and again later on. Perhaps our hero has a charmap app handy or has a remarkable keyboard or remembers a carefully curated, slack handful of compose sequences.

The system prompt for this beastie must surely have started with: You are a complete wanker, riff on the eighties "loadsa money" theme.

locusofself

13 hours ago

right, must be nice. I live in a HCOL area and have a mortgage and family to support. If big tech lays me off, it's going to be stressful and probably mean me selling my house and moving to LCOL.

guyzero

13 hours ago

EIR == Entrepreneur-in-Residence

polalavik

13 hours ago

so an employee at a VC? lmao

matthest

13 hours ago

Read the article, it actually raises some fascinating points that are agnostic to the possibility that he may be financially well off.

gerdesj

12 hours ago

Please don't tell people to RTFA. I have and it is still entitled rambling bollocks.

Is this really leading edge ... whatever it is supposed to be:

"The popular horse-switching fantasy answer is retraining. “Go back to school and become an engineer.” In theory, yes. In practice, rarely. The jump from an assembly-line worker to an engineer requires years of schooling and a different educational foundation."

They might as well pat the person who is losing their livelihood on the head and say "there, there, it will all come good in the wash".

emestifs

13 hours ago

Starting your comment with "Get wrecked" doesn't inspire confidence. Makes you sound, to me, like an edgy teenager.

joshuamoyers

13 hours ago

i appreciate the sentiment to a certain extent - its not going away, skate to where the puck is if you care to do so. but the writing is repetitive and theres an entire repeated paragraph (bullet to paragraph form). there are also lots of things to be worried about even for the most seasoned individuals in terms of half decade increments conservatively. assuming large parts of swe become commoditized in the form of paying a handful of frontier model providers more and more of the share of what was once swe wages, the high end is what survives. high context fox-like (a la terrence tao's foxes and hedgehogs) are guiding ai to build - and then they are eventually displaced as well. extreme societal pain seems like its on the horizon assuming we dont have some incredibly unlikely massive mobilization towards post-work post-scarcity thinking with active social safety nets. economic diffusion probably means this is a little further away than we think, but time moves pretty damn fast.

clipsy

11 hours ago

> high context fox-like (a la terrence tao's foxes and hedgehogs)

Pedantically, I think you mean Isaiah Berlin's foxes and hedgehogs[0].

> assuming we dont have some incredibly unlikely massive mobilization towards post-work post-scarcity thinking with active social safety nets

The problem being that we're not actually heading toward post-work or post-scarcity. We're heading towards post-knowledge-work. Any chance of UBI or similar will be summarily shot down by the Epstein class, most likely by using their ownership of 90+% of the media to drive a class war between the ascendant blue collar workers and the collapsing white collar workers.

[0]: https://en.wikipedia.org/wiki/The_Hedgehog_and_the_Fox

matheusmoreira

12 hours ago

When is AI going to take over the executive positions? I'm sure they can do a much better job than these guys.

tkel

11 hours ago

A fundamental characteristic of capitalism is that capitalist businesses are run as dictatorships, with the capitalists as the dictators. When is the dictator going to replace themselves? Never. If anything, the capitalist will fully automate their job and continue to control all of the profits and continue to run the organization as a dictator. Most capitalists already do some form of this, they just hire other people to do work while they do little to nothing yet continue to "own" the organization, which is why the setup is so good for them, and why their job could be so easily replaced by AI. Unless the structures of businesses change and capitalists no longer have dictatorial authority over hiring/firing.

zb3

13 hours ago

> there is always, always more work yet to do

And they always, always forget that it's not about "work", it's about whether a particular person will be able to contribute work that someone is willing to pay for. It's definitely NOT true that there'll always be more paid work to do that can be done by a particular person.

But this is what you get when these authors are wondering if something is good for "the economy" instead of thinking about actual people.

stavros

12 hours ago

Yeah, all these "work has always been fine!" writers forget that we've never invented cheap artificial people before.

swarmgram

12 hours ago

I like the horse analogy.. we will always find a way.

GSimon

12 hours ago

"Retirement aged man not concerned about finding work"

bayarearefugee

12 hours ago

> Aggressively learn AI.

Unless the author is talking about learning how transformer architectures work, and I don't think they are (and if they are, it won't help the vast majority of people anyway) this is the dumbest advice I keep seeing.

You don't have to "learn AI". "Learning AI" will not be a moat, for anyone. The power of "AI" is that you prompt it in plain language. And it just goes and does the thing. Using AI is not really a skill. It arguably was a little bit when the models were a lot dumber, but now it isn't.

This "transition" is going to be way worse and way more disruptive than even people who think they are thinking about this problem assume.

simonw

12 hours ago

Prompting in plain English doesn't mean learning to use AI tools is easy. I continue to believe that "AI is easy" is the single biggest misconception in the entire field.

I've been a daily user of LLMs since ChatGPT came out and I'm still figuring out new ways to use them on a daily basis.

Wowfunhappy

12 hours ago

If you wrote a blog post about this, I would be very interested in reading it.

I'm certainly aware of fun things I can do with local models, which takes setup, and if you're into e.g. ComfyUI those workflows can get very complicated. But, that's more a hobby—I don't actually think I get better results this way vs naively prompting a SoTA model.

There are some more advanced workflows for e.g. Claude Code, but I feel like all of that is likely to go away once the underlying models get better (for example, longer context windows mean less need to manage context).

dlivingston

12 hours ago

"Learn AI" doesn't mean "learn how to prompt." It means that intelligence will be a commodity, and the business value of that will be the integration of intelligence into business models. So learn that. Where would a business line benefit from integration of AI? In what form would that take? The software industry has their answer in the form of Claude Code and code autocomplete. That's a design and integration problem. What's the equivalent for, say, energy companies? Or hospital administration?

K0balt

12 hours ago

People are painfully ignorant about how this is going to be with robotics. There is an expectation based on current performance… understandable but fails to incorporate the why, which is grossly inadequate training data. When that is solved (and it soon will be) robots are going to have their GPT3.5 moment.