I think I have a pretty different view, though maybe it hinges on the bit about 9 in 10 software people being code monkeys, or what that means. To the extent I agree that LLMs are going to eliminate coding jobs (permanently), they're going to be the ones you could basically do with StackOverflow and Google (when those things worked).
I think there's a cohort thing going on here, because Google has been spam rekt for long enough that entire classes of students have graduated, entered the workforce, and been promoted all since search and SO went to hell, so the gizmo that has the working code and you just need to describe it seems totally novel.
But we've been through this before: one day there's this box and you type a question and bam, reasonable code. Joing a FAANG is kind of like that too: you do the mega grep for parse_sockaddr_trie and there's this beautifully documented thing with like, here's the paper where it shows its O(ln).
But you call the thing and it seems to work and you send the diff and the senior person is like, that doesn't do IPv6 and that's rolling to us next quarter, you should add IPv6. And the thing was exploiting the octets and so its hard.
The thing is, a sigmoid looks exactly like an exponential when you're standing on it. But it never is. Even a nuclear bomb is exponential very briefly (and ChatGPT is not a nuclear bomb, not if it was 100x more capable).
Think about defense, or trading, or anything competitive like that: now you need the LLM because the other guy has it too. But he's not watching YouTube all day, he's still chain-smoking and taking adderall except he has an LLM now too.
So yeah, in the world where any of 9 TypeScript frameworks would all make roughly the same dark mode website and any of them get the acquisition done because really the founder knows a guy? That job is really risky right now.
But low effort shit is always risky unless you're the founder who knows a guy.