Things I learned from burning myself out with AI coding agents

24 pointsposted 6 hours ago
by brazukadev

13 Comments

iainctduncan

3 hours ago

From the article "And yet these tools have opened a world of creative potential in software that was previously closed to me, and they feel personally empowering. "

I keep seeing things like this, about "democratization" etc. But coding has been essentially free to learn for about 25 years now, with all the online resources and open source tools anyone can use.

Is this just a huge boon to those too lazy to learn? And what does that mean for later security and tech debt issues?

kyancey

2 hours ago

> Is this just a huge boon to those too lazy to learn? And what does that mean for later security and tech debt issues?

In the same way that GPS is a boon to people too lazy to learn celestial navigation or read a paper map.

In the same way that word processors are a boon to people too lazy to use a typewriter and white-out.

In the same way that supermarkets are a boon to people too lazy to hunt and gather their own food.

In the same way that synthesizers are a boon to people too lazy to hire a 40-piece orchestra.

In the same way that power drills are a boon to people too lazy to use a hand crank.

wjfuu32984

an hour ago

Those are all false equivalents. The GP speaks of "democratization of learning", which had already happened. It's more akin to if I said "now people can finally vote" when remote voting expanded to civilians. It's not like people couldn't have voted before, and in fact it had only a modest impact on turnout.

Then people would ask "is this just a huge boon to those too lazy to vote?", and the answer would be "no actually, voting is still a thing where one must do their own thinking."

If anything, it's a boon to people too lazy to drive, similar to LLMs being a boon for those too lazy to type.

borroka

3 hours ago

Think about having to assemble a car (you can find specs and tutorials online, say) and then drive it, or asking engineers and mechanics to assemble it, and then using the car assembled by others to go for a drive.

brazukadev

6 hours ago

> 8. People may become busier than ever

this is so true and the opposite of what was expected

dapperdrake

4 hours ago

LLMs provide the benefit of lossy compression of all the text on the Internet.

It’s a crappier reddit in your pocket.

Use it well.

funkyfiddler69

6 hours ago

nice write up of things that are only obvious if you spend time with AI. pretty much everything applies to non-agentic AI work, code or not, as well, if you are aiming beyond average quality and conventional design, that is. people who give up somewhat early won't give up much later just because they use AI or teach an AI agent.

but the article is mostly also what people not in the field or tangentially related expect. it's here but that big thing isn't.

I could say I dabbled with woodworking but I really just used a chainsaw to cut down some trees, make slabs and then used drill and screws to construct the cheapest, fastest MVP of a piece of furniture that I used until the shed burned down. But that's not woodworking, not really.

"AI coding agents" is just an autoiterating chat of/with a large coding model, that you still have to iterate over, which is as obvious as an apprentice in a woodworking shop doing a lot--if not all of the work--alone until the meister points out all the mistakes and lets him do it all over again.

> I was soon spending eight hours a day during my winter vacation shepherding about 15 Claude Code projects at once

If you are a "computer person", spending 8h a day on multiple projects is normal, although 15 is, IMO, way too freaking much but I'm ADHD and not really a computer person. While I run dozens of narratives in parallel all the time, I only "shepherd" and iterate over a handful of them in 'flexible' time intervalls.

The reason for the burnout might be, and I can relate due to my ADHD, the following:

> Due to what might poetically be called “preconceived notions” baked into a coding model’s neural network (more technically, statistical semantic associations), it can be difficult to get AI agents to create truly novel things, even if you carefully spell out what you want.

The expectation to create something "truley novel" based on ideas that aren't truly novel (yet, ...what?) is weird enough, but then expecting that an AI coding agent, an apprentice, will make it novel even though the entire thing basically already exists and the novelty makes no sense conceptually until the core elements are separated

> a UFO (instead of a circular checker piece) flying over a field of adjacent squares,

is quite analogues to semi-functional ADHD people who believe they will get at least some of their ideas out if they "work" or dream on all all them. It can work, but you have to separate concerns, which, in case of ADHD people, is becoming functional, meaning don't consume stuff that impede body and brain, do stuff to eliminate bio-physical distractions and to keep hormonal and neural moral high at most times, and only then work, and in the case of AI coding agents it means to separate concepts that are programmatically/mathematically/linguistically intertwined and only then define mechanics and features within or beyond the individual or combined constraints.

karmakaze

3 hours ago

> The first 90 percent of an AI coding project comes in fast and amazes you. The last 10 percent involves tediously filling in the details through back-and-forth trial-and-error conversation with the agent. Tasks that require deeper insight or understanding than what the agent can provide still require humans to make the connections and guide it in the right direction.

So then why not at this point switch to the human being primary author and only have the AI do reviews and touch ups? Or are we restricting ourselves to vibe coding only?

funkyfiddler69

30 minutes ago

> The last 10 percent involves tediously filling in the details through back-and-forth trial-and-error conversation with the agent

It so often sounds like "traditional coding" flows like an orchestra during an opera while vibe and 'agentic' coding flows like a bunch of big bands practicing.

Are they trying to tell the story that "it's the same" or that "it's just not the same"? Is the toolchain changing that much that there is no reason to learn the baseline anymore? So the next ten years of AI development should be left to those who already weild the basic tools? Just like the economy? Is the narrative meant to establish a singularity-driven relationship with young coders, computer scientists and those who use code to entertain, inform and sell via media? While simultaneously pushing the outliers to the edge of the sphere and lock them out via their lack of AI skills and experience with such tools from ever reaching a proper chunk of the mob?

On the one hand, it's a personal decision. Trends and narratives can be convincing. Defactors are rare nowadays while polarization and the status quo are the defacto standard. So on the other hand, it's a depersonalized decision reinforcing the hierarchies (hardware) that dictate which tools (hardware, the cloud) dominate the main stream either way.

> Or are we restricting ourselves to vibe coding only? > why not at this point switch to the human being primary author

It's the only choice. You are either the primary author of the code or of the learning material. In the former case, the latter is implied and you can't teach if you don't know.

In essence, all this "AI hype" should really only motivate. But these perceptions of "the end of stuff as we know it" and "NOW it's definitely not in my/our hands anymore" that is everywhere weighs heavy. So that the only "residue outcome" really is: making money is the only thing that's left ..., again, reinforcing the hierarchies (hardware) that dictate which tools (hardware, the cloud) dominate the main stream either way--whether you break under the weight or not, whether you shrugg it off or become versed enough to just carry it along--while establishing a singularity-driven relationship of the system with it's constituents.

This is the way.

jaggs

6 hours ago

This is an excellent article. I resonated with all ten of his points.This section at the end particularly made sense.

"Regardless of my own habits, the flow of new software will not slow down. There will soon be a seemingly endless supply of AI-augmented media (games, movies, images, books), and that’s a problem we’ll have to figure out how to deal with. These products won’t all be “AI slop,” either; some will be done very well, and the acceleration in production times due to these new power tools will balloon the quantity beyond anything we’ve seen."

tonyedgecombe

5 hours ago

The problem is finding the pearls amongst the slop.

voakbasda

3 hours ago

How is that any different than, say, all of human history?

pdimitar

2 hours ago

It's not different per se, it's just being made much more difficult i.e. if you had to look for one pearl through a pile of 200 barnacles, how you have to scan through 3000.