Dave Farley on What Makes High Quality Code

3 pointsposted 16 hours ago
by rmason

4 Comments

gregjor

7 hours ago

I will state my conclusion first. Quality in code serves as a euphemism for skills issue. When we encounter code that we find hard to read or change, we might call the code unreadable or low-quality, but we actually mean "The programmers who wrote this code did a poor job," or "I don't have the skills needed to understand this code." If we don't recognize the problem as our own lack of skills we blame the code. Whichever framing applies, putting the blame on the code rather than the author or the reader prevents hurt feelings and bruised egos.

Dave Farley gives a definition of code quality that kind of kicks the can down the road, and at the same time has little value except in hindsight. "If it's easy to change, it's high quality; if it's hard to change, it's not." But "easy to change" defies easy definition and measurement just as much as "quality" does. If we don't discover (or judge) code to have low quality until we need to change it, how does that help us write quality code to begin with?

How easy or hard a programmer thinks changing some code might get depends mainly on the programmer. An inexperienced programmer will find most code hard to read and change (and thus low quality by Farley's definition), whereas an experienced and skilled programmer will not have the same difficulty. A skills issue, in other words.

As programmers gain experience they usually internalize a vague set of heuristics by which they judge code quality. At the same time the range of code they can read and confidently change expands. Some programmers will get stuck in the "expert beginner" loop [1] and stop improving. Some programmers get hung up on aesthetic trivia and dismiss code as unreadable because of indentation or naming conventions. Many programmers arrive at some style or paradigm for development that works for them -- a local maxima -- then judge everyone else according to their metrics. Farley's podcast dealt with Extreme Programming, and he relentlessly promotes CI/CD. If you don't do XP, CI/CD, OOP, functional, "Clean Code," take your pick, your code must lack quality, because you didn't use some version of "best practices."

Brian Kernighan and P.J Plauger wrote The Elements of Programming Style [2] back in 1974. That book summarized many of the heuristics experienced programmers had come to use for writing quality code, and judging the quality of code. Kernighan focused on writing code, not so much on system design or how to organize projects, but I think he did put his finger on a lot of what we think we mean when we talk about code quality. Kernighan also gave the nicest and clearest illustration of code readability, or quality, as a skills issue in The C Programming Language (2nd Edition) in chapter 5.5, when he walks through multiple implementations of `strcpy` from beginner to expert, while explaining pointers.

I listened to the podcast and couldn't tell what Farley meant when he referred to "more traditional approaches," but in context it seems he means any development style other than XP and/or CI/CD/TDD. I think saying that those "traditional approaches... start off assuming that we have get things right at the beginning" caricatures up-front planning and design, drawing a false dichotomy between so-called waterfall (traditional, I suppose) and agile and XP. Farley looks old enough to know better. Software development embraced the reality of software changing over time, and new requirements driving ongoing development and maintenance, way back before I started programming over 40 years ago. The concepts of modules, cohesion, and coupling -- central to designing and implementing readable and maintainable software -- got described back in the 1960s by Larry Constantine [3] at the beginning of the structured design phase of the industry trying to grapple with large-scale software development and maintenance lifecycles.

No serious software designer ever started out assuming they had to, or could, get everything right at the beginning. Waterfall never worked that way either, it always had feedback and iterations, formal or informal, and programmers understood the necessity of iterating and checking in with users and stakeholders back in the olden days. Maybe Farley meant to contrast with some other tradition that I don't know about.

[1] https://daedtech.com/how-developers-stop-learning-rise-of-th...

[2] https://en.wikipedia.org/wiki/The_Elements_of_Programming_St...

[3] https://en.wikipedia.org/wiki/Coupling_(computer_programming...*

hitchdev

6 hours ago

It's not purely about skill, code quality also improves as a function of discipline, a willingness to take risks and outside pressure.

fwiw I dont think Ive ever seen clean code that didnt make use of something at the very least resembling both exhaustive CI/CD and TDD. There are some practices which are basically essential even if some of the mavens of "best" practices mistakenly label a few that arent necessary as necessary.

gregjor

4 hours ago

Can you define and measure "code quality?" If not how can you say it improves?

Even granting your point, the things you list that supposedly improve code quality -- discipline, taking risks -- come under the "skills" heading in my understanding.

> I dont think Ive ever seen clean code that didnt make use of something at the very least resembling both exhaustive CI/CD and TDD

Yet almost all of the code ever written, including very successful long-lived things like Unix, the C standard library, Oracle, etc. got written before CI/CD and TDD. I don't have a definition of "clean code" to judge by, but I have certainly worked on lots of readable and maintainable code that did not come from TDD or CI/CD processes.