abc-1
12 hours ago
Contagion is exactly why interfaces are one of the most important pieces of design and should be given significant thought. A beautiful interface with a suboptimal implementation can be easily cleaned up when time is allotted. The reverse is rarely true.
majormajor
3 hours ago
I don't disagree but I think commonly you are missing one of two things that are necessary for a proper design:
1) time to design it 2) knowledge of exactly what it needs to do today and in a year
Sometimes you're missing both.
In which case I think you can prevent contagion from being too terrible by enforcing smaller modules and single responsibility in a compositional way. That doesn't require as much knowledge of the future or time, but just requires you to avoid high-surface-area interfaces that end up with lots of behavioral variants controlled via parameters in a nesting-doll style. Instead, move your config/parsing/behavioral decisions to the edges of your logic instead of letting them seep into all your underlying models too.
appplication
2 hours ago
Agree, but I’ve found designing robust, future proof interfaces to be one of the hardest problems in developing software. Even intentionally setting out to avoid tech debt at all costs, it’s just hard to do correctly. It requires more than technical bravado and architectural vision. It really does get into the realm of predicting the future.
abc-1
2 hours ago
Look at how mathematicians build minimal yet complete definitions for inspiration. An algebraic system can be created with a set of operations such as multiplication and addition, and existing concepts can be mapped to this system, such as money, but the underlying algebraic system will never change. It is complete.
Much of the system can be complete like this with forethought. The pieces that cannot can be factored out to the edges.
appplication
an hour ago
You’re not wrong in a theoretical sense, but building useful interfaces that your average dev can grok enough to build on top of requires higher level abstractions, approximations, and “reasonable defaults”. My experience is that only a small number of devs actually well understand the codebases they work in (and care enough to be thoughtful in interfacing with it).
The majority of devs generally are happy to tack on their features and PRs to whatever random scaffolding they can, without regard or awareness for how their individual component fits into the larger system, or how it may be extended. And to be honest it’s not necessarily a bad thing, because they do need to get work done, and merging PRs shouldn’t be reserved for the enlightened.
I guess I’m just pessimistic. The reason we don’t see perfect software is because we are not capable of producing it. At a certain point it all becomes spaghetti. If you work with software that isn’t spaghetti, it’s only because the people who care about it not becoming spaghetti haven’t left yet. This is good, but eventually they will leave, standards will decline, and you will become one with the pasta.
abc-1
an hour ago
Keep fighting the good fight. It’s more satisfying, even if entropy inevitably wins ;)
yodsanklai
6 hours ago
Which is why I like languages that make interfaces very explicit, like OCaml or Ada. Most of the time, I don't want to see the implementation, just a properly documented interface. If people can't describe in simple terms the behavior of an interface, something is wrong.