EN 35: Conventions are not dogma

The other day I saw a comment about a way to write some code that shocked me. It was about using uppercase and snake case with constants—LIKE_THIS.

The style of writing constants that way wasn’t what shocked me, as it’s quite common. The shock came from the idea that they felt justified to doing it like that because it was “industry standard” and that, in their many centuries of programming, they had always done it that way. Moreover, it was implied that if we really cared about quality, this “industry standard” had to be followed. I forgot to mention a crucial piece of data, the codebase doesn’t use that style anywhere.

Conventions are just conventions.

Convention: a usual or accepted way of behaving, especially in social situations, often following an old way of thinking or a custom in one particular society.

The value of conventions, as with standards, comes from people agreeing on following them. A kilogram is a kilogram everywhere, but the definition of the kilogram was agreed, and is only useful as a base unit of mass as long as we stick to the same definition.

Going to a codebase that doesn’t follow a certain convention is not heretic nor bad. Maybe for some people it would be spectacular if the entire world would code the same way, in the same languages and with the same code style. They could go in an out of projects with ease. But that’s rarely the case. Teams and organisations are more akin to living organisms than robotic or stale entities.

Each team has its own idiosyncrasy, and while many things might be similar, there will always be some differences. For example, in the JavaScript world, ESLint and Prettier are two of the most used libraries. Typically, you’d enable the recommended rules by default, but in my experience, teams tweak them, turning them on or off, making errors warnings or vice versa… based on their needs.

ESLint in particular has recommended rules that make sense and prevent tons of errors, but in the case of Prettier, it’s just an opinionated convention to format code, as any other before it. It automates the formatting, following the rules their creators chose. You don’t have to think about it and can avoid the holy wars where the team has to agree on all the different ways to write code (tabs vs spaces, ““ or ‘‘…). Ultimately, the team’s agreeing to follow Prettier rules, so we can worry about more important stuff and have our codebase more or less consistent. Even Prettier rules are slightly adjusted!

Maybe you’re thinking that some conventions have a good rationale, and it would behooves us to use them to prevent errors. There’s a good point there, conventions can arise from the need to prevent mistakes or, in the case of code styles, to add extra meaning that might not be in the language. A famous case is the Hungarian notation, things like IStuff or decPrice, to identify that things are interfaces, decimals, etc. The critical part to recognise is that most conventions are contextual, you can only apply them if it makes sense in your context. For example, with all the modern tooling we have nowadays, Hungarian notation doesn’t make sense for me, it just adds more noise.

If I work in a codebase that uses Hungarian notation, I’ll have to code in that style, irrespective of me liking it or not or the many millennia I’ve been in the profession doing it another way. If I want to attempt to change it, it can only be done by agreeing with the team, never in a unilateral and forceful way; otherwise we end up in the wild west, and I end up being a jerk.

Interesting links

Join the conversation

or to participate.