Worse but Better
In a perfect world, software would be like a work of art. Sleek, transparent, designed with mathematical precision and philosophical depth.
Every line of code would be crafted like a verse in a Shakespearean sonnet, and every API interface like a finely tuned harpsichord - harmonious, elegant, timeless.
But, as you can probably guess, we don't live in a perfect world. We live in a world where worse wins over better. And not just occasionally - it’s practically the rule. This is exactly what Richard Gabriel wrote about in his famous essay “The Rise of ‘Worse is Better’”, which - ironically - despite its intellectual depth, mostly survives today as a kind of legendary meme where Unix beat Lisp Machines not because it was better, but because it was simpler, uglier, and easier to deploy. In that essay, Gabriel contrasts two design philosophies - one elegant and idealistic, the other down-to-earth and pragmatic. One prioritizes aesthetics and consistency, and the other just tries to make things work somehow. Guess which one conquered the world?
Yes, that's a rhetorical question. Anyone who has ever tried to implement an architecturally flawless solution in a corporate environment knows that code purity takes a back seat to what I like to call “Excel purity” - meaning KPIs align, and the client doesn’t call after hours. Meanwhile, things like sed, awk, and bash - even though they look like ASCII art generated after a head injury - actually work, are readily available, and solve problems faster than you can even think about refactoring them.
This whole concept of “worse is better” is eerily similar to the legendary mindset of the so-called DevOps who logs into production as root and fixes everything with a single script. And you know what? Sometimes it works. Sometimes it works better than a distributed system designed by an architect who’s read seventeen books but tested none of them in a real-world environment.
This brings us to a bitter truth: beauty is not a currency in IT. Functionality is. A system that is beautiful but difficult to implement will lose to one that’s clunky but can be pushed to the server without compilation, testing, or the CTO’s blessing. And that second system? There’s a good chance someone will still understand it six months from now.
Does that mean we should accept mediocrity? Not quite. But we should understand it. It’s time to stop acting surprised that Python became entrenched in machine learning, even though under the hood it’s basically a collection of performance-avoidance techniques wearing a nice syntax. Or that JavaScript - a language born as a quick hack for Netscape - now powers full-scale banking applications. This isn’t a triumph of quality; it’s a triumph of compromise.
Gabriel points out that a “worse” system has an advantage because it’s easier to implement and easier to understand. And people choose what works now, not what might be perfect in some vague future. Somewhere between “we’ll fix it later” and “this has to work by Friday” lies the whole tragicomedy of modern programming.
Let’s be honest: how many of us have had the time to take a system to architectural perfection? And how many of us have written sloppy workarounds because the client called to say, “The page doesn’t load”? And no, not because we don’t know better - but because reality isn’t a GitHub repo full of theory. It’s a minefield with a deadline.
Picture a software architect. Not the one from LinkedIn. A real one - 15 sprints in, holding a cup of cold coffee, just having received a production bug report. Is this the time to contemplate semantic purity and monads? No. It’s time to find a fix. Any fix. As long as it works.
And that’s exactly why the “worse is better” philosophy works. It doesn’t ask for permission. It doesn’t wait for ideal conditions. It doesn’t require three Ph.D's to understand. It’s like that old script no one dares to touch - but everyone’s glad it still runs.
That doesn’t mean we should surrender to chaos as the new normal. But maybe it’s time we stop condemning pragmatic solutions just because they wouldn’t look good on a conference slide. Maybe real maturity in IT means knowing beauty - and knowing when to walk away from it in order to get something done.
So next time you come across code so ugly it makes your eyes water and your brain throb - but it works - maybe don’t mock it. Maybe bow your head, just a little. Because it might be another case of “worse” that, once again, turned out to be better.