Worse Is Better
The other day I read that phrase, Worse is better… and it clicked with me. Not a very original statement, I know; just another iteration of Perfect is the enemy of good, but sometimes is not the words, but the phrasing. You don’t need a PhD in Linguistics to know this (but I happen to be one!)
The point is that I’ve been struggling against the code of scalc for quite some weeks. The main roadblock is ticket #10, namely, that I want to merge as much as possible the code which deals with math operations with the one which deals with reading and running commands.1
I’m not feeling like explaining in detail what is going on there… precisely because my goal is to change that code for the better, but in a nutshell the situation is this: both operations and commands are defined statically and made available through two different “catalogs” which are arrays of data structures. The problem is that operations and commands use two different data structure types that are just slightly different, so that the code that maps a string with the function pointer we want is… slightly different as well… and, therefore, it duplicates code.
Code duplication is not bad per se… but it definitely is a red flag in most cases. It smells.
I still don’t have a working solution. I tried two variations of the same strategy: hiding arity of mathematical operations so that basically commands could work as 0-ary operations.2 The code became so utterly complex that it seemed like… I don’t know… worthy to be pushed to GLib’s mainline? Maybe GTK+? It was horrible.
There I was, sadly looking at my screen, when, in a moment of distraction reading some blog I guess, I came across the phrase Worse is better. It made me look at the current code of scalc differently, still sad and frustrated, but what if?
What if I just forget about ticket #10 and ship 0.1.0 as is. The code works, all these changes I’m sulking over are… internal… OK, yeah, they theoretically could open scalc to other possibilities, maybe? We might save some bytes both in storage and in memory, of course, but… is it really that urgent as for it to become a roadblock?
I’m starting to think it isn’t worth that.
I tried an experiment, this time with minitimer. There was this feature I wanted to support since a long time: an output file, like ii’s. ii is a file-based IRC client, which makes you send messages by writing to named pipes and read them from files. As minitimer already has got a named pipe, and accompanying output file seemed nice! You know especially what for? For being able to reuse the timer’s output from outside the shell minitimer is running in. Yes, you already could do some weird redirection magic to achieve this on startup, never as in connecting to a timer already running…
…and I want my timers to be shown on swaybar regardless from which terminal they’re running.
So, if you’ve been paying attention (you should!), you know I released a new version of minitimer with this feature.
Now to the fun part: the implementation is bad. OK, it works, but it doesn’t give the user any choice not to use (or to exclusively use) that output file or the named pipe, for that matter. It’s immature, but it does its job.
The important thing is that that feature got released, no matter how, as long as we’re not breaking security or the system (nope, minitimer can’t do that). Commits are free: there’s always time to improve upon things. Who cares if the next couple of commits and maybe the next release are solely devoted to round things up?
Worse is better: Aiming for a solution, any solution that works, will always be better than the hypothetical ideal one that isn’t being implemented.
The same goes for scalc, I think. I do strive for improving things, but over time. Sometimes, a bad, sketchy pair of commits will set things moving so that later I may write a good, more “ideal” solution. But features come first… at least that’s how I see it.
And now, I’ll have to make a decision on scalc… ohnoes