You know how annoyingly often you want to display a ratio — a percentage or framerate or some such — and you don't care what happens if you divide by zero, because the result isn't meaningful then anyway? But you don't want it to crash, so you end up writing things like this, over and over...
printf("did %d things (%d%% hard) at %d/s\n",
n, n ? nhard * 100 / n : 0, dt ? n / dt : 0)
If you're writing it over and over, it should be a function, of course:
div0 a b = if (zero? b) 0 (a / b)
This is a sinful function: it doesn't help you do the Right Thing, only helps you be sloppy more conveniently. Programmers tend to feel guilty about writing such functions, and especially about including them in public interfaces, lest they encourage someone else to be sloppy. Often that guilt deters us from writing them at all, even when we plan to be sloppy anyway, so there would be no harm in it. Language designers avoid such features even more scrupulously, and lament the ones they're forced to include for backward compatibility. Even if div0
is frequently useful, you won't see it in many languages' standard libraries1.
On the other hand, a large class of language features are valuable because they support seemingly sloppy practices. Garbage collection lets you leak memory with impunity, dynamic redefinition lets you modify a program while it's running, automated tests let you use code you can't prove correct — so rejecting features that promote sloppiness is not a very good heuristic. I suspect a better understanding is possible here, but I don't have it yet.
1 Although there probably is some language which lacks a good way to report exceptions, and therefore has div0
as the standard division operator.
No comments:
Post a Comment
It's OK to comment on old posts.