I remember one program. What they used to call dusty deck, after the deck of punched cards they were on. Dusty from years sitting on a shelf. It had one constant for two very different purposes. The equivalent of using foo for both the range of values for the month and for the hour in a date-time because they both happen to be 12. Lots of luck in Israel, where the Hebrew calendar (1..13) and military time (0..23) are often used.
Your intentions should always be clear to the engineer who has to work on your code after you. Or port it to another language or another operating system. And to the compiler that has to generate valid instructions.
Don’t rely on implicit type conversion, with the possible exception of upsizing, e..g., upsizing an int
to a long
.
Your programming language might have standardized rules for what happens if you write 7 + "5"
. Which, by the way, may not be the same as if you write "7" + 5
. Don’t rely on this without an awfully good reason. That it works now is no guarantee it works everywhere and everywhen or will be obvious to whoever looks at your code.
Don’t rely on how an empty string or a zero or NULL pointer is treated in a Boolean context. Explicitly convert it to Boolean so that your intent is clear and your code isn’t dependent on language minutiae. You need to know what if (object)
does, if only because the language has idioms other people will use. But if (object != NULL)
is clearer.
Like most rules, there are exceptions, and it’s a judgment call. If the language generates a Boolean value for a || b
, then you might as well make it a || bool(b)
.
But if the result of ||
is the actual value of the first operand that doesn’t evaluate to FALSE, then you have the useful idiom:
result = maybe_empty || "default value";
Which is more graceful than:
result = (object != NULL) ? object : "default value";
Also, if you’re working on code that follows an established style, follow the style not the rule. There’s a higher rule: Standard is better than better.