Rob Pike just tweeted a historical artifact:
#define bitblt(s, r, d, p, c)(*((void(*)())0x430d6))(s, r, d, p, c)
First-ever C cast to function pointer? dmr wrote it for me in 1981.
Ah, 0x430d6! Who could forget the hexidecimal address of bitblt()
in that C program Rob was writing on whatever version of Unix was
running on that Vax in the machine room of Murray Hill in 1981?
Joking aside, it seems to me that this single tweet contains within it everything that makes C so good and so bad.
If I can take a guess at what is going on here, 0x430d6 is the entry
point of a bitblt()
function that was probably written in assembly
language (for speed, naturally, because C is bloated). So bitblt()
would have been written in a separate file (a .s file instead of a .c
file) and compiled separately. And perhaps in 1981 the linker wasn’t
quite as capable as the ones we have today, so you had to use an
actual address instead of a symbolic name.
Now, why are we using #define
instead of a C function? There are
two possibilities. First, we don’t want to call a function just to
call another function; that’s wasteful overhead. Second, the
bitblt()
function might need to be applied to arguments of varied
types, and using a C function would force you to assign types to those
arguments, and this might not work with C’s type system—which would
make this possibly the first-ever time that Rob decided he didn’t like
generics. (As an aside, I hate the term “generics” because it reminds
me of Java and That Enterprise Smell (I prefer “polymorphism”); and I
think it’s fine to leave them out of your programming language.)
So from this tweet we see all of the power that C gives to programmers. In C the compiler is dumb and the programmer can do whatever is necessary. And there is just enough abstraction in the language to make programs portable across the diverse computer architectures of the 1970s (at which point in time, C was considered a high-level language!). This was essential in a language that bootstrapped itself and an operating system onto all of those computers and leads straight to the dominance we see today for Unix (in the derived forms of Linux, iOS, Android).
This power is unrestrained, and that is what makes it both beautiful
and terrible. A language in which the programmer can jump to any
location in memory is perilously close to a language in which every
program in the language contains eval()
. History since 1981 gives
ample evidence that this in fact does hold for C.
I like C, and I like programming in C. Historically I rank it #2 on the list of important programming languages. But there is no way to remove this danger from C, and that is why, in today’s security climate, we must abandon it.