But at some point, when Alan Turing’s abstract theory of computation, based in large part on Gödel’s 1931 paper, collided with the concrete engineering realities, some of the more perceptive people (Turing himself and John von Neumann especially) put two and two together and realized that their machines, incorporating the richness of integer arithmetic that Gödel had shown was so potent, were thereby universal. All at once, these machines were like music boxes that could read arbitrary paper scrolls with holes in them, and thus could play
The early computer engineers thought of their computers as number-crunching devices and did not see numbers as a universal medium. Today we (and by “we” I mean our culture as a whole, rather than specialists) do not see numbers that way either, but our lack of understanding is for an entirely different reason — in fact, for exactly the opposite reason. Today it is because all those numbers are so neatly hidden behind the screens of our laptops and desktops that we utterly forget they are there. We watch virtual football games unfolding on our screen between “dream teams” that exist only inside the central processing unit (which is carrying out arithmetical instructions, just as it was designed to do). Children build virtual towns inhabited by little people who virtually ride by on virtual bicycles, with leaves that virtually fall from trees and smoke that virtually dissipates into the virtual air. Cosmologists create virtual galaxies, let them loose, and watch what happens as they virtually collide. Biologists create virtual proteins and watch them fold up according to the complex virtual chemistry of their constituent virtual submolecules.
I could list hundreds of things that take place on computer screens, but few people ever think about the fact that all of this is happening courtesy of
Actually, it’s more ambiguous than that, and for all the same reasons as I elaborated in Chapter 11. Wherever there is a pattern, it can be seen either as itself or as standing for anything to which it is isomorphic. Words that apply to Pomponnette’s straying also apply, as it happens, to Aurélie’s straying, and neither interpretation is truer than the other, even if one of them was the originally intended one. Likewise, an operation on an integer that is written out in binary notation (for instance, the conversion of “0000000011001111” into “1100111100000000”) that one person might describe as multiplication by 256 might be described by another observer as a left-shift by eight bits, and by another observer as the transfer of a color from one pixel to its neighbor, and by someone else as the deletion of an alphanumeric character in a file. As long as each one is a correct description of what’s happening, none of them is privileged. The reason we call computers “computers”, then, is historic. They originated as integer-calculation machines, and they are still of course validly describable as such — but we now realize, as Kurt Gödel first did back in 1931, that such devices can be equally validly perceived and talked about in terms that are fantastically different from what their originators intended.
Universal Beings