Sometimes it’s nice to go back to your roots. And sometimes, it’s nice to go back to the roots you never had.
As a computer science dropout, I’d never gotten the fundamentals presented to me in a coherent fashion. I’d picked up many of the connections between mathematics, logic, and electrical engineering from reading online, but it’s no substitute for having the progress of computer science laid out in a systematic way. Code: The Hidden Language of Computer Hardware and Software by Charles Petzold is the Computer Science 101 course I’d wish I’d had in college.
Code is a great trip through the fundamentals of computing. Starting from first principles the author takes you through the basic computer theory to the point where you could create your own computer. It’s a journey that begins with a simple light switch and ends with advent of the Internet.
Morse Code, combinatorial theory, the telegraph, logic circuits — are all parts of the story of the computer. It’s a story told without analogies, but in a straightforward building-block fashion. It describes the insights and intuitive leaps that lie behind the most important tool of our age.
At times, it can seem like computers and high-tech gadgets are magic. What reading a book like Code reminds us is that there’s no wizardry involved, just careful engineering.
We were close to steampunk era computing. All the insights required to make simple calculators and computers were present in the late 19th century. Most of the equipment had been invented for the telegraph. But it took several decades more before the pieces came together, in a 1938 master’s thesis written by Claude Shannon while at MIT. (Shannon also coined the name used for the basic unit of digital information, the binary digit, or bit.)
The idea of separating code from data — something that every programmer gets beaten into them at the beginning — goes all the way back to the origin of computing. Even though it’s all binary numbers and logic gates at the hardware level, some of those numbers represent instructions and some represent data. Many of today’s higher-level languages are so far removed from the basic circuitry that you can (sometimes) treat them as one and the same.
It turns out the longstanding distinction between positive and negative numbers in most programming languages derives from electrical engineering. The circuitry needed to subtract two numbers can be greatly simplified if you accept a few constraints on the size of the number and apply some tricks for how they’re encoded. I’d assumed that the reason had to do with storage space alone, but it simplifies the electrical circuitry required as well.
I had no idea that assembly language was essentially machine language with human annotations and symbols. I thought that they were fundamentally different, somehow.
And speaking of assembly language, I now have an vague idea of what the disassembly code Visual Studio shows me during debugging means. By the time you’re finished with Code, you’ll have a basic understanding of it as well, and for how tedious it was to program. I’m glad to be working with today’s much more expressive programming languages.
Code was a great read, and ought to sit on every professional programmer’s bookshelf. Part Wired, part Make, part junior high science class, it’ll teach you the inner workings of the complicated and ubiquitous devices that run everything these days.
My only criticism of the book was that it wrapped up too quickly. The graphical user interface gets only a chapter, and events since the rise of the Internet barely half of one. I’d like to see another volume (or two!) covering in depth everything that’s happened since the Apple Macintosh.
But that’s really more of a compliment than a criticism.