January 30, 2010 at 4:12 PM by Dr. Drang
Among the many disadvantages of middle age are: the loss of suppleness in both your mind and body; the simultaneous loss of hair where you want it and growth where you don’t; and the recognition that people in Cialis commercials are meant to represent you (albeit much better looking). One of the few advantages is the pleasure you get from complaining about how the world is going to hell in a handbasket and how much better things were in your day.1 I’m sorry to say that an unfortunate side effect of this week’s iPad announcement is that pre-middle-agers are now horning in on nostalgia, a pastime that should be restricted to their elders.
For example, both Mark Pilgrim (37) and Alex Payne (26!) have written little essays on how the iPad is going to destroy the idyllic world in which they grew up and blossomed, replacing it with a harsh dystopia in which turtlenecked priests in Cupertino decide who may learn to program and who may not. The essays have struck a chord with other wet-behind-the-ears programmers who are worried that the rise of appliance computers will push out the kind of open, tinkerable computers they grew up with.
Maybe they’re worried because they’re so young they’ve never seen change before. I have no doubt that iPad-like devices will change the way computing is done, and I have no doubt that this will make the practice of programming very different 10-15 years from now. But it doesn’t worry me, because I’m old enough to have seen changes at least as drastic.
When I took my first programming class— Well, let’s stop right there. When I started programming, almost everyone learned to program by taking a college-level class. This was the late 70s, and although the personal computer revolution was under way, PCs were by no means common.
The first order of business in my introductory programming course was learning how to use a keypunch machine. That’s how we made the stacks of cards fed into the computer. We almost never saw the computer itself. Computer operators—talk about a priesthood!—would come out of the machine room, pick up the stacks of cards, and do God knows what to them. Some time later (often hours later), the cards would be returned to us along with a printout of the results, which were usually a set of compiler errors.
A year or two later, when I first sat down to work at an interactive terminal, it was an absolute revelation. I could correct my typing errors by backspacing!
The march of computing history has been toward greater accessibility. More computers, more people using computers, and more people able to program their computers. I don’t see this changing.
The nature of programming will definitely change, just as it has in the past. My first computer programs were compiled, and hardcore programmers wrote in assembly language. Nowadays it’s more common for people write in interpreted (or perhaps semi-compiled) languages. No one seriously considers this to be a sign of the death or programming.
I was particularly struck by one part of Mark Pilgrim’s essay. He talks about starting in BASIC on an Apple ][e and then moving on to a Mac, where he played with MacsBug and ResEdit. The thing is, in 1984 much of what is now being said about the iPad was being said about the Mac. You couldn’t write programs on the first Macs, development was done on a Lisa. It wasn’t until HyperCard that the Mac had a simple, free, Apple-supported development environment that kids could play around with.
So buck up, Gen Xers and Gen Yers and whatever other Gens there are! Things won’t be as bad as you fear.
And get off my lawn.
This does not, of course, prevent you from also lecturing the young on how tough the world used to be and how easy they have it. Memory loss allows a certain inconsistency. ↩