I live in an age of miracles and wonder, but can’t bring myself to be impressed. As noted elsewhere here, I run computers that are four or five years old and don’t miss having new, because there just isn’t that much new to miss. My Windows phone has the problem of being a Windows phone, but is basically very, very competent. Astoundingly so, if you could have put one in my hand back in, say, 1998.
I get it – these devices are, by any standard, extraordinary. But they do not move me. For that, to get my sense of wonder on, I have to reach back.
For a long while after I first got seriously interested in computers, my special thing was the history of the machines and software. I figured I was catching up, and that as soon as I read through the mainframe era, the “Soul of A New Machine” mini-computers, the Apple IIs and the original PC, that I’d be done. I bought a lot of obsolete and odd machines along the way.
Turns out, I cared for how we got to the present much more than I care about the present itself. And lately, more about the past than the future; some of that’s me getting older, but more of it is disappointment with what I’m being sold. I was expecting some kind of intelligence augmentation; what I got was Twitter.
I’m about halfway through George Dyson’s “Turing’s Cathedral,” which is a deep dive into the construction of one of the first computers, and into the way software was first worked out. I’ve read a lot of books that cover much of the same ground, but they all tell the story differently. “Cathedral” adds to the genre with what amounts to a mini-biography of John von Neumann. It also throws light on some names I know, but didn’t know much about, especially Stanislaw Ulam, a mathematician who was a protege of von Neumann, and who strikes me as underrated in the history of computing.
Mostly, I realize again just how smart all these people were, and how they were fumbling along, taking the magnificent hypothetical of Alan Turing’s universal computing machine and instantiating it, with vacuum tubes, and wire for memory, and most importantly, how the real world of balky electricity and bad air conditioning shaped what was possible. Dealing with that real world ended up being the critical accomplishment of the computer pioneers – it was hard to think of storing information as binary digits, but it was insanely difficult to turn that concept into something fast and reliable. They solved big, practical problems building computers, which I think gave them the confidence to see these machines as reliable tools for thinking, and to tackle even harder, more complicated problems – they made the modern world of weather forecasting and climate science, to name just one.
And that’s the thing I keep returning to – my phone, my tablet, my computer are more designed to save me from thinking than they are to extend and enhance what I can do. Yes, I can (and do) use these machines to think with, but the central tendency, all the weight, seems to be elsewhere. They’re all about – just watch the TV commercials – a seamless experience. In fact, as I was writing this I saw some beautiful footage of a city, apparently shot from the back of a metro train, and framed by white. It was simple, graceful, lovely. It was, the ad said, shot on an iPhone 6.
One suspects the actual turning of the iPhone video into something suitable for TV was more complicated than the ad suggests, but never mind; like much advertising, this ad is aspirational. It wants us to picture ourselves gliding through life as easily, as smoothly as the video of the city passes by the train window, with the help of our iPhone, of course. Now this is not new: car manufacturers have been doing it for decades. But it is at odds with the history of computing that we’ve been sold, and parts of which I believe to be true: these machines were developed by very smart – and in some cases very noble – people; there was a strong drive to democratize the technology; it can be used to spread knowledge in an unprecedented way. That’s the picture we’re supposed to hold in the background, while we more and more live in the tightly packaged, increasingly centralized experiences of social media. Our lived experience of computing is at odds with the reason it’s been given special status.
To put it another way, I follow with professional interest the growth of “mobile” as opposed to desktop or laptop computing – and it’s a rout. Mobile – phones – is the dominant computing experience for most people. And what is mobile really good for? Consuming things – words, music, increasingly video. But as a means of production, mobile sucks. Yes, you can post to Twitter or Facebook or Instagram, but I wouldn’t try writing this blog on my phone. Think of it as another example of most important lock-in going on, the one that all content companies, ISPs, aggregators, you name it, agree on – we are here to be marketed to, period. All this talk about using the internet to get smarter, to learn things, is, well, not in the business plan.
The good news here is that the internet remains a big and somewhat unruly place, and you can still go against the grain, find the places and people that can teach you something, though this is much easier done on a laptop or desktop than on your phone. (Why is the web such a generally crappy experience on phones and tablets? I almost never use either to look at the web itself, because I find it slow and buggy and miserable.)
But you can feel the exits slowly being sealed, the back streets blocked off or renovated out of existence in the name of a new! better! internet, one which is largely controlled by a handful of companies, with various toll schemes thrown up to extract money from the rest of us, while effectively blocking any opposition movements from forming. (And yes, in this scenario there will be a fair amount of highbrow culture available – the Comcast Symphony Orchestra would be both an attractive tax write-off and a lovely counter to arguments that it’s somehow less than optimal for three cable companies to control the vast majority of U.S. internet traffic.)
What will we be left with? I fear more things like my iPhone 6 video – a frictionless not-quite experience, a place where the real world is increasingly viewed not as a source of necessary constraints, where that’s a good thing, the place from which you start learning, but as something that isn’t quite as good as what we have in our pocket. Much of the history of computing has been a thrilling story of how we learned to make tools, and become competent in entirely new ways; right now, the story seems to be how the tools have been turned against us, and how little we care.