Maybe “Recursive” isn’t a bad word after all

So I nearly wrote an entry last week wondering whether recursive worked relative to anything at all other than tricks on the order of animated gifs. Because, really, does anything ever really coincide with itself like that? Whenever you get back to where you started, neither you nor the start are the same. A bit like Heraclitus (you can’t step in the same river twice, he said, supposedly).

But then when we were asked to come up with a Metaphor of our Own (you’ve read that Forster novel, right?), mine was Dream Machine. It was an immedia machine which means that there would be no media mediating between brainwave and Dream Machine (DM) simulation of the brilliant poem, scherzo, or water color in my head. So a species of immediatism, just in case you’re a fan of early Hakim Bey. Unlike the dynabook and the iPad where you’re always putzing around with some plastic and glass and software algorithms to translate your conception into their midwifed birthing of a not-quite-your-conception.

Which means I recursed upon our earlier reading by Ted Nelson by coinciding (ok, well, stealing) with his book title. Though I was dreaming something different than his Xanadu of parallel textfaces. So while not recursion, exactly, or is that properly, perhaps it was a recursing of the perennial human tendency to project our frustrations onto a machine that would relieve them by fulfilling their thwarted desires.

Which would suggest that our fascination for what’s normally meant by "recursion" may be in fact an attempt to coincide with ourselves rather than drift forever in Heraclitian, Derridean self-difference. File under fool’s errands.

Truth is, I don’t want my kind of Dream Machine. It would be too much like a hypercompetitive parent or older brother who did all your ideas better than you could. Who needs that kind of grief, especially at my age?

Give me a pretty interface and a program that doesn’t crash and does maybe 60% of what I want it to. That’s about right for a midweek evening.

The Case of the Provincial Nostalgic

Nostalgia is a peculiar filter designed for emotional gratification. It works by reinventing the past as a dreamworld unconsciously designed to fend off the present or to generate some sense of security in one’s own sensorium. The nineteenth century, particularly in its later decades, was notorious for its invention of culture worlds that were awesomely wonderful alternatives to the ruthless industrialization that fretted the age’s gentler thinkers.

Provincial is another filter, also gratifying, that represses the Other, variously conceived in terms of region, era, ethnicity, or sensibility. It allows someone to reorient the world and all of humanity around the self, as the self…

When you put the two together, it gets either scary or annoying, depending upon what’s at stake. All of which is an elaborate sigh of exasperation with our editors’ commentary upon a piece by Kay & Goldberg presenting their dynabook. Their breathless page is the case of the Provincial Nostalgic in my title: they admire these two for all the wrong reasons in a way that is offensive to others at the time and demeaning, really, of Kay & Goldberg.

Having whacked the hornet’s nest with my baseball bat, let me explain the offense(s) taken:

  1. O, please: does anyone really think that in 1977, date of original publication, no one had ever thought of a handheld device that would do everything and be connected to the mother ship of data? That is the least commendable achievement of the piece. Scifi is littered with versions of the iPad going back for decades. Star Trek televised in beginning in 1967. The next year, any American with a tomorrowland kind of pulse watched Dave and Frank use their 2001 Kubrick edition of the iPad. Stanislav Lem, Isaac Asimov, and many others did the imagineering for which the editors so awkwardly laud Kay & Goldberg. They started giving patents on tablet machines for pen input in 1888. So, really, to praise them for thinking of all the things a handheld computer could do is nonsensical. If they really did “conceive the computer from a radically different perspective,” the question might be different from whom? Certainly not from an enormous host of people who could imagine a computer doing such things. Which insight takes us to point number two, now that a vast swath of the past has been restored to visibility.
  2. Different from… What we ought to laud Kay & Goldberg for is figuring out how to make a functioning prototype. That, really, is harder than just imagining the device itself. How do you make parts small enough, capable enough, and fast enough? There wasn’t much on the shelf that you could use, though the idea of miniaturization had been around a while: transistor radios were demonstrated in 1954 and sold in the billions in the 1960s and 70s. The noteworthy factor here is that Kay and Goldberg presided over a team that figured out the software/hardware designs and the marriage of the two. That takes some doing. Their designer-selves drove part of the process in terms of ease and speed of use, their engineer-selves drove the concern with capacities and procedures, their entrepreneur-selves drove imagining it from the point of view of mass users wanting a “multimodal” device and children being able to use it.

It takes engineers, designers, and entrepreneurs to make real things that real people can use, Apple Computing being a case in point. All three do imagineering, each contributing a key piece of the vision, and if you lack any one of the three, you don’t get there (see android tablet culture and Surface un-design for useful warnings). What’s remarkable is that the Xerox team had a synergy going among the three legs of the technology stool.

Not that they conceived the idea of a handheld, not after all the reruns of tricorders and communicators and universal translators. But that they made a functioning device out of stuff that could barely do it, and that they won the battle that Microsoft’s Courier team lost to the spreadsheet and floppy drive guys. No doubt the editors want to imagine Kay and Goldberg as versions of their artistic digitalizing creative selves. But, really, the Xerox team was knee-deep in wires, busted circuit boards, and usability testers going wild with Smalltalk.

Living History

1974. The Altair (named, they say, for a location in that week’s Star Trek) computer appeared on the cover of Popular Electronics, and Paul Allen told his pal Bill Gates that microcomputers were on the way. Check.

Also, same year, Ted Nelson publishes his thoughts on user interface and, more generally, on how computers should work. In 2010 what he was looking for showed up on the market with an “ten-minute system,” a “prefabricated environment carefully tuned for easy use,” and an interface in which you touched stuff on the screen to move things along (though without Nelson’s beloved lightpen). And critics didn’t like it any better than they’d liked the iPhone or the iPod when they first came out. Check.

Interesting to see Nelson as a culture-warriorer against the Geeksquad’s ethos—the intuitive versus the infinitely tinkerable, the touchscreen versus the command line, the creative non-technical versus the programmer technophile… And the closed garden for creative play versus the bloatware infinity of buttons and palettes and options.

He has a cardboard mockup, we have OSX and iOS. He has Thinkertoys (p. 332), we have Scrivener which offers all seven of his key traits for presenting “‘views’ of the complexities in many different forms” so that we can use the computer as a “decision/creativity system.” He has Parallel Textface™ and we have Wikis and Snapshot versioning comparisons. He has the Ecit Rose™ and we have the Toolbar. We both have Undo, History, He thinks “the mechanisms at the computer level must be hidden to make [user clear-mindedness] work,” iOS hides everything but the doing (of painting, movie-editing, mind-mapping, writing, and various other forms of, as he calls it, “collateration”).

We both also have the issue of dreaming something worth dreaming with our liberating computers. Which is why I’m looking forward to our getting into those who are doing the dreaming in this seminar on “awakening the digital imagination.”

Two ways to miss the point….

It’s easy to lump together Vannevar Bush & Douglas Engelbart because they’re both, shall we say, on the geeky side imagining how to plug this into that and physically manage transfers and copies and recordings and the like. And they were writing essays at about the same time. But that agglomeration misses the point of the difference between them, and between both of them and the most interesting innovators around right now.

I could fall back on the last post and say that on a continuum that matters, Bush has a much lower ratio of digital to analog thinking than Engelbart shows us (almost entirely, the memex is a thing that a human being uses to do certain practical tasks more efficiently, a thoroughly analog storyline, whereas Engelbart borders upon delirium as he loses himself into the network of associations that explodes outward from his first simple notchings of notecards).

Or I could rely on mushy terms like system or network to explain the difference—except that both words are in themselves acritical, failing to distinguish clearly for all readers, though definitely so for some readers—all because of two limiters from conventional western thought. That is, conventional thought is comfortable with both “system” and “network” as long as it can define them in its own way.

So here is an effort to distill something useful from long teaching of postmodern thought, new art in several mediums, and digital literary culture: what are the two easiest (or is that laziest?) ways to miss the point at which one might think “like a native” of the digital rather than as an analog interloper taking a quick look around and inadvertently reconstructing the familiar rather than really seeing what’s emergent.

Pay no attention to the man behind the curtain (there’s no one there, and no curtain, and no not-curtain, and…)

You can think of network as lines connecting existing nodes. Which is not really a network at all, but more like a bunch of classically conceived entities communicating, more or less. It’s so common-sensical it’s almost hard to see why I’d bother talking about it. But, classically, we have thought of things as if they were separate, even autonomous, maybe even transcendental entities with a definable essence with just enough ineffability remaining to make them each seem unique. So, in other words, you have nodes that pre-exist the so-called network that links them, like Bush’s human being who sits down at his [sic] Memex. If, on the other hand, you’d crossed the great paradigm divide (by tapping into Dadaism rather than surrealism or, worse, canonical modernism, or by following the Oulipo, or by following postmodern art, or post-structural theory, or actually reading Nietzsche or Heraclitus or anyone of a number of insurgent thinkers, or…), you might conceive of nodes as epiphenomenal effects of interconnections. That is, “nodes” are effects of connections, nanosecond by nanosecond products of the sum total of relations intersecting at any given point. If rates of change are slow enough, we mentally construct an entity out of a series the way we take rapidly flashed stills as a smooth moving picture, as we used to call film.

For ontologists on the far side of that great paradigm divide, “humans” are wetware whose flash memory operating systems are continuously updated by experiences within culture, “within” because there’s no “without.” And since those lines of interconnection are themselves in continuous flux, pulsing and expiring, rerouting and transforming, none of the stabilities inherent in the conventionally conceived network pertain. Be careful: you may be changed by a randomly accessed input—which is not a new phenomenon (books change people), but just accelerated in terms of the number and diversity of exposures to which we now have access.

So no doubt it gets worse, right?

Ok, so a truly digital network is not a library reading room in which you can walk in from the outside and sit down at a table to discover another object that changes you. You are, instead, always already connected and undergoing digital reconfigurations of body and consciousness. “You” can’t step in the same river twice (Heraclitus) because that so-called “you” that sticks that foot in the “second” time is not entirely the same.

But all this talk about “you” signals the second way to miss the point: if you try to begin with (whatever kind of) “you” and then add to it a network, you’re still using a logic of independent (autonomous? transcendental?) boxes to limit the scope of what you’re talking about, the way we are classically scissoring out of the world one shape to talk about as if it were a meaningful “it” once we did so: it’s meaning is more like a function of the systems of concepts and symbols and processes valorized by our little moment in cultural history. Or, to get concrete about it, the story isn’t about a man [sic] sitting down at Bush’s memex, it’s about a human-memex aggregate (system) working within larger aggregates—networks of networks, systems of systems.

The more we are aware of how these systems cycle through each other, the less provincial we are in thinking about a human who augments itself by outsourcing some memory or some correlation functions. When Engelbart is talking about “associative linking” and “a complex symbol structure that would grow as the work progressed,” he’s walking across the paradigm divide from Bush’s world to the one in which you cannot understand where this is going unless you no longer see singular units (human, memex, data card) but a field of interrelations in which linkages and minds and societies are all being produced as the work progresses bit by bit. It’s dazzling, yes, because it’s so vast and complicated, each little (not)thing a function of all the other (not)things; it’s also discomfiting, because of what happens to that old oddly comforting notion of the human as an entity with an inner nature that is its own unique and enduring identity: it’s gone, poof. In its place is whatever we are being as we surf the network and play the system and jiggle around the possibilities of rewiring and reconfiguring.

The more we do such things, the less we resemble human beings who functioned in a different network of networks (which, perhaps, we nostalgically reify as we want it to be rather than as it now appears to “us” as we look back upon eras in which all this seems to us easier to finesse “back then”). But, really, consciousness changes.

We see noteworthy differences between generations now that lead us to wonder if videogames are destroying our children. Analog logic, analog ontology: if you sense the issue at all, then, yes, those children are already (partly) destroyed—or reconfigured, though videogames are only one set of forms within which that’s taking place. Read Engelbart: “process structuring limiting symbol structuring, symbol structuring limiting concept structuring, and concept structuring limiting mental structuring…” It’s a 1962 usage to say “limiting”—something like configuring is more like it. But he’s recognizing that to enter intimately into the technologically amplified digital world of interrelations is to see that structuring—understanding that word as a cumulation of all interrelations pulsing away at a given moment—effects the changing nature of everything amidst those interrelations in a moment to moment way.

Which is why I like the work of “programmatological” literary figure John Cayley more than I like the work of those who do familiar things (writing analog things in analog logic) on a computer and call it “new.” Or Talan Memmott, whose “Lexia into Perplexia” is all about the network effecting interdependent reconfiguring of the network and its nodes-of-the-moment (or, as he calls them the “cell….f”). They give us the rare gift of work from the other side of the great paradigm shift at a historical moment when we, like the digital pioneers we’re reading just now, are an awkward and perhaps ungainly mix of residual conventional culture and emergent “digital” ontology.

Zoom zoom.