Watch this: Sunshine
… the medium is the message.
And this: Helvetica
While the Dynabook doesn’t seem so revolutionary today, this was certainly not the case in the late 1960’s and early 1970’s. Your own computer? That you program? How wild and futuristic considering that Thomas Watson had predicted worldwide demand of 5 computers. Interesting how these visionaries, like Kay and Goldberg, realized that “the best way to predict the future is to invent it.” Many of us simply experience technology happening to us.
Universities had better not assume this posture. Momentum already appears that the university/college as institutions of higher education are being undermined by entrepreneurs who are seizing upon technology and opportunities availed by economic conditions to create change – i.e., invent the future (hey! I thought WE were supposed to do that!) See: A Boom Time for Education Start-Ups in The Chronicle of Higher Education.
Gardner Campbell spoke to my students today on emerging trends that I suspect they hadn’t quite considered. I felt like they expected me to defend the role of the university in it’s current form, which I was not about to do. Given the role of personal “computers” as tools/devices with infinite applications – only limited by what we can imagine – our conception of “computer” as a box, portable, or tablet should already be changing. We have “smart” roads, smart cities, cloud-connected thermostats, smart bombs(?), etc. all with the ability to collect information and react to specific commands. In essence, the Dynabook is miniaturized and distributed throughout our environment in one form or another.
The fear is that humans will lose personal skills to deal with other humans and discontinue building relationships. I already witness this and it doesn’t involve computers. Religion and politics are very effective at dividing and dissolving human interaction. In my opinion, religions that preach discrimination, intolerance, and hate, are far more destructive to human relations.
Hard to imagine watching Doug Englebart in 1968 talking about what we consider to be so basic and taken for granted. See: http://youtu.be/X4kp9Ciy1nE. I have to stop and wonder what human/computer relations will be 20, 30, 40, or more years into the future. Designing an external intelligence that is very mechanical – display, mouse, chord keyset, etc. will likely be replaced by something internal and biological. But how? What could it be?
A first step will be something like this “thought-controlled” Siri application: http://youtu.be/xFIRmnRHNUM. Just as with voice activation, the interface involves training to recognize prompts and converting them to machine readable commands. The headsets seem a bit awkward presently, but I’m sure they will shrink and disappear by being affixed to the skin (or embedded underneath). Better than having to ‘clap on’ and ‘clap off’ the lights.
Not so stylish now, but you know that will certainly change. Just think how much work you can get done in a meeting since you won’t have to talk or type to compose or communicate. You’ll look like you’re actually paying attention. So what will be the advantage? Seems simple enough, just think of commands or statements and they will be executed the same way keystrokes or mouse clicks are today. So efficient! (slight sarcasm). But what happens if I fall asleep with it on and it streams some of my very strange dreams into text or audio? Just don’t fall asleep with it on if you don’t want to be found out.
Just when we got kids off the couch with Kinect, they’ll be back on their butts with some new headgear. What happens if the communications is reversed? And the applications for surveillance, control and interrogation, a la MK-ULTRA? Hmm… I’m going to stop thinking about it now just in case I’ve already been implanted with one of these things.
Often cited in the bibliometrics literature, Vannevar Bush provided a vision for how scholars can think about scholarship. Instead of our narrow, disciplinary perspectives, his vision was about the “science of science” – keeping track of what we (and others) do for the purposes of better understanding what we (and others) do. “Science has provided the swiftest communication between individuals; it has provided a record of ideas and has enabled man to manipulate and to make extracts from that record so hat knowledge evolves and endures throughout the life of a race rather than that of an individual.” We may not give it much thought at the time, but the references we cite in our publications then becomes part of the record, establishing a pathway of connected ideas and influences. We are obligated to do so and we have a range of motivations for doing so (a topic for another day).
Later he states, “He has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory.” Information, artifacts of knowledge, and other such transactions are gaining speed in how quickly they are generated and it’s not often that we step back to examine these networks and patterns – yet more information. We have the technology, but do we have the time?
As Herbert Simon said in 1971, “In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention.” This becomes the challenge – in 140 characters or less – with a half-life of 2.5 hours.