Learning

This morning, I was seized by a whim to make myself a cup of coffee. It was only me in the apartment, so I just wanted a cup or two… not enough to fire up the entire coffee brewer. But as I pulled the french press off the shelf I realized suddenly that I had never made french press coffee on my own. Ever. And to that end, I’d never learned how to. What was I to do?

Modern-man as I am, I typed “How to make french press coffee” into Google, and the very first result (from www.howtobrewcoffee.com/french no less) gave me exactly the info I needed:

Medium-to-coarse grind
Water between 195 and 200 degrees F
Two tablespoons of grounds per eight ounces of water
3-5 minutes to brew

Along with a procedural list of steps, I had everything I needed. That was, until I realized that I didn’t have a way to measure ounces. Undaunted, I quickly Googled “8 oz in cups” and a handy-dandy volume conversion tool popped up with my answer already displayed. (after which I face-palmed the fact that I did not know that 8 oz is in fact one cup… an equivalency any baker worth his weight in cake should probably know).

I’m not going to lie: I made a pretty awesome cup of coffee. As I sat and enjoyed the fruit of my brief labor, I reflected on how empowered the internet via Google had allowed me to be. In mere milliseconds I had every bit of info I needed to get something done. It was all spelled out for me, black as coffee. Before this morning, I had no clue. Now, I can make it whenever I need to in the future.

I asked the internet how to make french press coffee, and the internet taught me how to make french press coffee. Huzza for the modern world.

But as I sat and sipped, I felt like there was something missing from my experience. Sure, I had more information now. I had the formula in my head “two tablespoons per eight ounces” and the new conversion knowledge “eight ounces is a cup.” And I had the rewarding experience of putting that information to work. So what was missing? I thought for a bit, and stupid as it sounded, I kept coming back to one single thought:

I hadn’t made a crappy cup of coffee.

With good information, I had done it right the first try. And while at face value such an outcome seems ideal, I realized that it was this lack of failure which I was feeling, and feeling negatively.

What I didn’t get to experience was the act of screwing up a cup of french press coffee and all the knowledge that such an experience would have given me. In the act of such a failure, there would have been a wealth of subtle bits of information not grasped consciously but absorbed unconsciously. In the back and forth of trial and error lay the possibility of more intuitive knowledge, a deeper mastery, a more comprehensive understanding of this mythical magical beast of french press coffee.

My discovery of perfect knowledge via Google did not lead me closer to any kind of mastery the way a failure would have. Had I experimented with the variables in this equation – the water temperature, the grounds to water ratio, the grind, the brewing time – I would have grasped not one useful set of variables to solve the equation, but a working knowledge of the equation itself; not just the elements, but the way the elements interacted together; not just a good solution, but a working understanding of why that solution was good.

The act of learning is not just receiving and assimilating a fact. It is also a discovery, a “working through” of the fact to the underlying process of how the fact works, how the fact is contextualized, and how the fact is connected to not just good information, but also bad information. To divorce the good information from all the bad information that provides its foundation is – with respect to mastery and real comprehension – to depower and devalue the very definition of knowledge.

The oft-lauded achievement of the internet and digitized information is that knowledge is now ubiquitous, easily available, and easily found. Even as I type and you read, the work goes on to reproduce, code, and index the entire library of human understanding. At no point in the entire history of mankind have we ever been this capable of sharing and producing information. It is a glorious age.

But there is something lost in this informational user-friendly omnipresence. I contend that just because information, and especially good information, is more easily found does not mean that we are all better equipped and more able, in a word “smarter” than we once were. Once the effort to obtain information has been reduced to its lowest common denominator, we no longer have to “learn” anything. We merely have to “look it up.” I did not have to “learn” how to make a good cup of french press coffee. I had to just look it up.

The ubiquity of online information and ease of use has re-inscribed us as creatures that have access to but no longer possession of information. Previously, I wrote of how we no longer own things in the digital age, but rather “stream” our possessions, and in so doing no longer pay for the right to own, but the right to borrow – the rights for temporary, mediated access. This paradigm extends to knowledge itself. The ultimate logical destination of this progression is us no longer constructing our own knowledge by way of experience, by experimentation, by learning. We only “access” information. We stream it from the internet. We no longer own anything.

Now that the supply of information is nearly limitless – indeed perhaps could be considered infinite – the demand for working knowledge may be waning. The premium seems to be shifting from mastery of knowledge to a mastery of the means of access to knowledge. In other words, it’s no longer a question of who knows more trivia, but a question of whose phone has fastest access to the internet, who has the best app, and who can use that app most effeiciently and effectively. The value has shifted from the content itself to access to content.

I’m not trying to devalue the usefulness and power of the internet as a tool. After all, it certainly gave me a great cup of coffee this morning. But we must be careful that the ease of access to and the ubiquity of information does not take the place of the process of discovery and learning itself. I feel it would benefit us if we keep a vigilant eye to how the Google search begins to take both the role and the name of “learning” as an action.

When “looking it up” relegates “figuring it out” to obsolescence, whole worlds of deeper knowledge – worlds created in self-actualizing, existentially fulfilling acts of productive failure – may be lost.

Grounded

The other day, a friend of mine here at Virginia Tech (a fellow student, though a fiction writer in the MFA program) asked me what my thesis was about. I began talking in earnest about how cultural productions are continually evolving sites rich with interaction between subjects, objects, and the ideologies and structures of social power that inform and are informed by the relationships between them and how an updated understanding of the critical paradigms which may be used to understand these interactions must be continually refined and updated, which is why I’m hoping to proffer (or at the very least, review) an updated interpretative framework that can function productively for reading the modern texts of cultural life and identifying the themes, symbols, and language that determine how culture disseminates ideologies of class, race, gender, sexuality, politics, and other crucial concerns with respect to the globalized, post-postmodern world and vis a vis modern feminist theory as it pertains to subject-object construction, identity formation, and consciousness/reflexivity as a means of developing agency, identity, and mobility within and as actions of transgression against such cultural ideological systems…

And I think it was about at “continually evolving sites” when here eyes started to glaze over, somewhere around “updated understanding of the critical paradigms” when she whipped out her phone and started texting, and near “globalized, post-postmodern world” when she literally turned away from me and started talking with the person sitting next to her.

And really I don’t blame her. At the time, mid-sentence, I was actually starting to wonder how loaded I really was.

This isn’t to say that I was talking nonsense. I happen to believe that I might be on to something really interesting here. But her lack of interest was proof enough that sometimes (or often, depending on who you are and what you’re doing) the language and even topics that we engage with in the scholarly world are not welcome in places like restaurants and bars where the interaction is casual and the answer to any question is expected to be tweetable. The sting that I felt at her lack of tact notwithstanding, I appreciated her refusal to engage with the topic because it manifested a fear of mine that’s grown larger for me the further I climb towards my PhD: the fear that as my work grows more interesting and relevant to intellectuals and scholars, it grows less so to everyone else.

This is why the idea of a blog/YouTube channel as an offshoot of my academic self seems absolutely critical. Especially if my end goal is to offer people (my students, my peers, my readers, my instructors, my future kids) new ways to understand the world around them, then above all else my ideas not only have to be complex and nuanced, they have to be tweetable; they have to be bar-room ready; they have to be grounded not only in a bedrock of previous research and a coherent network of scholarly discussion, but also in the the real experiences, real language, and real world of real people.

This is why I like to keep my blog somewhat irreverant. Why I write fragmented sentences like this one. Why I let the (sometimes not-so-) occasional obscenity stand proud on the page. Why I rant and joke (hopefully) just as much as I muse and ponder. Why I want a YouTube channel that subscribes to the VlogBrothers and PBSIdeachannel and Feminist Frequency and puts out videos that reference Bronies just as much as Judith Butler.

Not only do I want to be out there in academia, I want to be out there in the world. I want my work to be grounded in the conversations I have sitting on barstools with fiction-writing friends. Grounded in the knowledge that relevance has not been sacrificed for complexity and depth. Grounded in the real life of real people.

My greatest fear is not that my work will be irrelevant to the academic community, but that it will be irrelevant to everyone else.

Audience & Comments

I have to begin this post with the unabashed statement that I am a blatant narcissist. And to that end, I don’t usually like writing blogs online because I’m convinced that the effort is only so much more noise in a vast sea of cacophonous drivel. The feeling runs parallel to the very idea of a Facebook page or a Twitter account… the idea that anybody really cares about what I’m currently having for dinner, how pissed off I am about Futurama being cancelled, or how awesome I think a hand-carved Stormtrooper coffee mug is.

The confession to narcissism might seem to counteract this. After all, aren’t Facebook and Twitter the ideal answer to every narcissist’s most fervent prayers? The ultimate digital tools to draw attention to the self?

True… so perhaps I’ll modify the term: I’m a realistic/pessimistic narcissist (the “slashed” term giving a nod to those who would argue it only natural for the two words to be conflated). Sure, I want everyone to be interested in my life and what I have to say. But I believe on some deeper level that nobody really much cares, because it doesn’t really much matter.

I wonder sometimes if others fool themselves as I do… thinking that perhaps some random internet surfer will ride a digital wave into my part of town and be snagged by the elegance of my prose, miraculously “discovering” my site. But really, how often have I just surfed around reading the random blog and finding something worth connecting with?

And at the end of the day, if we’re being completely honest with ourselves, how many of us maintain a steady stream of mental production worth supporting a blog? At the end of the day, perhaps the dread centers not around how many people look at my blog postings, but the cold hard fact that I really do have nothing worthwhile to say.

All of this is counteracted by the comment. The glorious, miraculous comment that for a brief moment shatters the conceptualization of the blog as just more ignored noise in a vast sea of ignored noise. Suddenly, a user steps from the shadows, not only declaring their existence but simultaneously confirming that they have read my blog and, dare I believe it, have been stimulated enough to respond to it. Oh happiest of days! Such glorious payoff!

If you dear reader are not quite sure as to how much sarcasm that last paragraph contained, that makes two of us. On the one hand, the presence of comments does give me a kind of leap inside as a confirmation of my own self-affirmation. On the other hand, how much self-worth am I really locating in that faceless, nameless crowd of potential commentators, or worse yet, tick-marks that make up the pitiful number of page-views my plugin counts for me, as if assigning a number to the lack of readers would somehow make the fact easier to bear?

Is blogging a way to escape or nourish my narcissistic impulses? Does it and the so-called (and the actual) readers confirm the validity or necessity of the activity, or are they just the elements of a support system that continually decenters my self-worth and displaces it into the abstract concept of my supposed readers? And if not that, is it an exercise in futility? A tree falling in a forest with nobody around to hear it? If nobody hears me, am I really blogging at all?

Dependence

Now I don’t want to sound like a curmudgeonly old goat who refuses to embrace any new-fangled fancy-schmansy technologimical whatsits because they’re leading to our inevitable destruction – a digital hand-basket in which we are all enthusiastically riding to hell…

but oh who am I kidding, I quite enjoy sounding like that. Doom-mongering is all sorts of entertaining because if you’re right you get to say “I told you so” and if you’re wrong it doesn’t matter because nobody took you seriously in the first place anyway.

But with that being said, I’ll state for the record here that I’m just not comfortable with the level of dependence that new advancements in digital technology are pushing me into.

Let’s take music for example. It used to be, in the good ol’ days, we had to go to the store to buy music. We got a nice shrink-wrapped package that held a disc with a cool design on it and colorful liner notes with the lyrics printed out and everything. And when you took that CD home is was YOURS because you could hold it in your hand, and you could take it anywhere and play it in anybody’s car or boombox or home stereo you wanted. You could give it to your friends and get pissed as hell when they put new scratches on it. At the end of the day, come hell or high water, you could hold that CD and know, without a doubt, that Limp Bizkit’s Chocolate Starfish and the Hot Dog Flavored Water was yours to treasure as long as you possessed that 30 grams of aluminum, lacquer, and plastic.

Now we buy our music from iTunes and hope to God that whatever new extension and format it comes in isn’t packed with DRM nonsense that only lets us play it on one computer, or not be able to burn to a disc, or is a really shitty sample rate, or will commit suicide and refuse to play after X many days, or, worst of all, requires a connection to the Internet. We stream our music now from YouTube channels, Spotify, and Pandora radio. The music we “own” exists somewhere in zeros and ones and we don’t even own those. We own the right to ASK for the music to play, not the music itself.

The newest SimCity title released by EA Games requires an Internet connection to play the game. This is known as “always online digital rights management” and was incorporated into the game ostensibly to facilitate some number of the game’s unique features, but in reality, it was merely an operation to crack down on piracy. This means that if my Internet connection goes down, or worse, the company servers go down, I can’t play the game. I could own the damn thing, own a computer that could run the program, AND have an Internet connection, but if for any reason the company servers are offline, then the game will refuse to play, even though my operating system is perfectly capable of running the code all on its own, even without an internet connection.

As long as I can find a wall socket and some kind of display (tinkering with the input method notwithstanding) I can play Super Mario Brothers to my heart’s content any damn time I please. Don’t have an internet connection? No problem! Nintento goes completely out of business? No problem! I bought that game with my own money (given to me for doing my chores and mowing my parent’s lawn) and I have the right to play it whenever I like regardless of what state the company or my internet connection is in.

But if EA for whatever reason were to go out of business, then no more servers, no more service. Suddenly, everyone who dropped that 60 bucks on SimCity gets a big middle finger as every copy of the game suddenly becomes forever after unplayable. This is exactly what happened on the day of the game’s launch. Hundreds of people who had paid full price to get the game on launch day couldn’t play the title they had just purchased because EA’s servers were having some sort of conniption.

That’s horseshit.

And I see it everywhere. Kindle asks you to forgo that bulky physical library. Your device can carry thousands of books! Yes, until your device breaks… then your thousand books vanish. Until you want to share one of those books with a friend. Until the power goes out and you didn’t charge your device up. Until (if your books are in some sort of “cloud”) a server error decides to delete your library, or until whoever is ACTUALLY owning and possessing that data is compromised.

If Armageddon comes tomorrow, I’ll be reading my copy of World War Z in the what I’m sure will be abundant firelight.

 

I pay money to POSSESS, not to BORROW.
Advancements in digital technology are quietly pushing us toward more dependency on the companies and servers and data that we do not in fact control. Data is now clouded, video games are played on external servers, programs are run remotely. Products are becoming inexplicably tied to the means by which they are purchased. What happens to your music library if iTunes or god forbid the great and powerful Apple is compromised? And the further we move into “virtual” ownership, the more power we place into the hands of those that ALLOW us to own. When ownership is empowered by that which is outside of my control, it is not real ownership at all. It is an illusion.

 

We are iTunes customers before and as a prerequisite to being music owners.

The more we move toward a virtual world, the more that money and time and products and relationships and media become a series of ones and zeros that exists simultaneously everywhere and nowhere, the more we must rely on the system that keeps those ones and zeroes moving. And because of the nature of the digital world, I think it is more volatile, unstable, and open to compromise than most people think.

Time

<insert obligatory “Ain’t nobody got time for that” gif here>

I have not inserted the obligatory “Ain’t nobody got time for that” gif above to facilitate my own crude and I’m sure original bit of irony: I simply don’t have time for it.

I don’t have time to go tracking the gif down, then figuring out whether I can remote source the image or if I have to download it, then once downloaded, figuring out how to incorporate it properly into this blog post, and all the while wondering if I’m breaking any sort of copyright laws or trespassing on any intellectual property, either for the original source of the gif, or the guy/gal who made it into a gif in the first place…

And I’m sure the simple argument against this is, “Actually, yes, there are a number of people who have time for it.” KT’s blog (<hey, at least I had time to make up that brilliant hyperlink) is a wonderful shining example of a deft skill in everything that I seem to have no time for.

But really I don’t. I’m typing this frantically in the 10 (scratch that, just looked at the clock, 5) minutes before I have to get to one of my classes, after which I have a mountain of work to finish in the next couple days that I don’t know how I can possibly complete unless I just don’t sleep – which I probably won’t anyway because if I try, I’ll be too busy worrying about all the things I could be doing rather than sleeping.

The general prompt that instigated and ostensibly pushes forward this blog stands behind my professor saying that “You don’t need to make the entries perfect, or polished or anything like that. You just need to do them.” Or something like that. I don’t have time to pull up the blog for the class, find an exact quote and paste it in here. <Ain’t nobody got time for that.gif>

Which seems to be a bit ironic considering that once something goes online, it becomes available to a massive audience and, arguably, can never be deleted or destroyed. Of all the places one’s writing could appear, it seems to me that the internet is the absolute worst place for one to skimp on the polish, or thought, or care. But, say hello to irony again, the internet is filled with people at their sloppiest and people at their worst… as I perhaps am now.

It is one of my goals, yet another one of my tasks, to bang this blog out till the end of the semester in spectacular form, writing 3-4 posts a week to make up for the miserable dearth of work I’ve presented up to this point. But just know (you interwebs, you faceless audience that could consist of everyone and no one, knowledge of my rhetorical situation be damned) that I do not do so willingly. That is, I am not comfortable with this image that I am presenting to you. Instead of paparazzi waiting behind bushes to snap the opportune picture and paste it to a “what do they look like without makeup?!” tabloid article, I am putting my phone camera up to the mirror in the morning, snapping my untouched face, and presenting it for the world to see with full knowledge of the lack of presentation. Just don’t ask me to be okay about it.

</genderbending metaphors>
</rant>
</great, now I’m late for class>

<I told you I didn’t have time for that.gif>

Ennui

I am an English major, and as such, I’ve done my fair share of reading. There is something about reading that is very insular in that it boxes one into the world of the book. It engages one’s imagination to start constructing the characters, what everything looks like, how everything moves and sounds and behaves.

But it’s all going on inside one’s head. The reality of the situation is a lone individual, sitting in the quiet with her head bent, eyes down, motionless, a body at complete rest, only the cogs of the mind whirling away.

There is nothing inherently wrong with this picture. In fact, we as a society are always trying to get out children to read MORE, to spend more time sitting alone with a good book, rather than staring at some kind of screen.

But really, is reading a book all that different from staring at a screen?

In both cases, we sit motionless, soundless, usually alone. The television screen substitutes for the imagination a bit more, but how different really is the virtual world projected on the screen that much different than the virtual world created from the pages of a book?

For someone like me, someone who had done his fair share of reading and also someone who is constantly online, who fills his recreational time with movies and television shows and video games and other books, writing on Friend’s facebook walls, reading interesting articles online, tweeting, blogging…

none of this is REAL. That is, while this prodigious amount of activity is supposedly happening, I cannot help but sense how dead I am to the world around me. It is as if my brain were plugged into the Matrix yet I had just started to become aware of the plug at the bottom of my neck, could start to sense that as I talked with a friend or ate a steak, I was really laying motionless in a vat of pink jello.

In the books that I read, there are always marvelous descriptions for characters. There is always something great behind a character’s eyes or their smile. They do dramatic things, the run and jump and fight and die and laugh and all of it is happening TO them.

I think sometimes if someone were to write a story about me, how would they describe my eyes? They wouldn’t: they would never have seen my eyes. Perhaps a picture of me would surface online of my physical presence at some kind of gathering, but really, it would only be a representation of me.

In my grad school life, for the past year now at least, my world has become increasingly digital. Sure there is interaction in person during class periods… but conversation there is taken up by theory, or analysis, or diving into a literary or virtual world. When do I talk about anything that really matters?

I have a brief 10 minute walk to and from my classes, but during that time, I put in ear-buds and crank up dubstep like it’s nobody’s business. I look down at the ground, too insecure to face the possibility that somebody might make eye contact. If a man, it might provoke a challenge. If a woman, it might be misconstrued as lascivious. So I bury my eyes into the cracks of the pavement and bang on with Skrillex until I reach my next quiet and insulated room.

I come home and read books in silence, constructing the worlds and people to my on specifications. For a break, I watch reruns of The Office with my wife, or the newest episode of whatever cop drama happened to release something new on our Hulu cue. We talk about the characters in terms of how believable or unbelievable they are. We wonder how close to real life the drama is. We applaud that which comes the closest.

We don’t look at each other when we have the conversation. Sometimes, I don’t even watch the show. I turn to my eye pad and play a match 3 game insessantly, looking up every now and then at the important parts and listening to the rest.

I know that over the course of the Digital Self class, we have looked at a large amount of research that has done away somewhat with the gloom and doom that some like to sling at the digital world and our increasingly online lives. But perhaps I, as an example of someone who is very isolated and “virtual” in more ways that one, can offer some anecdotal evidence to the contrary.

I feel increasingly like a hollowed out tree – something that is constantly filled up, yet forever, characteristically, empty. I feel unreal. I cannot remember the last time I had a real conversation with a real person, face to face. The last time I went outside for some reason other than to get somewhere else. I am typing this blog entry sitting alone on my bed in my room. My wife is in the next room, watching television. We are both sitting nearly motionless, staring, not speaking, hardly even breathing, while nothing happens to us at all.

Damn the evidence, but this is no way to live.

The Setup – What do I use to get stuff done?

(I’ve borrowed this posting idea from The Setup, as referenced in the title.)

Who are you, and what do you do?

My name is David Calkins, and I am a grad student currently working toward my Masters degree in English at Virginia Tech. I’m interested in critical theory and its application to American cultural artifacts, media, and daily life. My research interests also include modern feminist theory, film & music studies, and modern American literature. I also teach freshman composition at Virginia Tech.

What hardware do you use?

My primary research and writing tool is my Dell Inspiron 17 laptop which I inherited from my late father-in-law. Up until only a couple months ago, this was my only device and my only access to the internet, but recently my technology collection has expanded dramatically. I now include my smartphone – a Motorola Droid Razr – and an iPad (lent to me by my university) among my hardware options. I should also include my PlayStation 3 in this list, as I don’t subscribe to any kind of network television, and most if not all of my media browsing in terms of television shows, youtube channels, music, and film all filter through this device. And finally, an old Maxtor One Touch external hard-drive handles any of my backup needs.

 

And what software?

My laptop runs Windows 7, but I am too unfamiliar with my other devices to know what OS each might be using.

A big part of my composition process, believe it or not, is the program Notepad – a basic word processor that comes standard with most versions of Windows. I prefer its extremely basic approach, the fact that I never get any red or green squiggles when I’m writing, no auto-formatting, no correcting. Just me and the text and nothing in-between. Nothing to fiddle with, nothing else to worry about except for writing. I’ve used this little gem in the early stages of almost all my composition projects for as long as I can remember and I love it. I’m even composing this blog entry in a notepad window.

I also use Office Suite to add final polish to my work and to run spellcheckers, do formatting, etc. I handle grading for my classes with Excel spreadsheets.

I’ve recently taken to reading class texts with my Kindle app on my iPad, but I have to admit that I still prefer hard copy over digital as far as reading is concerned. Otherwise, having a digitized and more importantly *searchable* text has been extremely beneficial to me, and I often wish that I had access to both hard copy and electronic versions of my texts and research materials.

I also utilize a Dropbox application to keep certain files in the cloud and synched one to another. Calendar and task list applications on my phone synch to my Google account online and help me keep track of due dates, meetings, and project deadlines. I also have my phone programmed to remind me to attend events and have a number of “SmartActions” set to silence my phone whenever my schedule designates my time as “busy” (or “asleep” for that matter). Finally, I adore the e-mail applications on my phone which allow me to handle incoming messages in real-time and keep tabs on my inboxes so they don’t build up over time. I hate messy inboxes.

Also, a small thing but a great one: I really enjoy using the Swype keyboard on my phone.

What would be your dream setup?

Though I’m not a huge fan of Apple and their products, I would feel more at ease if somehow my phone, laptop, iPad, etc. were all connected, all synced, and all talking to each other. At the moment, I have issues with certain applications on certain systems not updating when they should or not keeping in perfect sync with others, and it tends to interrupt my workflow. Also, I would very much prefer it if ALL of my research, writing, and teaching materials were available to me in the cloud.

My laptop is a bit too large with a 17.3″ display and I would prefer something much smaller and lighter to carry around with me. I’ve been using the iPad to accomplish this so far, but I absolutely hate having to type on its screen which makes it useless as a wordprocessor for me.

And though I don’t need one, I would love to have some kind of solid home system (like a desktop unit) in a designated area for working at home. I like the ability to use my laptop anywhere, but when I’m at home, I need designated spaces for work, play, and relaxation otherwise they will start to bleed into each other and interfere with what I need to be doing.

Recentering

Mental map of research processTo the right is a picture of the mental map I recently drew in my Digital Self to describe my “process” (if you could call it that) when conducting research, putting a project together. Something I had assumed that my map would have in common with other students’ maps was the presence of an avatar for “me” at the center with all of the different streams coming into that locus. However, I found that most of my peers had their laptop at the center of their diagrams.

I once read in an article somewhere that an interesting way to solve problems is to literally sleep on them. The article claimed that once you set your brain on figuring something out, even if you weren’t thinking of it consciously, your mind was still working things out unconsciously. This was meant to explain the way epiphanies will sometimes strike at odd moments that are totally unrelated to the thought you were having. It’s that classic moment when you can’t think of that book title or that actor’s name and then two hours later in the middle of doing dishes, you suddenly exclaim out loud “Dowager Countess of Grantham!”

Maybe it started with reading that article, who knows, but I seem to have developed this notion that I have to keep everything in my head at all times, just so that my unconscious brain can keep working on everything when I’m not looking. I figure that if I am the ultimate funnel point for all things, then somewhere in the pillowcase worth of grey folds that make up my brain will all the information be magically stored, codified, and worked on, even though (and especially when) I’m not thinking about it.

Sadly, this is not the case.

More often than not, I find I’m quite forgetful and I’ll lose an idea if I don’t write it down. Put simply, this strategy just does not work. And in the information overload that is the digital age, I cannot afford to keep working as if I am not bionic, as if I were not supported by a vast network of tubes and microchips that can store and recall information so much more accurately and easily than I ever could do under my own power.

Perhaps this method has served me well in my undergraduate work and before, but now that I am tackling graduate-school-sized work, I just can’t do it any more. I’m only two semesters in and I’ve already had enough mental breakdowns to last a good long while.

It’s time to recenter.

Mo’ Tech, Mo’ Problems

During the last few months I have been whiplashed into the modern age when it comes to tech. I’ve never been an “early adopter” – a term I recently came across in Networked that describes those people on the cutting edge of culture that love to try out new gear the minute it hits the public market. It’s not that I don’t love new technology. It’s just that I see it as yet another thing I have to do.

It was bad enough when I was backed up with 12 different hardcover books to read, sitting forlorn on my bookshelf chastising me for what a bad English MA student I was for not reading them. But now I’m expected to keep up my Facebook profile, incessantly check my twitter feed, update my blogs, dynamically respond to my e-mails. And this is all just normal digital mechanics. Now there are pinboards and forum walls and stumbleupons and wiki pages, all of them yelling at me, aren’t you interested in all this? Don’t you want to check this all out?!

And I can’t, I just can’t. There isn’t enough time in the day. A couple months back, I decided that I would aggregate all the internet activity I could think of into a single RSS feed in an attempt to bring this massive amount of data and activity into a controllable and efficient form that I could get done and out of the way. But it didn’t work. True, I wasn’t missing a beat, but the beats just kept on coming. There was no end in sight. All the blogs I liked to read, the new YouTube videos to watch, the boards to check, the entries to write, the status updates and twitter posts to check, the e-mail to read, the pictures to browse, the weather to check, the news to investigate… it eventually formed an enormous circle that threatened to ensconce me forever in the digital world. I called my feed the GodFeed and Nietzsche would have been proud the day I laid it to rest.

I’m reminded of Tyler Durden in Fight Club:

No fear.  No distractions.  The ability to let that which does not matter truly slide.

And that is the digital world really – chock full at every turn with that which does not matter. I’ve heard it said that the digital age is slowly but surely distracting the masses, and for the most part this is true. I found a 3 minute video of a cat randomly batting piano keys on YouTube that had been watched 24,593,828 times. That’s 132 years worth of human endeavor diverted to Nora the Piano cat.

Such is the new world. In this new age, perhaps, it won’t be those who have more drive, or more resources, or intelligence, or any of the other factors that built toward success in the past. Perhaps now, it is FOCUS itself that will be the determining factor on who gets things done and who doesn’t.

God save us all!