Seven Years, and How Social Software eats everything

I can’t believe I’ve been “blogging” for over seven years. How the hell did that happen?

Actually, I think it was longer — if I remember correctly, my first blog was on some service whose name I simply cannot remember now, until I ran across Blogger in 2000. Then I switched to there, using their service to run a blog I hosted on server space my then-employer let me use for free, and even let me use their nameserver for my domain name … drewspace.com. That name is now gone to someone or something else. But I did manage to suck all the old archives into my web space here. Here’s the first posts I have a record of, from August 2000.

This boggling (bloggling?) stretch of time occurred to me once I saw Ross Mayfield’s recent post about how he’s been blogging for five years. Of course, he’s much more industrious than I, what with a company of his own and writing that’s a heck of a lot more focused and, well, valuable. But of course, social software has been his professional focus for quite a while, whereas for me it’s been more of a fitful obsession.

“Social software” is turning out to be the monster that ate everything. Which only makes sense. The Web is inherently social, and so are human beings. Anything that better enables the flow of natural social behaviors (rather than more artificial broadcast/consume behaviors) is going to grow like kudzu in Georgia.

Anybody thinking of social software as a special category of software design needs to wake up and smell the friends list. Everything from eBay to Plaxo is integrating social networking tools into their services, and Google is looking to connect them all together (or at least change the game so that all must comply or die of irrelevance).

Poor old blog

I looked at my blog (this thing I’m writing in now) today and the thought that surfaced, unbidden, was “poor old blog.”

I felt bad because I haven’t been writing here like I used to, so sure I get the “poor” part — poor pitiful blog that isn’t getting my attention.

But where on earth did “old” come from? Besides the fact that “poor old whatever” is a common figure of speech, it felt a little shocking coming to the front of my brain while looking at a blog. I mean, I wouldn’t say “poor old iPhone” if I hadn’t picked one up for a week (and if I owned one to begin with).

I mean, blogs are still new, right?

But here he is (I’m convinced my blog is a “he” but I have no idea why, really). My blog, like all the other blogs, just taken for granted now. Blogs — part of the permanent landscape, like plastic grocery bags and 24 hour gas stations.

It was such a big deal just not long ago, but now here they are, blogs, sitting around watching other, younger, nimbler channels giddily running around their feet without a care in the world. The Twitters, Jaikus, Facebook apps. The Dopplrs, Flickrs and the rest.

It’s like somebody took a hammer to the idea of “blog” and it exploded, skittering into a million bits, like mercury.

So that’s what’s been up. I’ve been twittering, facebooking (yeah, it’s a verb, as far as I’m concerned), text-messaging… even the occasional “instant message” through the venerable old AIM or iChat, even though now that’s starting to feel as antiquated as a smoke signal or carrier pigeon.

If there was ever any chance of keeping focus long enough to write sound, thorough paragraphs, lately it’s been eviscerated to a barely throbbing stump.

I wonder if my poor old blog will rally? If it’ll show these whippersnappers it’s not done for yet? Like in the sports movies, you know, where the old batter who everybody thinks is all washed up slams another one over the bleachers?

I don’t know. All I know right now is, there’s my blog. With its complete sentences, its barely-touched comment threads. Its antiquated notion of being at a domain-named location. Its precious permalinks & dated archives, like it’s some kind of newspaper scholars will scan on microfiche in future generations.

Doesn’t it know that everything’s just a stream now? Everything’s a vapor trail?

Poor old blog.

Conference fatigue

I have it. I’ve been to several in the last 6 months, and I’m now burned out. For a few months anyway.

But UX Week was actually very good. Top-shelf in fact.

So, there it is, a glimmer on my blog that I am in fact still alive. I have lots of bloggy ideas, but they’ll have to wait until I can think in more than 2-3 sentences at a time.

OK Computer x 10

OK Computer Cover - Wikipedia

I don’t know the exact release date, but I do know that it was right about ten years ago that I first heard OK Computer.

In May of ’97, I had just finished my MFA in Creative Writing at UNC Greensboro. But I had no job prospects. I’d had a job lined up for me at a small press out of state, run by some dear friends and mentors of mine, but money issues and a new baby made it so at the last minute, I had to turn that opportunity down. (I handled it horribly, and lost some dear friends because of it.)

My future, or my identity in a sense, felt completely unmoored. The thing I’d assumed for two years I’d be doing after finishing my degree was no longer an option; I’d fallen out of love with teaching, and didn’t really have any good opportunities to do that anyway. All I had going was this nerdy hobby I’d been obsessing on for some years: the Web.

So, I needed a job, and ended up talking my way into a technical writing gig in the registrar’s office of my MFA alma mater. I wouldn’t be editing poems and fiction for a press or a journal (as I’d gotten used to doing and thinking of myself as doing) but writing tutorials and help instructions for complicated, workaday processes and applications. But at least I’d be on the “Web Team” — helping figure out how to better use the Web for the school. I’d been working with computer applications, designing them and writing instructions for them, off and on in my side-job life while I’d been in grad school, so it wasn’t a total stretch. It just wasn’t where I imagined my career would take me.

That summer, in a fit of (possibly misplaced) optimism and generosity, my new employer sent me to a posh seminar in Orlando to learn better Photoshop skills. And one of the presenters there was the guy who made some of the most collected X-Files trading cards around, and an acknowledged wunderkind with digital mixed-media collages. (Cannot find his name…)

As I was waiting to see this guy’s presentation, and people were filing into the presentation room, he was setting up and had a slideshow of his creepy graphics going onscreen. And this spooky, ethereal, densely harmonic, splintery music was playing over the room’s speakers. I was feeling a little transfixed.

And, of course, when I asked him later what it was, he said it was Radiohead’s OK Computer.

Here’s the thing: I’d heard Radiohead interviewed on NPR by Bob Woodward about a month or so before, where they discussed the new album, the band’s methods, how they recorded most of it in Jane Seymour’s ancient country mansion. And they played clips from it throughout, and I remember thinking “wow, that’s just too over the top for me… a little too strange. I guess I won’t be getting that album — sounds like experiment for its own sake.”

It’s just one of a thousand times this has happened to me — conservative, knee-jerk reaction to something, only to come to embrace it later.

Something about this music connected with me on a deep level at that time in my life, and through a lot of things going on in my own head. It *sounded* like my own head. And, to some degree, it still does, though now I feel it’s more of a remnant of a younger self. Yet this music still feels quite right, quite relevant now, but I hear different things in it.

So. This just occurred to me. Had to share. I’m on record as a huge Radiohead fan, even though I realize this isn’t exactly a unique thing to be. I’ve found every release of theirs to be fascinating, challenging, and rewarding once it has a chance to settle in. (Not a huge fan of Thom Yorke’s solo effort, but I’m glad it’s out of his system, so to speak — then again, who knows, four years from now it may be my favorite thing ever.)

They have a new album coming out sometime this year, if all stars align correctly. Can’t wait.

Everybody's a Cartographer

Wired has a great story explaining the profound implications of Google Maps and Google Earth, mainly due to the fact that these maps don’t have to come from only one point of view, but can capture the collective frame of reference from millions of users across the globe: Google Maps Is Changing the Way We See the World.

This quote captures what I think is the main world-changing factor:

“The annotations weren’t created by Google, nor by some official mapping agency. Instead, they are the products of a volunteer army of amateur cartographers. “It didn’t take sophisticated software,” Hanke says. “What it took was a substrate — the satellite imagery of Earth — in an accessible form and a simple authoring language for people to create and share stuff. Once that software existed, the urge to describe and annotate just took off.”

Some of the article is a little more utopian than fits reality, but that’s just Wired. Still, you can’t deny that it really does change, forever, the way the human geographical world describes itself. I think the main thing, for me, is the stories: that because we’re not stuck with a single, 2-dimensional map that can only speak of one or a frames of reference, we can now see a given spot of the earth and learn of its human context — the stories that happened there to regular people, or people you might not otherwise know or understand.

It really is amazing what happens when you have the right banana.

Vegas Lingers

This is Vegas

It’s easy to overlook them. The Skinner-box button-pushers, watching the wheels roll and roll. Surrounded by a ‘paradise’ that still leaves them wanting — and thinking they’ll find it like this.

Vegas was a mixed bag. I guess I’d always seen so many glamorous photos and film shots, even the ones that tried to be ‘gritty’ still managed to put a sort of mythic gleam over everything.

But it’s not mythic. It’s plastic. It’s the progeny of a one-night stand between the Magic Kingdom and TGI Friday’s. Inescapable throngs of flip-flopped, booze-soaked denizens, eyes bugged wide by the promise of … what? I’m not even sure. Entertainment, certainly, but another flavor invades the way saccharine crowds and leaves a film over any other flavor. Luxury, perhaps. Richness of the kind that first comes to mind when someone says “rich”: Trumpism mixed with Hollywood ersatz.

I don’t mean to be so down on it. Really. I’m a big fan of decadent, crazy, outrageous kitsch. But this somehow was so overwhelming, it wasn’t even kitsch. (Definitely not camp.) Now I understand why U2 filmed the video for “I Still Haven’t Found What I’m Looking For” here so many years ago — and that was before it was injected with virtual-reality steroids.

The conference was terrific (except for having trouble escaping the waves of noise and humanity to have a decent conversation). I’m amazed the team made it come together as well as they did given the circumstances.

Fortunately, most of the time was much happier than I’m letting on here. Check out my iasummit2007 Flickr stream.

For the record: I know plenty of people enjoy Vegas a great deal, and they have fun gambling and seeing shows and everything, and I think that’s actually really great. Some of my family enjoy doing it from time to time, and they seem to always come back smiling. I think it just hit me in a strange way on this trip — but I’m always like that; if there’s a silver lining I’ll find a cloud. I just can’t help noticing the souls that seem to be a little lost, a little vacant behind the marquee-reflecting eyes. But hey, that’s just me.

The Incredible Power of the "Side Project"

Colleague Michael Magoolaghan passed along a link to the transcript of Tim Berners-Lee’s testimony before Congress.

Hearing on the “Digital Future of the United States: Part I — The Future of the World Wide Web”

It’s fascinating reading, and extremely quotable. But one part that really struck me is in the first paragraphs (emphasis added):

To introduce myself, I should mention that I studied Physics at Oxford, but on graduating discovered the new world of microprocessors and joined the electronics and computer science industry for several years. In 1980, I worked on a contract at CERN, the European Particle Physics Laboratory, and wrote for my own benefit a simple program for tracking the various parts of the project using linked note cards. In 1984 I returned to CERN for ten years, during which time I found the need for a universal information system, and developed the World Wide Web as a side project in 1990.

While TBL didn’t invent the Internet entirely, his bit of brilliance made it relevant for the masses. Even though that wasn’t his intention right off the bat, it became so as he realized the implications of what he’d done.

But let’s look at the three bits in bold:

  1. He started studying Physics, then decided to follow a side interest in microprocessors.
  2. He created an e-notecard system for himself, on the side of (and to help with) what he was contracted to do at CERN.
  3. He developed a universal version of his notecard system so everyone could share and link together, as a side project in 1990.

Imagine the world impact of those three “side projects”?

This really begs the question for any organization. Does it give its members the leeway for “side” interests? Are they considered inefficient, or just odd?

It’s not that every person is going to invent another Web. It’s more that the few people who might do something like that get trampled before they get started, and that the slightly larger group of people who might do something merely impressive are trampled in the same way.

There was a time when amateurs were the experts — they were the ones who dabbled and learned and communicated in excited screeds and philosophical societies. They were “blessed” to have the time and money to do as they pleased, and the intellectual curiosity to dig in and dirty their hands with figuring out the world.

It could very well be that we’re in the midst of a similar rush of amateur dabbling. Just think of all the millionaires who are now figuring out things like AIDS, malaria and space flight. Or the empowerment people have to just go and remix and remake their worlds. There’s an excellent O’Reilly Conference keynote I wish I’d seen, but the pdf of the slides gives a decent accounting. Here’s an abstract:

Rules for Remixing; Rael Dornfest & Tim O’Reilly

Citizen engineers are throwing their warranties to the wind, hacking their TiVos, Xboxes, and home networks. Wily geeks are jacking Jetsons-like technology into their cars for music, movies, geolocation, and internet connectivity on the road. E-commerce and network service giants like Amazon, eBay, PayPal, and Google are decoupling, opening, and syndicating their services, then realizing and sharing the network effects. Professional musicians and weekend DJs are serving up custom mixes on the dance floor. Operating system and software application makers are tearing down the arbitrary walls they’ve built, turning the monolithic PC into a box of loosely coupled component parts and services. The massive IT infrastructure of the ’90s is giving way to what analyst Doc Searls calls “do-it-yourself IT.
We see all of this as a reflection of the same trend: the mass amateurization of technology, or, as Fast Company put it, “the amateur revolution.” And it’s these hacks, tweaks, re-combinations, and shaping of the future we’re exploring in this year’s Emerging Technology Conference theme: Remix.

I saw Mark Frauenfelder on Colbert Report last night, talking about Make Magazine and the very things mentioned in the abstract above. Colbert marveled at the ingenuity, and I wondered how many people watching would think to themselves: “Hey, yeah! Why not just take things apart and change them to the way I want them???”

It’s on the rise, isn’t it? Wow. Another sea change, and I’m not even 40. What a time to be alive.

All hail side projects and passionate tangents. Long may they reign.

But for now… I gotta get back to work.