Information Overload, 140 Characters at a Time
July/August 2009 (Vol. 13, No. 4) pp. 4-5
1089-7801/09/$31.00 © 2009 IEEE

Published by the IEEE Computer Society
Information Overload, 140 Characters at a Time
Fred Douglis, · Data Domain · f.douglis@computer.org
  Article Contents  
Download Citation
   
Download Content
 
PDFs Require Adobe Acrobat
 
OK, I lied. I said in my last column that I expected to talk about cloud computing for the next few issues, leading into the special issue on the topic next time around, but in the meantime I had an epiphany. Or perhaps I was merely deluged with so much news on the topic of social networks, and especially Twitter, that I couldn't fend off the idea that this called for immediate commentary.
Of course, I've been hearing about Twitter for some time, and I've even tried it myself a bit, but mostly to follow other users and not to subject my few followers to my own pronouncements. Don't try to look me up, though — for now, my only account isn't associated with my real name. I'll probably create another public account there in the near future if I actually decide to publish, rather than perish. But I get ahead of myself.
It's clear that Twitter has taken the usual evolutionary step from geek niche to mainstream, because just this year it's become almost as commonplace to find Twitter handles mentioned in advertisements, articles, radio spots, and so on as URLs once were. Ashton Kutcher has reportedly passed the 1-million follower mark, and numerous other celebrities have a Twitter presence.
My decision to write about Twitter this issue was spurred by its appearance on the cover of Time magazine in early June. There were numerous examples of real "tweets" that showed a remarkable variety of serious communication, jokes, and utter nonsense. But one of the comments that struck me was that when Oprah Winfrey (or her ghosttweeter) made a remark about getting ticks off a dog, her account got tens of thousands of tweets in reply! Whoa.
My initial reaction to Twitter was, basically, who cares? It doesn't seem like the kind of mundane information people often tweet about (and which winds up in their Facebook status feeds, where I see it) really has much purpose. The claim in the Time article that such mundane status updates foster a sense of community might have some merit, but I remain unconvinced. I honestly don't care how many of my social network friends are eating a bagel or a donut for breakfast.
Lately, however, I've seen a number of other really useful aspects of Twitter, and I'm coming around. One example was a tweet by Rob Glaser (CEO of RealNetworks) asking for information about a massive traffic jam he was in, followed by a comment that he got a response with useful information about where the road cleared. Another example is the growing trend of attendees sending updates from conferences and trade shows, remarking on interesting items and often providing pointers (via "shortened URLs" like is.gd/…) to more info. Not long ago, it was commonplace to "live blog" with frequent blog posts about an event, but Twitter seems to be overtaking that model. And it is.gd. Furthermore, the ability for people to join in real-time conversations about topics of interest is nice, and goes beyond the old-style Internet Relay Chat and AOL chatrooms because these topics can come and go so dynamically simply by adding a #keyword in the message.
Still, I have to wonder what the endgame is. Just how many frequent posters can one person "follow" in Twitter without being so overloaded with updates that it becomes unmanageable? Following updates in FriendFeed, via Facebook, or even on the Twitter Web page, is a synchronous activity: you see a bunch of updates, all at once, when you choose to look for them. Having updates sent to your mobile device is another story, and a bridge that I'm not ready to cross.
Returning to the Oprah story, another question is how public personae can use Twitter without being overwhelmed by it. Most likely, they arrange things so that communication is one-way, in the sense that the tens of thousands of @Oprah tweets might exist in the system, but they won't be "delivered" to her account as an enormous queue of things to look at. Instead, only accounts that Oprah follows would have their own @Oprah messages get past her firewall.
The difference between public and private conversations on the Internet is quite telling and applies in many domains beyond Twitter but with the same effect. If I comment on a colleague's blog, the odds are good that there will only be 2, 5, 10, or maybe 20 comments in total, and I can read them, respond, and engage in real dialog. If I comment on a Slashdot posting, an NPR news item, or something else with wide readership, not only is my comment lost in a sea of other commentary, I have little hope of discerning the one reply there that's really directed at my earlier comment. (Actually, some sites such as Slashdot use threaded comments, and a reply to my posting can be tagged as such, but far too many sites have just a continuous stream of time-ordered comments, many of which are redundant.) Add to this all the spurious junk — think about all the people who try and post simply "first" in response to Slashdot, public Facebook group postings, and so on — and looking for interesting content becomes a needle in an ever-increasing haystack.
My own prediction is that Twitter will continue to be successful for a while, but if it doesn't improve its communication model, it will be overtaken by a competing service that will have a better handle on controlling information and permitting people to get high-value content while avoiding the dreck. Time will tell. In the meantime, you might want to check out Twitter yourself, and while you're at it, look up the IEEE Computer Society's online magazine portal, Computing Now, at http://twitter.com/computingnow.
For more information on this or any other computing topics, please visit the IEEE Computer Society Digital Library at www.computer.org/publications/dlib.
The opinions expressed in this column are my personal opinions. I speak neither for my employer nor for IEEE Internet Computing in this regard, and any errors or omissions are my own.