Hyperactive


I got a link through on a feed this morning to a piece on the blog "The Conversation", about an article written eighty years ago by Vannevar Bush on information overload and its impact on the efficiency and effectiveness of academic research. In his original piece of July 1945, he argued that the sheer weight of information produced by academia in the form of papers, books and journal articles stood as a bulwark against the useful absorption and cataloguing of those data for re-use in future research. In those days, research was very much a case of sifting through physical catalogues and index files in order to make the connections between ideas and the concrete findings of individual researches to further advance knowledge.

He argued that the volume - even then - of works which needed to be considered in any rational assessment of a particular subject's development was too onerous, given the very linear nature of the filing structures and access methods then available, to be of much use at all; which naturally lead to many otherwise valuable avenues of thought lying completely unnoticed. What he proposed in his thought experiment was what he termed, arbitrarily, a 'Memex': a device - he envisaged a physical desk with as yet undeveloped technology at its heart - that would enable a researcher to link ideas, documents and artefacts in the associative manner of human thought: the web-like way that the brain can jump from one strand of thought or argument to another, then another, and another, almost without limit and in very short order. Niklas Luhmann posited and employed a physical manifestation of the idea with his Zettlekasten [blog posts passim] to assist with his writings: he called it his '...second brain...'.

Ted Nelson and Doug Engelbart in the 1960s developed the concept of 'hypertext', where documents, articles, fragments of media, etc., could be cross-linked, forming chains of associations that would assist research and learning. This idea of course came of age in the 1980s and 1990s, with the development of Bill Atkinson's HyperCard [blog posts passim] and ultimately finding fruit in Tim Berners-Lee's World Wide Web [cf. likewise]: the internet as we know it today. What is interesting and disturbing to me in equal measures, is how quickly the application of hyper-linking of data has lead from Vannevar Bush's state of information overload to our current state of information overload. It's as if nothing fundamentally has changed.

We simply seem to have allowed this new methodology to become as bloated as the old. What was meant to clarify things has rapidly become as confusing and obfuscatory as that which came before; with little progress in general information handling in reality having been made: it is still just as difficult to sort the wheat from the chaff as ever. The situation is not exactly helped by all the hype surrounding A.I in this field, with human actors relying more and more on machine inputs and less and less on the associative powers of their own minds to do the sorts of tasks that we have hitherto managed ourselves. There is a place in life for systems of assisted thought, whatever they may be, but the kicker is that you have to think for yourself in the first place to make any meaningful use of them whatsoever... 

Comments

Popular posts from this blog

Of Feedback & Wobbles

A Time of Connection

Messiah Complex