It’s about time we got there. We started with the model of learners working either independently or in a close relationship with mentors, instructors, or teachers to incorporate from them knowledge, built upon scarce data that only their teachers knew. Knowledge and its power were a direct correlate of what you could remember and recall when the time and place for them was present.
But the growth of facts began to challenge even the most capacious human minds. We took to recording them, laboriously by scribes, onto an external storage medium. The effort made the scarcity costly, adding economic value to the mix. Among the knowledge elite this precious external storage environment was prized and guarded. Practitioners of the passage of knowledge were seen at times silently mouthing words and sentences in a mystical and to some very frightening practice of ‘quiet reading’. They translated the coded representations of knowledge on the fly as their eyes danced across the storage medium bringing them to express things that those around them knew to be beyond their experience – and scaring the unlearned by the “power” in these new devices…. books.
The revolution of the printing press democratised access to information. It was no longer a matter of the rich and powerful to own knowledge. The transition took several hundred years, but it laid a part of the foundation for the explosion of knowledge that characterised the Renaissance. It also changed the way we think. What once was knowledge by virtue of memory and its recall was now possible to store outside the little grey cells in your cranium. We needed an indexing system to be able to track all of that externally stored information and mechanisms to use it for efficient retrieval.
Various mechanisms emerged that used descriptions of the location of the physical objects (the Persian city of Shiraz’s library, 10th century (1)), the location itself coded by numbers (Library at Amiens Cathedral in France (2)) or Thomas Hyde’s Incunabulum, a printed catalogue of the books in the Bodleian Library, Oxford University. All of these were attempts to organise and make more accessible to humans the increasingly vast body of information accumulating in the world of knowledge creation.
We have always, as tool making creatures, used our ability to build things to improve and existence. Initially this was focused on survival, but as we became more capable it quickly spread to other aspects of making life easier, better and more fulfilling. To organise our knowledge we built repositories in the form of collections in libraries, and indexed them initially more arbitrarily, and later through a classification schema (e.g., the Dewey Decimal System).
The advent of representing information more abstractly, in terms of binary coding of human readable characters launched the digital revolution. The initial physical manifestations derived from what was cutting edge technology of the period – gears, levers and pulleys gave way to re-appropriation of the loom for separating digitally represented holes as ones and their absence as zeros. The “Jacquard head” for dobby looms provided the in
sight between pattern in the abstract and the Jaquardian weaving that resulted. His key idea was the use of hole punched cards to represent a sequence of operations, leading to the Analytical engine of Charles Babbage and later the card tabulating machine of Herman Hollerith to perform the 1890 US Census.
Tying all this together is the use of technology to augment the human intellect. Fast forward to the end of World War II and the same concern for the proliferation of information and ways to find and use it rather than continue a cycle of rediscovery was expressed by Vannevar Bush in the classic “How We May Think” article printed in Vanity Fair, July 1st, 1945 (3). In it he proposed, based on the state of the art of technology of his day, the Memex, a machine to record knowledge by mimicking the human search process through what he termed Associative Trails. He wanted to record not just the artefacts but the way in which humans thought through the steps that led to the formation of those artefacts. Further he wanted to make these shareable so that others could see not just the result, but the process by which that result was derived.
It took another 23 years before the technology the day could at least attempt a physical implementation of this idea. It was presented to the world in a breathtaking live demonstration at the Moscone Center in San Francisco, California by Doug Engelbart in the “Mother of All Demos” (4). The demo was an attempt to show, rather than talk about, something Engelbart wrote six years before in the landmark paper Augmenting the Human Intellect: A Conceptual Framework (5). In one 100 min ‘show and tell’, Doug demonstrated the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor. Our world was forever changed.
Today we routinely offload memory into silicon. Some refer to this as the dumbing down of our intellect by Google (6) but we’ve been doing it for centuries – we just do it better and more efficiently today. And while it appears we’re mired in our devices, walking heads down into street signs as we text away, or sitting at the dinner table with friends when we’re ‘friending’ people who aren’t present through the devices formally known as phones, we are moving past that era.
What marks this shift? The post digital age is like prior radical transitions – it’s marked by the fact we no longer recognise it as different. Think back to when your parents had an icebox. They replenished it with block ice at least daily. And then something happened. Refrigeration. And in less than a generation we went from astonishment at this miracle, to forgetting the world was different in the time before it.
Look at children playing today with their parents smartphones, or perhaps their own tablet computers. When they walk up to pictures now they try naturally to manipulate them with the ubiquitous thumb and forefinger spread to zoom the image. We walk around with digital sensors measuring our gate, altitude, and velocity and glance at our ‘phones’ (we need a new term for this) to see the dashboard of our activity.
More importantly, we are starting to see and think in ways before we couldn’t before because our devices are shaping what we conceive as questions. In 2011, we began to make ‘movies, by directly recording the impulses from the voxels in our brain to reconstruct the imagery recalled from the memory of movies we have seen (7). We are on the verge of communicating rich media from neural storage to the sensors that pre-process it in others. It’s not long before we have the capability to transfer these memories passing their biological encoding. Will these be ‘memories’ at all without this step? Will we perceive them the same as those created by our own neural infrastructure? We don’t know yet, but we soon will.
As the embedding of digitally enabled devices extends the concept of the internet of things (8) to interconnection of objects in a network with ourselves we silently enter the post-digital age. As David Foster Wallace wrote in 2008, “the most obvious, ubiquitous, important realities are often the ones that are the hardest to see and talk about” (9). Fish don’t see water, but we must. Welcome to the post digital age.
(2) Joachim, Martin D., Ed. Historical Aspects of Cataloging and Classification, Volume 2 The Haworth Information Press, Binghamton, NY, 2003, p. 460.
(7) Nishimoto, et. al., (2011), Reconstructing visual experiences from brain activity evoked by natural movies, Current Biology, 21(19):1641-1646.
(8) Ashton, Kevin (22 June 2009). “That ‘Internet of Things’ Thing, in the real world things matter more than ideas”. RFID Journal.