Gone for good: The digital dark age.
A disturbing report last month in the Guardian noted that there is concern over the rapid evolution of computer systems and the consequent inability of contemporary computers to open files produced over the past twenty years. This situation has become serious enough to warrant the British National Archives and Microsoft teaming up in a partnership to “…prevent what was described as “a digital dark age”, and unlock millions of unreadable stored computer files”.
As the Guardian notes:
While anyone able to follow a Norman scribe’s handwriting can read the Domesday Book, on display at the archives, and work out who owned land, fishing rights and beehives in the 11th century, the software for many 10-year-old files – including thousands of government records – is already obsolete.
The solution is a version of Virtual PC, which allows “…users to run multiple operating systems simultaneously on the same computer, and unlock what are called “legacy” Microsoft Office formats dating back 15 years or more. The system should retrieve not just the text but, crucially, the formatting and original appearance of the files as they were created”.
It may appear counter-intutive, but as Natalie Ceeney, chief executive of the National Archives said: “Digital information is inherently far more ephemeral than paper…This is a critical issue for us, and for UK society as a whole. We assume our personal records are secure, we expect our pensions to be paid, but anyone with a floppy disc even three or four years old is already having a hard time finding a computer that will open it.”
In a world where the internet is now all-pervasive and where it is possible to store data on-line it may seem strange that the basic elements of data storage and continuity of access is problematic, but for data stored in bespoke systems this is a very real issue.
Fragmentation of the computer applications market means that multiple companies and corporation sold near, but not quite, identical applications. Some of those companies went out of business, others were taken over by larger competitors. When an application was discontinued that tended to be that. The means to access an individual file were constrained by the ability to run the operating system within which the application could operate.
And even then the capacity for one application to open files produced in another were, and remain, limited. A contemporary example would be file transfer and conversion between desk top publishing applications Quark Xpress and Adobe Indesign. It is possible to do precisely this, for some versions of the two applications. But not – unfortunately from the point of view of many designers – for all. And the reasons are as much commercial as technical. Simply put an application provider will seek – entirely reasonably from their perspective – to retain market share by effectively ‘locking in’ the customer base to a single application. If it is easy to convert and open files in a competitors application that opens the way to the rival application gaining market share.
I suspect that the point about opening files on discs three or four years old is overstated, but, across longer time scales there are evident problems.
For example I have some files produced in the early desk top publishing application, the cheerily named Ready, Set, Go!, stored on a floppy disk somewhere. These would have been produced in the late 1980s. I have a 3.5″ disk drive, also located in that unknown somewhere. I don’t have Ready, Set, Go! The application was purchased by the Diwan corporation and although still extant I have never come across it in use in the intervening years. I could, I imagine, purchase the latest versions of RSG! but I doubt that version would open my 1989 files, or indeed that the system software on my computer would run any previous versions of RSG! that I might be able to open the files with. In order to do so I would probably need a computer at least ten years old running software that was older. It is just about possible that I might be able to open the files in another application, but it is highly likely that they would not open with precisely the same formatting.
So, to all intents and purposes the work I produced in 1989 is irretrievable. This may be well be a good thing and if I were entirely serious about such matters I could alternatively reproduce the work from print-outs. That would be inconvenient and it would not be the original, merely an approximation of same.
Similarly anyone who has had the dubious pleasure of attempting to co-ordinate databases across different operating systems (Windows and Macintosh, or indeed the late lamented OS/2) will recognise that there the situation was even more difficult.
Yet, this encapsulates on a minor scale the much more serious issues that face those who hold critical data. I would argue that this demonstrates an endemic short-term approach in the digital media. There is – to some degree – an element of “Year Zero” to much that is produced in it. I’ve noted previously how there is little in the way of a defined history of the field of visual communications or visual and material culture, but while that is a conceptual deficit the short-term approach is very much a practical deficit. Digital media exists in an ever-present now.
But humans don’t. Health, financial and other data is not ephemeral. It is critical to personal and social well-being across lifetimes. The ability to access it at any point in that lifetime is crucial. The possibility that it may be inaccessible is something approaching a societal catastrophe.
One might point to a certain irony in the role of Microsoft having to work with the National Archives to permit files in applications produced by them to be read. Surely such backwards compatibility should have been written into the code of their applications? But that would be to ignore the way in which software has had to – as it were – adapt to rapidly changing microchip technology. Microsoft are often criticised, but they actually have quite a good track record in these matters.
There are other problems. Many storage media are relatively short-lived. Tape and CD-ROMs can often degrade as storage media within a decade. Transferring files from one medium to another incurs the risk of data loss or corruption. And even where storage is successful we return to the situation described above where access may be limited.
An analogous process has been seen in television, where programmes from the 1950s and 1960s were stored on short-lived media which were overwritten or discarded. The cultural artifacts lost ranged from the trivial to the irreplaceable.
Is this overstated? Is this to some extent the equivalent of Y2K? I don’t think so. In a compelling overview of the problem Stewart Brand notes that:
…we are now in a period that may be a maddening blank to future historians–a Dark Age–because nearly all of our art, science, news, and other records are being created and stored on media that we know can’t outlast even our own lifetimes. We arrived at this situation partly because digitization otherwise offers so many profound benefits. We can now store, search, and cross-correlate literally everything. In fact, according to estimates by Bellcore’s Michael Lesk, who calculated the total amount of data there is in the whole world, storage has now surpassed data, probably permanently. There is more room to store stuff than there is stuff to store. We need never again throw anything away. That particular role of archivists and curators has become obsolete.
The problem is that the structures and systems that we have established are insufficiently flexible enough to deal with long-term access and storage. The inbuilt logic of commercial competition and rapid technological development has led to gaps and dislocations in the overall process. Too great a quantity of the ‘stuff to store’ is slipping from access to memory.
And once gone it is gone for good.