This project has received funding from the European Union’s Seventh Framework Programme under grant agreement no 619568

THE PROBLEM

Memory institutions are facing increasing transfers of electronic documents and other media content for long-term preservation. In this context, they need to be sure that what is produced according to a standard, tested for conformity, and (if needed) re-processed for corrections, happens within an iteration that is under full control of the institutions themselves or of the curators that are in charge of managing and preserving the electronic documents and other media content in the long term. This in particularly valid for digitisation projects and institutional archives, where archivists may have more input into record creation/management.

Preservation models are often inspired by standards, such as the Open Archival Information System (OAIS) reference model, where transfers and preservation are built on information packages containing both data and metadata. Metadata is normally stored or exported in XML and specified in different schemas. XML is a stable and easily accessible standard, and the schema specifications, like METS, PREMIS, EAD, are controlled by the community of professional curators in digital preservation through different international boards and committees. On the other side, data is normally stored in specific file formats for documents, images, sound, video etc. that are produced by software from different vendors. Even if the transferred files are in standard formats, the implementation of standards cannot be guaranteed and results may be different depending on the software used. The main reason is that the software implementing standards for the production of the electronic files is not in control by the institution that produces them, or by the memory institution that holds the archive.

As a result, memory institutions have to make conformance tests before accepting transfers of electronic collections, to verify that they have been produced according to the specifications of a standard file format, and hence that they match the acceptance criteria for long-term preservation established by the memory institution. However, the software used to perform these tests are, in turn, not controlled by the institution.

This situation, when conformance to standards is not guaranteed, may result in increasing costs. Furthermore, this poses problems for curation and long-term preservation, since data objects meant for preservation, passing through an uncontrolled degenerative process, can jeopardise the whole preservation exercise. Migration of data files can for example be more or less impossible to carry out with the authenticity and integrity of the files still in place. This is particularly true in the case of born-digital archives and is thus the reason why preservation/curation of these objects is so difficult.

Official Media Partner: Digital Meets Culture.

Designed & Powered by: Promoter SRL.