The Standard of Metadata

The Standard of Metadata
The creation of robust metadata is of a primary importance for the success of any digital project. Metadata is typically defined as “data about data.” Good metadata makes it possible to catalogue and present digital information effectively to the public. Metadata typically describes how the image was digitised, its format, and its provenance. A wide variety of metadata schemes currently exist, but to date no single metadata standard has gained worldwide acceptance Deciding which metadata standard to use should be determined before materials are digitized. ‘quality metadata is what is fit for purpose’(DRI.2015). Digitising audio-visual material involves more than simply converting it into a digital file; data must be created and managed as well. Metadata must be recorded every time a component undergoes processing so that its history can be tracked: the parameters used in this process must recorded. The long-term maintenance and preservation of digital files is now of great concern to worldwide institutions. It has been suggested that it could cost as much as ten percent of the set price of the project per year to preserve its digital data. To maintain the ability to display, retrieve and use digital collections, digital files must be cared for, hard drives have a lifespan, and operating systems are constantly updated.
Digital preservation is a rapidly changing and complex field, but there are some questions on how ensure the long-term stability of high quality digital data. Some consider the main component in the preservation of digital files is the establishment of a central digital repository. Is this the right way to attain these goals, essentially placing all the eggs in one basket? It has been suggested that before undergoing a digitisation and preservation project, companies should make sure they have an archiving solution in place that will allow them to easily archive, search and retrieve the content. But there may still be a reason to consider older formats, microfiche for example. Consider that a major requirement is that metadata remain closely attached to the project media throughout the workflow, including in the archive. The archive data base requires a high scalability to store, manage and index the millions of metadata entries generated by any preservation project. Should you lose this for whatever reason all is lost image and metadata. If the metadata was stored in hard copy e.g. microfiche, the search parameters would still exist; this would suggest that the image may still be retrieved on another format or simple web search. There is no doubt that high quality metadata is essential in preservation projects. The storage of irreplaceable data will always be the problem.

Bibliography:

Digital Repository of Ireland: McCarthy, Kate ‘Metadata Quality Control’. Dublin: (2014), Royal Irish Academy: 10.3318/DRI.2015.1, http://dri.ie/sites/default/files/files/metadata-quality-control.pdf.
Europeana Task Force, Dangerfield Marie-Claire, Kalshoven Lisette Report and Recommendations from the Task Force on Metadata Quality. The Hague in 2015.
Irish Manuscripts Commission Digitisation policy (Dublin, 2007) http://www.irishmanuscripts.ie/downloads/IRISH%20IMC%20DTF%20report%20210607. pdf
Puglia, Steven, Reed Jeffrey, Rhodes Erin. U.S. National Archives and Record Administration (NARA) Technical Guidelines for Digitizing Archival Materials for Electronic Access: Creation of Production Master Files– Raster Images.