All posts by carlcooper

PASIG 2017: Ageing of Digital – Towards Managed Services for Digital Continuity

PASIG 2017 (Preservation and Archiving Special Interest Group) was hosted in Oxford this year at the Natural History Museum by Bodleian Libraries & Digital Preservation at Oxford and Cambridge (DPOC). I attended on all three days (11th -13th September), when I wasn’t working I had the opportunity to listen to some thought provoking talks centered around the issue of digital preservation.

One of the highlights of the conference for me, was a talk given by Natasa Milic-Frayling, the founder of Intact Digital. The presentation entitled  ‘Ageing of Digital: Towards Managed Services for Digital Continuity’ demonstrated the innovative ways in which digital preservation issues are being approached.

Digital technology has a short lifespan; hardware and software become redundant and obsolete in a very short time, essentially outdated. This is  known as ‘Legacy Software’, outdated software that no longer receives vendor support or updates.

This poses the problem – How can we manage the life-cycle of digital in the face of a dynamic and changing computing ecosystem?                                        

Technologies are routinely changed, updated (sometimes at a cost), made redundant and retired. The value of digital assets needs to be protected. In the current climate there is an imbalance of power between the technology producers and providers and the content producers, owners and curators. The providers and producers can move on without the opinion or input of those who use the software.

How do we enable prolonged use of software to protect value of digital assets?

A case study was presented that contextualised the problem and the solution. The vendor Tamal vista Insights provided Cut&Search, a software for automated and semi automated  indexing of digitised manuscripts and digital artefacts that standard OCR can not handle.
The software was supplied to Fo Guang Shan, an International Chinese Buddhist Monastic Order with over 200 branch temples worldwide for use with their digitised manuscript collection. This project is made up of thousands of volunteers and spans years, beyond the providers expected life-cycle for their product, its primary market life-time.
 Intact Digital provide a managed service that allows for digital continuity. There are several steps in the process which then provide a  number of options to software providers and the content producers:
  • Deposit
  • Hosting
  • Remote Access
  • Digital Continuity Assurance Plans

The software can be hosted in a virtual machine and accessed remotely via a browser. The implications of this are far reaching for projects like the ones undertaken by the Fo Guang Shan. They don’t need to worry about the Cut&Search software becoming redundant and their digital assets remain protected. For smaller organisations operating on ever decreasing budgets this is an important step both for asset protection and digital preservation.

Key areas to develop

Although this is an important step, there is still much work to do and some key areas that need to be developed were highlighted. This will result in a sustained use of digital.

  • Economy around “retired” software
  • Legal frameworks and sustainable business models
  • New practices to create demand
  • New services to make it efficient, economical and sustainable

Changes to the Ecosystem

In taking these steps and creating a dialogue between the technology producers/providers and the content producers it changes the dynamic of the ecosystem, readdressing the imbalance in control.

 

The talk ended with two very pertinent statements;

Together we can create new practices and
new models of extending the life of digital”
“Without digital continuity our digital content,
information and knowledge has no future”
As a trainee I still have lots to learn but a major theme running throughout digital archiving and digital preservation is the need for communication, collaboration and dialogue. Working together, sharing ideas and the challenges is key to securing the future of digital content.

 

A complete collection of the slides relating to this topic can be found here;  https://doi.org/10.6084/m9.figshare.5415040.v1  Milic-Frayling, Natasa (2017): Aging of digital: Towards managed services for digital continuity. figshare.

Researchers,practitioners and their use of the archived web. IIPC Web Archiving Conference 15th June 2017

From the 14th – 16th of June researchers and practitioners from a global community came together for a series of talks, presentations and workshops on the subject of Web Archiving at the IIPC Web Archiving Conference. This event coincided with Web Archiving Week 2017, a week long event running from 12th – 16th June hosted by the British Library and the School of Advance Study

I was lucky enough to attend the conference  on the 15th June with a fellow trainee digital archivist and listen to some thoughtful, engaging and challenging talks.

The day started with a plenary in which John Sheridan, Digital Director of the National Archives, spoke about the work of the National Archives and the challenges and approaches to Web Archiving they have taken. The National Archives is principally the archive of the government, it allows us to see what the state saw through the state’s eyes. Archiving government websites is a crucial part of this record keeping as we move further into the digital age where records are increasingly born-digital. A number of points were made which highlighted the motivations behind web archiving at the National Archives.

  • They care about the records that government are publishing and their primary function is to preserve the records
  • Accountability for government services online or information they publish
  • Capturing both the context and content

By preserving what the government publishes online it can be held accountable, accountability is one aspect that demonstrates the inherent value of archiving the web. You can find a great blog post on accountability and digital services by Richard Pope in this link.  http://blog.memespring.co.uk/2016/11/23/oscon-2016/

The published records and content on the internet provides valuable and crucial context for the records that are unpublished, it links the backstory and the published records. This allows for a greater understanding and analysis of the information and will be vital for researchers and historians now and into the future.

Quality assurance is a high priority at the National Archives. By having a narrow focus of crawling, it has allowed for but also prompted a lot of effort to be directed into the quality of the archived material so it has a high fidelity in playback. To keep these high standards it can take weeks in order to have a really good in-depth crawl. Having a small curated collection it is an incentive to work harder on capture.

The users and their needs were also discussed as this often shapes the way the data is collected, packaged and delivered.

  • Users want to substantiate a point. They use the archived sites for citation on Facebook or Twitter for example
  • The need to cite for a writer or researcher
  • Legal – What was the government stance or law at the time of my clients case
  • Researchers needs – This was highlighted as an area where improvements can be made
  • Government itself are using the archives for information purposes
  • Government websites requesting crawls before their website closes – An example of this is the NHS website transferring to a GOV.UK site

The last part of the talk focused on the future of web archiving and how this might take shape at the National Archives. Web archiving is complex and at times chaotic. Traditional archiving standards have been placed upon it in an attempt to order the records. It was a natural evolution for information managers and archivists to use the existing knowledge, skills and standards to bring this information under control. This has resulted in difficulties in searching across web archives, describing the content and structuring the information. The nature of the internet and the way in which the information is created means that uncertainty has to inevitably be embraced. Digital Archiving could take the turn into the 2.0, the second generation and move away from the traditional standards and embrace new standards and concepts. One proposed method is the ICA Records in Context conceptual model. It proposes a multidimensional description with each ‘ thing ‘ having a unique description as opposed to the traditional unit of description (one size fits all).  Instead of a single hierarchical fonds down approach, the Records in Context model uses a  description that can be formed as a network or graph. The context of the fonds is broader, linking between other collections and records to give different perspectives and views. The records can be enriched this way and provide a fuller picture of the record/archive. The web produces content that is in a constant state of flux and a system of description that can grow and morph over time, creating new links and context would be a fruitful addition.

Visual Diagram of How the Records in Context Conceptual Model works

“This example shows some information about P.G.F. Leveau a French public notary in the 19th century including:
• data from the Archives nationales de France (ANF) (in blue); and
• data from a local archival institution, the Archives départementales du Cher (in yellow).” INTERNATIONAL COUNCIL ON ARCHIVES: RECORDS IN CONTEXTS A CONCEPTUAL MODEL FOR ARCHIVAL DESCRIPTION.p.93

 

Traditional Fonds Level Description

 

I really enjoyed the conference as a whole and the talk by John Sheridan. I learnt a lot about the National Archives approach to web archiving, the challenges and where the future of web archiving might go. I’m looking forward to taking this new knowledge and applying it to the web archiving work I do here at the Bodleian.

Changes are currently being made to the National Archives Web Archiving site and it will relaunch on the 1st July this year.  Why don’t you go and check it out.