Further to previous post, I have now published the spreadsheet toolkits on Google Docs.
Although formally completed, the AIDA project is now contributing to the Integrated data management planning toolkit & support (IDMP) project funded by the JISC. Since January 2011 I have been offering support to various projects in the 07/09 Research Data Management Infrastructure strand, and adapting the AIDA methodology into a benchmarking tool that would be applicable to Research Data.
The toolkit below is the “Adapted AIDA” developed for this purpose.
I was recently invited to present a module on AIDA at Steve Hitchcock’s training programme. This was part of his series ‘Digital Preservation Tools for Repository Managers’ within the KeepIt project at Southampton. Steve has blogged about it here, with a Slideshare link to my slides; my impressions of the half-day module have also been added recently.
AIDA assisted at a one-day workshop delivered by the Digital Curation Centre on 15 July 2009 in Leeds City Centre. The workshop was open to researchers planning to submit bids to the JISC 07/09 call: Data Management Infrastructure. The workshop provided an introduction to the digital curation lifecycle and an overview of the specific JISC funded tools that have been recommended for use by funded projects.
Ed Pinsent (AIDA project manager) and Sarah Jones (Data Audit Framework) gave a joint presentation, called Requirements gathering exercise using the Data Audit Framework and Assessing Institutional Digital Assets (AIDA) toolkits. Copy of the slideshow is here.
The latest version of the AIDA Toolkit has been released May 2009. The new version and its accompanying blank scorecard can be downloaded here.
The new Toolkit has the same three-leg and five-stage structure as before, but features a new “two-tiered” approach, reflecting the Institutional – Departmental method of self-assessment. Those completing the assessment should enter two scores for each element. The first score records how the entire Institution is doing. The second score records how you personally are doing within your elected department, or with regard to the asset collection, or project, that is being assessed.
We recognise there may be variances between Institutional and Departmental practices, and that is precisely what we want to assess. A Department might be doing very well with technological resourcing and receive a score of 3, but the overall Institution may be performing so badly in this area as to score only a 1. The AIDA project has added the second tier to the toolkit in an attempt to accommodate this variance; many early users reported that “it is difficult to fill in for the University as a whole as there is so much variation.”
Bearing this in mind, there may even be some synergy between the two levels, such that good (or bad) practice with regard to resource distribution at Institutional level has a top-down effect on your own Department. Conversely, your own Department may be forging ahead with technological development at a rate that leaves other parts of the Institution behind. We realise these relationships are very complex in real life, but they can be expressed to some extent within the confines of this assessment exercise.
Added a page for the AIDA weighted scorecard. A working prototype of this was completed in January 2009.