Something New For Something Old (part 1)

December 5th, 2008 by Merrilee

I’m attending the “Something New For Something Old” conference (put together by the Philadelphia Consortium of Special Collections Libraries or PACSCL). The event is being held to celebrate the end of the groups big backlogs survey project, but the conference is about more than backlogs — it’s a look at archival practice and “where next.” I have a ton of notes, and am left with lots of ideas, but I should probably start by getting the notes out, and following with commentary.

The first set of presentations was around collection management, with presentations by Christine Di Bella, Mark Greene, and Rob Cox.

The Di Bella presentation was on the PACSCL backlog survey project. The project (now nearly completed) was conceptualized in order to help the group respond collectively to opportunities, to find connections between collections, to help prioritize work (collectively and individually). Much of what we think we know about our collections can be impressionistic — a collection survey helps to quantify this. The survey team, which was shared by all the institutions who participated, helps to create efficiencies. Everyone in the group was working with the same goals and standards. The project was perhaps artificially skewed to the backlog, and having this as a project made the survey activity seem different than “regular work,” so it runs the risk of not being carried on beyond the project. There is lots of documentation and the group hopes that there will be a way to integrate the survey into tools such as the Archivists’ Toolkit. [Christine has been a member of our survey tools project, and I'm glad to have her advice and expertise.]

Mark Greene talked about the importance of collection policy for reappraisal and deaccessioning and how his own institution (the American Heritage Center) has developed coherent collection policies, which document a planned and rational approach for not only new acquisition, but also for reappraisal of existing collections. Not many archives have a collection policy that’s publicly available. The AHC has found their collection policies to be invaluable in setting priorities, revisiting previous decisions, and for dealing with donors (especially when rejecting collection offers, even in returning collections to donors). Mark said that deaccessioning is either controversial or invisible in archives, and this shouldn’t be so.

Rob Cox rounded out the panel by talking about UMarmot, the “catablog” at University of Massachusetts, Amherst. This interesting approach to “layered cataloging” resulted from the combination of an entreprenurial staff coupled with management that was willing to allow experimentation. The result is built on WordPress (but could be any blogging software) which allows for textual descriptions of collections to be put up with links to finding aids, images, other collections, related resources, etc. The catablog is Google visible, allows for the incorporation of Web 2.0 features such as comments (which Rob admits are rarely valuable). They have changed their thinking about collection description. A two paragraph description goes up immediately, followed at some later point by a finding aid. Some quotables: “Description is a long, drawn out process.” “Rich textual description gives researchers Velcro to stick to.” And, because trends and vocabularies change, “Any finding aid is temporary and conditional” and probably needs to be revisited every 10 years or so.

My own panel was on cataloging and description (and my presentation is available on Slideshare)

Valerie Hotchkiss gave a presentation about University of Illinois at Urbana-Champaign’s Quick and Clean rare book cataloging project. When Valerie came into her position, there were 70,000 rare books in the backlog materials purchased from the 1920s – 1970s. The materials are American and European imprints from the 16th – 18th centuries. They are coming to the end of a Mellon funded project which has made a significant dent in this backlog, cataloging materials onsite for around $12 a book (as compared to figures ranging between $15 to $53 per book). I was gratified to hear that they were able to find 83% of their books in WorldCat, and that they have added the catalog records for which there was no copy to WorldCat (they also added 4,000 new holdings to the ESTC and enhanced existing WorldCat records). They have averaged 1200 books a month, and apart from using grad students (some from the library school, some from elsewhere on campus), the real cost cutting measure has been not to classify the newly cataloged materials. The impact of the project was seen almost immediately, as people began to request the newly available books (they file them using an alphanumeric code). As funds for the first grant run out, they are looking for ways to keep their well-oiled machine running in order to finish up the task. Valerie also briefly described Project Unica — when a book appears to be unique, they digitize it and make it available right away.

Finally, Dan Santamaria talked about efforts within University Archives at Princeton to create at least a minimal record for every collection. It took two staff members, three months, focus and attention to get this done, conducting a very basic survey of collections. Once they had the data, they created MARC records, then used MARC Edit to export EAD. As with the Quick and Clean project, once basic access was created, usage went up immediately. One of the problems that results from this process is that there are then twice as many files to keep up to date. Dan also spoke briefly about a versioning system that Princeton has developed to keep track of editing EAD files (I hope someone else took good notes on this, because I don’t have anything!).

That’s enough to get you started. I’ll be back tomorrow with the second part of the first day.

Related posts:

Comments are closed.