Archive for March, 2010

All futured out: UK public funding and risks to libraries

Tuesday, March 30th, 2010 by John

The future, it seems, has never been as popular as it as at the present time. We talk, think and write about it endlessly. The transformations in the world we live in over the past few decades have induced so much uncertainty that we look to the future because we crave a place where certainty and sureness return. As librarians, curators and archivists, of course, it is a professional duty to keep looking at the future in order to plan ahead, to prioritise, to make maximum impact from available resource and to prove that we manage well. But the current preoccupation with prediction goes much further. It seems likely that we are living through the most future-obsessed era our profession has ever experienced.

My first awareness that librarianship was a profession deeply concerned about its future was with the publication of James Thompson’s The end of libraries, in 1982, which was still a relatively recent work when I first went to library school. Thompson, University Librarian at the University of Reading, was interested in library technology and its potential to liberate libraries from what he saw as a paralysed state of continual growth unrelated to use. In an article of the same title as his book which was published in the then new journal The electronic library the following year he wrote:

One way to by-pass problems would of course be to store in the electronic memory not just the surrogate references, but the full text of the documents.

He didn’t imagine Google, but he did perhaps foresee the changes which are now underway, though he would doubtless have been surprised that they would take 30 years to occur. If the changes have been slow, the pace of future-gazing has intensified over these 30 years, and seems to be currently experiencing rocket thrust. On a recent visit to the National Library of Scotland, I was given a copy of its new discussion document Thriving or surviving? National Library of Scotland in 2030. The National Library of Wales has been less daring by ten years, producing Twenty-twenty: a long view of the National Library of Wales. Both institutions are taking on the challenge of providing national library services within a new sector – what the Scottish report calls small, smart countries. Read the rest of this entry »

Risky business: research libraries as enterprise

Thursday, March 25th, 2010 by Merrilee

In the summer of 2008, OCLC Research undertook a risk assessment exercise for research libraries in order to help us frame and shape our work. The results were shared internally, and the findings have been enormously useful as a backdrop for work planning and prioritization. I’ve long wished for an external version of the findings, so that I could wave at something during discussions with external colleagues. After months of me sighing about this, Jim, Constance and Arnold have bravely boiled down the most useful findings into a succinct report, Research Libraries, Risk and Systemic Change [pdf]. But they didn’t do it for me, they did it for you.

The analysis took a traditional approach, but looks at a group rather than an individual institutions, which makes it useful as a platform for group action. Major risk factors for the research library “enterprise” include: the erosion of perceived value, the current and future workforce, collections and spaces, legacy technology, and lack of control in scholarly communication. The report goes beyond articulating challenges, and suggests strategies for mitigating risks. These include developing shared infrastructure, restructuring workflows, and devising new services: there’s nothing new in these suggestions, but what is new is that the strategies are tied to cooling specific hot spots in the heat map, taking us from major to catastrophic impact to more moderate consequences.

This 20 page document is worth your attention, and I urge you to read it. I think this paper will help crystallize what we need to face in order to get to our future.

Pick of the week: ATF 19 March 2010

Monday, March 22nd, 2010 by Jim

ATF banner

Reading in a Digital Age (External site)

The American Scholar  •  Spring 2010

Food for thought. In an era of shortened attention spans, reading fiction requires deep focus, says author Sven Birkerts. Being a good reader involves two levels: suspension of reality to “live” the narrative premise, accompanied by the “resonance” created through the author’s carefully crafted use of language. “The two levels operate on a lag, with the resonance accumulating behind the sense, building a linguistic density that is the verbal equivalent of an aftertaste, or the ‘finish.’ The reader who reads without directed concentration, who skims, or even just steps hurriedly across the surface, is missing much of the real point of the work; he is gobbling his foie gras.”

A long piece that resonated with my own perceptions of the changes in my reading patterns and habits; in many cases Birkerts helped me name a phenomenon I acknowledged but couldn’t properly characterize. “But more and more comes the complaint, even from practiced readers, that it is hard to maintain attentive focus. The works have presumably not changed. What has changed is either the conditions of reading or something in the cognitive reflexes of the reader. Or both.” I think both. I have felt about Shirley Hazzard‘s The Transit of Venus exactly as Birkerts describes in this essay. That despite having three copies always at hand to press on those who have never read it.
(Michalko)

And related to this essay I very much recommend this book(s) review/essay on a related theme. It’s a very thoughtful reflection by a professional reader on how reading may be changing.

Books
Texts Without Context
By MICHIKO KAKUTANI
Published: March 21, 2010
How the Internet and mash-up culture change everything we know about reading.

See the rest of this ATF issue here.
Subscribe to ATF here.
Subscribe to the RSS feed of ATF here.

Back issues are here.

A case study of supply-driven product development

Monday, March 22nd, 2010 by Jim

There’s a very clear-headed reflection on the development of the Archivist’s Toolkit in the most recent Code4Lib journal. Challenges in Sustainable Open Source: A Case Study was written by Sibyl Schaefer who worked on the project. She does the kind of brave, objective reflection on the product’s development that isn’t often done in our domain.

It reminded me of the recent post that Lorcan did on the Ithaka report called Sustainability and revenue models for online academic resources PDF where he quotes the report saying

“The absence of focused effort on use, impact, and competition among these types of projects has deep implications for their potential long-term success.”

Lorcan goes on to characterize one of the further sustainability issues

“much project work is supply-driven rather than demand-driven. Project leaders, they suggest, tend to focus on the inherent values of their work rather than on what might be of most importance to their intended users.”

For a comparably brave and objective reflection from a former product manager and current colleague see Ricky Erway’s post on Desperately Seeking Sustainability.

Analyzing MARC tags and projecting MARC’s future

Saturday, March 20th, 2010 by Karen

The RLG Partners working group that has been gathering and analyzing evidence over the past two years about MARC tag usage to inform library metadata practices completed its work. The 72-page Implications of MARC Tag Usage on Library Metadata Practices report was published on March 12 — with links to thirteen detailed data tables for those who love to immerse themselves in statistics. They’re spreadsheets, so you can also filter and sort the data as you like.

The working group’s studies focused on machine applications. This is an important user category that has generally been ignored in user studies.  MARC data is also used for machine matching and manipulations, linking, harvesting, collection analysis, ranking, and providing systematic views of publications. If we envision a future of linked data so that all the work information professionals have invested into creating and maintaining legacy MARC data are available to the rest of the information universe, machine applications will become increasingly important. Future encoding schemas will need to have a robust MARC crosswalk to ingest our millions of legacy records.

We believe that MARC data cannot continue to exist in its own discrete environment. It will need to be leveraged and used in other domains to reach users in their own networked environments. With the increase of digitized full text from various mass digitization efforts, we advise MARC practitioners to focus on authorized names, classifications, identifiers, and controlled vocabularies that key-word searching of full-text will not provide, rather than on “descriptive metadata”.

The working group held a Webinar on March 18, 2010 to discuss its findings and projections for MARC’s future with those interested. I was grateful that Catherine Argus at the National Library of Australia was willing to get up extra early to present her work, at 7:00 am local time, so that RLG Partner staff on the east coast of the US could join the discussion at 4:00 pm EDT. A couple of Catherine’s colleagues at the NLA also listened in. Lisa Rowlison de Ortiz (University of California, Berkeley), who collaborated on the executive summary which pulled together all our work and presented the working group’s views on MARC’s future summarized above, also joined the discussion. The recording of that Webinar will be available on the OCLC Research’s Webinars page soon.

The working group members each selected a topic to research, and then wrote a report summarizing the findings, which we presented during the Webinar:

Read the rest of this entry »

Pick of the week: ATF 12 March 2010

Thursday, March 18th, 2010 by Jim

ATF banner

Thanks to my colleagues in Research for their informed commentary. Here’s the pick of the week.

Ebooks What Ebooks? (External site)

The Auricle   â€˘  February 21, 2010

When is an ebook not an ebook? When it’s not downloaded to a portable e-reader device, says author Derek Morrison, who notes that some recent ebook usage studies have based their conclusions on ebooks that are accessible only on a laptop or PC screen for a limited time period. Read on for a discussion of tethered vs. untethered models and what this may mean as we ease into the e-reader era. [See also Highlighting E-Readers for a discussion on using the Kindle for academic research.]

A thoughtful article on tethered (i.e., each page accessed separately) or untethered (whole books downloaded at once) with an aside comparing monochrome E-ink screens with full color devices. Good arguments about the dangers of depending on content that demands continuous connection. I find iPhone-sized screens a bit small for reading on, but Morrison prefers them to larger format screens, citing their added convenience and comfortably small amount of text to absorb at one time.
(Hickey)

See the rest of this ATF issue here.
Subscribe to ATF here.
Subscribe to the RSS feed of ATF here.

Back issues are here.

Pick of the week: ATF 9 March 2010

Friday, March 12th, 2010 by Merrilee

Basking in the afterglow of Undue Diligence, I’ve been at leisure to catch up on blog postings and email, which both include the always-informative Above the Fold. Here’s my pick for this week.

Publishing: The Revolutionary Future (full article here)

The New York Review of Books   â€˘  March 11, 2010

Crisis at the crossroads. Veteran publisher Jason Epstein offers a wide-ranging discussion of the pros and cons of digitization that draws on his extensive experience in both the hard copy and digital publishing businesses. One tidbit: “That the contents of the world’s great libraries will eventually be accessed practically anywhere at the click of a mouse is not an unmixed blessing. Another click might obliterate these same contents and bring civilization to an end: an overwhelming argument, if one is needed, for physical books in the digital age.”

I have enormous admiration for Mr. Epstein and think his essays always worth attention. In this essay he takes on not only the tumult and change in publishing but worries sensibly about what we call “digital preservation.” In that regard I commend to you the recently-released final report of the NSF Blue Ribbon Task Force on Sustainable Digital Preservation and Access titled “Sustainable Economics for a Digital Planet: Ensuring Long-Term Access to Digital Information.” My colleague, Brian Lavoie, was co-chair of the panel. They’ve delivered a much-needed and incredibly useful report that may unify our expectations and our vocabulary in managing this important responsibility. (Michalko)

This issue   â€˘   Subscribe via email or RSS   â€˘   Back issues

Focus and reframe: rights and unpublished materials

Wednesday, March 10th, 2010 by Merrilee

I’m using this blog posting to wrap together a bunch of ideas I’ll be presenting at a meeting tomorrow, Undue Diligence: Seeking Low-risk Strategies for Making Collections of Unpublished Materials More Accessible.

Mark Greene and Dennis Meissner helped to reframe processing modern archival collections in More Product, Less Process. Similarly, Shifting Gears helped to recast digitization from special collections. The purpose of Undue Diligence is to help professionals to look anew at rights issues around unpublished materials, specifically with regard to digitization of those materials, particularly 20th and 21st century collections.

The RLG Partnership exists to identify shared problems spaces, and to reduce pain and effort in those areas. With increasing expectations that our holdings will be made digitally accessible, assessing rights (copyright, along with privacy rights, and potentially sensitive materials) within archival collections is one of those points of pain. The prospect of analyzing items within archival collections is so painful, in fact, that many institutions avoid digitizing collections that were created in the last 70 to 100 years. While this is a very safe practice, it does little to advance broad and democratic access to collections in our care.

The RLG Partnership likewise dodged the copyright bullet in 2007 when we held our forum, Digitization Matters (from which Shifting Gears was born). We ruled copyright out of scope. While reframing the conversation around digitization — from preservation to access, from quality to quantity — did help move the conversation on digitization forward, it did little for those institutions who have major collections relating to … the Great Depression, World Wars I and II, the Korean, Vietnam, and Gulf wars, the civil rights movement, the free speech movement… the list goes on and on. This is a small slice of topics that are studied by researchers, taught in classrooms, and of interest to citizens everywhere.

In 2008, we published a short paper called Copyright Investigation Summary Report, which looked at then-current practices around copyright with both published and unpublished materials. Here, we learned that most investigations related to copyright were in relationship to permissions and almost never to digitization. Work was high effort and low return. “We say no a lot,” said one interviewee. Having conducted the interviews, I was pretty depressed by what I heard, which was a tale of professionals paralyzed by potential risks, and of collections shackled.

One of the proposed outcomes of the paper was to “…further explore community practice and issues around unpublished materials held in special collections and archives.” We did so by sponsoring the meeting that lead to the SAA Orphan Works Statement of Best Practices, which was published in 2009. This document provides good guidance for institutions to conduct a “reasonable search,” but does not frame rights assessment in a risk management strategy.

The risk of perceived harm in digitizing a collection is quite variable, based on factors like content, purpose of creation, and date of creation. We believe, in addition to standards for conducting a reasonable search, the community needs to reframe the issues of rights and risks as a community, and also to embrace rights assessment as archivists: at a collection or series level and not at an item level.

We are holding this event, with a star studded cast of presenters, to help set the stage for an important conversation, which is the development of what we are calling a set of “well intentioned practices.” We hope that this will have two effects. The first is that archivists will not need to reinvent the wheel, and can draw from community practices to identify lower risk collections of high research interest. The second is that institutions will digitize collections more freely. Even if institutions consider digitizing two out of ten collections, as opposed to one out of ten collections, access to collections will double!

We will follow up with subsequent blog postings both to report on the content of Undue Diligence and also to report on outcomes.

Many thanks to the advisory group who both helped to shape this event and our program of work in this area.

If you wish to follow the event on Twitter, follow #UndueD. I’ve also set up a Twapper Keeper for the event.

Pick of the week: ATF 2 March 2010

Saturday, March 6th, 2010 by Jim

ATF banner

Some of you may already be subscribers to Above The Fold (ATF) our weekly current awareness compilation and commentary. We just sent out the seventieth issue. Our objective in assembling the newsletter was to offer an information professional’s view of issues from outside our domain that were worth your consideration and related to library, archive and museum challenges. We selected items of interest likely to be beyond your normal reading sphere to help folks you look farther more often with less work.The selection and the commentary on the chosen articles would, we hoped, encourage some lateral thinking in our domain.

The date above marked our seventieth weekly issue and ATF now has nearly 3100 subscribers. We decided that we’ll feature a chosen article each week here in hangingtogether. I’ve chosen this article to feature not because it’s outside our domain but because it shines such a light on the obstacles to change in the research library arena.

E-Library Economics (full article here)

Inside Higher Ed   â€˘  February 10, 2010

The hard truth about hard copy. Recent studies suggest it might take up to 50 years, or two generations, before faculty in some disciplines will accept the predominance of digital resources over hard copy. But the economics may help to persuade them: estimates peg the cost of keeping a book on a shelf at a little over $4 a year, versus about 15 cents for a digital version.

This is the most disheartening saga. I feel badly for my colleague, Suzanne Thorin, the university librarian at Syracuse who is being vilified for acknowledging that the research library in the contemporary academy cannot contribute to the central academic mission without dramatic changes to its traditional processes and services. Managing the local book collection as part of a broad national pattern of provision, particularly alongside the emerging digital aggregations of text, could give readers and researchers more and better than any local print inventory. I’m looking forward to seeing the report mentioned in the article authored by another colleague, Paul Courant, from the University of Michigan but will have to wait until sometime in April. The faster it’s available the better. Cost evidence in these discussions is largely absent. Read the comments to fully appreciate the bile that this topic can attract. (Michalko)

See the rest of this ATF issue here.
Subscribe to ATF here.
Subscribe to the RSS feed of ATF here.

Back issues are here.

Over, Under, Around and Through

Wednesday, March 3rd, 2010 by Merrilee

Our paper on obstacles that archivists experience with adopting Encoded Archival Description (and how to get around them) is out!

Over Under Around and Through: Getting Around Barriers to EAD Implementation [pdf].

We are holding a webinar for the RLG Partnership tomorrow and I’ll share the link of the recorded session later.

The paper covers both “social” and “technical” barriers to implementation, and also gives suggestions for how to get around them. This is not a “how to” manual and it is not meant to be read all the way through (although I’m not going to stop you if you want to do that!). The paper is a collection of tips and tricks, and is as much about attitude adjustment as anything else.

Some high level thoughts:

  • EAD is 12 years old, but still has not reached the point of industrialization. There are others laboring in the same fields that you are and this paper is chock full of links to existing tools. So many that you should not need to invent your own! Use what’s out there rather than reinventing the wheel (or the stylesheet).
  • The paper makes much of consortia, and indeed, these organizations play a vital role in the creation and dissemination of EAD encoded finding aids. Many of these organizations are at risk, or could be at risk. We all are stakeholders in their continued existence.
  • I was surprised that I couldn’t find any high level talking points to “sell” EAD. We came up with some. Use them.
  • There are many barriers that can be bridged, but the standard is complicated and should be rethought, and fortunately there’s a call for the EAD Working Group to do just that.

Many thanks (and congratulations!) go to my co-authors: Michele Combs, Mark Matienzo, and Lisa Spiro. We look forward to your comments.