OCLC RLP Assessment Interest Group — lessons learned and ingredients for success

In OCLC Research, we very much see ourselves as being a learning organization. In that spirit, we are always trying new things, and in the process learning what works and what doesn’t. In the OCLC Research Library Partnership, we are learning together with those of you that work at institutions in the OCLC RLP.

Photo by Aaron Burden on Unsplash

In 2018 we launched an interest group on Library Assessment. The Assessment Interest Group built on a three-part WebJunction Webinar Series: Evaluating and Sharing Your Library’s Impact. It was delightful to have a partnership within OCLC Research that combined the OCLC RLP with WebJunction, and leveraged the expertise of our colleague Lynn Silipigni Connaway, OCLC’s Director of Library Trends and User Research. On the OCLC RLP side, I worked with the group, alongside my colleagues Rebecca Bryant and Titia van der Werf, and we were joined by WebJunction colleague Jennifer Peterson. Rebecca, Titia, Jennifer and I are by no means assessment experts so we really were co-learning with the OCLC RLP cohort.

We’ve written about the Assessment Interest group along the way. As a learning organization we also took some time to evaluate what worked and what did not go so well for us and our co-learners. We also interviewed two people who seemed to get a lot out of the course, Anne Simmons, and Tessa Brawley-Barker both from the (US) National Gallery of Art to help us see what we could glean from successful learners.

First, a few negatives:

Be(a)ware of spam filters To communicate with our learners, we set up a Google Group. We did this to give us the flexibility to not only send and share emails but also to potentially share documents. You can either directly add people to a Google Group or invite them to join and we quickly learned that the “direct add” method works best. But alas! Invitations got routed to spam or were never received. Even with the direct add method, emails from the group were caught by hungry spam filters which needed to be trained to accept emails from the group. We never wound up using the shared document functionality in Google Groups. The lessons here is that if you are setting up a new listserv of any type, make sure people are checking and correcting their spam filters! Otherwise, they will miss valuable communications and discussions.

Too much elapsed time? The WebJunction course spanned April to October, with the second of the three webinars in August. This left us with a lot of time between sessions one and two, and not a lot to discuss in the interim. Ideally, the sessions would have been more evenly spaced.

Not maximizing our resources. As I said previously, none of us helping to lead the interest group were assessment experts — but of course we had experts who were leading the WebJunction courses. We could better have leveraged the expertise of our presenters by including them in our discussion groups. This occurred to us rather late in the game but is something we would if we had a second chance.

However, we had many positive lessons learned:

Alongside those takeaways, we also learned from our star students what worked. As more and more of us look to expand and extend our skills by learning in online courses and cohorts some of these reflections seem particularly apt.

Use prepared course materials. Anne and Tessa valued the Webinar Series Learner Guide and treated the guide as an assignment which filled out and shared before each call. Treating the guide as a homework assignment helped to keep them accountable.

Use group meetings as deadlines. Tessa and Anne valued having the interest group calls on their calendars and treated them as deadlines for working through their assignments (the learner guide).

Leverage adjacent opportunities. Tessa took a Library Juice Academy “User Experience Research and Design” course concurrently with part of the webinar / interest group. This was beneficial because ideas presented in that class overlapped and complemented what she learned in our group.

Learning is better together. Tessa and Anne work in the same small library but don’t normally work together. Participating in this learning opportunity together helped with an NGA goal of fostering more cross functional collaboration. They reported that having a “learning partner” was a factor in their success (and as an added bonus they now know each other better as colleagues).

Mixing it up. A strength of the OCLC RLP is the diversity of institution types and geographic regions that are represented. Anne and Tessa both valued the mix of people who attended the calls, and the range of expertise, from novices to more seasoned assessment professionals.

As always, it is invigorating to work with those at our OCLC RLP institutions and with colleagues on our extended OCLC Research team. We have been applying our lessons learned from the Assessment Interest Group to a Research Data Management interest group that took place from October through December, and sharing those conversations here as well.

I hope if you have either pitfalls or success tips gleaned from leading or participating in an online learning or discussion group you will share that with us!