This article is more than 5 years old.
Eleven-day-old daughter and sleep-deprived wife in tow, I attended the 2014 Charleston Conference flying arguably in the face of reason. I had the advantage of a free place to stay: my parents-in-law live out on James Island, a 15-minute drive to the Francis Marion Hotel where the conference is held. Given this fact and the conference’s unique focus on acquisitions, it makes sense for this meeting to become an annual excursion for me.
The opening speaker, Anthea Stratigos (apparently her real last name) from Outsell, Inc. talked about the importance of strategy, marketing, and branding the experience your library provides. She emphasized that in tough budgetary times it is all the more important to know your target users and to deliver the services, products, and environment they are looking for rather than mindlessly trying to keep up with the Joneses and do everything all at once. “Know your portfolio,” advised Ms. Stratigos. I would say that we at ZSR do a good job of this.
At “Metadata Challenges in Discovery Systems,” speakers from Ex Libris, SAGE, Queens University, and the University of Waterloo discussed the functionality gap that exists in library discovery systems. While tools like Summon have great potential and deliver generally good results, they are reliant on good metadata to function. In an environment in which records come from numerous sources, the task of normalizing data is a challenge for library, vendor, and system provider alike. Consistent and rational metadata practices, both across the industry and within a given library, are essential. To the extent that it is possible, a good discovery system ought to be able to smooth out issues with inconsistent/bad metadata; but the onus is largely on catalogers. I for one am glad that we are on top of authority control. I am also glad that at the time of implementation I was safely 800 miles away in Louisiana.
In a highly entertaining staged debate over the premise that “Wherever possible, library collections should be shaped by patrons instead of librarians,” Rick Anderson from Utah and David Magier from Princeton contested the question of how large a role PDA/DDA should play in collection development in an academic context. Arguing pro-DDA, Mr. Anderson claimed that we’ve confused the ends with the means in providing content: the selection process by librarians ought properly to be seen simply as a method for identifying needed content, and if another more automated process (DDA) can accomplish the same purpose (and perhaps do it better), then it ought to be embraced. Arguing the other side, Mr. Magier emphasized DDA’s limitations, eloquently comparing over-reliance on it to eating mashed potatoes with a screwdriver just because a screwdriver is a useful tool. He pointed out that even in the absence of DDA, librarians have always worked closely and directly with patrons to answer their collection needs. In truth, both debaters would have agreed that a balance of DDA and traditional selection by librarians is the ideal model.
One interesting program discussed the inadequacy of downloads as proxy for usage given the amount of resource-sharing that occurs post-download. At another, librarians from UMass-Amherst and Simmons College presented results of their Kanopy streaming video DDA (PDA to them) program, similar to the one we’ll be rolling out later this month; they found that promotion to faculty was essential in generating views. On Saturday morning, librarians from Utah State talked about the importance of interlibrary loan as a supplement to acquisitions budgets and collection development policies in a regional consortium context. On this point, they try to include in all e-resource license agreements a clause specifying that ILL shall be allowed “utilizing the prevailing technology of the day” – an attempt at guaranteeing that they will remain able to loan their e-materials regardless of format, platform changes, or any other new technological developments.
Also on Saturday Charlie Remy of UT-Chattanooga and Paul Moss from OCLC discussed adoption of OCLC’s Knowledge Base and Cooperative Management Initiative. This was of particular interest as we in Resource Services plan on exploring use of the Knowledge Base early next year. Mr. Remy shared some of the positives and negatives he has experienced: among the former, the main one would be the crowdsourcing of e-resource metadata maintenance in a cooperative environment; among the negatives were slow updating of the knowledge base, especially with record sets from new vendors, along with the usual problem of bad vendor-provided metadata. The final session I attended was about link resolvers and the crucial role that delivery plays in our mission. As speakers pointed out, we’ve spent the past few years focusing on discover, discovery, discovery. Now might be a good time to look again at how well the content our users find is being delivered.
3 Comments on ‘The Ellers Visit the In-Laws; Charleston 2014’
Awesome Post. I love Charleston. Count me in on the OCLC Knowledgebase discussions.
James
Thanks for the report and the great Charleston photos! Good conference in a wonderful city. I’d be interested in hearing more about downloads as indicators of usage in the age of resource-sharing.Any potential solutions proposed?
Susan – Carol Tenopir from UT-Knoxville and colleagues have a research grant from Elsevier to study secondary (post-download) usage. They are interested in Twitter linking to articles, the emailing of articles between colleagues (both full articles and citations), links in blogs, etc. Preliminary findings, unsurprisingly, indicate that article sharing is a big factor in actual usage – whether that sharing is 100% legal or not. Here’s a link to their slides: http://www.slideshare.net/slideshow/embed_code/41629759