This article is more than 5 years old.
North Carolina Independent Colleges and Universities is a consortium of private schools that provides lobbying in the North Carolina legislature and professional development programming for various units in universities including libraries. Besides the fact that you just can’t get too much information on assessment, I was interested in this conference because I have just been added to the list of potential reviewers for SACS. In the small world department, I started talking to the man next to me at lunch and learned that he grew up in the Detroit area and worked at Wayne State in the Center for Urban Studies at the same time I worked there. And he is also a big Red Wings fan (but who isn’t at this time of year)!
Keynote, Steven Sheeley, Southern Association of Colleges and Schools (SACS)
Sheeley talked about accountability in higher education increasing by a more vocal and demanding public. In this economic downturn, all institutions of higher education have been hit, but publics may have been hit the hardest. The book Turn Around (Johns Hopkins 2009) is prescient in examining fragile institutions that may not survive additional financial stress. There will be a focus on efficiencies across the campus. The recession will affect enrollment in both positive and negative ways (community colleges enrollment is expected to go through the roof). Strategic decision making, informed by data and analysis, becomes even more important in times of financial stress.
Navigating the SACS Accreditation Process, Steven Sheeley
Standards and Policies are equal responsibilities for institutions, but Guidelines (such as faculty qualifications) are informative, not normative. Some standards require a policy and require that the policy be followed. Decennial review is necessary, leading to reaffirmation of accreditation every ten years.
Tracks A (baccalaureate only) or Track B (master’s and above)
Off-site Committee: compliance certification document review; each “cluster” reviews 3 or 4 institutions with a 2 day meeting in Atlanta. Committee report goes to institutions and forms basis of On-site Committee report. They give findings of either compliant or non-compliant.
On-Site Committee: Focused report and QEP document sent to committee 6 weeks before visit. Final report is narrative, institution has chance to respond within 5 months. C&R reviews on-site report, response, chair’s evaluation of the response to make a decision.
Quality Enhancement Plan (QEP): should still be in the planning stage until approved as part of reaffirmation. QEP should come out of assessment activities, NOT just brainstorming. Needs to focus on student learning outcomes. QEP lead evaluator can be from the outside, even outside SACS region.
Common areas of off-site non-compliance: faculty competence (not sending in enough documentation), college-level competencies (Gen Ed), institutional effectiveness, administrative staff evaluations.
Common areas of on-site non-compliance: QEP, college level competencies, faculty competencies
Common areas of C&R (in monitoring) non-compliance: institutional effectiveness, college level competencies, QEP, library/learning resources, financial stability
Danger zones: institutional effectiveness
Some sound practices: think like the reviewer (get off your own campus), begin early, clear documentation is key, burden of persuasion is on the institution, READ standards carefully, assessment woven throughout, ask if you don’t understand.
Fifth Year Report: mini-compliance report on progress
Use what you’ve got and get what you need:Strengthening your library’s assessment program, Yvonne Belanger and Diane Harvey, Perkins Library, Duke
I met Yvonne when we toured the Center for Instructional Technology at Duke a few months ago. She does assessment for CIT and is a resource for the libraries as well.They presented a very practical program on the basics of library assessment. My favorite quote was “Culture eats strategy for breakfast,” credited to Ford Motor Company in 2006. That rang true, because I once heard an ARL consultant say that it takes 15 years (give or take) to change a culture. So that got me thinking how I would describe the predominant culture at ZSR and I think I’d say intensely personal service. But I digress…
The growth in assessment programs in libraries mirrors the growth in assessment in higher education. On many campuses, SACS accreditors say that the library does a better job at assessment than most campus units (and probably 1/3 of the attendees here today are librarians). Libraries singled out for excellence in assessment efforts are: University of Pennsylvania, University of Washington, and University of Virginia. A key in library assessment is demonstrating the impact on institutional goals. The most successful library assessment programs are those that are infused throughout the organization, rather than just being the responsibility of one coordinator or one committee, hence the “culture of assessment” that we hear about. A good rule of thumb, attributed to Susan Gibbons of the University of Rochester, is “don’t guess, just ask.” With the availability of easy web survey tools and built-in focus groups of student employees or Lib100 classes, this is good advice to follow. So for example, when I see our virtual reference statistics declining, which seems counter-intuitive to all other prevailing trends, it seems a good approach would be to get some focus groups of students together to ask them what is going on.
Other nuggets that I picked up and will bring to various people when I get home:
- Bring together all assessment data in one place on the website so all can access and use it
- Look into Lib Stats, as a free, open source resource
- Build evaluative thinking by linking assessment to staff development
- Give data back: eg. analyze instruction sections by academic department and report back to department chairs and liaisons
- Our OCLC replication study given at ACRL was cited here as an example of how data tends to be local!
Better Assessment:The I-E-O Model Revisited, Libby Joyce and Rob Springer, Elon
They used the National Survey of Student Engagement (NSSE-spring semester of freshman year) and Beginning College Survey of Student Engagement (BCSSE- entering freshmen before they get to campus). They did a study of 331 matched pairs using the IEO model as a framework:Input (student profile), Environment (engagement), Output (outcome)
Impressively, they performed an ANOVA (analysis of variance) with
- Dependent variable:retention
- Fixed variables: NSSE cognitive variables
- Covariates of BCSSE cognitive variables
They found very strong statistical significance in their outcomes by looking at the complete picture of the student profile coming in (BCSSE), the environmental intervention, and then the outcome as self-reported in the NSSE survey.
Always have to ask the question: how much data do you gather before it becomes a burden?
What does this mean and where do we go from here? Assessing an information literacy program, Jennifer Hanft and Susan McClintock, Meredith College
Assessment is hard to define, but has elements of: accountability, focus, outcomes alignment, measurement, and acknowledgement of professional knowledge. It doesn’t have to be comprehensive, unchanging, intimidating, exceptional, self-sufficient or expensive. You are already assessing your program if you are meeting regularly with instruction faculty to discuss best practices, conducting regular student evaluations, grading assignments, conducting pre-tests, or partnering with faculty on assignments
ACRL Information Literacy Competency Standards for Higher Education: At Meredith, three tiered program (English 111, English 200, IL Thread), incremental and developmental. Where to go from here: continue as part of Gen Ed program (survived revision), extend to graduate programs, continue to assess.
How they tackle assessment in Information Literacy:
- Identify a skill
- Find the applicable ACRL standard
- Identify appropriate level(s) of program
- Align with program’s defined outcome
- Decide how best to measure
1 Comment on ‘NCICU Assessment Conference’
Interesting to see some discussion of open source options for assessment!