I attended Assessing Information Literacy Outcomes, Part 1 – Learning from Some Internal Experiences on Friday at the VALE Conference, hosted by Jacqui DaCosta of the VALE Shared Information Literacy Committee.
To recap, there were brief presentations from four institutions:
1) Camden County College
-Collaboration between the library and the biology department has existed since 2003 in the form of information literacy goals written into the curriculum for Biology 111. The curriculum outlines objectives that apply to a specific assignment related to scientific literature, and includes student outcomes.
2) New Jersey Institute of Technology (NJIT)
-Working with humanities/general education, librarians created a rubric to score research papers on factors such as citing sources, evidence of research, appropriateness, and integration. This program required readers to perform the assessment, but HUM101 has subsequently made information literacy 10% of the grade.
-Created a ‘home-grown’ rubric in the writing program. This involved a research notebook, and focused on the bibliography in the 12 steps of writing a paper. The goal was to translate into numbers students’ written work.
4) William Paterson University
-In a “Literacy, Technology & Instruction” class, students were assigned to interview a school librarian. Students used some prepared questions and then had to present results. This taught them not only about librarians but got them thinking about information more generally.
Having attended the second session as well (Assessing Information Literacy Outcomes, Part 2 – Learning from Some External Experiences), I find it interesting to see the different approaches to information literacy assessment. Home-grown assessment programs often seem to be extremely relevant to student information literacy, but they cannot provide a consistent ‘score’ to be used across institutions. Also, the library’s value to the institution in terms of successful information literacy instruction may not be brought to the forefront — It can be difficult to demonstrate the library’s quantitative benefits when it’s ultimately the faculty who are responsible for making information literacy part of their student assessment.
In some ways it seems more straightforward to have students take a test when they start a program, provide information literacy instruction for them, then have them take another test when they complete the program, and subsequently be able to show in hard numeric terms an improvement (presumably!). But such an abstract test of information literacy is difficult to create, and there is also a question of how to motivate students to take a test seriously if they receive no grade or credit for doing so.
From my perspective, assessments integrated into curricula are sufficient to at least meet ACRL’s Information Literacy Competency Standards, and collaborative tools such as the VALE Online Information Literacy Archive (VOILA) will be very useful for librarians to share ideas and best practices.
(Disclosure: I’m a Reference and Instruction Librarian at Camden County College, & so my thinking may be influenced by working there. Also I’m rather a new librarian, & so there may be some holes in my understanding of information literacy.)