At the VALE Conference on Friday December 9, 2009, I attended Assessing Information Literacy Outcomes, Part 2–Learning from Some External Experiences, facilitated by Jacqui DaCosta representing the VALE Shared Information Literacy Committee.
Unlike Part 1, Part 2 focused on external/commercial information literacy assessment tools and their effectiveness in three different college/university environments with different approaches.
In this session three colleges presented.
1) Berkeley College: reported on its use of SAILS(Standardized Assessment of Information Literacy Skills–www.projectsails.org/). The test consists of 45 multiple choice questions which address 4 of 5 ACRL IL Standards, excluding standard 4. Certain items need to be considered in the administration and use of this tool: the physical space in computer labs, ability to track the students internally, and presentation to an Institutional Review Board if necessary. Pre and Post testing was considered a part of this effort; librarian liaisons went into classes to present test; enrollment numbers were important; students were tested before receiving any IL instruction; and in setting up the test administration, program of study demographics was included. Librarians proctored the tests.
–Discoveries: The number of like institutions using test is important for comparative results. The number of students enrolled in the program to be tested is critical–a large cohort must be tested, especially if the college is planning pre and post tests. Faculty cooperation is also critical. Test can demonstrate that each cohort is markedly different in strengths and weaknesses; such results permit instructional efforts to be targetted accordingly. And, significantly, the way in which the information about the test is presented is critically important.
2) Mercer County Community College: tried out TRAILS(Tool for Real-Time Assessment of Information Literacy Skills–http://www.trails-9.org/). TRAILS comes as a complete package, has no cost, and focuses on 5 areas. Content is appropriate for 1st year students, and was originally written with high school students in mind. Mercer found the execution of the test was manageable, and with the cooperation of their English Department ran the test in 6 English 101 and 102 classes. They performed the test during class time ensuring almost 100% participation. TRAILS scores in individual scores and categories. The results helped show the English department the impact of Information Literacy sessions, and further aided in targetting Information Literacy Instruction by both the English Department and the library instructors so they were able to focus on the students’ weaker areas.
Discoveries–TRAILS is only available online and there is no paper equivalent. It is clearly written for high schools, and not for colleges. The wording of questions and setting of scenarios includes items that are high school specific, e.g. your principal and other similar references throughout. In addition the instrument questions had no flexibility and topics were uneven. However, Berkeley plans to re-use TRAILS BUT they are revising the test to reflect college level references and material where appropriate. There will be an online and paper version available. The revisions have been undertaken with the approval of TRAILS with two conditions: a) must keep the resulting version closed to the specific college, password protected and b) credit TRAILS with the development of the initial instrument. A re-test is scheduled for Spring ’09.
3)New Jersey’s Science and Technology University(NJIT): used iSkills from ETS. iSkills is a performance based assessment instrument that “uses scenario-based tasks to measure both cognitive and technical skills and is intended to guide institutional information/communication/ technology literacy initiatives, guide curricula innovations, measure progress standings, and assess individual student proficiency. ETS says that educational institutions can receive test scores in a form that allows them to compare their students’ performance with those of students from similar institutions”(NJIT Website). Within the NJIT community, “librarians took the leadership role in developing an assessment plan that would make use of both the real-time, scenario-based tasks offered by the ETS iSkills™ assessment and the portfolio system of writing assessment used by the Department of Humanities”(NJIT http://library.njit.edu/researchhelpdesk/infolit/assessment.php).
The iSkills tool uniformly assesses tasks, and measures the IL skills so that a school can establish benchmarks or norms for their schools. Again the difficulty is assessing ACRL Standard 4 which spotlights critical thinking. Another consideration for NJIT(and all of us) is how seriously do students take the Information Literacy programs and the iSkills. At NJIT the score is put on the transcript. NJIT tested 400 students, 200 lower level and 200 upper level students.
NJIT also mapped the iSkills items as they compared to ACRL and NJIT’s own concerns. The information can be accessed at the above website. iSkills has two versions of IL assessment: Core is appropriate for students transitioning into 4 year college programs or completing their freshman or sophomore undergraduate studies, and Advanced is appropriate for students transitioning to upper-level coursework or the workplace. The test is expensive–$20 per test–and time consuming–75 minutes.
Discoveries–Many of these have been folded in the above review, but NJIT is clearly more concerned with a larger scope than that which the iSkills test has. At the iSkills website, and in viewing the instrument myself, the emphasis seems to be more on technical skills than on cognitive. NJIT has added a portfolio component to their assessment activities. Their “method focuses on the examination of evidence of information literacy in student work product. . . . research papers were selected from the writing portfolios of students taking first-year composition or senior capstone seminars in the Humanities”(NJIT website). The Portfolio system was developed in collaboration with the Department of Humanities. Both the iSkills and the portfolio assessments are therefore used.
What I noticed about these sessions is that the most important component of information literacy is the most difficult to assess and no multiple choice or standardized tool seems to get at critical thinking, evaluation of information, or more advanced forms of using the knowledge and integrating that knowledge after students have acquired and demonstrated the technical skill to find the information they seek. Even in seeking that information, critical thinking is a necessity.
I was enlightened and informed about the various commercial materials that are “out there” by the session. And I’m sure there are more available. Since I also attended Part 1, I found some of the home grown more capable of assessing the more (in my view) significant components of information literacy. I also think that information literacy, while now a popular term, is also an unfortunate one since it focuses attention on information rather than the complexities of knowledge seeking, creating, synthesizing, and more that is truly what information literacy is all about. Those tools that these schools presented seem well geared to discover the skills entering college students have, and provide a means for establishing benchmarks and basic standards, but as all the presenters pointed out, they do not go far enough.
All the presenters higlighted these difficulties and their approaches to them whether it be adjusting the instrument or added assessment components in order to capture the complexities of the knowledge culture and how well our students are able to function within it. Another issue that became clear to me in the process of listening to these presenters is the nature of our institutions of higher education and their various missions. It became clear to me that while some comparisons are possible, we’d still have to point out differences and distinctions. While information literacy, in all its complexity, is of primary importance to our students and to our populations in general, standardized instruments cannot be the only assessment answer, at least not by itself. We must assess qualitatively as well.