Evaluation of Information Literacy Education: Measuring Effectiveness and Continual Improvement
Measuring the effectiveness of information literacy education is crucial for assessing student needs and enhancing learning outcomes. This evaluation incorporates a variety of methods, including Kirkpatrick's Four-Level Model, to gauge immediate reactions, changes in knowledge and skills, long-term behavioral shifts, and returns on investment. The process involves continual research to align educational goals with evidence-based learning approaches.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Evaluation of Information Literacy Education MGR. GABRIELA IMKOV (FACULTY OF ARTS, MASARYK UNIVERSITY) MGR. JI KRATOCHV L, PH.D. (THE MU CAMPUS LIBRARY)
Measuring effectiveness: main reasons E-learning can be a powerful tool it is scalableand less expensive than traditional training (...). But afterspending a lot of moneyon infrastructure and content, how do you know if youre-learning contentorprogram is really effective? Learning activities focusing students needs Knowledge Skills Attitudes Evidence-Based Learning Approach (continual research as a precondition for more effective achievement of educational goals defined within information literacy (IL) education) Key projectactivity
Measuring as a continuing process Educational needs analyse Measuring methodology design Measuring after activity Measuring before activity Learning activity
Kirkpatricks Four-Level Model Our aim: to strength students satisfaction and learning results The first level tries to evaluate immediate students reactions to an educational activity (environment, content and the lecturer) - short paper questionnaires (smile-sheets) The second level explores the change in knowledge and skills - a pre- and a post-test The third level tries to identify the long-term change in participants behaviour - qualitative methodology, specifically a focus group series and 360-degree feedback The fourth level is focused on the return on investment in education
About Model introduced already in 1959: a reaction to the increasing pressure on proving the effectiveness, value and benefit of education for business. one of the most widespread models for education evaluation well reflects the current constructivist conception of instruction comprised of four hierarchically ordered levels revealing, one by one, the levels of effectiveness of the educational process
Level 1: Immediate Reaction to Education Keyquestion: To what extent were the participants satisfied with the educational activity? to evaluate immediate students reactions to an educational activity (a seminar, a workshop, an e-learning module, etc.) how participants feel about the various aspects of a training program a clear research goal, understandable questions, and quantifiable answers, ensure anonymity of participants and the possibility of adding a comment We assessed students satisfaction with the study environment, study content and the lecturer
About smile sheets: type of tool displaying immediate students' response to the educational activity a target group comprised mainly of students aged between 20 and 25 the design of the questionnaire was based on an adjusted five-point Likert scale instead of an evaluation reaching from "extremely satisfied" to "not at all satisfied", consisted of five smileys indicating the level of satisfaction with a particular aspect The three main aspects evaluated by this questionnaire are: content and organization of the seminar, the instructor and overall assessment of the lesson.
Measuring of participants satisfaction at the MU Campus Library The medical students' satisfaction with e-learning aspects - part one (n=27 students from spring 2014, n=48 from autumn 2013) anytime learning (autumn 2013) anytime learning (spring 2014) anywhere learning (autumn 2013) anywhere learning (spring 2014) only online study materials (autumn 2013) only online study materials (spring 2014) permanent accessibility of study materials (autumn 2013) permanent accessibility of study materials (spring 2014) own study plan (autumn 2013) own study plan (spring 2014) necessity to have PC (autumn 2013) necessity to have PC (spring 2014) 75% 80% 85% 90% 95% 100% beneficial neutral not beneficial
The medical students' satisfaction with e-learning aspects - part two (n=27 students from spring 2014, n=48 from autumn 2013) no personal contact with classmates (autumn 2013) no personal contact with classmates (spring 2014) no personal contact with a teacher (autumn 2013) no personal contact with a teacher (spring 2014) online communication with a teacher (autumn 2013) online communication with a teacher (spring 2014) online communication with classmates (autumn 2013) online communication with classmates (spring 2014) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% beneficial neutral not beneficial
The medical students' satisfaction with taught topics - part one (n=27 students from spring 2014, n=48 from autumn 2013) evaluation of website's quality (autumn 2013) evaluation of website's quality (spring 2014) MU portal on EIR (autumn 2013) MU portal on EIR (spring 2014) searching (autumn 2013) searching (spring 2014) Web of Science + Scopus (autumn 2013) Web of Science + Scopus (spring 2014) multidisciplinary fulltext databases (autumn 2013) multidisciplinary fulltext databases (spring 2014) subject specific databases - medicine (autumn 2013) subject specific databases - medicine (spring 2014) basic rules for scientific writing (autumn 2013) basic rules for scientific writing (spring 2014) publication and citation ethics (autumn 2013) publication and citation ethics (spring 2014) citation styles (autumn 2013) citation styles (spring 2014) reference managers (autumn 2013) reference managers (spring 2014) 0% 20% 40% 60% 80% 100% beneficial neutral not beneficial
The medical students' satisfaction with taught topics - part two (n=27 students from spring 2014, n=48 from autumn 2013) library terminology, library services (autumn 2013) library terminology, library services (spring 2014) ILL service (autumn 2013) ILL service (spring 2014) catalogue (autumn 2013) catalogue (spring 2014) databases with e-books (autumn 2013) databases with e-books (spring 2014) online medical journals and books (autumn 2013) online medical journals and books (spring 2014) impact factor + SNIP/SJR (autumn 2013) impact factor + SNIP/SJR (spring 2014) h-index (autumn 2013) h-index (spring 2014) 0% 20% 40% 60% 80% 100% beneficial neutral not beneficial
The medical students' satisfaction with tasks (n=27 students from spring 2014, n=48 from autumn 2013) searching a shelf number in the catalogue (autumn 2013) searching a shelf number in the catalogue (spring 2014) evaluating a quality of information on a website (autumn 2013) evaluating a quality of information on a website (spring 2014) searching in Web of Science (autumn 2013) searching in Web of Science (spring 2014) linking service SFX (autumn 2013) linking service SFX (spring 2014) searching a fulltext in PubMed (autumn 2013) searching a fulltext in PubMed (spring 2014) detecting signs of plagiarism (autumn 2013) detecting signs of plagiarism (spring 2014) writing an abstract/annotation (autumn 2013) writing an abstract/annotation (spring 2014) manually creating a list of references (autumn 2013) manually creating a list of references (spring 2014) creating a list of references in reference manager (autumn 2013) creating a list of references in reference manager (spring 2014) comparing the quality of journals (autumn 2013) comparing the quality of journals (spring 2014) 0% 20% 40% 60% 80% 100% beneficial neutral not beneficial
Level 2: Gained Knowledge Keyquestion: To what extent did the participants obtain the expected knowledge and skills as a result of attending the educational activity? explores the change in one or more areas of participants knowledge, skills or attitudes due to education activity This change is expressed by the quantity of knowledge transferred during a lesson quantitative methods with statistical evaluation when knowledge is measured both prior to and after the lesson for a comparison (a pre- test, a post-test)
Measuring of gained knowledge: e-learning Course of Information Literacy, pre- and post-test autumn 2013 (n=1184) 100.00% No. of correct answers in % 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00% 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 No. of question
Measuring of gained knowledge: e-learning Course of Information Literacy, pre- and post-test autumn 2013 (n= The percentage of medical students from spring 2014 responding correctly in the pre-test and post-test (n=63) 100.0 90.0 80.0 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Pre-test Post-test
The percentage of PhD medical students from spring 2014 responding correctly in the pre-test and post-test (n=29) 100.0 90.0 80.0 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Pre-test Post-test
Level 3: Long-Term Effects Keyquestion: To what extent do the participants apply the knowledge and skills obtained to their everyday work? to identify the long-term change in participants behaviour with the benefit of hindsight (e.g. three to six months after the lesson) methods used for third-level measurements usually have a qualitative character: an interview, finding out the views of students or teachers who work with the person who attended the course focus group - Small number of people (usually between 4 and 15, but typically 8) brought together with a moderator to focus on a specific topic (satisfaction with study materials, communication aspects...). Focus groups aim at a discussion instead of on individual responses to formal questions, and produce qualitative data (preferences and beliefs) that may or may not be representative of the general population.
Level 4: Results Keyquestion: To what extent have the planned objectives of a development project and subsequent support activities been achieved? shows the tangible results of a programme and is accepted mainly in the commercial sphere because it is focused on the return on investment in education
References 1. Eldredge, J.: Evidence-Based Librarianship: searching for the needed EBL evidence. Medical Reference Services Quarterly. 3, 1-18 (2000) 2. Davies, P.: What is evidence-based education?. British Journal Of Educational Studies. 2, 108-121 (1999) 3. Smith, A.: Scientifically Based Research and Evidence-Based Education: A Federal Policy Context. Research & Practice for Persons with Severe Disabilities. 3, 126--132 (2003) 4. Kirkpatrick, D.: The Four Levels of Evaluation: Measurement and Evaluation. American Society for Training & Development Press, Alexandria (2007) 5. Kirkpatrick, D. Seven Keys to Unlock the Four Levels of Evaluation. Performance Improvement. 45, 5--8 (2006) 6. Explorable: Pretest-Posttest Designs, https://explorable.com/pretest-posttest-designs 7. Kirkpatrick, J.: The Hidden Power of Kirkpatrick's Four Levels. T+D, 61(8), 34--37 (2007) 8. Naugle, K.A., Naugle, L.B., Naugle, R.J.: Kirkpatrick s Evaluation Model as a Means of Evaluating Teacher Performance. Education. 1, 135--144 (2000)