Understanding Reader Assessment for Student Learning Improvement
This presentation outlines the importance of assessment in student learning, covering topics such as the reader review process, student learning outcomes, evidence analysis, and stakeholder engagement. It emphasizes the systematic collection of information to inform decisions for enhancing learning outcomes. The process involves assigning reader teams to review reports independently before meeting to provide consensus feedback. The aim is to evaluate progress, identify improvement opportunities, and promote accountability in academic programs.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Training for Readers Assessment of Student Learning Academic Program Reports Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
Presentation Outline Definition of Assessment Why Do We Assess? Reader Review Process HLC Statement on Student Learning, Assessment, and Accreditation Fundamental Questions Student Learning Outcomes Evidence of Learning Analysis and Use Shared Responsibility Evaluation and Planning Informing Stakeholders
Assessment of Student Learning The systematic collection of information about student learning, using the time, knowledge, expertise, and resources available, in order to inform decisions about how to improve learning. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-Bass. A Wiley Imprint.
Why Do We Assess? To clearly see opportunities for improvement Student learning University To measure progress To be reflective practitioners To be accountable Slide from the 2012 Assessment Reader Training PowerPoint Dr. Kent Buchanan Mr. Michael Jackson
The Process -- Before Two (02) readers will be assigned to work as a team. Each team reviews approximately six (06) reports (depending upon number of reader volunteers). Teams will only review reports that are not from their program area.
The Process - During Each reader first works independently to review each report. PDF versions of the Qualtrics-based assessment reports will be e-mailed to you. Previous readers suggested that each program s reader review for the prior report year also be provided. There is a question that states If the program was reviewed by a reader team the previous year, please include a response to readers comments from that review. Did an academic program being reviewed make appropriate responses? PDF versions will be e-mailed to you also.
The Process - During Readers meet (in person or electronically) to develop consensus feedback reports to be submitted on the Qualtrics- based form. http://okcu.qualtrics.com/jfe/form/SV_2iekczdEPGeeTnT A preview of the Qualtrics-based form may be accessed at https://az1.qualtrics.com/jfe/preview/SV_2iekczdEPGeeTnT? Q_CHL=preview All consensus reports should be completed by and submitted on Qualtrics by Friday, November 10, 2017.
The Process - During Question: Describe the success of your 2015- 2016 graduates. Answer yes or no based upon the following criteria: Did the program have that information? How comprehensive was the information? Did the information describe the majority of the program graduates?
The Process - During Question: Provide examples of student research or creative activities. This question will not be rated. It was included to compile data for Higher Learning Commission and other reports.
The Process - During Criteria: Student Learning Outcome - What specific, measureable outcome do you want to achieve? Look for verbs like these: Define Classify Describe Demonstrate Interpret Calculate Evaluate Synthesize Critique Diagnose
The Process - During Criteria: How is the outcome measured? For academic assessment, direct measures must be included. Examples include: Tests Essays/papers Performances Presentations/demonstrations Creative products Portfolios How are direct measures evaluated rubrics, etc.? Indirect measures, such as surveys, may also be utilized. Grades can be too subjective as a measure.
The Process - During Criteria: Results/findings? Does the description provide sufficient data for further analysis? Does the program have expected goals for achievement? An example is: Eighty (80) percent of the students will be rated, utilizing a rubric, at a level of 4 of a possible 5 on their performance or presentation. How many students are assessed? All majors? With a small number of majors, such as 2 or 3, results may be limited.
The Process - During Criteria: Analysis of Results How results compare to expected results? Does the program provide a logical interpretation of why/why weren t the goals achieved? The report clearly identifies what contributed to the actual results, such as a recent curriculum change that led to improved achievement or what may have impacted lack of achievement. Did 50% of the students achieve the expected criteria as opposed to an expected 80%? If so, there were likely issues with curriculum or instruction that should be identified in the results/findings.
The Process - During Criteria: Action Plan What changes will be made to improve the program in light of these findings? Is the action plan based upon the analysis? Does the action plan appear appropriate to address any issues in curriculum, instruction, or support resources identified during the analysis? Is the action plan described in sufficient detail so that it can be implemented? Is the proposed time frame for implementation realistic?
The Process - After PDF versions of Reader reports (with no reader names) will be e-mailed to the appropriate dean. A comprehensive report will be developed through Qualtrics. The report will display the average scores for the eight (08) numerically scored criteria and the question on graduate success. Below is an example from the 2014-2015 report of one criterion s average score. Criterion 1. The program has clearly defined, measurable student learning outcomes that focus on knowledge, skills, behaviors, or values. Needs Improvement or Clarification # Question Excellent Acceptable Total Responses Mean How well does this program meet this criteria? 1 26 22 7 55 2.35
Higher Learning Commission HLC Statement on Student Learning, Assessment, and Accreditation Fundamental Questions for Conversations on Student Learning Six fundamental questions serve as prompts for conversations about student learning and the role of assessment in affirming and improving that learning: 1. How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students? What evidence do you have that students achieve your stated learning outcomes? In what ways do you analyze and use evidence of student learning? How do you ensure shared responsibility for student learning and for assessment of student learning? How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? In what ways do you inform the public and other stakeholders about what students are learning---and how well? 2. 3. 4. 5. 6. Higher Learning Commission. (2007) Statement on Student Learning, Assessment and Accreditation. HLC Website: http://ncahlc.org/Information-for-Institutions/publications.html
HLC Fundamental Questions How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students? OCU Mission Emphasis Scholarship Service Culturally rich community Moral and spiritual development Rigorous curriculum Effective leaders
HLC Fundamental Questions What evidence do you have that students achieve your stated learning outcomes? Stated learning outcomes should be measureable. Data can come from both direct and indirect measures, but always incorporate direct assessment.
Examples of Goal Levels Institutional: Students will communicate effectively orally and in writing. General Education Curriculum: Students will write essays in which they select and defend a position on a debatable issue, analyze a text, propose research, or define a problem and suggest solutions. Composition Course: Students will write a 5 to 7 page argumentative essay in which they select and defend a position on a debatable issue, support their position with evidence from their readings, and address counterarguments. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.
Direct and Indirect Assessment Direct assessment involves an analysis of products or behaviors that demonstrate the extent of students mastery of learning outcomes. Indirect assessment involves people s opinions, and these opinions can richly supplement what is learned in direct assessment studies. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.
Direct Assessment Examples Standardized tests Locally developed tests Embedded assignments and activities Portfolios Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.
More Direct Assessment Examples Final Projects such as senior thesis, undergraduate research project, senior art show or music recital Capstone Experiences such as student teaching, internship, cooperative educational experience Middaugh, M.F. (2010). Planning and assessment in higher education. San Francisco, CA: Jossey- Bass A Wiley Imprint.
Indirect Assessment Examples Surveys Interviews Focus Groups Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.
HLC Fundamental Questions In what ways do you analyze and use evidence of student learning? Use multiple measures of direct and indirect assessment. (Grades are typically not adequate measures.) Do you utilize evidence for reflecting upon program outcomes? Do you utilize evidence for indication of student learning? Do you utilize evidence for planning and change?
HLC Fundamental Questions How do you ensure shared responsibility for student learning and for assessment of student learning? How many faculty members are involved? How many courses are assessed? How many students are assessed? How often are learning outcomes assessed? Are external stakeholders involved in assessment, such as in service learning or internships?
HLC Fundamental Questions How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? What is the plan for improvement, if needed? How does the plan for improvement link to strategic planning or budget requests? How do you know that last year s plans worked? How did Readers recommendations impact or improve effectiveness?
HLC Fundamental Questions In what ways do you inform the public and other stakeholders about what students are learning---and how well? How are students informed of assessment results? How are internal stakeholders informed of assessment results? How are external stakeholders informed of assessment results?
Questions? Thank You!