Equity-Informed Evaluation Plan Development Guide

Slide Note
Embed
Share

Develop an equity-informed evaluation plan to assess program effectiveness and address key objectives. Explore considerations for evaluating program implementation, strategic planning, and monitoring across the program life cycle. Learn how to incorporate equity considerations in evaluation processes, engage target populations, and manage evaluation teams effectively.


Uploaded on Jul 31, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Guide to Building an Equity- Informed Evaluation Plan

  2. Monitoring and Evaluation Across Program Life Cycle ASSESSMENT What is the nature of the (health) problem? (SUMMATIVE) EVALUATION Do I know the strategy is working? How do I judge if the intervention is making a difference? STRATEGIC PLANNING What primary objectives should my program pursue to address this problem? IMPLEMENTATION MONITORING/PROCESS EVALUATION How do I know the activities are being implemented as designed? Ho much does implementation vary from site to site? How can the program become more efficient or effective? DESIGN What strategy, intervention, and approaches should my program use to achieve these priorities?

  3. Equity considerations: How are the needs of your target population(s) being addressed in the evaluation process? What barriers do they face in engaging them? Does the budget/scope allow for a community engagement component?

  4. Before you get started Management of the Evaluation Evaluation Team Who will manage and implement this evaluation? What evaluation skills or approaches are needed to successfully conduct this evaluation? Have you identified an external reviewer to provide feedback on the evaluation plan? Does the team include individuals 1) who have experience working with communities similar to the client group AND/OR 2) who share attributes and experiences with the client group? (e.g. racialized youth, LGBTQ community) Individual Title or Role Responsibilities

  5. Evaluation Plan An evaluation plan sets out the proposed details of an evaluation - what will be evaluated, how and when. The evaluation plan should include information about what the evaluation is trying to do what is to be evaluated the purposes of the evaluation key evaluation questions how it will be done what data will be collected how and when data will be collected how data will be analyzed how and when results will be reported

  6. Evaluation Plan Steps 8. Data Analysis & Interpre- tation 2. Stating an evaluation purpose 3. 4. 6. 1. Writing a project description 5. Selecting evaluation design 7. Identifying evaluation sources 9. Use & Commu- nication Identifying evaluation stakeholders Choosing evaluation questions Choosing evaluation tools

  7. 1. Writing a project description Need: What need is your program designed to meet? Context: What is the program s relevant socioeconomic, political context? That is, what contextual or cultural factors may affect its implementation or effectiveness? PopulationAddressed: Who is included in the population for whom activities are intended? Also consider, how does your program affect health equity for the identified vulnerable or marginalized populations? What are their determinants of health considerations? E.g. income level, access to health services, etc. Stage of Development: How long has the program been in place? Is it in the planning or implementation stage? Resources/Inputs Activities Outputs Outcomes Short- Term/Intermediate Long- Term Initial Subsequent

  8. LOGIC MODEL Situation Statement: Goal: Target Population: Equity Considerations: Assumptions: External Factors: Knowledge Synthesis Input Output Activities Outcomes Short-term Long-term Medium-term -Flyers -Workshops -Webinars -Modules -Posters -Other Impact: -Outreach -Communication activities -KTE -Other -Needs assessment -Gap Analysis -Lit Review -Environmental Scan -Staff -Space -Time -Dollars -Equipment -Time -Expertise Change in: Change in: -Improved access to care -Improved quality of life -Knowledge -Awareness -Perception -Understanding -Perceived belief -Attitude -Behaviour -Practice -Performance change -Plan to transform -Access -Participation -Satisfaction -Impact -Sustainability -Scalability -Relevance -Validity -Accuracy -Reach -Use -Efficiency -Pre-post test -New learning -Cost-effectiveness -Cost-benefit -SROI Evaluation embed within an equity framework

  9. 2. State the purpose of your evaluation Through the process of clarifying the objectives and goals of your initiative, you will be able to identify which major components should be evaluated. Example: Access Alliance Language Services, as a program of Access Alliance was evaluated, in order to assess its effectiveness in terms of meeting its desired outcomes, as well as a means to identify strengths and areas of improvement.

  10. 3. Stakeholder Assessment and Engagement Plan Stakeholder Category Interest or Perspective Role in the Evaluation Stakeholder Name {May be an individual or a group} {primary, secondary, tertiary} {program participant, staff, etc.} {planning team, external reviewer, etc.}

  11. 4. Choosing your Evaluation Questions What three to five major questions do you intend to answer through this evaluation? Do the questions align with the Good Evaluation Questions Checklist? Key Evaluation Questions should be developed by considering the type of evaluation being done, its intended users, its intended uses (purposes), and the evaluative criteria being used.

  12. Good Evaluation Questions Checklist Centre for Disease Control (CDC, 2013) created a checklist for use in assessing potential evaluation questions. This checklist is designed for use in reviewing the overarching questions guiding an evaluation. It does not apply to the specific questions included in a data collection instrument, such as survey or interview questions. Access Tool here: https://www.cdc.gov/asthma/program_eval/assessingev aluationquestionchecklist.pdf

  13. Program Evaluation Dimensions How important is the relevance or significance of the project regarding population level health needs and priorities? Is the Project responding to an identified client community need? Reach and Relevance Alignment between the stated aims of the project and the actual results. Is the Project achieving its intended goals and objectives? How big is the effectiveness or impact of the project compared to the objectives planned (Comparison: result planning)? Effectiveness How big is the efficiency or utilisation ratio of the resources used (Comparison: resources applied results)? How the resource of the project used to achieve the desired results? Efficiency Long term ability and operational capacity of the project to continue delivering against its goals. Are the positive effects or impacts sustainable/ scalable? How is the sustainability or permanence of the intervention and its effects to be assessed? Does the development intervention contribute to reaching higher level development objectives (preferably, overall objective)? Are these outcomes transferable to our agencies in the sector? Impact and Sustainability

  14. Key Evaluation Questions (KEQs): Process Evaluation How is the program being implemented? How appropriate are the processes compared with quality standards? Is the program being implemented correctly? Are participants being reached as intended? How satisfied are program clients? For which clients? What has been done in an innovative way?

  15. Key Evaluation Questions (KEQs): Outcome Evaluation How well did the program work? Did the program produce or contribute to the intended outcomes in the short, medium and long term? For whom, in what ways and in what circumstances? What unintended outcomes (positive and negative) were produced? Were inequities increased or decreased? To what extent can changes be attributed to the program? What were the particular features of the program and context that made a difference? What was the influence of other factors?

  16. Key Evaluation Questions (KEQs): Economic Evaluation What has been the ratio of costs to benefits? What is the most cost-effective option? Has the intervention been cost-effective (compared to alternatives)? Is the program the best use of resources?

  17. Evaluation Questions to Gauge Appropriateness To what extent does the program address an identified need? How well does the program align with government and agency priorities? Does the program represent a legitimate role for government?

  18. Evaluation Questions to Gauge Effectiveness & Efficiency Effectiveness: To what extent is the program achieving the intended outcomes, in the short, medium and long term? To what extent is the program producing worthwhile results (outputs, outcomes) and/or meeting each of its objectives? Efficiency: Do the outcomes of the program represent value for money? To what extent is the relationship between inputs and outputs timely, cost-effective and to expected standards?

  19. Evaluation Plan Template Objectives Key Evaluation Question Questions Information Required What information do you require to answer each of these questions? Data Source How will you collect this information? What questions do you need to ask to determine that your objectives have been met? Better and more timely planning and delivery of community services and infrastructure in community X through partnership model. What was the model? Who was involved? Was the diversity of the community represented? What was delivered? What was delivered differently because of the model? Has capacity been built to make the model sustainable? What lessons were learned? Could this model be generalised to other areas? Effectiveness Sustainable & Scalable Increased level of community participation in general activities and governance. Are more people from a range of backgrounds involved in general activities? Reach Are more people from a range of backgrounds involved in governance? Effectiveness

  20. 5. Selecting Evaluation Design What is the design for this evaluation? What is the rationale for using this design? Common features: Does not require a comparison group Includes qualitative and quantitative data collection Does not require advanced statistical methods Common methods include: Review of program documents/methods, administrative data Interviews, focus groups Direct observation Types of analysis: thematic identification; confirmation of findings across sources (triangulation) Common features: Typically requires quantitative data May include a comparison group (impact evaluation) Often requires statistical methods Common methods include: Pre-post test

  21. https://www.nationalservice.gov/sites/default/files/resource/Evaluation_Designs_Slides.pdfhttps://www.nationalservice.gov/sites/default/files/resource/Evaluation_Designs_Slides.pdf

  22. 6. Choosing Evaluation Tools: Program Evaluation Levels Evaluation Level Focus Measurement Methods Reaction Perception of- Knowledge Satisfaction Motivation Usefulness Rating scales Surveys Focus Groups Structured in-depth interview Learning Acquisition of- Knowledge Skills Attitude Pre-post test Essay questions Case study analysis Simulations / Role play Transfer Real life transfer of- Knowledge Skills Attitude Problem solving Record reviews and audits Surveys Observations Checklists Critical incident report Utilization Real world outcome Record reviews and audits Surveys Observations Compliance reviews Critical incident report Program institutionalization

  23. Common Evaluation Tools Client Experience Survey Focus Groups In-depth Interviews Debriefing Case studies Reflective diaries Content Evaluation: Pre-Post Other methods: e.g., Art-based method Equity Considerations: How will it be ensured that the methods suit the participants? (e.g. Are the methods appropriate for nonverbal clients or clients with mental health issues, cognitive issues, addictions?

  24. 7. Choosing Evaluation Sources Data Collection What Constitutes Success ? Evaluation Question Criteria or Indicator 1. 2. Evaluation Question Data Collection Method Source of Data 1. 2.

  25. Evaluation Plan Template Objectives Key Evaluation Question Questions Information Required What information do you require to answer each of these questions? Data Source How will you collect this information? What questions do you need to ask to determine that your objectives have been met? Better and more timely planning and delivery of community services and infrastructure in community X through partnership model. What was the model? Who was involved? Was the diversity of the community represented? What was delivered? What was delivered differently because of the model? Has capacity been built to make the model sustainable? What lessons were learned? Could this model be generalised to other areas? Description of the model List of partners over time Significant organisations not involved in the partnership Audit of services/ infrastructure delivered Partners assessment of what happened differently because of the model Partners assessment of the capacity that has been built and sustainability Partners assessment of lessons High level strategic opinion about the applicability of the findings to other areas Project application Bi-monthly reports Stakeholder interviews Achievement audit Stakeholder interviews Workshop on evaluation results Effectiveness Sustainable & Scalable Increased level of community participation in general activities and governance. Are more people from a range of backgrounds involved in general activities? Number of participants - Age groups, CALD, indigenous etc Enrolment forms Achievement audit Pre and post population indicators Bi-monthly reports Achievement tool Pre and post population indicators Reach Are more people from a range of backgrounds involved in governance? Number of participants - Age groups, CALD, indigenous etc Effectiveness

  26. 8. Data Analysis & Interpretation Analysis What method(s) will you use to analyze your data (e.g., descriptive statistics, inferential statistics, qualitative analysis, such as content or thematic analysis)? Provide example table shells, templates, or qualitative codebook that specifies the output for each type of analysis you plan to conduct. Interpretation Who will you involve in drawing, interpreting, and justifying conclusions? Does this group include program participants or others affected by the program? What are your plans, including evaluation capacity building activities, to involve them in this process? Analysis to Be Performed Data to Be Analyzed Person(s) Responsible Due Date

  27. 9. Use and Communication Use How will evaluation findings be used? By whom? How does the timeline for reporting findings and potential recommendations align with key events for which you will need information from the evaluation (e.g., grant application, partner meeting)? Who is responsible for creating and monitoring an action plan to guide the implementation of evaluation recommendations? What follow up is needed? What lessons learned, including those about evaluation and evaluation capacity building, should be shared? How will they be documented? KTE plan Which evaluation stakeholders will you communicate with and for what purpose (e.g., update on status of evaluation, invite to meetings, share interim or final findings)? What methods (e.g., in-person meetings, emails, written reports, newsletter article, presentations) will you use to communicate with evaluation stakeholders? Who is best suited to deliver the information (e.g., evaluator, program manager, coalition leader)? Why are these methods appropriate for the specific evaluation stakeholder audience of interest?

  28. Communication Plan Applicable (Yes/No) Purpose Format Messenger Time/ dates Notes Include in decision making about evaluation design/activities Inform about specific upcoming evaluation activities Keep informed about progress of the evaluation Equity Considerations: How can we ensure results are communicated to all clients and wider community? (e.g. Using a checklist of stakeholders) Present initial/interim findings Present complete/final findings Document the evaluation and its findings Document implementation of actions taken because of the evaluation

  29. Evaluation Report Guidelines

  30. Thank you!

Related


More Related Content