Rationale and Framework for Evaluating Creativity Collaboratives

Slide Note
Embed
Share

Explore the importance of evaluating Creativity Collaboratives through an overview of the evaluation framework by Durham University. Discover the aims, roles, and responsibilities involved in the evaluation process, along with the vision for creative education and teaching shared by the Arts Council England. Definitions of creativity and creative thinking are provided, emphasizing the significance of teaching for creativity. Delve into the reasons for evaluating Creativity Collaboratives and the expected outcomes of the evaluation project.


Uploaded on Oct 03, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Creativity Collaboratives Information Session - Evaluation Helen Cramman and Victoria Menzies School of Education Durham University 6thJuly 2021

  2. Contents Introduction from Arts Council England Aims for the session Why evaluate the Creativity Collaboratives? Overview of the overarching evaluation framework Evaluation roles and responsibilities - what each party will be doing and expecting Questions so far BREAK (5 minutes) What might individual Creativity Collaboratives want to learn from their own evaluation? Questions and discussion

  3. Introduction from Arts Council England Durham Commission vision for a creative education Teaching and creativity through system leadership and collaboration A watershed moment in education

  4. Aims of the session The aims of the session are to: 1) Share the rationale for evaluating the Creativity Collaboratives. 2) Give a brief overview of what evaluation is. 3) Summarise the plans for the overarching evaluation framework being carried out by Durham University. 4) Provide the opportunity for you to ask questions related to evaluation of the Creativity Collaboratives.

  5. Definitions Creativity: The capacity to imagine, conceive, express, or make something that was not there before. Creative thinking: A process through which knowledge, intuition and skills are applied to imagine, express or make something novel or individual in its contexts. Creative thinking is present in all areas of life. It may appear spontaneous, but it can be underpinned by perseverance, experimentation, critical thinking and collaboration. Teaching for creativity: Explicitly using pedagogies and practices that cultivate creativity in young people. Durham Commission For Creativity and Education Report (2019) https://www.dur.ac.uk/resources/creativitycommission/DurhamReport.pdf

  6. Why evaluate the Creativity Collaboratives?

  7. Why evaluate the Creativity Collaboratives? 1. What is evaluation? 2. Who benefits from evaluation? 3. Evaluation aims for this project 4. Who is the evaluation for and where will the findings of this project be used?

  8. What is evaluation? Educational evaluations are intended to provide evidence-based arguments about whether educational outcomes can be improved based on the implementation of intervention strategies (ScienceDirect, 2021). The aim of evaluation is to provide rigorously gathered evidence to inform decisions about policies, programmes and resources. For a pilot project, the emphasis is very much on evidence of promise (e.g. is there evidence of expected change happening?), feasibility (e.g. is the approach acceptable to participants?). How manageable is the intervention and what appear to be the most important factors in successful implementation? (Humphry, 2016) https://www-sciencedirect-com.ezphost.dur.ac.uk/topics/social-sciences/educational-evaluation https://educationendowmentfoundation.org.uk/public/files/Evaluation/Setting_up_an_Evaluation/IPE _Handbook.pdf

  9. Who benefits from evaluation? Evaluation can be beneficial at many levels Pupils Parents Teachers Senior Leaders Consortia Professional Bodies Funders Policy Makers Research Evidence to support decision making. For example: What is working well, where could refinements be made? Are all students benefitting from participating in a programme? What is important for successful implementation? Which programmes to invest time and money in? Evaluation can be conducted internally by staff within an organisation or externally, or as a combination of both approaches.

  10. Evaluation aims for this project The evaluation framework has three key aims: To understand the impact of each Creativity Collaborative against their individual research question/s. To support Creativity Collaboratives in monitoring and evaluating their own practice so that they can use data to learn about what they are doing, improve their practice and justify their practice to external stakeholders. To bring together and analyse the impact of the differing approaches adopted by the national Creativity Collaboratives programme and understand the overall impact of the programme in establishing teaching for creativity within schools and the curriculum.

  11. Who is the evaluation of this project for and where will the findings be used? Immediate and direct benefits from the evaluation findings for: Schools within the Creativity Collaboratives programme Partner organisations Arts Council England Durham Commission on Creativity and Education Aiming to influence: Department for Education and across Government Ofsted Funders Schools Organisations that work with schools Business and Enterprise

  12. Overview of the overarching evaluation framework

  13. Evaluation data collection The evaluation framework will include data collected at a range of levels. Data collection tools will be provided by the Durham research team. To be undertaken by all schools (leads and partners), supported by Creativity Collaborative Leads, at three time-points (baseline November 2021, interim July 2023 and end of programme July 2024). Senior leadership school usual practice survey Teacher confidence in teaching for Creativity survey (all teachers in lead and partner schools) Pupil level data collection

  14. Evaluation data collection Undertaken by Creativity Collaborative Leads Online reflective portfolio six timepoints (February and July annually) Attending and contributing to focus group discussion at annual conferences three timepoints (September 2022, 2023, 2024) Completion of ACE annual reporting documentation (CC Leads) - three timepoints (September 2022, 2023, 2024) Undertaken with ACE at the end of the programme Interview with ACE central team and focus group with ACE regional managers (July 2024) Case study with one/two Creativity Collaborative(s)

  15. Timeline (part 1) Successful Creativity Collaboratives announced September 2021 Theory of Change and Evaluation plans development and training sessions Baseline senior leader usual practice survey, teacher confidence survey, pupil data collection November 2021 Start of programme January 2022 Creativity Collaborative Leads complete Online reflective portfolio entry February 2022 Creativity Collaborative Leads complete Online reflective portfolio entry July 2022 Workshop with presentation of interim evaluation findings and Creativity Collaborative Lead discussion focus groups September 2022 Year 2 Creativity Collaborative Leads complete Online reflective portfolio entry February 2023 Interim senior leader usual practice survey, teacher confidence survey, pupil data collection July 2023 Creativity Collaborative Leads complete Online reflective portfolio entry September 2023

  16. Timeline (part 2) Workshop with presentation of interim evaluation findings and Creativity Collaborative Lead discussion focus groups September 2023 Year 3 Creativity Collaborative Leads complete Online reflective portfolio entry February 2024 End of programme senior leader usual practice survey, teacher confidence survey, pupil data collection July 2024 Creativity Collaborative Leads complete Online reflective portfolio entry ACE central team interviews and regional delivery manager focus group End of programme Workshop with presentation of interim evaluation findings and final Creativity Collaborative Lead discussion focus groups End of project evaluation report submitted to ACE December2024

  17. Summary of roles What are the roles and responsibilities in relation to the overarching evaluation framework and evaluation within your network Creativity Collaborative Lead Lead Creativity Collaborative Schools Partner Creativity Collaborative Schools Partner organisations ACE Durham University Evaluation Team

  18. Questions so far?

  19. 5 minutes

  20. What might you, as individual Creativity Collaboratives, want to learn from your own evaluation?

  21. Questions 1. What impact are you aiming for your programme to have? 2. What outcomes do you need to achieve to make that impact? 3. What activities are you going to carry out to achieve those outcomes?

  22. Questions 1. What impact are you aiming for your programme to have? What assumptions are you making about how -> the activities will lead to the outcomes and how -> the outcomes will lead to the desired impact? 2. What outcomes do you need to achieve to make that impact? 3. What activities are you going to carry out to achieve those outcomes?

  23. Questions Are the aims for the impact of your programme SMART? S Specific M Measureable A Achievable R Relevant T Timely 1. What impact are you aiming for your programme to have? 2. What outcomes do you need to achieve to make that impact? Will there be impact for different levels of participant (e.g. teaching staff, senior leadership, students, parents etc.)? 3. What activities are you going to carry out to achieve those outcomes? Will impact be the same across your whole network?

  24. Questions ? 1. What impact are you aiming for your programme to have? How will you know if you have successfully achieved the desired impact? 2. What outcomes do you need to achieve to make that impact? 3. What activities are you going to carry out to achieve those outcomes?

  25. Questions Measuring impact: What can you measure to objectively measure impact? When do you expect measurable impact to have occurred? Will the timeline for impact be different for different participants? What contextual information do you need to know about participants to see if the impact has been the same for everyone?

  26. Questions 1. What impact are you aiming for your programme to have? How will you know what is working well and what could be improved? 2. What outcomes do you need to achieve to make that impact? When is it useful to have this information? ? 3. What activities are you going to carry out to achieve those outcomes?

  27. Questions Example questions to understand implementation: Did everything get implemented as planned? How was the programme changed as it was implemented? How much of the programme did participants experience? Did everyone who should have been able to access it have access? Were there any groups that particularly struggled to access it? Were there any barriers to participation? If so, what were they? What did the participants think of the programme? How was the programme different to what was already available?

  28. Data collection methods Align data collection closely to your research questions Data collection could include: Assessments Interviews Focus groups Surveys/questionnaires Concept maps Samples of work Policy documents Video of classroom discussions/activities Writing Photographic data Reflections Diary Storytelling What kind of data will tell you what you need to know? Summative/impact data vs Implementation and process data Durham evaluation team can support

  29. Evaluation Support The Creativity Collaborative Leads will be supported by the Durham research team to undertake their evaluation activities through several routes: an evaluation toolkit provided by the research team at Durham University, which will include template data collection tools that Creativity Collaboratives have the option to choose from to support their evaluation activities ( pick and mix ). Two one-day workshops in October and November 2021 to train Creativity Collaborative Leads in developing and refining Theory of Change (ToC) models, to introduce the principles of evaluation and to support Creativity Collaboratives in refining their evaluation design and data collection plans. Creativity Collaborative Leads will also be able to contact the Durham research team for a one hour phone call before the end of December 2021 for additional support. From January 2022, each Creativity Collaborative will have the opportunity to access up to five hours of evaluation coaching support provided the Durham research team at any time up to the end of the programme.

  30. Questions and Discussion If you have questions about the evaluation after the session, please direct these to Sophia Ronketti (Sophia.Ronketti@artscouncil.org.uk) or Nicky Morgan (Nicky.Morgan@artscouncil.org.uk) at Arts Council England

Related


More Related Content