Innovating evaluation
Examining the unintended consequences and complexities in evaluation processes, discussing the interplay between regulation and experimentation, and exploring challenges in policy analysis and evaluation fields. The discussion dives into the need for systems thinking to make impactful evaluations.
Uploaded on Feb 27, 2025 | 1 Views
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Innovating evaluation Challenges and opportunities to make impactful evaluation - - - - Cross-purpose mechanisms at play: an unintended consequence? Regulation vs. experimentation: constraints and opportunities to innovate evaluation Challenges in politcs and policy analysis/evalutation fields What can systems thinking and complexity approach can offer? Mita Marra, PhD - University of Naples & the George Washington University Associate Professor and Editor-in-Chief, Evaluation and Program Planning mita.marra@unina.it or marram@gwu.edu
Cross-purpose mechanisms at play An unintended consequence? The mechanisms the institutionalization of evaluation triggered were: Fear of punishment, shirking, compliance, red tapee, resistance & contestation As opposed to: Evaluative thinking, policy learning, social learning, inclusive participation, co- creation Behavioral sciences have pointed to biases and heuristics end-receivers of information show when they perceive information as potentially challenging Do we need to work on communication style and channels? Yes but not only
Experimentation vs. Regulation Constraints & opportunities to innovate On the supply side: Laws and by-laws are requiredand binding in civil law countries (compliance mentality) Nudge and soft-power measures (libertarian paternalism) to increase the influence of evaluation, e.g., sunset legislation; organizational working through profiles like the devil s advocate or the red teaming, to spot shortcomings through critical thinking On the demand side: Evidence-informed policy making Interest in social impact assessment cross-cutting public and private sectors ESG and pay-for success schemes that increase the demand for evaluation
Challenges In the policy/politics and policy analysis/evaluation fields Post-truth phenomena and polarization of ideological positions New and old crises (climate change and pandemic, war and poverty) Multi-actor stakeholder participation Overarching programs with multiple objectives and complex theories of changes to get explored. Multiple sources of evidence to integrate (scientific evidence, evaluative findings, expert judgment, user experience, organizational know-how, citizens science, social media, Big data, etc.), interpret, and make sense
What can systems thinking offer? Innovating evaluation a few hints based on Hirschman Allow for self-evaluation and peer review => give latitude and space for unintended consequences and serendipity Envision framework legislation with standards set and peer-reviewed by independent authorities rather than binding by-laws Make different evaluation systems interact at different scales => backward and forward linkages e.g., education evaluation with training and labor market, scientific research & business ecosystems, infrastructure and services in the organization of urban life, cultural heritage and the evinronment. See https://www.ees2022.eu/files/copenhagen_framework_for_sound_evaluation_systems.pdf https://europeanevaluation.org/wp-content/uploads/2020/04/Evaluation-Connections-December-2012.pdf https://onlinelibrary.wiley.com/doi/abs/10.1002/sres.2423 by M. Reynods, E. Gates, R. Hummelbrunner, M. Marra, B. Williams (2016) Reflect upon the positionality of the evaluator and other key stakeholders to grasp emergence and embeddedness => becoming aware of obstacles in change perception e.g., observing, immersed within, or acting in favor of, change? Play with evaluative designs, techniques, and methods => trespassing disciplinary and traditional approaches e.g., mixed methods case study designs and/or case study mixed methods designs Question metrics and indicators => unleash creativity beyond rationalistic or incrementalist approaches to reform: e.g., are ESG useful? Why, when, how, and for whom? Focus on the use of evaluation findings rather than evaluation findings Focus on the use of evaluation findings rather than evaluation findings => reformongering Hirschman, A.O. (1967) Development Projects Observed, Brookings Institutions, Washington DC