Understanding and Evaluating Smart Specialisation in Practice

Slide Note
Embed
Share

Delve into the world of monitoring and evaluating Smart Specialisation Strategies (S3) with a focus on effectiveness, efficiency, and shared understanding. Discover the challenges, processes, and importance of designing and conducting evaluations for impactful decision-making. Explore the RIS3 model and its role in bridging the gap between expected and actual outcomes. Unlock the key concepts of evaluation basics and learn how to improve processes for better results.


Uploaded on Sep 12, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Wise Guys Ltd T M E W +44 1273 45 45 35 Wise Guys Ltd. 2 Queens Place +44 7715 05 41 10 Shoreham-by-Sea, W. Sussex ken.guy@wiseguys.ltd.uk BN43 5AA, UK www.wiseguys.ltd.uk Evaluating Smart Specialisation Where to Start? Ken Guy Wise Guys Ltd. Brussels, 240119

  2. Structure of the Day Monitoring Smart Specialisation Technical and conceptual challenges Organisational challenges Is there a shared understanding? Monitoring in action From Monitoring to Evaluation What questions should an S3 evaluation answer? How should we evaluate? The relationship between monitoring and evaluation Timescale for evaluation Evaluating Smart Specialisation: where do we start?

  3. How do we get to our destination? Well I wouldn t start from here

  4. Designing and Conducting Evaluations There has to be shared understanding Monitoring and evaluation are inextricably linked The basic concepts are quite easy to explain Designing M&E schemes is more difficult Implementing them can be hell But the effort is usually worth it

  5. Evaluation Basics Appropriatenes s Was it the right thing to do? Contextual Understanding Economy Has it worked out cheaper than we expected? Issues and Approaches Effectiveness Has it lived up to expectations? Efficiency What s the return on investment (ROI)? Process Efficiency Is it working well? Models and Indicators Quality How good are the outputs? Impact What has happened as a result of it? Data and Analysis Additionality What has happened over and above what would have happened anyway? Process Improvement How can we do it better? Communication Strategy What should we do next?

  6. RIS3 Model Expected Impacts Expected Results Expected Outputs Expected Inputs The Performance Gap Actual Inputs Actual Outputs Actual Results Actual Impacts

  7. In Practice RIS3 monitoring systems are well suited in theory to summative evaluations that focus on effectiveness (actual vs. expected outcomes) and efficiency (outcome/input ratios) They are complicated by technical and conceptual challenges and organisational challenges They are less well suited to formative evaluations designed to lead to learning and process improvements But learning can and should be embedded into existing systems

  8. Where Next for Regions? Stocktaking In the absence of any need for regulatory compliance, regions will have to consider the utility of complementing RIS3 monitoring activities with evaluation activities and the levels that can be tackled Mid-term Reviews Mid-term reviews can lead to process improvements during the latter stages of the 2014-2020 cycle and valuable inputs into strategy formulation for the next cycle In many instances, RIS3 monitoring will need to be complemented by other customised evaluation exercises Ex-post Evaluations Ex-post evaluations are critical for longer-term assessments of the efficiency and effectiveness of policies at the regional and supra- regional levels Commitment is needed by regional authorities to ensure that some RIS3 monitoring data are capable of feeding into both regional assessments of performance and comparisons of performance across regions

  9. Where Next for the Commission? Eradicate confusion and inefficiency by ensuring that compliance via fulfilment criteria is associated with both RIS3 monitoring and evaluation activities Provide adequate guidance to regions concerning the design and conduct of RIS3 Mid-term Reviews and Ex-post Evaluations Devise a conceptual scheme and related guidance that will allow regions to produce some evaluation data that lend themselves to inter-regional comparison and aggregation But also emphasise the primacy of learning and the slightly different approaches that are needed to achieve this Support research into the use of data analytics to make sense of the vast amounts of data that will be generated via RIS3 monitoring and evaluation exercises

  10. Are we there yet? Let s stop for a drink and think about it

Related