Evaluation Culture in SEE: Comparative Study in Public Procurement Innovation

Slide Note
Embed
Share

This study explores the evaluation culture in South East Europe (SEE) with a focus on public procurement innovation through a comparative and needs assessment approach. The research considers the feasibility of benchmarking with both objective and subjective indicators, highlighting challenges in diverse national contexts. Methodological insights from literature and research are discussed, emphasizing the importance of stakeholder engagement, knowledge transfer, and decision-making in public procurement of Research, Technological Development, and Innovation (RTDI) evaluations.


Uploaded on Sep 24, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Evaluation culture in SEE (Public procurement in SEE innovation evaluations: A comparative and needs assessment study; Lena Tsipouri Nikos Sidiropoulos University of Athens, Centre of Financial Studies December 12, 2013, Zagreb www.eval-inno.eu

  2. Outline Does a comparative study and benchmarking make sense (objective/subjective indicators)? Methodological remarks (literature; research) Experiences from the EVAL INNO countries A Synthesis Lessons from intelligent benchmarking 2

  3. Does a comparative study and benchmarking make sense The background of the countries studied differs considerably (Austria-Greece-Hungary-Bulgaria / Montenegro-Serbia) The legal framework is hardly applicable to the size of the procurement studied Objective indicators (measurable such as number of evaluations; budgets; frequency, but also qualitative regarding quality and response) are limited Subjective indicators can be obtained but are subject to criticism (the solution is stakeholder approval) 3

  4. Methodological remarks (literature) Common rules for public procurement, as a tool of the Single Market (contracts over 499000; Open procedures, Restricted procedures, Negotiated procedures and Competitive dialogue). Usually RTDI Evaluations fall in the budget range of 50000-500000 Euros There is no real academic literature; There is a broad number of tender documents and terms of reference available, mostly (but not exclusively) originating in procurement by the European Commission; Information of the later stages of the procurement, namely monitoring of the evaluations and their acceptance, is practically absent; 4

  5. Methodological remarks (research) Borrow notions for benchmarking from the pverall literature on public procuremnt Quality provision of the service Knowledge of the international standards and good practices Knowledge of the local needs and capabilities Knowledge of international capabilities and interest to respond to national tenders (price and reputational issues) Good and timely decisions 5

  6. Public procurement of RTDI evaluations: our approach Decompose the process: Identifying the requirements and user readiness Market intelligence Tendering process (Terms of reference: background, data availability, questions and methods) Assessing tenders and awarding contracts Managing contract delivery Response to recommendations 6

  7. The basic dimensions The institutional set up (formal and informal rules) Key organisations involved Tendering process 7

  8. The institutional set up (formal rules) The budget thresholds for general provisions for public tendering The existence (or not) of special provisions for RTDI evaluations (e.g. specific thresholds; individual selection procedures etc.) Explicit legislation (or not) regarding the legal obligation of awarding authorities to evaluate their programmes or organisations. The existence (or not) of evaluation standards 8

  9. The institutional set up (informal rules) Relevant parameters for launching tenders (strategy issues): Frequency Type of evaluations 9

  10. Key organisations involved Awarding Authorities (how many, how good, how can they improve) Evaluators (local, national, international; issues of independence, expertise and reliability for evaluators called for direct or restricted tenders; how good, how can the market evolve) Other stakeholders (exercise pressure for RTDI evaluations) 10

  11. Tendering process Terms of Reference (how good they are/could be) Smooth process (no legal or other complications) Time to contract (benchmarks) Monitoring (hands on or off?) Content (how ambitious are the Terms of Reference?) Adoption of recommendations (of the specific evaluation and more in general) 11

  12. Type of parameters FORMAL RULES OF PUBLIC PROCUREMENT OF EVALUATION The budget thresholds for general provisions for public tendering The existence (or not) of special provisions for RTDI evaluations (e.g. specific thresholds; individual selection procedures etc.) Explicit legislation (or not) regarding the legal obligation of awarding authorities to evaluate their programmes or organisations. Objective Subjective Yes Yes Yes The existence (or not) of evaluation standards. INFORMAL RULES OF PUBLIC PROCUREMENT OF EVALUATION The frequency of evaluations Yes Yes, because there is no systematic record Yes Yes Yes The type of evaluations The willingness to improve Who are the champions? 12

  13. Type of parameters IMPLEMENTATION Smooth process Time to contract Monitoring Content Adoption of recommendations AWARDING AUTHORITIES Awarding authorities in WP3 Yes, number from the DB Yes Yes (no track record) Yes Yes Yes (no track record) Awarding authorities interviewed Yes, based on willingness Yes Yes Yes Yes Experience in evaluation market/needs Experience in drafting ToR Willingness to experiment Willingness to participate in training EVALUATORS Evaluators in WP3 Yes, number from the DB Evaluators interviewed Yes, based on willingness Yes Yes Yes Experience Experience in drafting ToR Willingness to participate in training 13

  14. Experiences from the EVALL-INNO countries Measuring/subjective rating per parameter decomposed Comments per parameter Comments per country 14

  15. An example of experiences from the EVALL-INNO Special provisions for RTDI evaluations Austria No Yes Bulgaria No No Greece No No Hungary No Yes Montenegro No No Serbia No No Comments per parameter framework; Austria and Hungary explicit rules all programmes need to be evaluated. Explicit legislation Standards Comments per country Yes No No No No No Standards only exist in Austria Model country Significant problems Significant problems Basics in place Significant problems Significant problems general Only in 15

  16. Experiences from the EVALL-INNO countries Frequency identified by country visits High Type of evaluations Willingness to improve/ experiment (max 5*) RTDI evaluation champions Austria Restricted tenders frequent Mainly mandatory through SF *** *** (Platform) very Bulgaria Low ** * Funds) * (Structural Greece Hungary Montenegro Negligible PRAG Serbia Negligible PRAG Negligible Mainly Internal ** Low Mainly Internal * * (GSRT) * (New Unit) -- -- ** * The European Commission defined PRAG Practical Guide to Contract Procedures for EU External Actions 16

  17. Experiences from the EVALL-INNO countries Smooth process Time contract to Monitoring Content Adoption recommendations of Comments country per Austria Yes *** Good/variable Variable 60% Implementation smooth but can be further improved is Bulgaria Yes *** Limited/variable Standard 40% Need to improve monitoring, content of the ToR and relevance of recommendations Greece Yes * Limited Standard 20% Hungary Yes ** Limited/variable Standard 40% Montenegro Yes ** Limited Standard 30% Serbia Yes ** Limited Standard 30% 17

  18. A topic for benchmarking Austria 5 4.5 4 3.5 3 Serbia Bulgaria 2.5 2 1.5 Formal institutions 1 Informal 0.5 Implementation 0 Awarding authorities Evaluators Montenegro Greece Hungary 18

  19. What next: Do countries want to incorporate the benchmarking lessons (what should be their priorities; what is their distance to top; who are the top performers; from whom to learn)? Should/could the EU play a role in taking the (obvious but documented) results a step further? If yes how? Are other stakeholders interested in the results (to be read horizontally or vertically) 19

  20. Thank you for your attention! tsipouri@econ.uoa.gr nikos.sidiropoulos@gmail.com office@eval-inno.eu www.eval-inno.eu

Related