Challenges and Opportunities in Science and Technology Policy Indicators

Slide Note
Embed
Share

Towards creating indicators for the field of science and technology policy, this content discusses the increasing demands in research management and evaluation, the potential role of indicators in decision-making, the drawbacks of conventional indicators, and the current issues with the usage of science and technology indicators.


Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. ORCID CASRAI Barcelona, May 2015 Towards indicators for opening up science and technology policy Ismael Rafols Ingenio (CSIC-UPV), Universitat Polit cnica de Val ncia SPRU (Science Policy Research Unit), University of Sussex, Brighton, UK Observatoire des Sciences et des T chniques (OST-HCERES), Paris Building on work with Tommaso Ciarli and Andy Stirling (SPRU), Loet Leydesdorff (Amsterdam), Alan Porter (GTech, Atlanta)

  2. Pressing demands of research management and evaluation Increasing size of research endeavour 1.5 M papers per year only in Web of Science Globalisation. Many mid-income countries have multiplied their publication output (China) Within a country: 3,000 postgraduate programmes are evaluated in 48 panels in BR Increasing competition for funding globally and locally Success rates of research calls are very low in the US, EU (10%-20%) Increasing societal demands Interactions with industry and social actors (NGOs) Grand challenges (climate change, epidemics, water & food security) Traditional qualitative techniques of management cannot cope. Hope that use of indicators can help...

  3. Can indicators help? Yes, indicators can help make decisions Increase transparency and sense of objectivity Reduce complexity Reduce time and costs The dream of rationality, the science of science policy (De Solla Price, Garfield, 1960s .Marburguer, Julia Lane, 2000s) but do they lead to the right decisions? Evaluation gap (Wouters): discrepancy between evaluation criteria and the social and economic functions of science

  4. Perverse effects of conventional indicators Conventional indicators (such as IFs, or h-index) are (often) biased against: Field research (epidemiology) Applied research Social science and humanities Peripheral countries Non-English publications and authors Some topics outside outside mainstream (e.g. preventive medicine)?? re-inforcing existing power structures in S&T reducing diversity, making S&T less relevant to society (Q: would use of peer review lead to same biased outcomes?)

  5. Current use of S&T indicators Use of conventional S&T indicators is *problematic* Narrow inputs (only pubs!) Scalar outputs (rankings!) Aggregated solutions --missing variation Opaque selections and classifications (privately owned databases) Large, leading scientometric groups embedded in government / consultancy, with limited possibility of public scrutiny Sometimes even mathematically debatable Impact Factor of journals (only 2 years, large error bar) Average number of citations (pubs) in skewed distributions

  6. The Leiden Manifesto (in the making) on use of indicators Metrics Should support, not replace expert evaluation. Should match institutional mission Should not suppress locally relevant research Should be simple, transparent, accessible and verifiable by evaluated Should take into account field and country differences/contexts Metrics for individual researchers must be based on qualitative judgment. Intended and unintended effects of metrics should be reflected upon before use Hicks, Wouters, Waltman, de Rijcke and Rafols (Nature, in press)

  7. How can S&T indicators help in science policy? What type of answer" should indicators provide? Model 1: Unique and prescriptive Proposing best choices Rankings -- ranking list of preferences Model 2: Plural and conditional Exploring complementary choices Facilitating options/choices in landscapes

  8. From S&T indicators for justification and disciplining Justification in decision-making Weak justification, Give me a number, any number! Strong justification, Show in numberrs that X is the best choice! S&T Indicators have a performative role: They don t just measure. Not just happen to be used in science policy (neutral) Constitutive part incentive structure for disciplining (loaded) They signal to stakeholders what is important. Institutions use these techniques to discipline subjects Articulate framings, goals and narratives on performance, collaboration, interdisciplinarity

  9. towards S&T indicators as tools for deliberation Yet is possible to design indicators that foster plural reflection rather than justifying or reinforcing dominant perspectives This shift is facilitated by trends pushed by ICT and visualisation tools More inputs (pubs, pats, but also news, webs, etc.) Multidimensional outputs (interactive maps) Institutional repositories Multiple solutions -- highlighting variation, confidence intervals More inclusive and contrasting classifications (by-passing private data ownership? Pubmed, Arxiv) More possibilities for open scrutiny (new research groups)

  10. 1. Conceptual framework: broadening out vs. opening up policy appraisal

  11. Policy use of S&T indicators: Appraisal Appraisal: the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments Leach et al. (2008) Breadth: extent to which appraisal covers diverse dimensions of knowledge Openness: degree to which outputs provide an array of options for policies.

  12. Policy use of S&T indicators: Appraisal Appraisal: the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments Leach et al. (2010) Example: Allocation of resources based on research excellence Breadth: extent to which appraisal covers diverse dimensions of knowledge Narrow: citations/paper Broad: citations, peer interview, stakeholder view, media coverage, altmetrics Openness: degree to which outputs provide an array of options for policies. Closed: fixed composite measure of variables unitary and prescriptive Open: consideration of various dimensions plural and conditional

  13. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal outputs on decision-making closing-down opening-up narrow range of appraisals inputs (issues, perspectives, scenarios, methods) broad Leach et al. 2010

  14. Appraisal methods: broad vs. narrow & close vs. open effect of appraisal outputs on decision-making closing-down opening-up narrow cost-benefit analysis open hearings risk assessment structured interviews sensitivity analysis range of appraisals inputs citizens juries q-method consensus conference (issues, perspectives, scenarios, methods) decision analysis scenario workshops narrative-based participant observation multi-criteria mapping broad Stirling et al. (2007)

  15. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal outputs on decision-making closing-down opening-up narrow Most conventional S&T indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) broad

  16. Broadening out S&T Indicators effect of appraisal outputs on decision-making closing-down opening-up narrow Conventional S&T indicators?? Incorporation plural analytical dimensions: range of appraisals inputs global & local networks hybrid lexical-actor nets etc. (issues, perspectives, scenarios, methods) New analytical inputs: media, blogsphere. Broadening out broad

  17. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal outputs on decision-making closing-down opening-up narrow Journal rankings Unitary measures that are opaque, tendency to favour the established perspectives University rankings range of appraisals inputs European Innovation Scoreboard and easily translated into prescription (issues, perspectives, scenarios, methods) broad

  18. Opening up S&T Indicators effect of appraisal outputs on decision-making closing-down opening-up narrow opening-up Conventional S&T Indicators?? range of appraisals inputs Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration (issues, perspectives, scenarios, methods) NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction broad

  19. 2. Examples of Opening Up a. Broadening out AND Opening up b. Opening up WITH NARROW inputs

  20. 1. Preserving multiple dimensions in broad appraisals effect of appraisal outputs on decision-making closing-down opening-up narrow Conventional S&T indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) opening-up Broadening out broad Leach et al. 2010

  21. Composite Innovation Indicators (25-30 indicators) European (Union) Innovation Scoreboard Grupp and Schubert (2010) show that order is highly dependent on indicators weightings. Sensitivity analysis

  22. Solution: representing multiple dimensions (critique by Grupp and Schubert, 2010) Use of spider diagrams allows comparing like with like U-rank, University performance Comparison tools (Univ. Twente) 5.4 Community trademarks indicator

  23. 2. Examples of Opening Up b. Opening up WITH NARROW inputs

  24. Opening up S&T Indicators effect of appraisal outputs on decision-making closing-down opening-up narrow opening-up Conventional S&T Indicators?? range of appraisals inputs Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration (issues, perspectives, scenarios, methods) NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction broad Leach et al. 2010

  25. 1. Excellence: Opening Up Perspectives Provide different perspectives of scientific impact

  26. Measures of scientific excellence 4 4 Journal Impact Factor 3.5 3 3 ABS Rank 2.5 2 2 1.5 1 1 0.5 0 0 ISSTI SPRU MIoIR Imperial WBS LBS ISSTI SPRU MIoIR Imperial WBS LBS 5 Journal-field Normalised 4 Citations/pub 3 2 1 0 ISSTI SPRU MIoIR Imperial WBS LBS Which one is more meaningful?? Rafols et al. (2012, Research Policy)

  27. Measures of scientific excellence 4 4 Journal Impact Factor 3.5 3 3 ABS Rank 2.5 2 2 1.5 1 1 0.5 0 0 ISSTI SPRU MIoIR Imperial WBS LBS ISSTI SPRU MIoIR Imperial WBS LBS 5 0.2 Citing-paper Normalised Journal-field Normalised 4 0.15 Citations/pub Citations/pub 3 0.1 2 0.05 1 0 0 ISSTI SPRU MIoIR Imperial WBS LBS ISSTI SPRU MIoIR Imperial WBS LBS Which one is more meaningful?? Rafols et al. (2012, Research Policy)

  28. 2. Interdisciplinarity: Opening Up Perspectives Explore different concepts of same policy notion

  29. Multiple concepts of interdisciplinarity: Coherence Conspicuous lack of consensus but most indicators aim to capture the following concepts Low High High Diversity Interdisciplinary Integration (diversity & coherence) Research that draws on diverse bodies of knowledge Research that links different disciplines Multidisciplinary Low Monodisciplinary Intermediation Low High Intermediation Research that lies between or outside the dominant disciplines Monodisciplinary Interdisciplinary

  30. Diversity Assessing interdisciplinarity ISSTI Edinburgh WoS Cats of references

  31. Coherence Assessing interdisciplinarity ISSTI Edinburgh Observed/Expected Cross-citations

  32. Assessing interdisciplinarity Intermediation JApplPsychol JManage AcadManageJ HumRelat JPersSocPsychol JIntBusStud Organization StrategicManageJ PsycholBull HarvardBusRev AnnuRevPsychol AmJSociol PsycholRev AccountOrgSoc PublicAdmin IntJMedInform IndCorpChange ResPolicy TechnolAnalStrateg BritMedJ Scientometrics SocStudSci EconSoc JLawEconOrgan Lancet DrugInfJ JLawEcon SciTechnolHumVal PolicySci Interfaces RandJEcon Nature JFinancEcon PublicUnderstSci Futures JFinanc JRiskRes WorldDev StudHistPhilosSci EconJ JDevStud RiskAnal PhilosTRSocA GlobalEnvironChang JPublicEcon JAgrarChange CanJEcon Econometrica JApplEcol ClimaticChange JIntEcon EnvironSciPolicy EnergPolicy ApplEcon BiomassBioenerg AtmosEnviron ISSTI Edinburgh References

  33. Summary: IS (blue) units are more interdisciplinary than BMS (orange) More Coherent Observed/Expected Cross-Citation Distance More Diverse Rao-Stirling Diversity More Interstitial Average Similarity 0.02 0.03 0.04 0.05 0.06 0.07

  34. 3. Research focus: Opening Up Perspectives Explore directions of research

  35. Thinking in terms of research portfolios: the case of rice Pests Plant protection Weeds Plant protection Rice Varieties Classic Genetics Plant nutrition Production & socioeconomic issues Transgenics Mol. Biology Genomics Consumption Hum. nutrition, food techs) Ciarli and Rafols (2014, unpublished)

  36. Rice research US, 2000-12 Ciarli and Rafols (2014, unpublished)

  37. Rice research India 2000-12 Ciarli and Rafols (2014, unpublished)

  38. Rice research Thailand 2000-12 Ciarli and Rafols (2014, unpublished)

  39. Rice research Brazil 2000-12 Ciarli and Rafols (2014, unpublished)

  40. 3. Summary and conclusions

  41. S&T indicator as a tools to open up the debate Conventional use of indicators ( Pure scientist --Pielke) Purely analytical character (i.e. free of normative assumptions) Instruments of objectification of dominant perspectives Aimed at legitimising /justifying decisions (e.g. excellence) Unitary and prescriptive advice Opening up scientometrics ( Honest broker --Pielke) Aimed at locating the actors in their context and dynamics Not predictive, or explanatory, but exploratory Construction of indicators is based on choice of perspectives Make explicit the possible choices on what matters Supporting debate Making science policy more socially robust Plural and conditional advice Barr (2001, 2004, 2010), Stirling (2008)

  42. Strategies for opening up or how to keep it complex yet manageable Presenting contrasting perspectives At least TWO, in order to give a taste of choice Simultaneous visualisation of multiple properties / dimensions Allowing the user take its own perspective Interactivity Allowing the user give its own weigh to criteria / factors Allowing the user manipulate visuals .

  43. Is opening up worth the effort? (1) Sustaining diversity in S&T system Decrease in diversity. Potential unintended consequence of the evaluation machine: Why diversity matters Systemic ( ecological ) understanding of the S&T S&T outcomes depend on synergistic interactions between disparate elements. Dynamic understanding of excellence and relevance New social needs, challenges, expectations from S&T Manage diverse portfolios to hedge against uncertainty in research Office of Portfolio Analysis (National Institutes of Health) http://dpcpsi.nih.gov/opa/ Open possibility for S&T to work for the disenfranchised Topics outside dominant science (e.g. neglected diseases)

Related


More Related Content