Comparing GCs and Allocation Methods

 METHODS MATTER: ON THE
IMPORTANCE OF RELEVANT
EVIDENCE FOR POLICY SOLUTIONS
PROF. JAROSŁAW GÓRNIAK
CENTRE FOR EVALUATION AND ANALYSIS OF PUBLIC POLICIES
JAGIELLONIAN UNIVERSITY IN KRAKOW
 “Evidence-based policy in Erasmus+. Seminar on research and methodology”
November 28-30, Warsaw
Credit for
the title:
THE IMPORTANCE OF THEORY FOR POLICY MAKING
“There is Nothing More Practical Than A Good Theory” - Kurt Lewin
(Ludwig Boltzmann, James Maxwell?)
"Theory is when you know everything but nothing works. Practice is
everything works but no one knows why. In our laboratory, theory and
practice are combined: nothing works and no one knows why!” –
Albert Einstein
“(…) practitioners and policymakers—at all levels—wanted to know
the answers to questions about cause and effect. They wanted to
know if A caused B, and wanted IES to commission research that
would provide them with answers.” – Murnane & Willett
A theory which matters for policy has to make evidence-based causal
claims
EVIDENCE-BASED POLICY – POLICY ANALYSIS –  EVALUATION
Evidence-based policy – public policy based on scientifically sound
evidence
Policy analysis as the analysis 
for
 policymaking – providing policy
makers with (evidence-based) advice on problems, causal
mechanisms, instruments and potential consequences of the
available options
Evaluation as a source of knowledge about what works, for whom
and in what circumstances
Policy analysis
Ex ante
evaluation
Ongoing
evaluation
POLICY ANALYSIS WITHOUT EVALUATION IS BLIND
EVALUATION WITHOUT POLICY ANALYSIS IS POWERLESS
A NOBEL PRIZE WINNER, JAMES HECKMAN ON EVALUATION
OF PUBLIC POLICIES FACING THREE MAIN PROBLEMS:
“Evaluating the impact of historical interventions on outcomes
including their impact in terms of the well-being of the treated and
society at large.”
“Forecasting the impacts (constructing counterfactual states) of
interventions implemented in one environment in other environments,
including their impacts in terms of well-being.”
“Forecasting the impacts of interventions (constructing counterfactual
states associated with interventions) never historically experienced to
various environments, including their impacts in terms of well-being.”
(Heckman, 2008, p. 8; see also: Heckman, 2005).
GOOD EVIDENCE
Policy-relevant – justifying the choice of policy conduct and
instruments
Trustworthy
Sound theory – causal claims
Proper scientific methodology – research design and measurement
Dependable and up-to-date data
Conclusive – “clearly speaks for or against the policy”
“Will it work here? That is, will the policy that you are considering
make a positive difference in the desired outcome if you implement
it, bearing in mind how, where, and when you would do so? In the
language of the standard literature, this is a call for a prediction of
effectiveness.” – Cartwright & Hardie
MARYLAND EVALUATION SCALE
Source: Daniel Fujiwara, 
Methodological Developments and Challenges in UK Policy Evaluation
(presentation on 
VIII Evaluation Conference in Warsaw
) based on
: 
Farrington, D. P.
(2002) 
Methodological Quality Standards for Evaluation Research. 
Paper Presented at the Third Annual Jerry Lee Crime Prevention Symposium, University of Maryland
)
RANDOMIZED CONTROL TRIALS ARE IMPORTANT, BUT
NOT SUFFICIENT
„They [RCT] cannot alone support the expectation that a policy will work for
you. What they tell you is true—that this policy produced that result there.”
“They do not even tell you that a policy works. What they tell you is that a
policy worked there, where the trial was carried out, in that population.”
“The fact that it worked there is indeed fact. But for that fact to be evidence
that it will work here, it needs to be relevant to that conclusion.”
Nancy Cartwright and Jeremy Hardie, 
Evidence-Based Policy. A Practical Guide to Doing It Better”.
RCT provides internally valid effect size and significance (
works there
) but
without insight into a causal mechanism and certainty of external validity
(
will work here
)
WHY CAUSAL
MECHANISMS MATTER
X – the randomized stimulus
Y – the measure of the outcome
η
1 
 - the latent variable manipulated by the
stimulus (the cause of the policy outcome)
η
2
 – latent outcome variable (policy
outcome)
Problems:
-
How strong is the effect of the stimulus
X on η
1
?
-
How good is Y as the measure of the
outcome η
2 
?
-
If above relations are weak, results
obtained traditionally by
ANOVA/regression would not reveal
influence, even if it is a strong one
Kenneth A. Bollen and Judea Perl (2012), 
Eight Myths About
Causality and Structural Equation Models
-
X causes another
latent variable η
3
which in turn causes
η
2
-
X i Y remain
significantly
associated
WHY CAUSAL
MECHANISMS MATTER
Kenneth A. Bollen and Judea Perl (2012), 
Eight Myths About
Causality and Structural Equation Models
-
X and Y are significantly
associated
-
η
1
 is not true cause of
η
2
-
The stimulus X causes
another variable - η
4
,
which does not cause
η
2
 but causes Y
WHY CAUSAL
MECHANISMS MATTER
Kenneth A. Bollen and Judea Perl (2012), 
Eight Myths About
Causality and Structural Equation Models
WHAT MATTERS FOR GOOD EVIDENCE IS BOTH: 
 
  
PROPER CAUSAL THEORY
  
AND ADEQUATE METHODS
Source:  http://www.dagitty.net/dags.html# 
LIMITATIONS OF EVIDENCE-BASED POLICY
Policy decisions are based not only in evidence, but they are
also prone to the influence of competing interests – politics
are involved here
Institutional and cultural constraints matter too
Evidence may not reflect political priority or social desirability
of what is being measured
Priorities and social desirability are usually differentiated
External validity of social research is more problematic than in
medicine (“
what works there might not work here”
)
Evidence bias (Parkhurst, 2017)
Technical bias
: evidence can be misused or manipulated for
political reasons
Issue bias
: appeals to evidence serve to obscure key social values
or impose political priority in unrepresentative ways
EBP is a significant component of the rational model of
policy-making, but not as significant or having a different
meaning in case of other models like: policy as a political
game, a discourse, a garbage can or an institutional process
(see: Enserink, Koppenjan & Mayer 2013)
LIMITATIONS OF EVIDENCE-BASED POLICY
Problem of policy scope, size and complexity: evidence based on
causal research (what works) is often restricted to selected policy
problems and is of limited use for complex reforms
Communication  problems – decision-makers use stories rather
than pure scientific reports; there is a need for translation from
the language of science into policy narratives
Timing – politicians (like businessmen) have much shorter
timescales than researchers
The job of policy makers is to anticipate, rather than explain past
events and processes, whereas  social scientists prefer the latter
“No prophet is acceptable in his own country”
LIMITATIONS OF EVIDENCE-BASED POLICY
WHAT TO DO?
Policy makers: commission methodologically excellent and
causally conclusive policy research
Universities: train researchers more thoroughly
Experts: obtain and use sound evidence
Scientists: develop proper theories and methods
REFERENCES
Bollen, K. A., & Perl, J. (2013). Eight myths about causality and structural equation models. In S. L. Morgan
(Ed.), 
Handbook of Causal Analysis for Social Research 
(pp. 301-328). Dordrecht: Springer.
Cartwright, N., & Hardie, J. (2012). 
Evidence-Based Policy. A Practical Guide to Doing It Better
. Oxford
University Press.
Enserink, B., Koppenjan, J. F. M., & Mayer, I. S. (2012). A Policy Sciences View on Policy Analysis. In W. A. H.
Thissen & W. E. Walker (Eds.), 
Public Policy Analysis. New Developments
.
Heckman, J. J. (2005). The scientific model of causality. 
Sociological methodology
, 
35
(1), 1-97.
Heckman, J. J. (2008). Econometric causality. 
International Statistical Review
, 
76
(1), 1-27.
Murnane, R. J., & Willett, J. B. (2011). 
Methods Matter. Improving Causal Inference in Educational and Social
Science Research
. Oxford New York: Oxford University Press.
Parkhurst, J. (2016). 
The Politics of Evidence. From evidence-based policy to the good governance of evidence
(Open Access)
. London and New York: Taylor & Francis.
THANK YOU FOR YOUR ATTENTION!
JAROSLAW.GORNIAK@UJ.EDU.PL
Slide Note
Embed
Share

This content delves into the comparison of different garbage collectors and allocation methods, exploring aspects such as throughput, pause times, space utilization, and more. It discusses considerations for choosing the best garbage collector and highlights factors like application dependencies, heap space availability, and heap size. The content also addresses the primary goal of throughput for batch applications, the algorithmic complexity of collection methods, and the importance of pause times for interactive applications.

  • Garbage Collectors
  • Allocation Methods
  • Throughput
  • Pause Times
  • Space Utilization

Uploaded on Feb 19, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. METHODS MATTER: ON THE IMPORTANCE OF RELEVANT EVIDENCE FOR POLICY SOLUTIONS PROF. JAROS AW G RNIAK CENTRE FOR EVALUATION AND ANALYSIS OF PUBLIC POLICIES JAGIELLONIAN UNIVERSITY IN KRAKOW Evidence-based policy in Erasmus+. Seminar on research and methodology November 28-30, Warsaw

  2. Credit for the title:

  3. THE IMPORTANCE OF THEORY FOR POLICY MAKING There is Nothing More Practical Than A Good Theory - Kurt Lewin (Ludwig Boltzmann, James Maxwell?) "Theory is when you know everything but nothing works. Practice is everything works but no one knows why. In our laboratory, theory and practice are combined: nothing works and no one knows why! Albert Einstein ( ) practitioners and policymakers at all levels wanted to know the answers to questions about cause and effect. They wanted to know if A caused B, and wanted IES to commission research that would provide them with answers. Murnane & Willett A theory which matters for policy has to make evidence-based causal claims

  4. EVIDENCE-BASED POLICY POLICY ANALYSIS EVALUATION Evidence-based policy public policy based on scientifically sound evidence Policy analysis as the analysis for policymaking providing policy makers with (evidence-based) advice on problems, causal mechanisms, instruments and potential consequences of the available options Evaluation as a source of knowledge about what works, for whom and in what circumstances

  5. Social issue on policy agenda Problem definition: diagnosis, elaboration, evidence Ex post evaluation Policy analysis Designing policy options Ongoing evaluation Implementation Calculating consequences of particular options Ex ante evaluation Policy design Policy recommendation Choice of policy option

  6. POLICY ANALYSIS WITHOUT EVALUATION IS BLIND EVALUATION WITHOUT POLICY ANALYSIS IS POWERLESS

  7. A NOBEL PRIZE WINNER, JAMES HECKMAN ON EVALUATION OF PUBLIC POLICIES FACING THREE MAIN PROBLEMS: Evaluating the impact of historical interventions on outcomes including their impact in terms of the well-being of the treated and society at large. Forecasting the impacts (constructing counterfactual states) of interventions implemented in one environment in other environments, including their impacts in terms of well-being. Forecasting the impacts of interventions (constructing counterfactual states associated with interventions) never historically experienced to various environments, including their impacts in terms of well-being. (Heckman, 2008, p. 8; see also: Heckman, 2005).

  8. GOOD EVIDENCE Policy-relevant justifying the choice of policy conduct and instruments Trustworthy Sound theory causal claims Proper scientific methodology research design and measurement Dependable and up-to-date data Conclusive clearly speaks for or against the policy Will it work here? That is, will the policy that you are considering make a positive difference in the desired outcome if you implement it, bearing in mind how, where, and when you would do so? In the language of the standard literature, this is a call for a prediction of effectiveness. Cartwright & Hardie

  9. MARYLAND EVALUATION SCALE Design Statistical method Level 5 Randomised trials Evaluations with well implemented random assignment of treatment to subjects in treatment and control groups. 4 Quasi- Experiments Evaluations that use a naturally occurring event (that makes the treatment assignment as good as random) 3 Matching techniques; Regression analysis Non-experimental evaluations where treatment and comparison groups are matched on observable characteristics 2 Simple comparisons Studies with a treated and comparison group, but with no attempt made to control for differences among the groups. 1 Pre- and post analysis Studies where no comparison group is used. Outcomes are measured pre and post-treatment. Source: Daniel Fujiwara, Methodological Developments and Challenges in UK Policy Evaluation (presentation on VIII Evaluation Conference in Warsaw) based on: Farrington, D. P. (2002)Methodological Quality Standards for Evaluation Research. Paper Presented at the Third Annual Jerry Lee Crime Prevention Symposium, University of Maryland)

  10. RANDOMIZED CONTROL TRIALS ARE IMPORTANT, BUT NOT SUFFICIENT They [RCT] cannot alone support the expectation that a policy will work for you. What they tell you is true that this policy produced that result there. They do not even tell you that a policy works. What they tell you is that a policy worked there, where the trial was carried out, in that population. The fact that it worked there is indeed fact. But for that fact to be evidence that it will work here, it needs to be relevant to that conclusion. Nancy Cartwright and Jeremy Hardie, Evidence-Based Policy. A Practical Guide to Doing It Better . RCT provides internally valid effect size and significance (works there) but without insight into a causal mechanism and certainty of external validity (will work here)

  11. WHY CAUSAL MECHANISMS MATTER X the randomized stimulus Y the measure of the outcome 1 - the latent variable manipulated by the stimulus (the cause of the policy outcome) 2 latent outcome variable (policy outcome) Problems: - How strong is the effect of the stimulus X on 1? How good is Y as the measure of the outcome 2 ? If above relations are weak, results obtained traditionally by ANOVA/regression would not reveal influence, even if it is a strong one - - Kenneth A. Bollen and Judea Perl (2012), Eight Myths About Causality and Structural Equation Models

  12. WHY CAUSAL MECHANISMS MATTER - X causes another latent variable 3 which in turn causes 2 - X i Y remain significantly associated Kenneth A. Bollen and Judea Perl (2012), Eight Myths About Causality and Structural Equation Models

  13. WHY CAUSAL MECHANISMS MATTER - X and Y are significantly associated - 1is not true cause of 2 - The stimulus X causes another variable - 4, which does not cause 2but causes Y Kenneth A. Bollen and Judea Perl (2012), Eight Myths About Causality and Structural Equation Models

  14. WHAT MATTERS FOR GOOD EVIDENCE IS BOTH: PROPER CAUSAL THEORY AND ADEQUATE METHODS

  15. Source: http://www.dagitty.net/dags.html#

  16. LIMITATIONS OF EVIDENCE-BASED POLICY Policy decisions are based not only in evidence, but they are also prone to the influence of competing interests politics are involved here Institutional and cultural constraints matter too Evidence may not reflect political priority or social desirability of what is being measured Priorities and social desirability are usually differentiated External validity of social research is more problematic than in medicine ( what works there might not work here )

  17. LIMITATIONS OF EVIDENCE-BASED POLICY Evidence bias (Parkhurst, 2017) Technical bias: evidence can be misused or manipulated for political reasons Issue bias: appeals to evidence serve to obscure key social values or impose political priority in unrepresentative ways EBP is a significant component of the rational model of policy-making, but not as significant or having a different meaning in case of other models like: policy as a political game, a discourse, a garbage can or an institutional process (see: Enserink, Koppenjan & Mayer 2013)

  18. LIMITATIONS OF EVIDENCE-BASED POLICY Problem of policy scope, size and complexity: evidence based on causal research (what works) is often restricted to selected policy problems and is of limited use for complex reforms Communication problems decision-makers use stories rather than pure scientific reports; there is a need for translation from the language of science into policy narratives Timing politicians (like businessmen) have much shorter timescales than researchers The job of policy makers is to anticipate, rather than explain past events and processes, whereas social scientists prefer the latter No prophet is acceptable in his own country

  19. WHAT TO DO? Policy makers: commission methodologically excellent and causally conclusive policy research Universities: train researchers more thoroughly Experts: obtain and use sound evidence Scientists: develop proper theories and methods

  20. REFERENCES Bollen, K. A., & Perl, J. (2013). Eight myths about causality and structural equation models. In S. L. Morgan (Ed.), Handbook of Causal Analysis for Social Research (pp. 301-328). Dordrecht: Springer. Cartwright, N., & Hardie, J. (2012). Evidence-Based Policy. A Practical Guide to Doing It Better. Oxford University Press. Enserink, B., Koppenjan, J. F. M., & Mayer, I. S. (2012). A Policy Sciences View on Policy Analysis. In W. A. H. Thissen & W. E. Walker (Eds.), Public Policy Analysis. New Developments. Heckman, J. J. (2005). The scientific model of causality. Sociological methodology, 35(1), 1-97. Heckman, J. J. (2008). Econometric causality. International Statistical Review, 76(1), 1-27. Murnane, R. J., & Willett, J. B. (2011). Methods Matter. Improving Causal Inference in Educational and Social Science Research. Oxford New York: Oxford University Press. Parkhurst, J. (2016). The Politics of Evidence. From evidence-based policy to the good governance of evidence (Open Access). London and New York: Taylor & Francis.

  21. THANK YOU FOR YOUR ATTENTION! JAROSLAW.GORNIAK@UJ.EDU.PL

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#