Understanding Bayesian Networks in Fine Arts Investigations

Slide Note
Embed
Share

Explore the application of Bayesian Networks in quantifying evidence weight in fine arts investigations. Delve into probability theory, Bayes theorem, decision theory, and their implementation. Discover how Bayesian statistics provide a framework for comparing theories and updating probabilities based on new evidence.


Uploaded on Sep 25, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Quantitative Provenance 3 2 1 0 1 2 3 Using Bayesian Networks to Help Quantify the Weight of Evidence In Fine Arts Investigations A Case Study: Red Black and Silver

  2. Outline Probability Theory and Bayes Theorem Likelihood Ratios and the Weight of Evidence Decision Theory and its implementation: Bayesian Networks Simple example of a BN: Why is the grass wet? Taroni Bayesian Network for trace evidence The Bayesian Network for Red, Black and Silver Stress testing: Sensitivity analysis Recommendation for RBS

  3. Probability Theory The actual science of logic is conversant at present only with things either certain [or] impossible. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is in a reasonable man s mind. James Clerk Maxwell, 1850C Probability theory is nothing but common sense reduced to calculation. Laplace, 1819L

  4. Probability Theory Probability: A particular scale on which degrees of plausibility can be measured. They are a means of describing the information given in the statement of a problem E.T. Jaynes, 1996J

  5. Probability Theory Probability theory forms the rules of reasoning Using probability theory we can explore the logical consequences of our propositions Probabilities can be updated in light of new evidence via Bayes theorem.

  6. Bayesian Statistics The basic Bayesian philosophy: Prior Knowledge Data = Updated Knowledge A better understanding of the world Prior Data = Posterior

  7. The Bayesian Framework Bayes Theorem to Compare Theories: Ha= Theory A (the prosecution s hypothesisAT) Hb= Theory B (the defence s hypothesisAT) E = any evidence I = any background information Pr(Ha|E,I)=Pr(E |Ha,I) Pr(Ha,I) Pr(E) Pr(Hb|E,I)=Pr(E |Hb,I) Pr(Hb,I) Pr(E)

  8. The Bayesian Framework Odd s form of Bayes Rule: Pr(Ha|E,I) Pr(Hb|E,I)=Pr(E |Ha,I) Pr(E |Hb,I) Pr(Ha,I) Pr(Hb,I) Posterior Odds = Likelihood Ratio Prior Odds { { { Prior odds in favour of Theory A Posterior odds in favour of Theory A Likelihood Ratio

  9. The Bayesian Framework The likelihood ratio has largely come to be the main quantity of interest in the forensic statistics literature: LR=Pr(E |Ha,I) Pr(E |Hb,I) A measure of how much weight or support the evidence gives to Theory A relative to Theory BAT

  10. The Bayesian Framework LR=Pr(E |Ha,I) Pr(E |Hb,I) Likelihood ratio ranges from 0 to infinity Points of interest on the LR scale: LR < 1 1 to 3 3 to 20 20 to 150 > 150 Kass-Raftery ScaleKR Evidence supports for Theory B Evidence barely supports Theory A Evidence positively supports Theory A Evidence strongly supports Theory A Evidence very strongly supports Theory A LR < 1 1 to 3 3 to 10 10 to 30 30 to 100 Evidence very strongly supports Theory A > 100 Evidence decisively supports Theory A Jeffreys ScaleJ Evidence supports for Theory B Evidence barely supports Theory A Evidence substantially supports Theory A Evidence strongly supports Theory A

  11. Decision Theory Frame decision problem (scenario) List possibilities and options Quantify the uncertainty with available information Domain specific expertise Historical data if available Combine information respecting the laws of probability to arrive at a decision/recommendation

  12. Bayesian Networks A scenario is represented by a joint probability function Contains variables relevant to a situation which represent uncertain information Contain dependencies between variables that describe how they influence each other. A graphical way to represent the joint probability function is with nodes and directed lines Called a Bayesian NetworkPearl

  13. Bayesian Networks (A Very!!) Simple exampleWiki: What is the probability the Grass is Wet? Influenced by the possibility of Rain Influenced by the possibility of Sprinkler action Sprinkler action influenced by possibility of Rain Construct joint probability function to answer questions about this scenario: Pr(Grass Wet, Rain, Sprinkler)

  14. Bayesian Networks Pr(Sprinkler | Rain) Pr(Rain) Rain: yes no Sprinkler : Pr(Sprinkler) Pr(Rain) was on was off 40% 60% 1% 99% Rain: yes 20% 80% no Pr(Grass Wet) Pr(Grass Wet | Rain, Sprinkler) Sprinkler: Rain: was on yes was on no was off yes was off no Grass Wet: yes no 99% 1% 90% 10% 80% 80% 0% 100%

  15. Bayesian Networks Pr(Rain) Pr(Sprinkler) Other probabilities are adjusted given the observation You observe grass is wet. Pr(Grass Wet)

  16. Bayesian Networks Likelihood Ratio can be obtained from the BN once evidence is entered Use the odd s form of Bayes Theorem: Probabilities of the theories after we entered the evidence Probabilities of the theories before we entered the evidence

  17. Bayesian Networks Areas where Bayesian Networks are used Medical recommendation/diagnosis IBM/Watson, Massachusetts General Hospital/DXplain Image processing Business decision support Boeing, Intel, United Technologies, Oracle, Philips Information search algorithms and on-line recommendation engines Space vehicle diagnostics NASA Search and rescue planning US Military Requires software. Some free stuff: GeNIe (University of Pittsburgh)G, SamIam (UCLA)S Hugin (Free only for a few nodes)H gR R-packagesgR

  18. Taroni Model for Trace Evidence Taroni et al. have prescribed a general BN fragment that can model trace evidence transfer scenariosT: H: Theory (Hypothesis) node X: Trace associated with (a) suspect node TS: Mediating node to allow for chance match between suspect s trace and trace from an alternative source T: Trace transfer node Y: Trace associated with the crime scene node

  19. Trace Evidence BN for RBS case Theories are that Pollock or someone else associated with him in summer 1956 made the painting The are two suspects Use a Taroni fragment for each of: Group of wool carpet fibers Human hair Polar bear hair Use a modified Taroni fragment (no suspect node) for each of: Beach grass seeds Garnet

  20. Trace Evidence BN for RBS case Link the garnet and seeds fragment together directly They a very likely to co-occur Link all the fragments together with the Theory (Painter) node and a Location node

  21. Trace Evidence BN for RBS case Enter the evidence:

  22. Sensitivity Analysis Local sensitivityC Posterior s sensitivity to small changes in the model s parameters. Threshold > 1

  23. Sensitivity Analysis Global sensitivityC Posterior s sensitivity to large changes in the model s parameters. Threshold < 0.1 Parameter 24 is: the probability of a transfer of polar bear hair, given the painting was made outside of Springs by Pollock and he had little potential of shedding the hair .

  24. Conservative Recommendation Considering the Likelihood ratio calculated with the Red, Black and Silver trace evidence network coupled with the sensitivity analysis results: The physical evidence is more in support of the theory that Pollock made RBS vs. someone else made RBS: Strongly Very Strongly (Kass-Raftery Scale) Very Strongly Decisively (Jeffreys Scale)

  25. References C Lewis Campbell. The Life of James Clerk Maxwell: With Selections from His Correspondence and Occasional Writings, Nabu Press, 2012. L Pierre Simon Laplace. Th orie Analytique des Probabilit s. Nabu Press, 2010. J E. T. Jaynes. Probability Theory: The Logic of Science. Cambridge University Press, 2003. AT C. G. G. Aitken, F. Taroni. Statistics and the Evaluation of Evidence for Forensic Scientists. 2nd ed. Wiley, 2004. J Harold Jeffreys. Theory of Probability. 3rd ed. Oxford University Press, 1998. KR R. Kass, A. Raftery. Bayes Factors. J Amer Stat Assoc 90(430) 773-795, 1995. P Judea Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann Publishers, San Mateo, California, 1988. Wiki http://en.wikipedia.org/wiki/Bayesian_network T F. Taroni, A. Biedermann, S. Bozza, P. Garbolino, C. G. G. Aitken. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science. 2nd ed. Wiley, 2014. C Veerle M. H. Coupe, Finn V. Jensen, Uffe Kjaerulff, and Linda C. van der Gaag. A computational architecture for n-way sensitivity analysis of Bayesian networks. Technical report, people.cs.aau.dk/~uk/papers/coupe-etal-00.ps.gz, 2000. G http://genie.sis.pitt.edu/ S http://reasoning.cs.ucla.edu/samiam/ H http://www.hugin.com/ gR Claus Dethlefsen, S ren H jsgaard. A Common Platform for Graphical Models in R: The gRbase Package. J Stat Soft http://www.jstatsoft.org/v14/i17/, 2005.

  26. Fin

Related


More Related Content