Benefits of Probabilistic Static Analysis for Improving Program Analysis

Slide Note
Embed
Share

Probabilistic static analysis offers a novel approach to enhancing the accuracy and usefulness of program analysis results. By introducing probabilistic treatment in static analysis, uncertainties and imprecisions can be addressed, leading to more interpretable and actionable outcomes. This methodology allows for the prioritization of analysis results based on certainty levels, aiding users in navigating through flood of results efficiently. Additionally, combining hard and soft rules with prior knowledge can further improve the quality of analysis outcomes. The integration of probabilistic elements in static analysis can help strike a balance between soundness and utility considerations.


Uploaded on Oct 05, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. In Defense of Probabilistic Static Analysis BEN LIVSHITS SHUVENDU LAHIRI MICROSOFT RESEARCH

  2. FROM THE PEOPLE WHO BROUGHT YOU SOUNDINESS.ORG

  3. STATIC ANALYSIS: UNEASY TRADEOFFS too imprecise does not scale useless results overkill for some things may not scale possibly still too imprecise for others

  4. WHAT IS MISSING IS ANALYSISELASTICI TY

  5. OUR APPROACH IS PROBABILISTIC TREATMENT Points-to(p, v, h) if(...){ $( mydiv ).css( color : red ); // branch direction info Object x = new Object(); x = new Object(); MANYINTERPRETATIONSAREPOSSIBLE try { }else{ OURCERTAINTYINTHEFACTBASEDONSTATIC EVIDENCESUCHASPROGRAMSTRUCTURE x = null; } catch(...){ } OURCERTAINTYBASEDONRUNTIME OBSERVATIONS x = null; OURCERTAINLYBASEDONPRIORSOBTAINED FROMTHISOROTHERPROGRAMS }

  6. BENEFITS RESULT PRIORITIZATION STATICANALYSISRESULTSCANBENATURALLY RANKEDORPRIORITIZEDINTERMSOFCERTAINTY, NEARLYAREQUIREMENTINASITUATIONWHERE ANALYSISUSERSAREFREQUENTLYFLOODEDWITH RESULTS ANALYSIS DEBUGGING PROGRAMPOINTSOREVENSTATICANALYSIS INFERENCERULESANDFACTSLEADINGTO IMPRECISIONCANBEIDENTIFIEDWITHTHEHELP OFBACKWARDPROPAGATION

  7. MORE BENEFITS HARD AND SOFT RULES INANEFFORTTOMAKETHEIRANALYSISFULLYSOUND, ANALYSISDESIGNERSOFTENCOMBINECERTAININFERENCE RULESWITHTHOSETHATCOVERGENERALLYUNLIKELY CASESTOMAINTAINSOUNDNESS INFUSING WITH PRIORS END-QUALITYOFANALYSISRESULTSCANOFTEN BEIMPROVEDBYDO- MAINKNOWLEDGESUCH ASINFORMATIONABOUTVARIABLENAMING, CHECK-ININFORMATIONFROMSOURCE CONTROLREPOSITORIES, BUGFIXDATAFROM BUGREPOSITORIES, ETC. NATURALLYBLENDINGSUCHINFERENCERULESTOGETHER, BYGIVINGHIGHPROBABILITIESTOTHEFORMERANDLOW PROBABILITIESTOTHELATTERALLOWSUSTOBALANCE SOUNDNESSANDUTILITYCONSIDERATIONS

  8. SIMPLE ANALYSIS IN DATALOG // transitive flow propagation 1. FLOW(x,z) :- FLOW(x,y), ASSIGN(y,z) 2. FLOW(a,c) :- FLOW(a,b), ASSIGNCOND(b,c) 3. FLOW(x,x). 1. x=3; 2. y=null; 3. z=null; 4. z=x; // nullable variables 1. NULLABLE(x) :- FLOW(x,y), ISNULL(y) 5. if(...){ 6. z=null; // error detection 1. ERROR(a) :- ISNULL(a), DEREF(a) 2. ERROR(a) :- !ISNULL(a), NULLABLE(a), DEREF(a) 7. y=5; 8. 9. w=*z }

  9. RELAXING THE RULES // transitive flow propagation FLOW(x,z) :- FLOW(x,y), ASSIGN(y,z). FLOW(a,c) :- FLOW(a,b), ASSIGNCOND(b,c). FLOW(x,x). // transitive flow propagation FLOW(x,y) ^ ASSIGN(y,z) => FLOW(x,z). 1 FLOW(a,b) ^ ASSIGNCOND(b,c) => FLOW(a,c) FLOW(x,x). // nullable variables NULLABLE(x) :- FLOW(x,y), ISNULL(y). // nullable variables FLOW(x,y) ^ ISNULL(y) => NULLABLE(x). // error detection ERROR(a) :- ISNULL(a), DEREF(a). ERROR(a) :- !ISNULL(a), NULLABLE(a), DEREF(a). // error detection ISNULL(a)^ DEREF(a) => ERROR(a). 0.5 !ISNULL(a) ^ NULLABLE(a) ^ DEREF(a) => ERROR(a). // priors and shaping distributions 3 !FLOW(x,y).

  10. PROBABILISTIC INFERENCE WITH ALCHEMY X1 Y1 U1 W1 Z1 Z2 W4 TUNINGTHERULES 0.567993 TUNINGTHEWEIGHTS Z3 W3 0.614989 0.616988 SEMANTICSARENOTASOBVIOUS SHAPINGPRIORSISNON-TRIVIAL, BUTFRUITFUL W5 W6 0.544996 0.560994 W7 W8 W9 W10 W11

  11. CHALLENGES LEARNINGTHEWEIGHTS EXPERTUSERS LEARNING (NEEDLABELEDDATASET) WHATCLASSOFSTATICANALYSISCANBEMADEELASTIC? DATALOG ABSTRACTINTERPRETATION DECISIONPROCEDURE (SMT)-BASED

Related


More Related Content