Density Independent Algorithms for Sparsifying Random Walks
This presentation discusses density-independent algorithms for sparsifying ?-step random walks on graphs, focusing on sparsification by resistances and spectral sparsification. The talk outlines definitions, applications, and results related to the topic. Random walk graphs, transition matrices, Laplacian properties, and example scenarios are explored. The results show improvements in running time, with comparisons to prior work by various researchers.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Density Independent Algorithms for Sparsifying ?-Step Random Walks Gorav Jindal, Pavel Kolev (MPI-INF) Richard Peng, Saurabh Sawlani (Georgia Tech) August 18, 2017
Talk Outline Definitions Sparsification Random walk graphs Our result Sparsification by Resistances Walk sampling algorithm 8/18/2017 2
Spectral Sparsification Sparsification: Removing (many) edges from a graph while approximating some property Spectral properties of the graph Laplacian! 2 2 3 1 0 Graph Laplacian = ? ? e.g.: ? = ??= 2 0 1 1 2 1 Formally, find sparse ? s.t.: 1 ? ????? ????? 1 + ? ????? This preserves eigenvalues, cuts. ? 8/18/2017 3
Applications of Sparsification Process huge graphs faster Dense graphs that arise in algorithms: Partial states of Gaussian elimination ?-step random walks. 8/18/2017 4
Random Walk Graphs Walk along edges of ?. When at vertex ?, choose the next edge ? with probability proportional to ?(?) a u ? ?? + ? ?? + ?(??)=?(??) ?(??) Pr ? ? = b ?(?) c Each step = ? 1? k step transition matrix = (? 1?)? 8/18/2017 5
Random Walk Graphs Special case: ? = ?(for this talk) Weights become probabilities Transition of k-step walk matrix: ?? Laplacian ? ?? 8/18/2017 6
Random Walk Graphs Example: .68 .68 .2 .8 a a b b .32 ?2= ? = .32 .68 .8 .2 c c .68 8/18/2017 7
Our Result Assume ? is constant ? hides log log terms Running time ?(? log2.5?) ?(? log3?) ?(? log2?) ?(? + ? log10?) ?(? log4?) Comments Only ? = 1 Spielman Srivastava 08 ? = 1, combinatorial Kapralov Panigrahi 11 Koutis Levin Peng 12 Only ? = 1 ? 2, combinatorial Peng-Spielman `14, Koutis 14 ?(?2? log?(1)?) ? 1 Cheng, Cheng, Liu, Peng, Teng 15 ?(? log2? + ? log4? log5?) Only ? = 2? Jindal, Kolev 15 O(m + k2n log4n) O(m + k2n log6n) ? 1 Our result combinatorial
Density Independence O(m + k2n log4n) Only sparsify when m >> size of sparsifier. SS`08 + KLP `12: ?(?log?) edge sparsifer in ?(?log2?) time Actual cost at least: ?(?log3?) Density independent: ? ? + ? overhead Clearer picture of runtime
Algorithm Sample an edge in ? Pick an integer ? u.a.r. between 0 and k Walk ? steps from ? and k 1 ? steps from ? Add the corresponding edge in ?? to sparsifier (with rescaling) Walk sampling has analogs in: Personalized page rank algorithms Triangle counting / sampling 8/18/2017 10
Effective Resistances View ? as an electrical circuit Resistance of ?: ??= 1/?? Effective resistance (??) between two vertices: Voltage difference required between them to get 1 unit of current between them. Leverage score = ? ?? ?? ?? Importance Intuitive way of observing a graph Sparsification by ?? is extremely useful! (Next slide) 8/18/2017 11
Sparsification using ?? Suppose we have upper bounds on leverage scores of edges. (? ? ? ? ??(?)) Algorithm: Repeat ? = ?(? 2 ? ? log?) times Pick an edge with probability ? ? / ? ? . Add it to H with appropriate re-weighting. [Tropp 12] The output ? is an ? sparsifier of ?. Need: leverage score bounds for the edges in ?? 8/18/2017 12
Tools for Bounding Leverage Scores Odd-even lemma [CCLPT 15]: 1. For odd ?, ?????,? 2 ????,? 2. For even ?, ?????,? ???2?,? Triangle inequality of ?? (on path ?0 ??): ? ????0,?? ?????,??+1 ?=0 We will use these to implicitly select edges by leverage score 8/18/2017 13
Analysis: Simplifications: Assume odd ? (even ? uses one more idea) Assume access to exact effective resistances of ? (available from previous works) Goal: sample an edge (?0,??) in ?? w.p. proportional to: ????0,?? ?????0,?? Claim: walk sampling achieves this! 8/18/2017 14
Analysis: Claim: it suffices to sample a path ?0 ??w.p. proportional to: ? 1 ?????,??+1 ? ?0,?1, ,?? ?=0 ? ?0,?1, ,?? ????0,?? ( inequality) ? ?0,?1, ,?? ?????0,?? (odd-even lemma) Summing over all k-length paths from ?0 to ??, = ????0,?? ?????0,?? 8/18/2017 15
Walk Sampling Algorithm Algorithm (to pick an edge in ??): Choose an edge (?,?) in ? with probability proportional to ? ?? ?? ?? Pick u.a.r. an index ? in the range 0,? 1 Walk ? steps from ? and k 1 ? steps from ? 8/18/2017 16
Analysis of Walk Sampling Probability of sampling the walk (?0,?1, ,??) Pr[selecting the edge (??,??+1)] x x Pr[Walk from ?? to ?0] x Pr[Walk from ??+1 to ??] Pr[index=i] ? 11 ? 1 ? 1 ? ? ??,??+1?????,??+1 = ? ??,??+1 ? ??,??+1 ?=0 ?=0 ?=?+1 ? 11 ? 1 ? ?????,??+1 = ? ??,??+1 ?=0 ?=0 ? 1 =1 ?? ??,??+1 ? ?(?0,?1 ??) ?=0 8/18/2017 17
? = even: ?2 ?2 is still dense and cannot be computed! Compute product of G and length 2 path, return ER from that .68 .68 d a b ?2 .2 ?1 .8 a .32 b .68 ?2 ?1 .32 .8 ?2 ?1 .2 c c .68 ?2 ?1 ?2 ? ? ?2 ??? ?2 ?1,?1 = ???2(?,?) 8/18/2017 18
Future Work This result: ? ? + ?2?log4? time. Log-dependency on ?(as in JK 15) Better runtime of ? ? + ?log2? ??? (combinatorial algorithm) 8/18/2017 19
ER estimates for ? (or ? ?2) Iterative improvement similar to KLP 12: Create sequence of graphs, each more tree like than the previous, ?1 ?? ? 1 -Sparsify the last graph to get ?? Use sparsifier ??+1 to construct an ? 1 -sparsifier ?? of ??. 8/18/2017 20