Understanding Fourier Growth of Parity Decision Trees

fourier growth of parity decision trees n.w
1 / 17
Embed
Share

Explore the concepts of Fourier growth in Parity Decision Trees through in-depth analysis of Boolean functions, Fourier coefficients, decision trees, and quantum vs. parity query complexity. Discover known bounds, optimal separations, and lower bounds in the context of decision tree complexity theory. The discussion also delves into the use of expander graphs and random walks to analyze the growth of Parity Decision Trees.

  • Fourier Growth
  • Parity Decision Trees
  • Decision Tree Complexity
  • Quantum Query Complexity
  • Random Walks

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Fourier Growth of Parity Decision Trees Kewen Wu (UC Berkeley) Joint with Uma Girish (Princeton) and Avishay Tal (UC Berkeley)

  2. Decision Trees and Parity Decision Trees ?1 Input: ? +1, 1? Output: value {0,1} ?1?2 +1 +1 1 1 ?2 ?3 ?2 ?3 1 1 +1 +1 1 1 +1 +1 ?3?5 ?3 ?4 ?4 ?5 ?1?5 1 1 +1 +1 +1 +1 +1 +1 1 1 1 1 1 1 ?2?3?6 ?2 1 1 1 1 1 1 1 1 0 0 +1 +1 1 1 1 1 0 0 Decision tree Parity decision tree

  3. Fourier Basics Boolean function ?: +1, 1? 0,1 Fourier coefficients ? ? = E ? ? ?? where ??= ? ??? Fourier representation ? ? = ? ? ? ?? Level-??1-sum ?1,?? = ?: ? =? ? ? Building PRGs for functions with small ?1,? [RSV13, CHRT18, CHHL19, CHLT19, CGL+20] Learning theory: small ?1-norm implies sparse (in Fourier base) approximators [KM93, Man95, OS07]

  4. ?1,? ?? Known Bounds PRGs with poly(?) seed length [CHHL19, CHLT19, CGL+20] ?1,? ?? ?1,? log? ?1,? ?? ?1,? log??? ?1,? ? log(?)/??/2 ?1,? 2? ? Width-? DNFs/CNFs: AC0 circuit of size ? and depth ?: Boolean functions with sensitivity ?: Width-? read-once branching program: Depth-? (randomized) decision tree: [Man95] ? 1 ? [Tal17] [GSTW16] [CHRT18] [Tal20, SSW21] ? Degree-? polynomials over GF 2 : [CHHL19] ?1,? ??log??/2 ?1,? ??2log2??/2 Depth-? (randomized) noisy decision tree: Depth-? (randomized) parity decision tree: [New] [New] The dependence on ? is optimal

  5. Quantum vs Parity Query Complexity Forr? is a partial Boolean function over ? input bits Quantum query complexity of Forr? is Randomized query complexity of Forr? is ?1 1/?[Bansal-Sinha 21] Fourier growth lower bounds on Forr? [Aaronson-Ambainis 14] ?/2 [Aaronson-Ambainis 14] Optimal separation Randomized parity query complexity of Forr? is ?1 1/? Randomized noisy query complexity of Forr? is ?1 1/?

  6. Expander Random Walks [Cohen, Peri, Ta-Shma21] Fourier growth of PDTs and NDTs Assume function ? over +1, 1? satisfies ?1,?? ?? Let ? be an expander graph on ? vertices with ?2 ? 4 Labels of vertices: half +1 and half 1 ?1 is a uniform random string in +1, 1? ?2 is the label sequence of an (? 1)-step random walk on ? Then E?1? ?1 E?2? ?2

  7. XOR-lifting in Communication Protocols [Girish-Raz-Tal21] Assume ?: +1, 1? +1, 1? 0,1 is computable by a deterministic communication protocol of cost ? Alice gets ? +1, 1? and Bob gets ? +1, 1? They should obtain ?(?,?) after ? bits of communication Define : +1, 1? 0,1 by ? = E?? ?,? ? Conjecture: ?1,2 ? polylog(?) Known bounds: ?1,2 = ? ?2 If true: randomized communication complexity of Forr2 XOR is ? Conjecture is true if the protocol follows a parity decision tree strategy

  8. Technical Overview General framework Random top-down walk in the decision tree Adaptive Azuma s inequality Reduction from ?1,? to ?1,? 1 ?1,1 for depth-? decision trees ? ? in expectation and a concentration bound ?1,1 for depth-? parity decision trees Adaptive Azuma s inequality ?1,2 for depth-? parity decision trees Reduction to ?1,1 ? ? ?1,?? = ?: ? =?

  9. Proof Overview - ?1,1 ? for decision trees ?1 Let be a random root-to-leaf path Then ? +1, 1 iff ?? is queried in the path and fixed to ? +1 1 ?2 ?3 ? ? = E?? ? ?? = E ? E? ?? = E ? ? 1 +1 1 +1 By negating ?? in ?, we may assume ? ? 0 By querying dummy variables, we may assume the tree is full ?3 ?5 ?4 Then ?1,1? = ? ? = E ? ? E ? +1 +1 +1 1 1 +1 1 1 ? depends only on the number of +1/ 1 s on the path = the final state of a simple ?-step 1D random walk = ? ? in expectation [O Donnell-Servedio 07] 1= 3= +1, 4= 1 1 ? = ? ? log with probability 1 ?

  10. Standard Azumas Inequality ?0= 0 and ??= ?? 1+ ?? ?? for ? = 1,2, ?? is a fixed value given ?1,?2, ,?? 1 ?? s are independent random variable with ?? 1 and E ?? = 0 Standard Azuma s inequality: If ?? ?? almost surely for all ?, then ?? 1 and ?? +1, 1 Pr ?? ? ? 2? 2?2 2 2? 2?2 Pr ?? ? ??

  11. Proof Overview - ?1,1 ? for parity decision trees ?1?2?5 Let be a random root-to-leaf path Then By negating ?? in ?, we may assume ? ? 0 By querying dummy variables, we may assume the tree is full ? +1, 1 iff ?? is fixed to ? along the path +1 1 ?2 ?3?4 ? ? = E?? ? ?? = E ? E? ?? = E ? ? 1 +1 1 +1 ?5 ?5 ?4 Then ?1,1? = ? ? = E ? ? E ? +1 +1 1 1 +1 1 1 +1 ? depends only on the number of +1/ 1 s on the path? We need to reweigh the parity decision tree 2= 1, 5= 1= +1 5= +1

  12. Proof Overview - ?1,1 ? for parity decision trees ?1?2?5 Relabel the edges of the parity decision tree by the sum of newly fixed variables ?2= 1 0 0 ?2 ?3?4 Then ? ?= ? ? and ?1,1? E ? = E ? ? 1 0 0 +1 ?5 ?5 ?4 Good: it is still a 1D random walk Bad: each step size is not fixed, but adaptive based on the path +1 +2 1 2 +1 1 2 Need to make the step size smooth [Blais-Tan-Wan 15] ?1?1,?1?2,?1?3,?1?4,?1 ?1?1,??,?1?2,?1?3,?1?4 +2 fix 5 at once fix gradually Need adaptive Azuma s inequality for concentration bounds ?1= ?5= +1

  13. Adaptive Azumas Inequality [new] ?0= 0 and ??= ?? 1+ ?? ?? for ? = 1,2, ?? is a fixed value given ?1,?2, ,?? 1 ?? s are independent random variable with ?? 1 and E ?? = 0 Standard Azuma s inequality: ? = ?? Adaptive Azuma s inequality: If ?=1 ?? 2 and ?? ?? almost surely for all ? ? 2 ? almost surely, then w.p at least 1 ? Pr ?? ? ? 2? 2?2 +?

  14. 0 0 ? = 2 w.p 0.75 ?1= 0 2= 5 1 0 0 ?2= 1 +1 ?? 2 ?3= 2 +2 +1 1 +1 1 +1 1 ? = 4

  15. Proof Overview - ?1,2 for parity decision trees Let be a random root-to-leaf path ? +1, 1 iff ?? is fixed to ? along the path Then ? ?,? = E?? ? ???? = E ? E? ???? = E ? ? ? ? ?1?2 ?1?2 +1 +1 1 1 +1 ?1 ?1 ?1 1 Clean Up +1 +1 1 1 1 +1 +1 ? ?,? Let ??,?= sgn ?1,2? = ? ?,? = E ? ??,? ? ? E ??,? ? ? , then = ??,? ?(?,?) 1= 2= 0 But E? ?1?2 = 1 Now E? ?1?2 = 1 2 The growth of ??,? ? ? at step ? can be written as a level-1 bound ?1,1 of a depth-? parity decision tree Intuitively, this is because the derivative of a quadratic is a linear function

  16. Summary We prove Fourier growth of (randomized) parity/noisy decision trees Random top-down walk, adaptive Azuma s inequality, induction on levels Applications in Separations between quantum/parity query complexity Expander random walks XOR-lifting in communication complexity Open problems Level-2 Fourier bounds for XOR-lifting in communication complexity Tight bounds for the Fourier growth: a gap of ?log??(?)

  17. Thank you! Thank you!

Related


More Related Content