
Insights into Information Complexity and Communication Theory
Delve into the realm of information complexity, classical information theory, and communication complexity. Explore applications in circuit complexity, streaming algorithms, data structures, and distributed computing. Uncover advancements in communication complexity, lower bounds, and interactive computation extensions. Discover the interplay between information theory and computational challenges.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information Day 2014
Classical Information Theory Shannon 48, A Mathematical Theory of Communication:
Motivation: Communication Complexity ? ?,? = ? ? ? Yao 79, Some complexity questions related to distributive computing
Motivation: Communication Complexity More generally: solve some task ?(?,?) ? ? Yao 79, Some complexity questions related to distributive computing
Motivation: Communication Complexity Applications: Circuit complexity Streaming algorithms Data structures Distributed computing Property testing
Example: Streaming Lower Bounds Streaming algorithm: How much space is required to approximate f(data)? algorithm data Reduction from communication complexity [AMS 97]
Example: Streaming Lower Bounds Streaming algorithm: State of the algorithm algorithm data Reduction from communication complexity [Alon, Matias, Szegedy 99]
Advances in Communication Complexity Very successful in proving unconditional lower bounds, e.g., ? for set disjointness [KS 92, Razborov 92] ? for gap hamming distance [Chakrabarti, Regev 10] But stuck on some hard questions Multi-party communication complexity Karchmer-Wigderson games [Chakrabarty, Shi, Wirth, Yao 01], [Bar-Yossef, Kumar, Jayram, Srivakumar 04]: use tools from information theory
Extending Information Theory to Interactive Computation One-way communication: Task: send ? across the channel Cost: ? ? bits Shannon: in the limit over many instances Huffman: ? ? + 1 bits for one instance Interactive computation: Task: e.g., compute ? ?,? Cost?
Information Cost Reminder: mutual information ? ?;? = ? ? ? ? ? = ? ? ? ? ? Conditional mutual information: ? ?;? ? = ? ? ? ? ? ?,? = ??? ?;? ? = ? Basic properties: ? ?;? 0 ? ?;? ? ? and ? ?;? ? ? Chain rule: ? ??;? = ? ?;? + ? ?;? ?
Information Cost Fix a protocol Notation abuse: let also denote the transcript of the protocol Two ways to measure information cost: External information cost: ? ;?? Internal information cost: ? ;? ? + ? ;? ? Cost of a task: infimum over all protocols Which cost is the right one ?
Information Cost: Basic Properties External information: ? ;?? Internal information: ? ;? ? + ? ;? ? Internal external Can be much smaller, e.g.: ? = ? uniform over 0,1? : Alice sends ? to Bob But equal if ?,? inependent
Information Cost: Basic Properties External information: ? ;?? Internal information: ? ;? ? + ? ;? ? External information communication: ? ;?? ? .
Information Cost: Basic Properties Internal information communication cost: ? ;? ? + ?( ;?|?) . By induction: let = 1 ?. ? ? : ? ?;? ? + ? ?;? ? ?. ? ?;? ? + ? ?;? ? = ? <?;? ? + ? <?;? ?what we knew after r-1 rounds + ? ?;Y X, <? + ? ?;X Y, <? what we learn in round r, given what we already know what we know after r rounds I.H. ? 1
Information vs. Communication Want: ? ?;Y X, <? + ? ?;X Y, <? 1. Suppose ? is sent by Alice. What does Alice learn? ? is a function of <? and ?, so ? ?;Y X, <? = 0. What does Bob learn? ? ?;Y X, <? ? = 1.
Information vs. Communication We have: Internal information communication External information communication Internal information external information
Information vs. Communication Information cost = communication cost ? In the limit: internal information! [Braverman, Rao 10] For one instance: external information! [Braverman, Barak, Rao, Chen 10] Big question: can protocols be compressed down to their internal information cost? [Ganor, Kol, Raz 14]: no! There is a task with internal IC=?, CC=2?. but: remains open for functions, small output.
Information vs. Amortized Communication Theorem [Braverman, Rao 10]: ??(??,??,?) lim ? = ?? ?,?,? . ? The direction: compression The direction: direct sum We know: ?? ??,??,? ?? ??,??,? We can show: ?? ??,??,? = ? ?? ?,?,?
Direct Sum Theorem [BR10] ?? ??,??,? = ? ?? ?,?,? : Let be a protocol for ?? on ?-copy inputs ?,? Construct for ? as follows: Alice and Bob get inputs ?,? Choose a random coordinate ? ? , set ??= ?,??= ? Bad idea: publicly sample ? ?,? ? ? ? ? ?
Direct Sum Theorem [BR10] ?? ??,??,? = ? ?? ?,?,? : Let be a protocol for ?? on ?-copy inputs ?,? Construct for ? as follows: Alice and Bob get inputs ?,? Choose a random coordinate ? ? , set ??= ?,??= ? Bad idea: publicly sample ? ?,? ? Suppose in , Alice sends ?1 ??. In , Bob learns one bit in he should learn 1/? bit But if ? ? is public Bob learns 1 bit about ?!
Direct Sum Theorem [BR10] ?? ??,??,? = ? ?? ?,?,? : Let be a protocol for ?? on ?-copy inputs ?,? Construct for ? as follows: Alice and Bob get inputs ?,? Choose a random coordinate ? ? , set ??= ?,??= ? Publicly sample ?1, ,?? 1 ? Privately sample ?(?+1), ,?? ? Privately sample ?1, ,?? 1 ? Publicly sample ??+1, ,?? ?
Compression What we know: a protocol with communication ?, internal info ? and external info ???? can be compressed to ???? polylog ?[BBCR 10] ? ? polylog ?[BBCR 10] 2? ?[Braverman 10] Major open question: can we compress to ? polylog ? ? [GKR, partial answer: no]
Using Information Complexity to Prove Communication Lower Bounds Internal/external info communication Essentially the most powerful technique known [Kerenidis,Laplante,Lerays,Roland,Xiao 12]: most lower bound techniques imply IC lower bounds Disadvantage: hard to show incompressibility! Must exhibit problem with low IC, high CC But proving high CC usually proves high IC
Extending IC to Multiple Players Recent interest in multi-player number-in- hand communication complexity Motivated by big data : Streaming and sketching, e.g., [Woodruff, Zhang 11, 12, 13] Distributed learning, e.g., [Awasthi, Balcan, Long 14]
Extending IC to Multiple Players Multi-player computation traditionally hard to analyze [Braverman,Ellen,O.,Pitassi,Vaikuntanathan]: ?? for Set Disjointness with ? elements, ? players, private channels, NIH input
Information Complexity on Private Channels First obstacle: secure multi-party computation [Goldreich,Micali,Wigderson 87]: any function can be computed with perfect information-theoretic security against < ?/2 players Solution: redefine information cost, measure both Information a player learns, and Information a player leaks to all the others.
Extending IC to Multiple Players Set disjointness: Input: ?1, ,?? Output: ?1 ??= ? Open problem: can we extend to gap set disjointness? First step: purely info-theoretic 2-party analysis
Extending IC to Multiple Players In [Braverman,Ellen,O.,Pitassi,Vaikuntanathan] we show direct sum for multi-party Solving ? instances = ? solving one instance Does direct sum hold across players ? Solving with ? players = ? solving with 2 players? Not always Does compression work for multi-party?
Conclusion Information complexity extends classical information theory to the interactive setting Picture is much less well-understood Powerful tool for lower bounds Fascinating open problems: Compression Information complexity for multi-player computation, quantum communication,